Skip to main content
Health Services Research logoLink to Health Services Research
. 2019 Apr 22;54(4):940–946. doi: 10.1111/1475-6773.13156

Survey mode effects and insurance coverage estimates in the redesigned Gallup well‐being index

Benjamin D Sommers 1,2,, Anna L Goldman 1,3, Dennis Lee 1, Arnold M Epstein 1,2
PMCID: PMC6606554  PMID: 31006859

Abstract

Objective

To test whether a change from a telephone to mail and Internet survey in January 2018 affected the sample composition and uninsured estimates in the Gallup‐Sharecare Well‐Being Index.

Data Sources

Gallup‐Sharecare Well‐Being Index (2013‐2018).

Study Design

Regression discontinuity analysis identified changes after the survey redesign in the estimated U.S. uninsured rate (adults 18‐64) and in the sample's demographic composition.

Principal Findings

After the survey redesign, the estimated uninsured rate fell 5.3 percentage points (< 0.001), respondents were older and disproportionately white, more likely to have a college degree, and had higher average incomes. These changes were modestly reduced by survey weights.

Conclusions

The shift to a mail survey (with a web option) led to an older, more educated sample with fewer minorities, and a significant break in trend in the estimated uninsured rate.

Keywords: insurance, survey methods

1. INTRODUCTION

After the passage of the Affordable Care Act (ACA), the Gallup‐Sharecare Well‐Being Index (WBI—previously the Gallup‐Healthways Well‐Being Index) served as a valuable source of timely data on trends in insurance coverage and other health measures.1, 2, 3, 4 From 2008 to 2017, the WBI was administered as a daily, random‐digit dialing telephone survey, but on January 1, 2018, the methodology changed to a monthly mail‐ and web‐based approach. A previous validation study demonstrated that the phone‐based survey produced relatively similar estimates of insurance coverage, access to care, and health status compared to five large government‐sponsored surveys.5 However, the survey's 2018 redesign may have impacted the sample composition and the WBI's ability to accurately estimate changes in insurance coverage over time.

More broadly, telephone response rates have been declining in recent years,6 making phone‐based surveys more costly to conduct. Other large surveys may similarly opt to change data collection tactics to improve response rates.

In this study, we used a regression discontinuity design to assess changes in sample composition and the estimated uninsured rate at the time of the WBI's redesign, in order to gauge mode effects and assess the WBI's potential to serve as a data source for future analyses of coverage trends.

2. METHODS

2.1. Data

Our data source was the Gallup‐Sharecare WBI, 2013‐2018, obtained under contract with Gallup; however, our analysis was independently conducted and used only secondary data. Since 2008, the WBI has surveyed adult residents of the United States aged 18 and older in all 50 states. Data are collected on sociodemographic characteristics, self‐reported health, insurance coverage, and several other measures of well‐being.7

From 2008 to 2017, the survey used random‐digit dialing of cellphone and landline numbers, conducting at least 500 daily interviews in English and Spanish, producing sample sizes of at least 150 000 adults per year. Household members were randomly selected for interview, based on the most recent birthday. Survey data were then weighted to account for nonresponse, unequal selection, and double counting of adults who had access to both landlines and cellphones.7 Gallup derives its WBI survey weights based on gender, age, race, Hispanic ethnicity, education, region, population density, and phone status, from the Current Population Survey, Decennial Census, and National Health Interview Survey (NHIS).8

Beginning in 2018, Gallup elected to change the WBI to a dual mail‐ and web‐based survey. The new design uses an address‐based sampling frame, and paper survey invitations are mailed monthly to a nationally representative sample of U.S. households. Respondents are chosen at random within each household based on who has the next birthday. Recipients may respond via mail or online using a link contained in the paper invitation. The new survey sample is slightly smaller, containing approximately 10 000 adults each month. The pre‐2018 WBI using the phone‐based methodology had an average response rate (varying by year) of 9‐11 percent, whereas the 2018 WBI (postredesign) has a 19 percent response rate, using the AAPOR3 definition.5, 9

2.2. Analysis

First, we provide descriptive statistics of the study sample before and after the design change, and provide statistics from the American Community Survey (ACS) for comparison. Then, we assessed the monthly uninsured rate for adults aged 18‐64 for 2013‐2018; this time period captures the major declines in the uninsured rate under the ACA previously assessed using this data source. The overall sample included 606 154 nonelderly adults.

Next, we conducted regression discontinuity (RD) analyses to test for discrete changes in demographic measures and the uninsured rate before and after January 1, 2018. RD models allow for varying associations between study outcomes and the “running variable”—in this case, the interview date (a variable Days, indicating the number of days before or after January 1, 2018).

Our primary approach was a nonparametric model using local linear regression, in which we narrowed the study sample around the survey redesign date and fitted a model with linear time trends that could change before and after January 1, 2018. The variable of interest was an indicator (called Post2018) for observations on or after January 1, 2018. We tested the effect of how narrowly we trimmed our sample in terms of number of days (known as the “bandwidth”) on either side of January 1, 2018. We tested a range of bandwidths from 10 to 90 days, including the “optimal” bandwidth approach, which minimizes the model's average mean‐squared error.10, 11

We also considered several parametric models, using data from January 2017 through March 2018, which tested whether our results are affected by using a linear, quadratic, or cubic time trends. The variable of interest was an indicator (called Post2018) for observations on or after January 1, 2018. We interact Days, Days Squared, and Days Cubed with the Post2018 variable, which allows for differential polynomial time trends before and after the survey redesign.

Essentially, all of these models identify the extent of any break in trend associated with the survey redesign, but make different assumptions about what the time trend in the uninsured rate would look like in the absence of any discontinuity at January 1, 2018. Our results were similar across these various models.

The study outcomes were the overall uninsured rate among nonelderly adults (18‐64) and key demographic features—age, sex, race/ethnicity, education, and income. For income, we followed previous research using this dataset to create an imputed continuous measure of income as a percentage of the Federal Poverty Level (FPL) from the survey's categorical data on income (eg, less than $720 per year, $720 to $5999, $6000 to $11 999), combined with information on household size.1, 3

For all analyses, we present both unweighted estimates and survey‐weighted estimates. This approach allows us to determine whether accounting for potential changes in the underlying sample based on the survey mode can be adequately addressed by the WBI survey weights. For the nonparametric model, we also tested replacing the survey weights with triangular kernel weights, which more heavily weight observations near the RD threshold—a technique common in RD analyses.12 Finally, in our analysis of the uninsured rate, we tested the effect of directly adjusting for the following covariates in the regression: age, sex, race/ethnicity, education, and income (in addition to using weights). While weighting and direct adjustment should be somewhat similar in their effects, we tested both simultaneously in a “belt‐and‐suspenders” approach to see whether the combination more effectively mitigates any break in trend based on the survey redesign.

We also present “placebo” testing using analogous RD methods to assess any potential changes in the uninsured rate and demographics between December 31, 2016, and January 1, 2017, when no survey redesign occurred. This tests whether any changes detected in our main model could be plausibly attributed to an effect of the new calendar year (eg, open enrollment for insurance coverage effective January 1 each year).

3. RESULTS

Table 1 compares sociodemographic characteristics of the 2013‐2017 phone survey cohort to the 2018 mail survey cohort. Compared to the telephone cohort, the postredesign cohort was older, had more female and white respondents, and had higher income and education levels. Table S1 shows that the WBI phone survey sample was more representative of the actual U.S. population than the new mail survey cohort along these dimensions.

Table 1.

Characteristics of the study sample, 2013‐2017 vs 2018

Variable Unweighted Weighted
2013‐2017 (telephone survey) 1st quarter 2018 (mail‐ and Internet‐based survey) P‐value 2013‐2017 (telephone survey) 1st quarter 2018 (mail‐ and Internet‐based survey) P‐value
Sample size 586 669 19 485 N/A 586 669 19 485 N/A
Age (y)
–18‐25 14% 5% <0.001 19% 13% <0.001
–25‐34 15% 14% <0.001 18% 20% <0.001
–35‐44 18% 19% 0.004 20% 20% 0.82
–45‐54 23% 26% <0.001 22% 23% 0.004
–55‐64 30% 38% <0.001 22% 23% <0.001
Race/ethnicity
White 71% 78% <0.001 66% 63% <0.001
Hispanic 11% 8% <0.001 15% 17% <0.001
Black 10% 7% <0.001 13% 12% 0.004
Othera 8% 7% <0.001 6% 8% <0.001
Education
Less than a high school degree 6% 2% <0.001 11% 9% <0.001
High school graduate or GED 26% 21% <0.001 33% 32% 0.04
Some college 26% 26% 0.04 25% 24% 0.26
College graduate 43% 50% <0.001 31% 35% <0.001
Male 54% 43% <0.001 50% 49% 0.02
Income (% FPL) 375% 423% <0.001 330% 335% <0.001

Sample contains respondents aged 18‐64 in the Gallup‐Healthways WBI 2013‐Q1 2018 (N = 606 154). Binary and categorical variables were compared using the Pearson chi‐square tests. Continuous variables were compared using two‐sample t tests.

a

The Gallup survey instrument changed the categories available in its survey several times during the study period, precluding a consistent way to characterize the size of the Asian American, Native Hawaiian, Pacific Islander, or Native American populations. We also include “missing or don't know” in this category.

Abbreviation: FPL, Federal Poverty Level.

Figure 1 displays the unweighted and weighted trends in the nationwide uninsured rate from 2013 to 2018. The unweighted trend (panel A) drops from over 13 percent in October 2017 to about 7 percent in January 2018, the period when the survey shifted from phone‐based to mail‐based. This decline is roughly equal to the size of the more gradual change that occurred in the first twelve months after the ACA's 2014 coverage expansion first took effect. The overall pattern was similar in the weighted trend (panel B). Panel B also shows estimates from the NHIS, which do not indicate any major change in early 2018.13

Figure 1.

Figure 1

Unadjusted WBI monthly uninsured estimates for adults aged 18‐64, from January 2013 to March 2018 [Color figure can be viewed at wileyonlinelibrary.com]
  • Note. “WBI” is Gallup‐Healthways Well‐Being Index 2013‐Q1 2018 (n = 606 154). The survey redesign occurred in January 2018. Weighted results used Gallup survey weights. “NHIS” shows estimates from the 2013‐2018 National Health Interview Survey, using weighted nationally representative results.

Table 2 presents the results of our RD models of the uninsured rate and demographic measures. In the unweighted model, the uninsured rate fell by 5.3 percentage points (< 0.001) in the postredesign cohort compared to the preredesign cohort. The estimate was essentially unchanged by the use of survey weights (5.2 percentage point decrease, P = 0.02). After weighting and adjustment for sociodemographic characteristics, the RD change in the uninsured rate no longer reached statistical significance, but the magnitude of the estimated break in trend remained considerable (3.3 percentage point decrease, P = 0.12).

Table 2.

Estimated changes in uninsured rate and demographic characteristics in 2018, based on regression discontinuity (RD) design

Outcome Unweighted, unadjusted Weighted, unadjusted Weighted, adjusted
RD estimate P‐value RD estimate P‐value RD estimate P‐value
% uninsured −0.053 <0.001 −0.052 0.018 −0.033 0.12
Age (y) 2.533 <0.001 −0.584 0.52
Race/ethnicity
White 0.173 <0.001 0.096 0.004
Hispanic −0.063 <0.001 −0.053 0.065
Black −0.064 <0.001 −0.040 0.085
Education
Less than a high school degree −0.041 <0.001 −0.047 0.064
High school graduate or GED −0.075 0.001 −0.020 0.52
Some college −0.041 0.069 −0.007 0.81
College graduate 0.168 <0.001 0.081 0.004
Male −0.046 0.071 0.083 0.010
Income (% of FPL) 64.6% <0.001 22.6% 0.15

Data source is Gallup‐Healthways WBI 2017‐2018 (n = 10 592). Adjusted model controlled for education, race/ethnicity, sex, age, and income. We did not estimate adjusted models for demographic outcomes—only for the % uninsured. All models show results of local linear regressions using the optimal bandwidth approach. See text for further details.

Abbreviation: FPL, Federal Poverty Level.

4.

The unweighted models also showed large RD changes in sample composition after the 2018 survey redesign. Mean age increased by 2.5 years (P < 0.001), and the proportion of white respondents increased by 17.3 percentage points (P < 0.001). The share with a college degree increased by 16.8 percentage points (P < 0.001), and income as a percentage of the Federal Poverty Level increased by 64.6 percent (P < 0.001).

Using the survey weights decreased the magnitude of several estimates, though many demographic features continued to show substantial discontinuity. After weighting, the survey redesign in January 2018 was associated with an increased share of male respondents (8.3 percentage points, P = 0.01), an increased proportion of white respondents (9.6 percentage points, P = 0.004), and an increased share with a college degree (8.1 percentage points, P = 0.004). The latter two estimates indicate that weighting reduced the discontinuity for those characteristics by roughly one‐half.

Figure S1 displays the relationship between the unweighted unadjusted RD estimates on the uninsured rate and bandwidth size, ranging from 10 to 90 days (our main model uses an optimal bandwidth of 24 days). The RD estimates reached significance (< 0.05) above a bandwidth of 15 days. As bandwidth increased, the RD estimates become more precise (presumably due to larger sample sizes) and stabilized in the range of −3 to −6 percentage points. The analogous figure for the weighted RD estimates is in Figure S2, which shows that using the survey weights reduced the magnitude of the RD estimate compared to the unweighted analysis for some bandwidths (and some became nonsignificant), though not near the optimal bandwidth of 24 days.

We also examined parametric models using data from 2017 to 2018 and polynomial equations for the time trend (Table S2). In both quadratic and cubic models, the RD estimate for the change in the uninsured rate as of January 1, 2018, was large and significant (−6.0 and −6.1 percentage points, respectively, both < 0.001). Results for the local linear model using a triangular kernel were fairly similar to our main model (−3.9 percentage points, = 0.018).

We also tested whether similar discontinuities were evident as of January 1, 2017 (a year prior to the survey redesign). Those results (Table S3) show no significant RD changes in the uninsured rate as of January 1, 2017, with or without survey weighting, and only one significant demographic change (white race) that was no longer significant when using survey weights. Other demographic variables were unchanged from 2016 to 2017.

5. DISCUSSION

In this analysis of the Gallup‐Sharecare WBI, we found that the shift in methodology from phone‐based to mail‐ and web‐based data collection in January 2018 produced significant discontinuities in estimated trends of the U.S. nonelderly uninsured rate and the demographic composition of the survey sample. In the postredesign cohort, the uninsured rate fell over 5 percentage points compared to the preredesign cohort, an apparently spurious change in magnitude on par with effects of the ACA's coverage expansion, and contradicted by stable coverage estimates from the NHIS during this time. Postredesign survey respondents were older, with fewer racial/ethnic minorities, and had higher average education and income levels. Use of survey weights designed to mitigate nonresponse bias reduced the size of the demographic shifts in some cases by up to half of the original discontinuity (eg, for white race and percentage with a college degree), but had minimal effects on changes in the uninsured rate. Direct regression‐based adjustment for demographic changes attenuated some of the change in the uninsured rate, but not enough to eliminate the large discontinuity in the WBI. These substantial discontinuities suggest that academic researchers may want to consider alternative data sources for the study of post‐ACA trends in the uninsured rate in 2018 and beyond.

Our study has implications beyond this particular dataset. The WBI is one of the several large national surveys adopting new modes of data collection to improve survey response rates.14, 15 This trend will likely continue as telephone surveys become increasingly difficult and expensive to conduct. However, our results suggest that, even with sophisticated reweighting to correct for the demographic effects of survey redesign, changing data collection modes can cause major disruptions in trends.

The Gallup organization urges caution when utilizing data drawn from before and after the method redesign, noting that the demographic composition of the sample and response patterns vary with the survey mode.16. A 2019 Gallup report on the uninsured rate released after the redesign of the survey describes using “scientifically determined mode adjustments” to eliminate the 2018 break in trend; no further details have been released yet, but Gallup plans to issue a methodology report in the future. Validation of this approach will be a useful area for future research.

Notably, the WBI's response rate improved after the redesign, but nonresponse bias may have increased, as the WBI uninsured estimate (which was already roughly two points lower than federal estimates) fell even further. This is consistent with prior research, indicating that response rates alone may not be adequate indicators of nonresponse bias.17 It also points to a trade‐off between response rate and accurate population representation for web‐based data collection in comparison with phone surveys,18 a disparity largely explained by differential Internet access and literacy among U.S. adults.19 Although RDD phone surveys suffer from high nonresponse rates, nonresponse bias is lessened by the ease of phone use and the higher penetration of telephones compared to Internet usage.20

A substantial literature suggests that the survey mode can also lead to systematic changes in responses even without a change in sample composition.21, 22, 23 In particular, interpersonal modes of questioning (ie, by phone or in person) may lead respondents to provide more socially desirable responses compared to mail or online questions.24, 25 However, we found a lower uninsured rate in the depersonalized mail/web‐based survey, counter to this predicted effect. More likely, the decline in the uninsured rate in the 2018 WBI cohort is due to a selective change in sample composition that is only partially captured by observable demographic variables. Our results underline the importance of survey weighting in low‐response rate surveys, but also indicate the limitations of such approaches in the face of potentially large nonresponse bias and changes in survey mode.

5.1. Limitations

While we identified clear discontinuities in trends over time, our results do not provide insight into how closely future trends in the WBI's estimated uninsured rate may track with “gold standard” national surveys. It is possible that, after accounting for the large drop in the uninsured rate in January 2018, the WBI may nonetheless be a reasonable early indicator of future national coverage changes. New validation studies are needed to address these issues. Ideally, these studies would compare changes in the uninsured rate over time and across states in the WBI vs federal surveys such as the NHIS or ACS, similar to prior validation studies of the WBI.5 However, such studies cannot be performed until late 2019, when 2018 data from large federal surveys will become publicly available.

Other limitations include that we were limited to analyzing the secondary dataset available for purchase from Gallup, which restricted our ability to determine which aspects of the survey mode change were responsible for our findings. We could not assess whether the mail survey led to changes in misreporting rates, in addition to changing the sampling frame and potential nonresponse bias. While previous work shows that survey respondents often struggle to describe their health insurance status,26 which may be worse in a mail survey than a live interviewer phone survey, this is less of a concern in assessing the overall uninsured rate, which is less prone to misreporting than the type of coverage. The revised survey was also shorter and had a different question order than the original survey, which may have contributed to some of the detected changes.

We were unable to assess changes in racial composition for several minority groups (including Native Americans and Asian Americans) because the WBI has used a changing set of racial and ethnic categories over time. We did not analyze several measures of access to care previously studied in the WBI, since these items were dropped after the survey redesign. Lastly, we did not examine changes in coverage type, as only the uninsured rate has been previously validated in this dataset.5

6. CONCLUSIONS

As major changes in the ACA continue,27 rapid response surveys like the WBI can serve an important role in evaluating coverage trends; however, our results provide a note of caution for researchers considering this data source for such analyses—at least until future validation studies comparing the redesigned WBI to government surveys can be conducted. More broadly, our findings suggest that the redesign of large surveys may cause significant mode effects and potential disruptions in time trends.

Supporting information

 

 

ACKNOWLEDGMENTS

Joint Acknowledgment/Disclosure Statement: This project was supported by a research grant from the Robert Wood Johnson Foundation (RWJF). The authors have no financial conflicts of interest to report. Drs. Epstein and Sommers previously served in the Office of the Assistant Secretary for Planning and Evaluation at the Department of Health and Human Services (HHS), but the views presented here are those of the authors and do not represent HHS or RWJF.

Sommers BD, Goldman AL, Lee D, Epstein AM. Survey mode effects and insurance coverage estimates in the redesigned Gallup well‐being index. Health Serv Res. 2019;54:940–946. 10.1111/1475-6773.13156

REFERENCES

  • 1. Sommers BD, Gunja MZ, Finegold K, Musco T. Changes in self‐reported insurance coverage, access to care, and health under the affordable care act. JAMA. 2015;314:366‐374. [DOI] [PubMed] [Google Scholar]
  • 2. Nasseh K, Vujicic M. Early impact of the affordable care act's Medicaid expansion on dental care use. Health Serv Res. 2017;52:2256‐2268. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Sommers BD, Musco T, Finegold K, Gunja MZ, Burke A, McDowell AM. Health reform and changes in health insurance coverage in 2014. N Engl J Med. 2014;371:867‐874. [DOI] [PubMed] [Google Scholar]
  • 4. Sommers BD, Clark KL, Epstein AM. Early Changes in Health Insurance Coverage under the Trump Administration. N Engl J Med. 2018;378:1061‐1063. [DOI] [PubMed] [Google Scholar]
  • 5. Skopec L, Musco T, Sommers BD. A potential new data source for assessing the impacts of health reform: evaluating the Gallup‐healthways well‐being index. Healthc (Amst). 2014;2:113‐120. [DOI] [PubMed] [Google Scholar]
  • 6. Meyer BD, Mok WK, Sullivan JX. Household surveys in crisis. J Econ Perspect. 2015;29:199‐226. [Google Scholar]
  • 7. How Does the Gallup‐Sharecare Well‐Being Index Work? Gallup, 2017. http://www.gallup.com/175196/gallup-healthways-index-methodology.aspx. Accessed 2 August 2018.
  • 8. How Does the Gallup‐Sharecare Well‐Being Index Work? Gallup, 2018. http://www.gallup.com/224870/gallup-sharecare-index-work.aspx. Accessed 10 July 2018
  • 9. Gallup . Gallup‐Sharecare Well‐Being Index Methodological Documentation. Washington, DC: Gallup; 2018. [Google Scholar]
  • 10. Imbens G, Kalyanaraman K. Optimal bandwidth choice for the regression discontinuity estimator. Rev Econ Stud. 2012;79:933‐959. [Google Scholar]
  • 11. Calonico S, Cattaneo M, Titiunik R. Robust nonparametric confidence intervals for regression‐discontinuity designs. Econometrica. 2014;82:2295‐2326. [Google Scholar]
  • 12. Lee DS, Lemieux D. Regression Discontinuity Designs in Economics. J Econ Lit. 2010;48:281‐355. [Google Scholar]
  • 13. Martinez ME, Zammitti EP, Cohen RA. Health Insurance Coverage: Early Release of Estimates From the National Health Interview Survey, January‐June 2018. Hyattsville, MD: National Center for Health Statistics; 2018. [Google Scholar]
  • 14. Kreider AR, French B, Aysola J, Saloner B, Noonan KG, Rubin DM. Quality of health insurance coverage and access to care for children in low‐income families. JAMA Pediatr. 2015;170(1):940‐9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Tesler R, Sorra J. CAHPS Survey Administration: What We Know and Potential Research Questions. Rockville, MD: Agency for Healthcare Research and Quality; 2017. [Google Scholar]
  • 16. Marlar J. Why Phone and Web Survey Results Aren't the Same. Washington, DC: Gallup; 2018. [Google Scholar]
  • 17. Davern M. Nonresponse rates are a problematic indicator of nonresponse bias in survey research. Health Serv Res. 2013;48:905‐912. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Chang L, Krosnick JA. National surveys via RDD telephone interviewing versus the Internet: comparing sample representativeness and response quality. Public Opin Q. 2009;73:641‐678. [Google Scholar]
  • 19. Berrens RP, Bohara AK, Jenkins‐Smith H, Silva C, Weimer DL. The advent of Internet surveys for political research: a comparison of telephone and Internet samples. Political Analysis. 2003;11:940‐22. [Google Scholar]
  • 20. Hays RD, Liu H, Kapteyn A. Use of internet panels to conduct surveys. Behav Res Methods. 2015;47:685‐690. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Dillman DA, Christian LM. Survey mode as a source of instability in responses across surveys. Field Methods. 2005;17:30‐52. [Google Scholar]
  • 22. Elliott MN, Zaslavsky AM, Goldstein E, et al. Effects of survey mode, patient mix, and nonresponse on CAHPS® hospital survey scores. Health Serv Res. 2009;44:501‐518. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Dillman DA, Phelps G, Tortora R, et al. Response rate and measurement differences in mixed‐mode surveys using mail, telephone, interactive voice response (IVR) and the Internet. Soc Sci Res. 2009;38:940‐18. [Google Scholar]
  • 24. Greene J, Speizer H, Wiitala W. Telephone and web: mixed‐mode challenge. Health Serv Res. 2008;43:230‐248. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Bowling A. Mode of questionnaire administration can have serious effects on data quality. J Public Health. 2005;27:281‐291. [DOI] [PubMed] [Google Scholar]
  • 26. Davern M, Klerman JA, Baugh DK, Call KT, Greenberg GD. An examination of the Medicaid undercount in the current population survey: preliminary results from record linking. Health Serv Res. 2009;44:965‐987. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Collins SR, Gunja MZ, Doty MM, Bhupal HK. First Look at Health Insurance Coverage in 2018 Finds ACA Gains Beginning to Reverse. New York, NY: Commonwealth Fund; 2018. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

 

 


Articles from Health Services Research are provided here courtesy of Health Research & Educational Trust

RESOURCES