Abstract
This cross-sectional study explores the reliability and validity of a newly developed 15-item Medicare proficiency questionnaire (MPQ) across a mixed group of participants enrolled and unenrolled in Medicare. The MPQ was designed to assess beneficiary knowledge across a variety of Medicare topics and was developed by combining questions selected from the 2003 Medicare Current Beneficiary Survey and updating it with researcher-generated Medicare Part D questions. During the month of February in 2024, participants enrolled and unenrolled in Medicare, which were recruited on Prolific, completed online surveys which included the MPQ and demographic questions. We found that the MPQ has adequate internal consistency reliability with Cronbach’s alpha = .73 across participants enrolled and unenrolled in Medicare as well as adequate validity, as demonstrated by positive relationships of MPQ scores to education level and to enrollment status (enrollees scoring higher).
Background and Objectives
In December 2023, approximately 66,900,000 people were enrolled in Medicare (Centers for Medicare & Medicaid Services Data, n.d.-b). Making Medicare-related decisions has been reported to be a challenging and overwhelming experience for beneficiaries (Castillo, 2023). During this process, a beneficiary is tasked with learning unfamiliar terms, adhering to enrollment deadlines, evaluating various plans, and choosing one plan based on their personal preferences and situation. Older adults have been found to have a poor understanding of the services covered by Original Medicare and Medicare Advantage plans (Ankuda et al., 2020; Sivakumar et al., 2016). Even intelligent digital assistants have shown imperfect knowledge, though large language models are generally superior to Medicare beneficiaries (Langston et al., 2024).
Such poor health literacy is essential to identify and address, given its association with poorer health status and higher mortality rates among older people (Berkman et al., 2011). Medicare beneficiaries’ knowledge of the Medicare system is assessed through Medicare Current Beneficiary Surveys (MCBSs). These surveys are annually distributed to a nationally representative sample to collect beneficiary information, including beneficiaries’ understanding of Medicare.
Bann and McCormack (2005) published data from 2,634 respondents who completed the “Medicare Knowledge” section within the MCBS. This version of the Medicare Knowledge section included true/false questions from the following sections: “Eligibility for and Structure of Original Medicare,” “Medicare + Choice,” “Plan Choices and Health Plan Decision-Making,” “Information and Assistance,” “Beneficiary Rights,” and “Medigap/Employer-Sponsored Supplemental Insurance.”
The section “Eligibility for and Structure of Original Medicare,” included 17 questions regarding the coverage and cost of original Medicare. The “Medicare + Choice” section refers to the “Medicare + Choice” program, the name of Medicare’s Medicare Advantage program before December 2003. This section included 17 questions regarding the coverage and eligibility related to Medicare Managed Plans (HMOs). The “Plan Choices and Health Plan Decision-Making” section included six questions regarding plan flexibility, differences between Medicare Advantage plans and Original Medicare, and their associated costs. The “Information and Assistance” section included six questions relating to informational resources available to beneficiaries. The “Beneficiary Rights” section included four questions regarding appealing Medicare decisions, as well as beneficiaries’ right to privacy. The Medigap/Employer-Sponsored Supplemental Insurance section included five questions related to Medigap and supplemental insurance eligibility and coverage.
Objective fact-based questions such as those included in this 2003 MCBS are important in assessing beneficiary knowledge, as older adults with a poor understanding of Medicare have been shown to be more likely to report that they have adequate knowledge of Medicare (Sivakumar et al., 2016). Our aim in this project is to provide a short Medicare proficiency questionnaire (MPQ) that could benefit: a) Medicare advisors to quickly assess the level of proficiency of their clients to better tailor advice to them, b) older adults who could self-test to determine their own level of proficiency when deciding whether to consult others for Medicare decision making, and c) researchers who want to assess the influence of level of Medicare knowledge on Medicare decision making processes. Utilizing questions drawn from the Medicare Knowledge section of the 2003 Medicare Current Beneficiary Survey, adapting these questions based on terminology changes, and supplementing these questions with new questions, we created and assessed the reliability and validity of a short, 15-question Medicare proficiency quiz (see Table 1).
Table 1.
15 Questions Used in the Medicare Proficiency Questionnaire with Answers and Question Category
| Question number | Question | Correct Answer | Question Category |
|---|---|---|---|
| 1 | A beneficiary who is enrolled in a Medicare Advantage plan can go to any doctor or hospital in the United States for routine care and the visit will be covered. | False | Medicare + choice |
| 2 | If your Medicare Advantage provider leaves Medicare and you do not choose another one, you will be covered by the Original Medicare plan. | True | Medicare + choice |
| 3 | Part A of the Medicare program covers hospital stays. | True | Eligibility for Medicare |
| 4 | I can drop and switch my Medicare drug plan at any time. | False | Part D |
| 5 | If you have Medicare, your health insurance plan or your doctor must keep your information private, unless you give your permission for them to share it. | True | Beneficiary Rights |
| 6 | The premium, or monthly payment, that Medicare beneficiaries have to pay for doctors’ visits and other medical services, can change at any time during the year. | False | Eligibility for Medicare |
| 7 | Medicare beneficiaries can buy a Medigap or supplemental health insurance policy at any time, regardless of their health. | False | Medigap |
| 8 | No matter which health insurance option you choose, your out-of-pocket costs will be the same. | False | Plan choices |
| 9 | I can only get drug coverage through Medicare Advantage plans. | False | Part D |
| 10 | Information is available about the quality of care people get with different Medicare health insurance options. | True | Information and Assistance |
| 11 | Medicare Advantage plans often cover more health services, like prescribed medicines, than Medicare without a supplemental insurance policy. | True | Medicare + choice |
| 12 | With Medicare Part D, people can receive coverage for any prescription. | False | Part D |
| 13 | You have the right to appeal a Medicare Advantage plan’s decision about which health care services it will pay for. | True | Beneficiary Rights |
| 14 | Your out-of-pocket costs will vary depending on which Medicare health plan options you choose. | True | Plan choices |
| 15 | Part B of the Medicare program covers medical services like doctors’ visits. | True | Eligibility for Medicare |
Research Design and Methods
Development of Medicare Proficiency Questionnaire
When developing our smaller Medicare proficiency questionnaire, we chose 12 questions from the “Medicare Knowledge” section of the 2003 version of the MCBS used by Bann and McCormack (2005). We chose questions from all six sections, with half of these questions coming from the sections with the largest number of questions: “eligibility for Medicare and structure of original Medicare” and “Medicare + Choice.” When answering these questions, beneficiaries who took part in the 2003 MCBS had the option of responding to questions with the statement “I don’t know,” in addition to the answer choices of “true” or “false.” For each question within the Medicare Knowledge section of the 2003 MCBS, Bann and McCormack (2005) published the percent of participants who responded to an answer correctly or using the “I don’t know” option. By consulting the percentage of participants who answered questions correctly, incorrectly, or chose the “I don’t know” option, we selected questions to create a set of questions with varying difficulty levels. The 12 questions we selected had a mean percent correct of 52.6 (SD=18.1) and a mean percent “don’t know” of 34.2 (SD=12.3). Three of these questions originally included the term “HMOs.” For clarity, we edited these questions to instead use the current phrase “Medicare Advantage plan.”
When the 2003 MCBS was developed, Medicare part D had not yet been established (Centers for Medicare & Medicaid Services, n.d.a). Therefore, we developed three true/false part-D questions based on the content included in a 2022 version of the “Medicare and You” official handbook. These 15 questions were selected to ensure a balanced number of correct “true” and “false” responses. Eight of the 12 questions had “true” as their correct answer, while four of the questions taken from the MCBS, and all three of the Part-D questions had “false” as their correct answer (see Table 1).
Participant Recruitment
Using Prolific, https://www.prolific.com/, we recruited participants age 62 or older who were enrolled and unenrolled in Medicare to take this Medicare proficiency survey. Participants were eligible for the study if they lived in the United States and were of American Nationality. Participants were first asked to complete an eligibility survey through Qualtrics (https://www.qualtrics.com/), which would allow us to recruit them for the second, main survey. Participants were able to complete this eligibility survey between the dates of February 15 to February 23 and were compensated at a rate of $10 an hour. This initial survey asked participants their age, birth year, and if they were enrolled in Medicare. In place of a traditional informed consent page, participants who attempted to complete the initial eligibility survey were given an information sheet outlining the study procedure. Following this information sheet, participants were asked three true/false questions about the content of the information sheet to ensure that they were not cognitively impaired to the level in which they would be ineligible for participation. All participants who attempted to complete this survey answered these questions correctly.
When determining the appropriate minimum sample size to test the study hypothesis that those who are enrolled will perform better on the MPQ, we conducted an a priori power analysis using G*Power version 3.1.9.7 (Faul et al., 2007). Results indicated the required sample size to achieve 80% power for detecting a small effect of enrollment on MPQ performance, at a significance criterion of α = .05, was N = 204 for a one-tailed t-test for independent samples. In total 478 participants, 355 enrolled in Medicare, and 123 unenrolled in Medicare completed the initial survey, and 11 participants dropped out during the middle of this initial survey. Every participant who completed this survey answered all of the questions included in this survey. The survey remained open until we received at least 120 responses from each cohort (enrolled and unenrolled), with the goal of having at least 102 participants from each cohort complete the main survey. 119 unenrolled and 191 enrolled participants who completed the eligibility survey were assessed for eligibility based on their reported age and year of birth. After removing participants below the 62-year age limit, we invited 184 enrolled and 117 unenrolled participants to complete the second survey.
Administration of Medicare Proficiency Questionnaire
From February 20 to February 26 in 2024, 130 eligible participants enrolled in Medicare and 105 eligible participants unenrolled in Medicare completed the main survey and were compensated at a rate of $10 an hour. One unenrolled participant who began the survey did not complete it, while every enrolled participant who began the survey completed it. All participants who completed the survey answered every question. Like the eligibility survey, this survey was administered using Qualtrics and included an information sheet in place of an informed consent form, which described the general format of the survey and its purpose. Following the informed consent page, participants answered questions about their age, gender, and education level. After this demographics section, participants who indicated that they were enrolled in Medicare on the eligibility survey were asked if they were currently enrolled in a “Medicare Advantage plan (also known as Medicare’s managed care plan),” or a “private supplemental insurance plan.” These questions were included as these data were reported by Bann and McCormack (2005), though Medicare plan structures have changed since that time making comparisons difficult.
Following these sections, participants were given instructions for completing the MPQ, which asked them to answer questions to the best of their ability and not to consult outside sources such as Google. Participants were also asked to answer with “I don’t know” if they were not fairly confident that they knew the answer. Following the MPQ, participants were asked if they had ever consulted the “Medicare and You” official handbook, or if they had ever visited Medicare.gov. We also asked participants to self-report their knowledge of Medicare, “How much would you say you know about Medicare?”, using a five-point scale ranging from “none at all” to “a great deal.” However, because this question followed the objective test of knowledge it was likely affected by perception of performance on the quiz. We administered two multiple-choice attention check questions, one immediately before the 15-question Medicare proficiency questionnaire, which asked participants if they strongly agree, agreed, disagreed or strongly disagreed with the statement “I currently live on Mars,” and one at the end of the survey which asked participants if they strongly agree, agree, disagree or strongly disagree with the statement “I was born on planet Earth.” Participants were considered to pass the attention check if they selected “disagree” or “strongly disagree” with the first statement, and if they selected “agree” or “strongly agree” with the second statement. Out of the 130 enrolled participants who completed this survey, we excluded data from five enrolled participants because these participants failed one or more of the two attention check questions. All 105 unenrolled participants who completed the survey passed the attention check questions. This left the data from 125 enrolled participants and 105 unenrolled participants Our aim was to generate a Medicare proficiency test that was quick to administer, up to date (by incorporating new items beyond those sampled in Bann and McCormack (2005)), reliable, and valid. First, we analyzed the mean time to answer the MPQ. Timing data was collected by Qualtrics, which captured the number of seconds participants spent on the MPQ. Then using the statistical analysis software JASP, we analyzed the MPQ’s dimensional structure using factor analysis to assess the cohesion (factor loadings) of new questions and prior validated ones, and then its reliability (Cronbach’s alpha) (https://jasp-stats.org/ JASP Team, 2024). Medicare information is unlikely to be unidimensional and multidimensional structure can bias to be analyzed.
Analysis Plan
Our aim was to generate a Medicare proficiency test that was quick to administer, up to date (by incorporating new items beyond those sampled in Bann and McCormack (2005)), reliable, and valid. First, we analyzed the mean time to answer the MPQ. Timing data was collected by Qualtrics, which captured the number of seconds participants spent on the MPQ. Then using the statistical analysis software JASP, we analyzed the MPQ’s dimensional structure using factor analysis to assess the cohesion (factor loadings) of new questions and prior validated ones, and then its reliability (Cronbach’s alpha) (https://jasp-stats.org/ JASP Team, 2024). Medicare information is unlikely to be unidimensional and multidimensional structure can bias reliability estimates (Trizano-Hermosilla et al., 2021). Following Bann and McCormack (2005) we assessed validity in two ways: 1) by comparing performance of those with higher and lower educational attainment, and 2) by comparing those with prior experience with Medicare (enrollees) to those with less prior experience (those unenrolled) expecting higher test scores for those with more education and those who were enrolled. We also provided a new validity check on how prior knowledge influences test performance. We evaluated whether self-reports of use of the Medicare.gov and the Medicare Handbook mediated the expected relationships between education and enrollment status on MPQ scores through a regression analysis.
Results
Overall, the 15-item questionnaire took participants less than three minutes (M = 2.8, SD = 1.7) to complete. We assessed multidimensionality of the scale by conducting an Orthogonal Varimax principal component analysis on participant’s answers to the 15 Medicare proficiency questions. For the whole sample, five factors with eigenvalues being greater than one emerged (see Table 2), indicating that the scale is not unidimensional. With an Eigenvalue of 3.82, a percent variance of 25.5%, and a cumulative variance of 25.5%, Factor 1, which included questions 1,4,8, 12, and 13, was the strongest factor identified. Questions 4 and 12 came from the “Part D” category, while questions 1, 8, and 13 all came from different categories (see Table 2). Factor 2 was composed of question 3 and 15, both questions from the Eligibility + Structure category of the questionnaire. Table 3 shows a comparison of response types on the MPQ in this sample for those enrolled in Medicare to those of Bann and McCormack (2005) for shared questions.
Table 2.
Oblique Promax Principal Component Analysis Component Loadings for the MPQ Performance of Participants Enrolled and Unenrolled in Medicare
| Factor Loading | ||||||||
|---|---|---|---|---|---|---|---|---|
|
|
||||||||
| Question Number | Question Category | 1 | 2 | 3 | 4 | 5 | Uniqueness | Eigenvalue |
| 1 | Medicare + Choice | .750 | .400 | |||||
| 4 | Part D | .623 | .532 | |||||
| 8 | Plan Choice | .529 | .553 | |||||
| 13 | Beneficiary Rights | .473 | .411 | .473 | ||||
| 12 | Part D | .428 | .565 | 3.82 | ||||
| 3 | Eligibility + Structure | .883 | .182 | |||||
| 15 | Eligibility + Structure | .831 | .243 | 1.30 | ||||
| 11 | Medicare + Choice | .753 | .416 | |||||
| 14 | Plan Choice | .613 | .525 | |||||
| 2 | Medicare + choice | .422 | .650 | 1.20 | ||||
| 9 | Part D | .665 | .413 | |||||
| 6 | Eligibility + Structure | .589 | .555 | |||||
| 7 | Medigap | .540 | .436 | |||||
| 10 | Information and Assistance | .533 | .604 | .293 | 1.02 | |||
| 5 | Beneficiary Rights | .719 | .411 | 1.02 | ||||
Table 3.
Percent of MPQ Items Answered Correctly or With “I don’t know” answer For Current Sample and Bann and McCormack’s (2005) Sample
| Percent correct | Percent “I don’t know” | ||||||
|---|---|---|---|---|---|---|---|
|
|
|||||||
| Question Number | Bann & McCormack 2005 | Enrolled | Unenrolled | Bann & McCormack 2005 | Enrolled | Unenrolled | |
| 1 | 49 | 62.4 | 39.1 | 42 | 20.8 | 42.9 | |
| 2 | 40 | 26.4 | 26.7 | 51 | 52.8 | 49.5 | |
| 3 | 68 | 84.0 | 64.8 | 24 | 8.8 | 27.6 | |
| 4 | N/A | 64.8 | 50.5 | N/A | 23.2 | 39.1 | |
| 5 | 86 | 89.6 | 85.7 | 11 | 9.6 | 11.4 | |
| 6 | 40 | 71.2 | 58.1 | 31 | 16.8 | 34.3 | |
| 7 | 27 | 43.2 | 29.5 | 41 | 37.6 | 44.8 | |
| 8 | 50 | 90.4 | 78.1 | 37 | 7.2 | 20.0 | |
| 9 | N/A | 77.6 | 58.1 | N/A | 12.8 | 28.6 | |
| 10 | 48 | 58.4 | 62.9 | 42 | 32.0 | 31.4 | |
| 11 | 35 | 85.6 | 71.4 | 51 | 9.6 | 19.1 | |
| 12 | N/A | 49.6 | 40.0 | N/A | 32.0 | 44.8 | |
| 13 | 52 | 64.8 | 60.0 | 41 | 30.4 | 32.4 | |
| 14 | 70 | 88.8 | 89.5 | 27 | 6.4 | 8.6 | |
| 15 | 75 | 83.2 | 58.1 | 19 | 13.6 | 36.2 | |
Data, analytical methods, and materials used in this study are available to researchers for replication purposes, upon request.
Although we have shown that the MPQ is not unidimensional, we do need an estimate of its reliability as a quick to administer general measure. Hence, we conducted a Bayesian Unidimensional Reliability test using JASP (2024) on the number of correct responses and the number of incorrect and “I don’t know” responses collapsed from our 15-item questionnaire. We found that the questionnaire had an acceptable reliability (Cronbach’s alpha = .73) across both enrolled and unenrolled participants. For enrolled participants, we found that the questionnaire had questionable reliability (Cronbach’s alpha = .64), while for unenrolled participants, we found that it had acceptable reliability (Cronbach’s alpha = .76).
Our primary hypotheses concern the effects of enrollment status and education on accuracy for the MPQ. We hypothesized that participants who reported they are enrolled in Medicare would perform better on MPQ questions compared to those who reported they were unenrolled in Medicare and that those with higher educational levels would also perform better. We also assess potential confounder variables in exploratory analyses.
An independent samples T-test on the association between enrollment status and number of MPQ questions answered correctly, showed a significant effect of enrollment status t(228) = −4.29, p <.001 enrolled M = 10.4, SD = 2.6,unenrolled M = 8.72, SD = 3.3, d= −0.57. An independent samples T-test on the association between enrollment status and number of MPQ questions answered as “I don’t know,” also showed a significant effect of enrollment t(228) = 3.62, p <.001 with enrolled M= 3.1, SD = 2.7, unenrolled M = 4.7, SD = 3.9, d = 0.48 . We did not find a significant effect of enrollment on the number of questions answered incorrectly t(228) = .545, p = .586 , enrolled M = 1.5, SD = 1.5, unenrolled M = 1.6, SD = 1.5 ,d = .132.
When using a Chi-squared test to compare the answers of enrolled participants those of unenrolled participants for each of the 15 questions, we found there was no significant difference between these two groups for the proportion of correct, incorrect, and “I don’t know answers” for questions, 2,5,7,10,12,13, and 14 (See Table 3). However, for questions 1,3,4,6,8,9, and 15, Chi-squared tests showed that enrolled participants answered these questions correctly at a significantly higher rate than those who were unenrolled. Those who were unenrolled answered these seven questions with “I don’t know” at a significantly higher rate than those who were enrolled. Enrolled and unenrolled participants answered questions incorrectly at equivalent rates.
The mean age of all participants was 66.7 years (SD = 4.5). As Table 4 indicates, age is somewhat confounded with enrollment status. A Chi-squared test showed that enrollment status was significantly associated with age cohort, such that the majority of enrolled participants were between the ages of 65 and 75 and the majority of unenrolled participants were under the age of 65. An ANOVA test examining the relationship between age cohort (under 65, 65–75, or over 75) and MPQ answers, found a significant effect of age cohort on the number of questions answered correctly [F (2, 227) = 5.98, p = .003], under 65 M = 8.9, SD = 3.3, 65–75 M = 10.1, SD = 2.8, over 75 M = 11.5, SD = 1.2, η2 = .05, as “I don’t know,” [F (2, 227) = 3.40, p = .035], under 65 M = 4.3, SD = 3.8, 65–75 M = 3.6, SD = 3.0, over 75 M = 1.8, SD = 1.1, η2 = .03 and as incorrectly and “I don’t know,” combined, [F (2, 227) = 5.98, p = .003], under 65 M = 6.1, SD = 3.3, 65 to 75 M = 4.9, SD = 2.8, M = 3.5, SD = 1.2, η2 = .05. Older age was associated with a greater number of correct answers and a smaller number of unsure answers. Although this result is not consistent with typical age effects where older age groups underperform younger ones, it may be an example of an age-knowledge effect due to greater experience in a domain, consistent with life span increases in crystallized knowledge. For instance, representative surveys of Swedish citizens have shown that pension knowledge increases with age (Elinder et al., 2022).
Table 4.
Demographic Data by Enrollment Status
| Variable | Enrolled | Unenrolled | Chi-square Significance | ||
|---|---|---|---|---|---|
|
|
|||||
| N | % | N | % | ||
| Gender | p = .508 | ||||
| Male | 53 | 42.4 | 40 | 38.1 | |
| Female | 72 | 57.6 | 65 | 61.9 | |
|
| |||||
| Age | p < .001 | ||||
| Under 65 | 11 | 8.8 | 89 | 84.8 | |
| 65–75 | 104 | 83.2 | 15 | 14.3 | |
| Over 75 | 10 | 8.0 | 1 | 1.0 | |
|
| |||||
| Education | p = .138 | ||||
| Some High School | 16 | 12.8 | 11 | 10.5 | |
| Some College | 35 | 28.0 | 19 | 18.1 | |
| College Degree or Higher | 74 | 59.2 | 75 | 71.4 | |
|
| |||||
| Experience Consulting Official Handbook | p = <.001 | ||||
| Yes | 88 | 70.4 | 14 | 13.3 | |
| No | 30 | 24.0 | 90 | 85.7 | |
| Unsure | 7 | 5.6 | 1 | 1.0 | |
|
| |||||
| Experience Consulting Official Website | p = <.001 | ||||
| Yes | 104 | 83.2 | 36 | 34.3 | |
| No | 17 | 13.6 | 67 | 63.8 | |
| Unsure | 4 | 3.2 | 2 | 1.9 | |
Most of our participants (64.8%) reported that they had a college degree or higher, 23.5% of participants reported having some college education and 11.7% of participants reported having high school education. Chi-squared tests on the proportion of participants in each cohort between enrolled and unenrolled cohorts showed no association of enrollment status to educational level [X2 (1, N = 230) = 3.96, p = 0.138]. As predicted, ANOVA tests revealed a significant effect of education on the number of questions answered correctly overall by all participants [F (2, 227) = 3.06, p = .049], some high school M = 9.0, SD = 3.2, some college M = 9.0, SD = 3.4, college degree or higher M = 10.0, SD = 2.8, η2 = .03, as well as the number of questions answered incorrectly, or as, “I don’t know” [F (2, 227) = 3.06, p = .049] some high school M = 6.0, SD = 3.2, some college M = 6.0 SD = 3.4, college degree or higher M = 5.0, SD = 2.9, η2 = .03,such that completion of a college degree was associated with answering more questions correctly.
Most enrolled participants reported that they had consulted the “Medicare and You” handbook (70.4%) and the Medicare.gov website (83.2%), whereas most unenrolled participants reported that they had not consulted the “Medicare and You” handbook (85.7%) and the Medicare.gov website (63.8%) (see Table 4). There was a significant effect of enrollment status on consulting the “Medicare and You” handbook, [X2 (1, N = 230) = 87.11, p = <.001], as well as the Medicare.gov website, [X2 (1, N = 230) = 62.19, p = <.001.
Chi-squared tests on the proportion of women and men in our sample between enrolled and unenrolled cohorts showed no association of enrollment status to gender [X2 (1, N = 230) = 0.44, p = .508].
To assess mediation of enrollment status and education effects by use of Medicare information sources, we ran a regression analysis on the effect of enrollment status, age, gender, education, and exposure to the “Medicare and You” handbook and Medicare.gov website (combined information source variable) on the number of questions answered correctly, treating education as a conditional variable. We found that together, these variables had a significant effect on the number of questions answered correctly [F (5,224) = 8.38, p < .001]. As seen in Table 5, educational level and combined information source exposure were observed to be the unique predictors, with enrollment status no longer significant.
Table 5.
Regression Analysis for The Impact of Demographic Variables on MPQ performance
| Variable | Unstandardized | Standard Error | Standardized | T-Value | P-Value |
|---|---|---|---|---|---|
| H1 Intercept | 4.03 | 3.52 | N/A | 1.14 | 0.255 |
| Enrollment Status | 0.49 | 0.57 | N/A | 0.86 | 0.389 |
| Age | 0.04 | 0.05 | 0.06 | 0.70 | 0.482 |
| Gender | 0.08 | 0.38 | N/A | 0.21 | 0.832 |
| Education | 0.67 | 0.27 | 0.15 | 2.47 | 0.014 |
| Combined information source | 1.00 | 0.28 | 0.28 | 3.62 | < .001 |
The regression was significant with [F (5,224) = 8.38, p < .001], Adjusted R2 = .139
We compared the results of our participants enrolled in Medicare on the 12 questions in our questionnaire taken from the 2003 MCBS, to the performance of participants included in Bann and McCormack’s (2005) sample on the same questions (see Table 3, Table S1). We found that our enrolled participants performed significantly better than Bann and McCormack’s (2005) sample overall across the 12 questions [X2 (2, N = 17304) = 436.57, p < .001], by answering 70.7% of the 12 questions correctly, 8.9% incorrectly, and 20.5% as “I don’t know.” compared to 53.3% of the questions correctly, 11.9% incorrectly, and 34.8% as “I don’t know.” Our enrolled participants differed slightly from the enrolled participants within Bann and McCormack’s (2005) sample for age and education, such that Bann and McCormack’s (2005) sample were significantly older, and significantly less likely to hold a college degree. Although speculative, better performance of current older adult cohorts is consistent with a time of measurement effect, the “Flynn effect” (Fox & Mitchum, 2014), showing better test performance by more recent age cohorts compared to earlier same age cohorts. However, other effects of history, or differences in sampling or administration of the survey may have played a role as well.. For further information about our sample, and comparison of our enrolled participants to the Bann & McCormack sample, see Tables S1, S2, and S3.
Discussion
Our results indicate that a short 15-item Medicare knowledge questionnaire, the MPQ, shows adequate reliability and validity. We hypothesized that enrolled participants would perform better overall on the 15 MPQ questions due to their experience with Medicare, compared to those who reported that they were unenrolled in Medicare, and that those who reported a higher education level would respond more correctly to MPQ questions compared to those reporting a lower educational level. Both hypotheses were supported, though the effect of enrollment status was mediated by exposure to the Medicare and You handbook and the Medicare.gov website.
When comparing the MPQ results of our enrolled participants to that of Bann and McCormack’s (2005) sample for shared questions, we found that overall, our enrolled participants answered most questions more accurately than Bann and McCormack’s (2005) sample. However, out of the questions whose proportion of answer types significantly differed across the two groups, we found that those in Bann and McCormack’s (2005) sample answered these questions with the “I don’t know” option more frequently than those in our enrolled sample.
Because MCBSs are conducted over the phone or face to face with a member of Medicare, it’s possible that these participants felt less confident in their abilities compared to the Prolific sample, and therefore chose the “I don’t know” answer more frequently. However, it may also be that our sample, as they were surveyed 20 years after Bann and McCormack’s (2005) sample, had greater knowledge of the information covered in these questions because of exposure to online information. Because our sample was not a representative one, it is also possible that mean performance differences are attributable to our gathering a more elite internet-using sample.
By adding questions to tap knowledge of new aspects of Medicare, our instrument advances measurement of Medicare knowledge. The results of principal component factor analysis, which revealed five different factors for the sample, indicate that Medicare knowledge may develop rather unevenly. Thus, the MPQ is most appropriate for a quick assessment of general knowledge.
There were several limitations to this study. Our sample was significantly younger than that used by Bann and McCormack (2005) across both enrolled and unenrolled cohorts. The majority (83.20%) of our enrolled participants fell into the 65–75 cohort, while the more populous cohort in Bann and McCormack’s (2005) sample was the older than 75 cohort with 45.3% of survey responders, followed by 40.4% percent in the age 65 to 75 cohort. As of 2021, 49% of Medicare beneficiaries were between ages 65 and 74, with 35% being 75 or older. Therefore, our sample skews younger than recent estimates of beneficiary age which may bias the sample in the direction of more recent, hence easily accessible knowledge of Medicare for enrollees than was the case in the 2005 sample.
Additionally, by including the question that asks participants to self-report their level of Medicare knowledge after the 15-item MPQ, participants most likely referenced their experience of taking the MPQ when answering this question. Therefore, this question, when asked at this point in the protocol, may be a biased indicator of participants’ assessments of their own Medicare knowledge outside of the context of the 15-item MPQ. Future studies should ask participants to report their self-knowledge of Medicare prior to taking the MPQ, in order to assess their degree of calibration to their actual knowledge.
By limiting the MPQ to a total of 15 questions across seven different categories (“Eligibility for and Structure of Original Medicare,” “Medicare + Choice,” “Plan Choices and Health Plan Decision-Making” “Information and Assistance,” “Beneficiary Rights,” “Medigap/Employer-Sponsored Supplemental Insurance,” and Part D) we created a questionnaire that can be completed in under three minutes on average. However, because there are only one to three questions per category, this questionnaire would need to be supplemented by a more detailed one to evaluate knowledge on a specific sub-category of Medicare.
In summary, for those in need of a quick assessment of Medicare knowledge, the MPQ is a reliable and valid measure. It can supplement or replace self-reports of knowledge. We believe that it can be helpful to Medicare navigators (SHIP program) and Medicare administrators who wish to provide tailored guidance based on assessed knowledge levels of beneficiaries and new enrollees. The MPQ may also be useful to older adults who want to self-test their knowledge before making Medicare decisions. Researchers could use the MPQ as a predictor or mediator of Medicare decision-making processes. Finally, the MPQ can serve as a short benchmark test to assess comparative accuracy and stability of AI tools (Langston et al., 2024).
Supplementary Material
What this paper adds
Provides an updated, short, reliable, and valid quiz tapping Medicare knowledge for older people enrolled and unenrolled in Medicare.
Demonstrates that better performance on the quiz by Medicare enrollees is mediated by self-reported access to the Medicare web site and the Medicare beneficiary handbook.
Applications of study findings
Introduces a short questionnaire which can be used by Medicare advisors to quickly assess Medicare beneficiary knowledge, by Medicare beneficiaries to assess their own knowledge, and by researchers who need to account for the role of general Medicare knowledge when predicting Medicare task performance. It can also serve as a benchmark test for assessing the comparative accuracy and stability of AI tools.
Funding
This work was supported in part by a grant from the National Institute on Aging, under the auspices of the Center for Research and Education on Aging and Technology Enhancement [1P01AG073090]
Footnotes
Declaration of Conflicting Interests
The Authors declare there is no conflict of interest
Institutional Review Board Approval
This study, STUDY00004568, was approved as exempt by the Florida State University Institutional Review Board.
References
- Ankuda CK, Moreno J, McKendrick K, & Aldridge MD (2020). Trends in older adultsʼ knowledge of Medicare Advantage Benefits, 2010 to 2016. Journal of the American Geriatrics Society, 68(10),2343–2347. 10.1111/jgs.16656 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bann CM, & McCormack L (2005). Measuring knowledge and health literacy among Medicare beneficiaries. RTI International 3040 Cornwallis Road, P.O. Box 12194, Research Triangle Park, NC; 27709–2194. https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Reports/Research-Reports-Items/CMS062191 [Google Scholar]
- Berkman ND, Sheridan SL, Donahue KE, Halpern DJ, & Crotty K (2011). Low Health Literacy and Health Outcomes: An updated systematic review. Annals of Internal Medicine, 155(2), 97. 10.7326/0003-4819-155-2-201107190-00005 [DOI] [PubMed] [Google Scholar]
- Castillo A, Rivera-Hernandez M, & Moody KA (2023). A digital divide in the COVID-19 pandemic: Information Exchange among older Medicare beneficiaries and stakeholders during the COVID-19 pandemic. BMC Geriatrics, 23(1). 10.1186/s12877-022-03674-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Centers for Medicare & Medicaid Services. (n.d.-a). History. CMS.gov. https://www.cms.gov/about-cms/who-weare/history#:~:text=The%20MMA%20also%20expanded%20Medicare,went%20into%20effect%20in%202006 .
- Centers for Medicare & Medicaid Services Data. (n.d.-b). Medicare Monthly Enrollment. Data.CMS.gov. https://data.cms.gov/summary-statistics-on-beneficiary-enrollment/medicare-and-medicaid-reports/medicare-monthly-enrollment
- Elinder M, Hagen J, Nordin M, & Säve-Söderbergh J (2022). Who Lacks Pension Knowledge, Why and Does it Matter? Evidence From Swedish Retirement Savers. Public Finance Review, 50(4), 379–435. 10.1177/10911421221109061 [DOI] [Google Scholar]
- Faul F, Erdfelder E, Lang A-G, & Buchner A (2007). G*Power 3: A flexible statistical power analysis for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191 [DOI] [PubMed] [Google Scholar]
- Fox MC, & Mitchum AL (2014). Confirming the cognition of rising scores: Fox and Mitchum (2013) predicts violations of measurement invariance in series completion between age-matched cohorts. PloS one, 9(5), e95780. 10.1371/journal.pone.0095780 [DOI] [PMC free article] [PubMed] [Google Scholar]
- JASP Team (2024). JASP (Version 0.18.3)[Computer software]. https://jasp-stats.org/
- Langston E, Charness N, & Boot W (2024). Are virtual assistants trustworthy for Medicare information: An examination of accuracy and reliability. The Gerontologist, 64(8). 10.1093/geront/gnae062 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sivakumar H, Hanoch Y, Barnes AJ, & Federman AD (2016). Cognition, health literacy, and actual and perceived Medicare knowledge among inner-city Medicare beneficiaries. Journal of Health Communication, 21(sup2), 155–163. 10.1080/10810730.2016.1193921 [DOI] [PubMed] [Google Scholar]
- Trizano-Hermosilla I, Gálvez-Nieto JL, Alvarado JM, Saiz JL, & Salvo-Garrido S (2021). Reliability estimation in multidimensional scales: Comparing the bias of six estimators in measures with a Bifactor structure. Frontiers in Psychology, 12. https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2021.508287/full. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
