Abstract
BACKGROUND
Economics and reimbursement have become a daily part of practicing physicians' lives. Yet, few internal medicine (IM) programs have offered formal curricula during residency about practice management or economics.
OBJECTIVE
To determine perceived, desired, and actual knowledge of Medicare billing and reimbursement among residents compared with community-based General Internists.
DESIGN AND PARTICIPANTS
Cross-sectional needs assessment survey of community and university-based second-year IM residents from 4 geographic regions of the United States.
RESULTS
One hundred and thirty-three second-year IM residents completed the questionnaire. Residents rated their level of knowledge about Medicare as a 2.0 (SD = 0.9) on a Likert scale (1 = “very low,” 5 = “very high”). Residents agreed that Medicare reimbursement should be taught in residency with a score of 4.0 (SD = 1.1; 1 = “strongly disagree,” 5 = “strongly agree” SD = 1.1). On the knowledge assessment portion of the questionnaire, residents scored significantly lower than a group of general IM physicians who completed the same questions (percent correct = 41.8% vs 59.0%, P<.001). Residents' scores correlated with their self-assessed level of knowledge (P = .007).
CONCLUSIONS
Our study demonstrates that second year IM residents feel they have a low level of knowledge regarding outpatient Medicare billing, and have a lower test score than practicing Internists to back up their feelings. The residents also strongly agree that they do not receive enough education about Medicare reimbursement, and believe it should be a requirement in residency training.
Keywords: internship and residency, curriculum, medicare, needs assessment
Economics and reimbursement have become a daily part of practicing physicians' lives. In 2002, the Accreditation Council for Graduate Medical Education (ACGME) implemented 6 general competencies for residencies to establish and evaluate within their programs. The Systems-based practice and Practice-based learning competencies have often been interpreted as requiring education about practice management.1 Yet, the ACGME has not required, and few internal medicine (IM) programs have offered, formal curricula during residency about practice management or economics.2,3 The few residents who have been involved in training courses on managed care have found them very satisfactory.4
Congress established Medicare in 1965 to provide health coverage to 38.4 million beneficiaries who are elderly, disabled, or have end-stage renal disease. Medicare pays hospitals to finance the salaries of resident physicians who serve these patients.5 At the end of residency, these same physicians must be competent to engage and access Medicare's system of reimbursement.
This study is a cross-sectional needs assessment survey of the knowledge and attitudes of IM residents about Medicare billing and reimbursement. It was designed to obtain a sample of residents that would be representative of different residency program structures and regions. We hoped to determine (1) if residents believed training in Medicare billing and reimbursement was needed, (2) how often Medicare billing and reimbursement was taught, and (3) the baseline knowledge of Medicare billing and reimbursement among residents compared with community-based General Internists.
METHODOLOGY
Study Participants
In the fall of 2004, we obtained a list of all categorical IM residency programs listed in the ACGME Directory.6 The 388 IM residency programs were divided into 2 groups, community-based and university-based, using the American College of Physicians website.7 Programs in each group were then divided into the 4 geographic regions of the United States, for a total of 8 groups of residencies. We chose 8 programs from each group (64 total programs) by using a random number generator.
We contacted the chief resident and/or program administrator from each program to establish a site coordinator who would be willing to distribute and collect our survey to 10 postgraduate year-2 (PGY2) IM residents. Postgraduate year 2 residents were chosen because they had been at their respective program for a period of time adequate to be exposed to medical billing. Twenty of the 64 programs agreed to participate.
As we were unable to establish a site coordinator for the remaining 44 residencies, we randomly selected another residency from the remaining ones within the same region. Twenty-two residency programs were approached in this fashion. Six of these programs agreed to participate, increasing our total number of participating programs to 26. Some regions did not have more than 8 community- or university-based programs. In order to maintain our stratification, other programs were not approached at this point.
Questionnaire Development
We selected topics presented during our residency curriculum on the business of medicine. We also reviewed the CMS website8 and a book on Medicare reimbursement.9 The questionnaire used 27 total questions and was 4 pages long. The first section gathered demographics about the respondents including age, business background, and future plans. We also asked if the resident was a participant in a primary care track within their residency. The next section asked the respondent to self-evaluate his or her own level of knowledge about outpatient Medicare billing. The final section evaluated the respondent's actual level of knowledge about outpatient Medicare billing, including specific questions about types of services Medicare pays for, billing regulations, and sample clinic notes to code appropriately. We termed this portion of the questionnaire, “Medicare Billing and Reimbursement Test.” We reviewed each question with a doctorate level education specialist and a Medicare billing and coding specialist to ensure face and content validity.
The questionnaire was given to a pilot group of 12 third-year residents from our own institution. The average time for the respondents to complete the survey was 5 minutes. The residents reported that the questions were easy to understand, and the instructions were clear. They verbally stated that they felt their level of knowledge was very low. Feedback from the pilot test was used to increase the clarity of the instrument.
We invited a group of 22 practicing General Internists to be our reference group. These physicians were based out of private practice and resident clinics based at a tertiary care urban hospital. They were sent the surveys with return envelopes. Thirteen completed the questionnaire (59%).
We sent the resident questionnaires to a contact person at the residency programs by mail with a return envelope enclosed. An enclosed letter specified how to administer the survey. When the questionnaires were returned, we sent a 5-dollar bookstore gift card to the contact person. We made reminder telephone calls and sent e-mails to our contacts and to the program directors to increase the response rate.
Validation and Reliability of the Knowledge Test
We calculated the classical probability index and difficulty index for each question. We used the group of practicing Internists as a reference to compare their responses on the questions with the group of residents. Twenty-two questions were determined to be valid and useful by this method.
In order to further determine the discriminant ability of each question, we needed a standard way of grouping participants according to proficiency on the test. Given that the distribution of the variable “Medicare Billing and Reimbursement Test—Percent Correct” provided a very normal curve, we used the classic bell curve standard deviation method of grading to divide the participants into common grade categories: A (>1.5 SD above mean), B (from >0.5 to 1.5 SD above mean), C (from 0.5 SD below mean to +0.5 SD above mean), D (from <−0.5 to −1.5 SD below mean), and F (<−1.5 SD below mean).10 Then, we used discriminant analysis to determine what combination of questions had the best ability to divide the subjects into these 5 groups. All but 4 questions contributed to a model that had good discriminant ability to differentiate between these 5 groups. Based upon the methodologies presented above, we included 18 questions in the final “percent correct” calculation.
We used factor analysis to see if any of the topic areas grouped together under 1 reliable domain, indicating that they would be measuring the same knowledge piece, and therefore should be weighted appropriately. We believed this was important particularly as topic areas were often grouped under the same stem of a question. We used Varimax and Promax factor loadings to determine into which factor individual questions would fall. Factor analysis revealed 9 factors from the 18 questions, only 1 of which had face validity, had previously been grouped under the same stem on the questionnaire, and had a Crohnbach's α that showed it was a reliable factor (α = 0.6). This is indicated in Table 2. The 4 questions from this 1 factor were averaged together and weighted as only 1 point toward the total “percent correct” calculation.
Table 2.
Subject Area | Percent of Group with Correct Answer Residents(n = 133) | Internists(n = 13) | PValue |
---|---|---|---|
Determining whether patient qualifies for Medicare—age | 44 | 62 | N/S |
Determining whether patient qualifies for Medicare—quadriplegia | 53 | 15† | .009 |
Determining whether patient qualifies for Medicare—mental retardation | 51 | 15† | .01 |
Knowing number of review of systems needed for level 5 visit | 41 | 85 | .003 |
Knowing the definition of established patient | 11 | 69 | <.001 |
Knowing whether Medicare covers prescription medications | 32 | 77 | .001 |
Knowing whether Medicare part A covers physician fees | 32 | 62 | .04 |
Knowing whether Medicare part A covers hospital days | 70 | 77 | N/S |
Knowing whether Medicare part A covers medications | 62 | 69 | N/S |
Knowing whether Medicare part A covers ambulance | 68 | 46† | N/S |
Knowing whether Medicare part A covers hospice service | 25 | 39 | N/S |
Knowing whether Medicare part B covers:* | |||
Durable medical equipment | 40 | 77 | 0.01 |
Inpatient hospital services | 50 | 69 | N/S |
Inpatient skilled nursing services | 40 | 54 | N/S |
Home health services | 49 | 69 | N/S |
Knowing history elements needed for outpatient level 3 visit | 34 | 46 | N/S |
Correctly coding patient SOAP note of 45 minute spent counseling | 17 | 77 | <.001 |
Recognize billing codes for a new patient clinic visit | 34 | 54 | N/S |
Additional elements not included in the MBRT total score | |||
Determining whether patient qualifies for Medicare—ESRD | 67 | 62 | N/S |
Determining whether patient qualifies for Medicare—history of MI | 93 | 54 | <.001 |
Knowing whether Medicare Part B covers hospice services | 47 | 39 | N/S |
Knowing review of systems needed for outpatient Level 2 visit | 5 | 31 | .001 |
Group mean total score (SD) | |||
MBRT total score | 41.8 (15) | 59.0 (21) | <.001 |
This group of questions was treated as 1 knowledge element based upon factor analysis and reliability testing. These 4 elements counted as only 1 point on the test
On these 3 questions, the reference group performed worse than the residents; yet, the questions were found to have good discriminant ability and were included in the total score
PGY2, postgraduate year 2; MI, myocardial infarction; ESRD, end-stage renal disease
Statistical Analysis
For the continuous variables, we examined frequency distributions and descriptive statistics (means, standard deviations, frequencies, etc.) for evidence of skewness, outliers, and nonnormality. Categorical variables were recoded based on sparseness in certain categories. Bivariate analyses included t tests, anova, and Pearson's -r as appropriate to the level of measurement.
We used stepwise multivariable regression analysis to determine what variables were independent predictors of the score on the Medicare Billing and Reimbursement Test (MBRT). We began the procedure with variables that were significant in bivariate analysis and variables that were of particular interest. The F test was used to determine whether a variable should be included, with an entry of 0.05 and an exclusion of 0.10. The 133 residents in this study came from 26 different schools. To accommodate the potential clustering of residents from the same school, we used a mixed effects regression model with a random school effect. A mixed effects model confirmed that the programs did not have an effect on our outcome.
Sample Size
We hoped 64 residency programs would partner with us to survey 10 residents each. Assuming that only 50% of the residents would respond, we might obtain around 300 resident respondents in order to have 80% power to detect a 5% difference on the MBRT between 2 equal groups if the SD were about 15%. When we found only 26 programs that were willing to participate, we reran our power analysis to see what affect it would have on the determining the significance of our results. If from the 26 programs willing to participate we achieved a 50% response rate, we would have 80% power to detect an 8% difference on the MBRT between 2 equal groups. Based upon the performance of our pilot group and our Internist reference group, we believed this lower sample size would be sufficient for our study. The Saint Luke's Hospital Institutional Review Board approved the study.
RESULTS
Characteristics of Responding Residents
Twenty-six programs agreed to participate in our study. The participating programs had a mean size of 65.5 residents (range, 29 to 176). From these programs, 133 second-year IM residents completed the questionnaire (response rate = 51% from these 26 programs). The pertinent characteristics of the respondents are summarized in Table 1. The respondents had fewer women than the average proportion in IM residencies nationally (35% vs 42%).11 On average, 5.0 residents (range, 1 to 10) participated from each program. Fewer residents responded from the Northeast and the West. The average age of the responding residents was 29.4 (range, 24 to 50). Eighty-six percent (115/133) of the residents completing the questionnaire reported no prior business experience or training.
Table 1.
Characteristics | No. | % |
---|---|---|
Female | 47 | 35.3 |
Community-based program | 54 | 40.6 |
University-based program | 79 | 59.4 |
Regions | ||
West | 26 | 19.5 |
Northeast | 22 | 16.5 |
South | 40 | 30.1 |
Midwest | 45 | 33.8 |
Primary care track resident | 34 | 25.8 |
Categorical resident | 98 | 74.2 |
Thirty-six (27%) respondents planned to enter practice as a General Internist immediately after residency, while the majority (66%) planned to pursue fellowship training. Nearly half (48%) planned to work in private practice after training, while 38% anticipated working in a univrsity-based practice.
Residents' Perceived Level of Knowledge about Medicare Reimbursement
We asked respondents to self-assess their level of knowledge about Medicare reimbursement using a Likert scale from 1 (“very low”) to 5 (“very high”). Overall, responding residents reported a mean level of 2.0 (SD = 0.9). We used a similar scale from 1 (“much less than”) to 5 (“much greater than”) for asking how they compared themselves with other groups. They believed they have about the same level of knowledge as medical students (2.9, SD = 1.0) and interns (3.0, SD = 0.7). When comparing themselves with attending physicians, they reported feeling less knowledgeable (1.6, SD = 0.7).
Fifty-six (42.1%) of the 133 residents reported receiving “0 hours a month” of Medicare reimbursement training, while 69 (51.9%) reported “0 to 3 hours a month.” We used a Likert scale from 1 (“strongly disagree”) to 5 (“strongly agree”) to ask if residents believed they received enough education and if it should be taught. Respondents disagreed with a statement that they receive enough education about Medicare reimbursement (1.9, SD = 0.8). They agreed that Medicare reimbursement should be taught in residency (4.0, SD = 1.1)
Actual Level of Knowledge about Medicare Reimbursement of Residents
Eighteen test questions were used for assessing residents' knowledge on the Medicare Billing and Reimbursement Test (MBRT; see Table 2). Residents scored significantly lower than the group of practicing general IM physicians who served as a reference group as expected (41.8% vs 59.0%, P<.001). The residents' score correlated significantly with their self-assessed level of knowledge (P = .007), and even more significantly with their self-assessed level of knowledge in comparison with medical students and interns (P = .002)
Comparisons Between Subgroups of Residents
Selected subgroup comparisons were performed to determine whether responses differed according to specific characteristics of the residents and their programs. When we compared self-assessed level of knowledge and the percent correct on the MBRT between residents of community and university programs, and between residents from different regions of the country, there were no significant differences.
Primary care track residents scored significantly lower on the MBRT than did categorical residents (34% vs 44%, P<.001). The results were unchanged when we limited the analysis to only those 16 residencies that had both primary care track and categorical residents (no residency was represented by only primary care track residents). However, the perceived level of knowledge as well as the self-reported amount of education received on this topic was not significantly different between these 2 groups.
No correlation was found between the size of the residency program and residents' percent correct on the MBRT. Similarly, there was no association between postresidency training plans (start practice vs fellowship) or preferred postresidency practice setting (private practice vs university-based) and the MBRT.
A significant negative correlation (R2 = −.211, P = .016) was found between program size and whether or not Medicare reimbursement should be taught—smaller programs felt more strongly that there should be more education in this area.
Stepwise multivariable linear regression modeling was used to determine if any of the respondent and program characteristics were independently associated with the MBRT score. Three variables combined for a significant model: being a primary care track resident (negatively associated, P<.001), having some education per month on Medicare reimbursement (P = .047), and the perceived level of knowledge about Medicare reimbursement compared with medical students (P = .070; Adjusted R2 for the model = .14, df = 3, F = 8.087, P<.001).
DISCUSSION
Internal medicine residents in our study strongly agree that they do not receive enough education about Medicare reimbursement, and believe it should be a requirement in residency training. We were glad to learn that a majority of the residents receive up to 3 hours per month of training in Medicare reimbursement. Some of this may have been informal teaching during rounds. In review of the literature, at least 1 institution has reported on a monthly conference dedicated to teaching these issues.12 The amount of training our respondents reported receiving correlated well with their actual knowledge on the MBRT. One curriculum evaluation demonstrated that residents who participated in a 2-week class on managed-care had a significant increase in their knowledge.13 The question remains: how well prepared is a resident to bill and code appropriately once he or she has graduated from residency? Our study would suggest not optimally.
Internal medicine residents' actual knowledge of Medicare billing was significantly associated with their perceived knowledge, perhaps indicating appropriate insight into their lack of preparedness. While there were no differences between regions or between university and community programs, we were very surprised that primary care track residents demonstrated a lower level of knowledge than residents in regular tracks. Practice management training has been recommended as a staple for primary care track curricula for more than a decade.14 However, this sample of 34 residents may not be representative of primary care track residents nationwide.
Our study had some limitations. We found it exceedingly difficult to enlist the full complement of programs we set out, probably too optimistically, to recruit. Also, only 59% of the general IM practitioners we asked to complete the survey did so after repeated requests. On the other hand, we wound up with a resident sample that is well represented by region and type of program. Thirty-five percent were female, and 40.6% were from community-based programs. Overall, we believe this group of respondents is a good snapshot of second-year IM residents.
However, giving this questionnaire to second-year residents may be another weakness of our study. It is possible that residencies focus on billing and reimbursement training later on during the second or third year. It is apparent, however, that the residents are interested in gaining more knowledge and concerned that they might not get enough. It would be interesting to give the questionnaire to graduating residents and fellows to see if they score higher than the residents in our study.
We were concerned that utilizing different contact people at each residency program across the country could create nonuniform survey administration. We attempted to minimize differences in questionnaire completion by contacting a person at each program to supervise the residents who participated in our survey. By sending specific instructions to our contact person we instilled uniformity, as compared with each resident participating alone by using a mailed paper questionnaire or a web-based survey.
Another potential weakness was the use of a written test to assess resident knowledge. While direct observation of clinical billing would be the optimal method, achieving the generalizability and power needed in such a study would be difficult without extraordinary resources. The study also relied on self-assessment of knowledge about reimbursement, and self-report about the amount of time spent learning about billing and reimbursement. We had no curricular documents from the residencies to verify the education content, nor objective evaluations of the residents' knowledge from the residency office. However, our MBRT score served to verify the validity of these measures.
The fact that the practicing Internist reference group average on the MBRT was not higher than 59% did not invalidate the test by any means. Many standardized exams used in medical training and certification have an average percent correct score of around 60%. The test construction methodology we used ensured that the test was valid and reliable. Also, being a reference group does not mean that the Internists were “experts” on coding—continuing education on this topic is obviously in demand for physicians such as these.
Our study demonstrates that second-year IM residents feel they have a low level of knowledge regarding outpatient Medicare billing, and have a lower test score than practicing Internists to back up their feeling. While graduating residents fresh from residency know as much or more about the biomedical aspects of medicine than practicing physicians, they are knowingly deficient about the Medicare billing and reimbursement by which many will support their practice. Because of the increasing number of Medicare enrollees, and the consequences of Medicare fraud and abuse, it is vital that residents receive training in these concepts and skills.
Tufts Health Care Institute is one of the organizations that has already developed a curriculum which addresses education on managed care and insurance plans.15 According to our respondents, few programs provide education in this area. Over the next several years, with further implementation of ACGME's core competencies, more programs will be faced with deciding on a curriculum that works within their institutions.
Acknowledgments
We would like to express gratitude to Anthony Paolo, Director of Assessment and Evaluation in the Office of Medical Education and the University of Kansas Medical Center, and to Shellie Fortney, LPN, CPC, founder and director of Coding & Compliance Initiatives Inc. Their expertise and time helped develop our questionnaire and methods.
Contributions of authors: K.A., DO: lead author, survey development, data entry, study design, manuscript drafting, statistical analysis, editing.M.B., MD: survey development, data entry, study design, manuscript drafting, editing.
B.W.B., MD: mentoring author, study design, survey design, data analysis, manuscript drafting, editing.
REFERENCES
- 1.Accreditation Council for Graduate Medical Education. ACGME general competency requirements. [December 28, 2005]; Available at http://www.acgme.org/acWebsite/irc/irc_competencies.pdf.
- 2.Accrediation council for Graduate Medical Education. “Program requirements for residency education in internal medicine.”. [May 23, 2005]; Available at http://www.acgme.org/acWebsite/downloads/RRC_progReq/140pr703_u704.pdf.
- 3.Council on Graduate Medical Education. “Preparing learners for practice in a managed care environment”. [23, 2005]; Available at http://www.cogme.gov/resource.htm.
- 4.Hakim A, Kachur E, Santilli V. A comprehensive curriculum in managed care for residents. Acad Med. 2001;76:560. doi: 10.1097/00001888-200105000-00107. [DOI] [PubMed] [Google Scholar]
- 5.Iglehart JK. Support for academic medical centers—revisiting the 1997 Balanced Budget Act. N Engl J Med. 1999;341:299–304. doi: 10.1056/NEJM199907223410424. [DOI] [PubMed] [Google Scholar]
- 6.Graduate Medical Education Directory 2004–2005. Americal Medical Association 2004–2005.
- 7.Americal College of Physicians. “Residency Database Search”. [August 22, 2004]; Available at http://www.acponline.org/residency/index.html.
- 8.Centers for Medicare and Medicaid Services. [December 28, 2005]; Available at: http://www.cms.hhs.gov/
- 9.American Medical Association. 7. Baltimore, MD: American Medical Association; 2003. Medicare Resident & New Physician Guide: Helping Health Care Professionals Navigate Medicare. [Google Scholar]
- 10.Ward AW, Murray-Ward M. Assessment in the Classroom. In: Williams Lindsay D., editor. Belmont, CA: Wadsworth; 1999. p. 74. [Google Scholar]
- 11.American Board of Internal Medicine. Internal medicine residency programs: percentage of third-year residents by gender and type of medical school”. [May 23, 2005]; Available at: http://www.abim.org/resources/trainim.shtm.
- 12.Kravet SJ, Wright SM, Carrese JA. Teaching resource and information management using an innovative case-based conference. J Gen Intern Med. 2001;16:399–403. doi: 10.1046/j.1525-1497.2001.016006399.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Callahan M, Fein O, Stocker M. Educating residents about managed care: a partnership between an academic medical center and a managed care organization. Acad Med. 2000;75:487–93. doi: 10.1097/00001888-200005000-00021. [DOI] [PubMed] [Google Scholar]
- 14.Martin GJ, Curry RH, Yarnold PR. The content of internal medicine residency training and its relevance to the practice of medicine: implications for primary care curricula. J Gen Intern Med. 1989;4:304–8. doi: 10.1007/BF02597402. [DOI] [PubMed] [Google Scholar]
- 15.Tufts Health Care Institute. Health care system overview. [December 28, 2005]; Available at http://www.tmci.org/curriculum/category1.htm.