Abstract
Background
Specialty-to-specialty variation in use of outpatient evaluation and management service codes could lead to important differences in reimbursement among specialties.
Objective
To compare the complexity of visits to physicians whose incomes are largely dependent on evaluation and management services to the complexity of visits to physicians whose incomes are largely dependent on procedures.
Design, Setting, and Participants
We analyzed 53,670 established patient outpatient visits reported by physicians in the National Ambulatory Medical Care Survey (NAMCS) from 2013 to 2016. We defined high complexity visits as those with an above average number of diagnoses (> 2) and/or medications (> 3) listed We based our comparison on time intervals corresponding to typical outpatient evaluation and management times as defined by the Current Procedural Terminology Manual and specialty utilization of evaluation and management codes based on 2015 Medicare payments.
Main Outcome and Measures
Proportion of complex visits by specialty category.
Key Results
We found significant differences in the content of similar-length office visits provided by different specialties. For level 4 established outpatient visits (99214), the percentage involving high diagnostic complexity ranged from 62% for internal medicine, 52% for family medicine/general practice, and 41% for neurology (specialties whose incomes are largely dependent on evaluation and management codes), to 34% for dermatology, 42% for ophthalmology, and 25% for orthopedic surgery (specialties whose incomes are more dependent on procedure codes) (p value of the difference < 0.001). High medication complexity was found in the following proportions of visits: internal medicine 56%, family medicine/general practice 49%, and neurology 43%, as compared with dermatology 33%, ophthalmology 30%, and orthopedic surgery 30% (p value of the difference < 0.001).
Conclusion
Within the same duration visits, specialties whose incomes depend more on evaluation and management codes on average addressed more clinical issues and managed more medications than specialties whose incomes are more dependent on procedures.
Electronic supplementary material
The online version of this article (10.1007/s11606-019-05624-0) contains supplementary material, which is available to authorized users.
KEY WORDS: evaluation and management services, outpatient visit complexity, duration of visit, NAMCS
BACKGROUND
Office visits are the most common service delivered in the US health care system. The existing evaluation and management (E/M) service codes were developed from a wide range of specialty-specific service codes used before the Medicare Physician Fee Schedule (MPFS). Prior to its inception of the MPFS in 1992, many specialties each had a repertoire of service codes, each with descriptors, that were tailored to capture the breadth, depth, and intensity of care specific to that specialty.1 The collapse of all the outpatient codes for new and established patients to a total of ten—five for new patients and five for established patients—required all specialties to choose from a narrow set of options for billing purposes. This set, now almost 30 years old, may not adequately capture the nuances and full topology of cognitive work—particularly the critical thinking involved in data gathering and analysis, planning, management, decision making, and judgment in ambiguous or uncertain situations.2, 3 The numbers of clinical items addressed, medications managed, and co-morbidities considered have all increased.4 When patients present for an office visit, the physician medical decision making (MDM)—the cognitive work—is more complex than it was 30 years ago.5–9 If the E/M definitions are not updated to reflect the increasing complexity of cognitive work for the most intense levels of E/M care, payment distortions may become exacerbated, creating income disparities among specialties and leading to geographic maldistribution and inadequate access to lower-paid specialties.10
We analyzed the National Ambulatory Medical Care Survey (NAMCS) data to determine if there are differences in the content of the services provided by different specialties over similar time intervals which would suggest that there is significant specialty-to-specialty variability in the use of the E/M codes. NAMCS questionnaires are sent to a nationally representative sample of practicing physicians each year by the Center for Disease Control (CDC). The instrument asks each respondent to provide patient-level data based on individual clinical encounters. It is designed to reflect the clinical experience of practitioners in both a quantitative and a qualitative manner. Physicians record what is clinically relevant for each encounter, independent of billing (service code selection) or risk adjustment (diagnoses associated with claim submissions). Each physician is asked to record their time with each patient encounter. Thus, the NAMCS database may be the most unbiased representation of the services delivered to patients by different specialties.
MATERIALS AND METHODS
Data
We studied data from the 2013–2016 NAMCS, a nationally representative survey conducted by the National Center for Health Statistics (NCHS). The NAMCS collects data from all physicians listed by the American Medical Association (AMA) and the American Osteopathic Association (AOA) who provide office-based, direct patient care. Using a 3-stage sampling design, selected physicians are asked to complete surveys for individual patient encounters, resulting in a systematic random sample of office-based outpatient visits.11 For each sampled visit, responding physicians report patient demographic information, up to five clinical diagnoses, all medications prescribed, the time spent with face-to-face clinical care, and the diagnostic, preventive, and therapeutic services provided.
Since NAMCS physician categorization does not crosswalk readily to the Centers for Medicare & Medicaid Services’ (CMS) specialty categories, we focused our analysis on visits to physicians who specialized in family medicine and general practice (FM/GP), internal medicine (IM), dermatology, neurology, ophthalmology, and orthopedic surgery, specialties that could be clearly distinguished within the NAMCS database and reliably cross-walked to the Medicare specialty designations. In addition, we restricted our analysis to established outpatient visits to compare the complexity of continuity care for the select specialty categories. Visits with only a nurse or medical assistant and no physician (1.83% of visits) were excluded, leaving a final analytical sample of 53,670 established office visits to the six specialties between 2013 and 2016.
We used 2015 Medicare Provider Utilization and Payment Data: Physician and Other Supplier Public Use File (Medicare PUF) to compare specialties’ income dependency on E/M and non-E/M procedures.12 A list of the E/M services and corresponding Current Procedural Terminology (CPT) codes is attached in Appendix A.
Variables
Time spent with a physician is self-reported and defined as the amount of time in minutes that a physician spent with the patient, not including patient time spent waiting for an appointment or with another type of practitioner.10 We matched physician-reported times with the typical times defined by CPT guidelines for the outpatient E/M codes. Typical times are conventions based on the average range of time it takes a physician to complete all face-to-face work, as defined by the 1995/1997 CMS guidelines,13, 14 for a visit at a specific service code level. For example, the office visit code 99213 assumes a typical time range up to 15 min, while a 99214 is assigned a typical time up to 25 min. We matched the physician time reported to outpatient E/M codes as follows: < 15 min (99211 and 99212), > 15–24 min (99213), > 25–39 min (99214), and > 40 min (99215). We did not categorize any visits in the 99211 range of 1–10 min since this code is not designed for physician use.
We considered two measures to represent visit complexity: the number of diagnoses listed and the number of medications prescribed. For the years included in our analysis (2013–2016), the NAMCS provided the option to record up to 5 clinical diagnoses at each visit using the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes and up to 30 medications ordered, supplied, administered, or continued per visit. Medications in the NMACS are broadly defined and include prescription and over-the-counter drugs. We chose the number of clinical diagnoses as a proxy for the number of issues actively considered simultaneously at a given visit. NAMCS questionnaires do not ask for billing diagnoses and therefore are less confounded by incentives for physicians to list billable clinically concurrent problems or to add financially beneficial diagnoses for risk adjustment. In this way, NAMCS diagnostic data represent what the physician sees as sufficiently active clinically to merit recording.
Prescription medications, our second measure of complexity, have demonstrated a robust relationship to visit complexity.9 We only counted prescription medications since we were not certain whether over-the-counter (OTC) medications were consistently recorded and we could not determine whether the responding physician was personally recommending a given OTC medication or merely listing what the patient had chosen on their own. Prescription medications with two or more active ingredients were counted once. The proportions of combined medications prescribed across the selected groups of specialties were not statistically different (p value < 0.001, calculated from comparison of means). For our analysis, complex visits are defined as those with over two diagnoses (the mean diagnoses per visit across all the NAMCS encounters for all specialties and/or those with over three prescription medications (the mean medications per visit)). The mean cutoff point was chosen based on previous studies on medication regimen complexity and patient care complexity.15–18 Besides, results from sensitivity analysis using the median as the cutoff point to define level of complexity were consistent with the primary findings.
Statistical Analysis
We calculated total Medicare part B payments as well as payments for E/M and non-E/M services for the selected specialties in 2015. Next, we calculated and compared the percentage of total payments derived from E/M services across the specialties.
Descriptive analyses examined average visit complexity within each category of visit duration by physician specialty using 2013–2016 NAMCS data. Pearson chi-square tests for independence (categorical variables) or t tests for independent samples (continuous variables) were used to examine differences in characteristics of patients and visits seen by specialties whose income was largely dependent on E/M services and specialists whose income was largely dependent on procedural services. The independent two-sample t test was applied to compare the mean number of medications and diagnoses between the E/M dependent and the procedural-dependent groups of specialties for each visit time category. The proportion of the complex established patient office visits for each outpatient E/M service code time interval was calculated for each specialty for illustrative purposes. All analyses were performed using Stata version 15.1 (Stata Corporation, College Station, TX). All data were publicly available de-identified data, and thus exempt from review by the Institutional Review Board.
RESULTS
Figure 1 compares income dependency on E/M vs. non-E/M codes across the six specialties. For FM/GP, IM, and neurology specialties, E/M work makes up a larger percentage of their total part B payments as follows: FM/GP 70.5%, IM 75%, and neurology 51.3%. A larger percentage of income is derived from non-E/M services for ophthalmology (94.7%), dermatology (77.6%), and orthopedic surgery (77%).
Figure 1.
Shares of Medicare payments to providers for E&M and Non-E&M services by type of specialty, 2015. E&M services, evaluation and management services; FM/GP, family medicine and general practice; IM, internal medicine.
When patient characteristics are compared across the six specialties (Table 1), patients reported by the specialties with a higher dependency on E/M codes (FM/GP, IM, and neurology) were more likely to be current smokers (14.7% vs. 8.8%, p < 0.001), more likely to have a new problem (35.6% vs. 22.1%, p < 0.001), more likely to have 2 or more chronic conditions (28.9% vs. 13.8%, p < 0.001), more likely to be on 3 or more medications (49.1% vs. 29.1%, p < 0.001), and more likely to have two or more diagnoses (50.8% vs. 34.5%, p < 0.001) when compared with the patients reported by those specialties with a higher dependency on procedures (dermatology, ophthalmology, orthopedic surgery).
Table 1.
Characteristics of Office-Based Outpatient Visits, by Specialty Dependence on E/M Services, NAMCS (2013–2016)
| Specialty income dependent on E/M services* | Specialty income dependent on procedures† | p value§ | |
|---|---|---|---|
| Visit characteristics (n (%))‡ | |||
| Patient age (mean (SD)) | 53.51 (21.06) | 59.13 (20.47) | < .001 |
| Patient sex | |||
| Female | 19,383 (57.65) | 11,244 (56.09) | < .001 |
| Male | 14,240 (42.35) | 8803 (43.91) | |
| Tobacco use | |||
| Not current | 21,040 (62.58) | 12,350 (61.61) | < .001 |
| Current | 4996 (14.86) | 1768 (8.82) | |
| Major reason for visit | |||
| New problem | 11,754 (35.59) | 4321 (22.13) | < .001 |
| Chronic problem, routine | 11,851 (35.89) | 6880 (35.24) | |
| Chronic problem, flare-up | 2403 (7.28) | 1698 (8.70) | |
| Pre-/post-surgery | 3368 (10.20) | 5028 (25.75) | |
| Preventive care | 3646 (11.04) | 1598 (8.18) | |
| Number of chronic conditions | |||
| 0 | 9305 (27.67) | 8262 (41.21) | < .001 |
| 1–2 | 14,164 (42.13) | 8450 (42.15) | |
| > 2 | 9725 (28.92) | 2775 (13.84) | |
| Number of medications | |||
| ≤ 3 | 17,111 (50.89) | 14,198 (70.82) | < .001 |
| > 3 | 16,512 (49.11) | 5849 (29.18) | |
| Number of diagnosis | |||
| ≤ 2 | 16,550 (49.22) | 13,127 (65.48) | < .001 |
| > 2 | 17,073 (50.78) | 6920 (34.52) | |
E&M services evaluation and management services
*Percentages may not add up to 100% due to missing data
†Specialties with income dependent on E&M services include FM/GP, IM, and neurology
‡Specialties with income dependent on procedures include: dermatology, ophthalmology, and orthopedic surgery
§p values calculated from chi-square (categorical) and Student’s t tests (comparison of means)
Figure 2 illustrates the differences in the percentages of high complexity visits for each outpatient E/M service code time by specialty. For example, for those visits where the time would correspond to a level 4 (99214) established outpatient, the percentages that were high visit diagnosis complexity were as follows: FM/GP, 52%; IM, 62%; neurology, 41%; dermatology, 34%; ophthalmology, 42%; orthopedic surgery, 25%. Similar figures for high medication complexity were as follows: FM/GP, 49%; IM, 56%; neurology, 43%; dermatology, 33%; ophthalmology, 30%; orthopedic surgery, 30%.
Figure 2.
Percent distribution of complex visits, by time-based visit levels and type of specialty, 2013–2016. FM/GP, family medicine and general practice; IM, internal medicine. Complex visits are defined as those with over two diagnoses (the mean of diagnoses made per visit) and/or those with over three prescription medications (the mean of medications prescribed per visit).
When the six physician categories are split into two groups, those whose income is largely dependent on outpatient E/M work (FM/GP, IM, and neurology) and those whose income is largely dependent on procedures (dermatology, ophthalmology, orthopedic surgery), there are further differences across all visit time intervals (Table 2). For the time corresponding to a level 4 established outpatient (99214), the mean number of medications reported for those specialties whose compensation is largely E/M dependent was 4.65 (± 4.27) for each visit while the mean number of medications reported for those who are procedurally dependent was 2.99 (± 3.80). The mean ± SD of the number of medications and diagnoses, and p values, as the results of t test between the E/M-dependent and the procedural-dependent groups, are shown in Table 2.
Table 2.
Average Number of Medications Prescribed by Time Levels and Specialty Dependence on E/M Services, NAMCS 2013–2016
| Number of Medications | Number of Diagnosis | |||||
|---|---|---|---|---|---|---|
| Specialty income dependent on E/M services* | Specialty income dependent on procedures† | p value | Specialty income dependent on E/M services | Specialty income dependent on procedures | p value | |
| Visit time levels | Mean (SD) | Mean (SD) | Mean (SD) | Mean (SD) | ||
| Levels 1 and 2 | 4.56 (4.25) | 2.98 (3.82) | < .001 | 2.51 (1.36) | 2.17 (1.25) | < .001 |
| Level 3 | 4.42(4.08) | 2.78 (3.64) | < .001 | 2.53 (1.35) | 2.12 (1.23) | < .001 |
| Level 4 | 4.65 (4.27) | 2.99 (3.80) | < .001 | 2.67 (1.37) | 2.17 (1.22) | < .001 |
| Level 5 | 4.61 (4.26) | 2.85 (3.73) | < .001 | 2.66 (1.37) | 2.20 (1.24) | < .001 |
E&M services evaluation and management services
*Specialties with income dependent on E&M services include FM/GP, IM, and neurology
†Specialties with income dependent on procedures include dermatology, ophthalmology, and orthopedic surgery
§p values calculated from Student’s t tests (comparison of means)
DISCUSSION
The pricing of physician services within the MPFS ripples through all payment models.19,20 For traditional Medicare, the fee schedule directly determines physician compensation. For new models of care ranging from Advanced Payment Models (APMs) to the CPC+ payments, service code values determine payment calculations.
Virtually all the services in the MPFS are subject to change as medical and surgical practices change. When the times spent completing a range of procedures were compared with the times used by the AMA’s Relative Value Scale Update Committee (RUC) for pricing these procedures, inconsistencies were identified.21 Some procedures were priced to take longer than what was observed while others were priced to take less time. Likewise, there have been repeated calls for reworking the definitions and valuations of the E/M codes based on the changes in clinical practice over the last several decades.1,22–26 For the primary care physician, there are more ambitious goals for the treatment of prevalent conditions such as hypertension, diabetes, and hypercholesterolemia. In addition, there are new combination therapies for these conditions, each with more and more potential adverse interactions. This is in addition to the increased numbers of diagnostic tests, the aging demographic, the work burden imposed by Electronic Health Records (EHRs), and innovative interventions, such as the biomodulators, that confer new and higher risks.
CMS is responsible for accurate pricing within the MPFS. To fulfill this expectation, the fee schedule and its relative valuations must be continually updated to reflect actual practice. The imprecision of the times reported for the surgical codes illustrates how the codes have not been adjusted to reflect the changing nature of practice.21 Importantly, the codes with underestimated time included cardiothoracic surgery,21 a profession where there has been upward movement in the average age of patients undergoing surgery, with attendant increase in risks and challenges for the surgeon.
Likewise, our analysis of the NAMCS data would indicate that within similar visit times, there are important differences in how different specialties assess the intensity of their own work. The number of problems addressed and the number of medications managed were higher for the specialties of FM/GP, IM, and neurology in comparison with dermatology, ophthalmology, and orthopedic surgery. For example, when the intensity of care was compared across specialties for the time intervals that correspond to the level 4 (99214) established outpatient E/M code, 56% of IM visits had over 3 medications managed compared with 33% of dermatology visits. The corresponding comparison for greater than two diagnoses was 62% for IM and 34% for dermatology.
We note several limitations. First, the NAMCS data comprise voluntary responses by physicians. It may be that there is specialty-specific underreporting of either diagnoses or medications. However, this underreporting could be yet another indicator of decreased complexity within an encounter, and problems and medications not listed may indicate issues not addressed. Second, because NAMCS specialty designations do not consistently correspond to CMS specialty categories, we were forced to confine our selection of specialties to the six we could reliable identify in both the NAMCS and CMS databases. For example, the NAMCS category, cardiovascular disease, could include cardiologists, surgeons, or physiatrists specializing in cardiac rehabilitation. Third, there are shortcomings in the instrument itself. The number of diagnoses that can be listed per patient was capped at five, a critical restriction for those physicians, such as IM physicians, whose patients might have concurrent conditions well beyond five. Fourth, the report of time spent with patient care may not be comparable among the respondents. There were no direct observations to validate the times recorded. The time spent with face-to-face care for IM, FM/GP, and neurology likely underestimates the total time of an encounter since the post-visit time was not requested in the NAMCS questionnaire. Fifth, the request to list medications was not interpreted uniformly by respondents. Some respondents provided long lists including all over-the-counter medicines. Many listed only prescription medications (88.6%). Some listed vitamins, which may or may not have been medications. We counted combined medications only once but found no significant recording differences across the six specialties. Aspirin was a special case since the questionnaire did not ask whether this was physician directed or taken by patient choice. Sixth, there was no data on the billing code selected. The code selection itself is a proxy for visit “intensity” since either time spent with a patient or visit content can be used to support billing. However, the absence of the coding in the NCMCS data may be a hidden asset. In some specialties, the time actually spent in care would suggest specialty-specific patterns of overcoding. For example, 81.8% of dermatology visits and 68.2% of ophthalmology visits were below the level 3 threshold of 15 min suggesting that a larger portion of visits should be at level 2 (99212). Lastly, we assume that documentation practices are uniform across specialties. It may be that some there are specialty-to-specialty documentation habits that could skew any analysis based on recorded conditions or medications.
The last CMS guidelines for the outpatient E/M codes were issued in 1995 and revised in 1997.13,14 The absence of any revisions over the last two decades may have created specialty-to-specialty variation in how these guidelines are interpreted. CMS efforts to improve the valuations of the E/M service codes must be coupled with a commitment to ensuring that service code selection within each specialty is appropriate with respect to the full range of patient care provided to all Medicare beneficiaries. In other words, there should be well-defined cross-specialty guidelines for the selection of the appropriate outpatient E/M service code level. Ideally, such revised guidelines would be based on a nationally representative sampling of actual day-to-day clinical practice. The importance of accurate valuations and definitions for physician services cannot be understated. The outpatient E/M codes cover the largest category of Medicare payment for professional services, roughly $27 billion annually and growing. For the workforce to meet national needs for primary care, specialty care, and procedural care, any inherent bias in the fee schedule or deficiencies in service code definition that lead physicians to choose the wrong service code could lead to long-term specialty-specific undercompensation or overcompensation, and ultimately system-wide workforce deficiencies.10
NAMCS data suggest that there are important specialty-to-specialty variations in the complexity and intensity of outpatient E/M cognitive care provided within similar visit times. Our results imply that the distribution of outpatient E/M service codes should vary substantially among specialties. Some specialties, such as IM, FM/GP, and neurology, with higher numbers of conditions addressed and medications managed for all visit time intervals, should utilize higher codes more frequently, while other specialties, such as dermatology, ophthalmology, and orthopedic surgery, should utilize higher codes much less frequently.
Electronic supplementary material
(DOCX 23.1 kb)
Funding Information
Supported by a grant from the Office of the Director, National Institutes of Health (NIH Director’s Early Independence Award, 1DP5OD024564, to Dr. Song)
Compliance with Ethical Standards
Conflict of Interest
The authors declare that they do not have a conflict of interest.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Kumetz EA, Goodson JD. The Undervaluation of Evaluation and Management Professional Services: The Lasting Impact of Current Procedural Terminology Code Deficiencies on Physician Payment. Chest. 2013;144(3):740–745. doi: 10.1378/chest.13-0381. [DOI] [PubMed] [Google Scholar]
- 2.Landon BE. A Step toward Protecting Payments for Primary Care. N Engl J Med. 2019;380(6):507–510. doi: 10.1056/NEJMp1810848. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Nakhasi A, Goodson J. The case for the redefinition and revaluation of the outpatient Evaluation and Management (E&M) service codes and the development of new documentation expectations. J Gen Intern Med. https://www.sgim.org/File%20Library/SGIM/Communities/Advocacy/LEAHP/SGIM-WP-July-2015-Goodson-E-M-paper%2D%2D2-final.pdf
- 4.Abbo ED, Zhang Q, Zelder M, et al. The increasing number of clinical items addressed during the time of adult primary care visits. J Gen Intern Med. 2008;23(12):2058–2065. doi: 10.1007/s11606-008-0805-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Katerndahl D, Wood R, Jaen CR. Complexity of ambulatory care across disciplines. Healthc (Amst). 2015;3(2):89–96. doi: 10.1016/j.hjdsi.2015.02.002. [DOI] [PubMed] [Google Scholar]
- 6.Horner RD, Matthews G, Yi MS. A conceptual model of physician work intensity: guidance for evaluating policies and practices to improve health care delivery. Med Care. 2012;50(8):654–661. doi: 10.1097/MLR.0b013e31825516f7. [DOI] [PubMed] [Google Scholar]
- 7.Katerndahl DA, Wood R, Jaen CR. A method for estimating relative complexity of ambulatory care. Ann Fam Med. 2010;8(4):341–347. doi: 10.1370/afm.1157. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Horner RD, Szaflarski JP, Jacobson CJ, et al. Clinical work intensity among physician specialties: how might we assess it? What do we find? Med Care. 2011;49(1):108–113. doi: 10.1097/MLR.0b013e3181f3801f. [DOI] [PubMed] [Google Scholar]
- 9.George J, Phun YT, Bailey MJ, Kong DC, Stewart K. Development and validation of the medication regimen complexity index. Ann Pharmacother. 2004;38(9):1369–1376. doi: 10.1345/aph.1D479. [DOI] [PubMed] [Google Scholar]
- 10.Goodson JD, Song Z, Shahbazi S. Physician Payment Disparities and Access to Services—A Look Across Specialties. J Gen Intern Med. 2019. Accepted for publication. [DOI] [PMC free article] [PubMed]
- 11.National Center for Health Statistics. 2016 National Ambulatory Medical Care Survey (NAMCS) micro-data file documentation. Hyattsville, MD. .
- 12.Centers for Medicare & Medicaid Services. Medicare fee-for service provider utilization & payment data: Physician and other supplier public use file—A methodological overview Centers for Medicare & Medicaid Services. https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Medicare-Provider-Charge-Data/Downloads/Medicare-Physician-and-Other-Supplier-PUF-Methodology.pdf. Accessed March 10, 2019.
- 13.Centers for Medicare & Medicaid Services. 1995 documentation guidelines for evaluation and management services. http://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network MLN/MLNEdWebGuide/Downloads/95Docguidelines.pdf. Accessed April 20, 2019.
- 14.Centers for Medicare & Medicaid Services. 1997 documentation guidelines for evaluation and management services. http://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNEdWebGuide/Downloads/97Docguidelines.pdf. Accessed April 15, 2019.
- 15.Muir AJ, Sanders LL, Wilkinson WE, Schmader K. Reducing medication regimen complexity: a controlled trial. J Gen Intern Med. 2001;16(2):77–82. doi: 10.1046/j.1525-1497.2001.016002077.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Linnebur SA, Vande Griend J, Metz K. Patient-level medication regimen complexity in older adults with depression. Clin Ther. 2014;36(11):1538–1546. doi: 10.1016/j.clinthera.2014.10.004. [DOI] [PubMed] [Google Scholar]
- 17.Libby AM, Fish DN, Hosokawa PW, et al. Patient-level medication regimen complexity across populations with chronic disease. Clin Ther. 2013; 35:385–398.e1. 10.1016/j.clinthera.2013.02.019. [DOI] [PubMed]
- 18.Bazargan M, Smith J, Yazdanshenas H, Movassaghi M, Martins D, Orum G. Non-adherence to medication regimens among older African-American adults. BMC Geriatrics. 2017;17(1):163. doi: 10.1186/s12877-017-0558-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Berenson RA, Goodson JD. Finding Value in Unexpected Places--Fixing the Medicare Physician Fee Schedule. N Engl J Med. 2016;374(14):1306–1309. doi: 10.1056/NEJMp1600999. [DOI] [PubMed] [Google Scholar]
- 20.Berenson RA, Ginsburg PB. Improving The Medicare Physician Fee Schedule: Make It Part Of Value-Based Payment. Health Aff (Millwood). 2019;38(2):246–252. doi: 10.1377/hlthaff.2018.05411. [DOI] [PubMed] [Google Scholar]
- 21.Chan DC, Huynh J, Studdert DM. Accuracy of Valuations of Surgical Procedures in the Medicare Fee Schedule. N Engl J Med. 2019;380(16):1546–1554. doi: 10.1056/NEJMsa1807379. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.McCall N, Cromwell J, Braun P. Validation of physician survey estimates of surgical time using operating room logs. Med Care Res Rev. 2006;63(6):764–777. doi: 10.1177/1077558706293635. [DOI] [PubMed] [Google Scholar]
- 23.Goodson JD. Unintended consequences of resource-based relative value scale reimbursement. JAMA. 2007;298(19):2308–2310. doi: 10.1001/jama.298.19.2308. [DOI] [PubMed] [Google Scholar]
- 24.Berenson RA, Basch P, Sussex A. Revisiting E&M visit guidelines--a missing piece of payment reform. N Engl J Med. 2011;364(20):1892–1895. doi: 10.1056/NEJMp1102099. [DOI] [PubMed] [Google Scholar]
- 25.Schroeder SA, Frist W. Phasing out fee-for-service payment. N Engl J Med. 2013;368(21):2029–2032. doi: 10.1056/NEJMsb1302322. [DOI] [PubMed] [Google Scholar]
- 26.Katz S, Melmed G. How Relative Value Units Undervalue the Cognitive Physician Visit: A Focus on Inflammatory Bowel Disease. Gastroenterol Hepatol (N Y). 2016;12(4):240–244. [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
(DOCX 23.1 kb)


