Abstract
CONTEXT
The impact of residents on hospital finance has been studied; there are no data describing the economic effect of residents on attending physicians.
OBJECTIVE
In a community teaching hospital, we compared allowable inpatient visit codes and payments (based on documentation in the daily progress notes) between a general medicine teaching unit and nonteaching general medicine units.
DESIGN
Retrospective chart review, matched cohort study.
SETTING
Six hundred fifty–bed community teaching hospital.
PATIENTS
Patients were discharged July 1998 through February 1999 from Saint Barnabas Medical Center. We randomly selected 200 patients in quartets. Each quartet consisted of a pair of patients cared for by residents and a pair cared for only by an attending physician. In each pair, 1 of the patients was under the care of an attending physician who usually admitted to the teaching service, and 1 was under the care of a usually nonteaching attending. Within each quartet, patients were matched for diagnosis-related group, length of stay, and discharge date.
MAIN OUTCOME MEASURES
We assigned the highest daily visit code justifiable by resident and attending chart documentation, determining relative value units (RVUs) and reimbursements allowed by each patient's insurance company.
RESULTS
Although more seriously ill, teaching-unit patients generated a mean 1.75 RVUs daily, compared with 1.84 among patients discharged from nonteaching units (P = .3). Median reimbursement, daily and per hospitalization, was similar on teaching and nonteaching units. Nonteaching attendings documented higher mean daily RVUs than teaching attendings (1.83 vs 1.76, P = .2). Median allowable reimbursements were $267 per case ($53 daily) among teaching attendings compared with $294 per case ($58 daily) among nonteaching attendings (Z = 1.54, P = .1). When only the resident note was considered, mean daily RVUs increased 39% and median allowable dollars per day 27% (Z = 4.21, P < .001).
CONCLUSIONS
Nonteaching attendings appear to document their visits more carefully from a billing perspective than do teaching attendings. Properly counter-documented, resident notes could substantially increase payments to attending physicians.
Keywords: health care finance, residents, coding
The effect of resident care on costs of hospitalization has been well studied, although infrequently with consideration of reimbursements generated by that care.1–3 We recently reported that patients on an internal medicine inpatient teaching service generated payments to the sponsoring community hospital that were higher, on average and for most diagnosis-related groups (DRGs), than payments for nonteaching patients. At least part of this difference was attributable to extensive documentation by residents in the medical record.4
In this study, we extend our work on resident care from its association with hospital reimbursement to its effect on physician payments for hospital care. We were interested in determining whether resident note writing affected allowable reimbursement to the attending physician. Moreover, we hypothesized that the potential effect of residents would be greater than the observed effect because of imperfect counter-documentation of their notes by attending physicians. We studied not actual daily reimbursements but insurer-specific, allowable reimbursement based on billing documentation guidelines.
METHODS
Site
As previously described, this study was performed at the Saint Barnabas Medical Center (SBMC), a 620-bed community teaching hospital in Livingston, New Jersey.4 Affiliated with the Mount Sinai School of Medicine, New York, SBMC had, at the time of the study, 40 residents in a fully accredited free-standing internal medicine training program.
There were 487 attending staff members in internal medicine and family practice who had the option of admitting patients to a geographic 40-bed general medicine teaching unit. Attending physicians could alternatively admit to 310 beds on nonteaching medical/surgical units or to teaching units dedicated to intensive care, nephrology, cardiology, pulmonology, and oncology. General medicine teaching patients were always admitted to the teaching unit; however, when the hospital was full, nonteaching patients were also placed on the teaching unit.
Teaching patients were under the care of residents who were supervised by attending physicians in accordance with requirements of the Internal Medicine Residency Review Committee of the American Council on Graduate Medical Education.5 Nonteaching patients were cared for by attending physicians without residents. For these patients, additional care was available at night and on weekends by licensed house physicians who were not members of the resident staff. Nurse-to-patient ratios were the same on the geographic general medicine teaching unit as on nonteaching medical/surgical units. Social workers and case managers were equally available on all units.
Patient Selection
The cohort from which subjects were selected for this study consisted of patients discharged by internists or family practitioners between July 1, 1998 and January 1, 1999. We included all 1,614 inpatients discharged from general medicine units who were assigned a medical diagnosis-related group during this period. Two hundred two attending physicians admitted 917 patients to the teaching unit and 697 patients to other nonspecialty units. We excluded 1 otherwise eligible patient due to incomplete data.
Teaching- and nonteaching-unit patients were further characterized by the teaching habits of their attending physicians. Attending physicians who admitted 50% or more of their patients to the teaching unit were designated teaching attendings, the others nonteaching attendings. Each patient in the cohort thus belonged to 1 of 4 groups: teaching unit/teaching attending (T/T), teaching unit/nonteaching attending (T/N), nonteaching unit/teaching attending (N/T), and nonteaching unit/nonteaching attending (N/N).
For the present study, we randomly chose 200 patients from this cohort. Patients were selected in quartets so that every quartet consisted of 1 patient from each of the 4 groups. Within each quartet, patients were matched for DRG, length of stay (LOS), and date of discharge.
Patient selection was accomplished in the following manner: Initially, we assigned a random number to each of the 1,624 eligible patients. We next separated patients into the 4 groups based on unit and attending physician type described above. Within each of the groups, patients were then ordered according to their randomly assigned numbers. From group N/N we selected the first patient. From groups N/T, T/T and T/N we selected the patient discharged closest in time to the NN patient who was assigned the same DRG and had the same LOS (within 1 day). The 4 subjects thus selected were removed from the list of available patients. This process was then repeated, starting successively with the first available patient in group N/T, T/T, and T/N. In this manner, 50 quartets of matched patients were selected. When it was impossible to find all 3 matches for an initial candidate we chose the next initial candidate within the same group and rematched the entire quartet.
Before medical record review, discharge from the teaching unit was the sole criterion for assignment of teaching status. We expected, however, from our previous sampling of teaching-unit discharges, that about a third of these patients had not in fact been cared for by a resident. Upon chart review, patients were considered teaching if there was a resident admission note and evidence of ongoing resident involvement for patients who stayed more than 3 days. Ongoing involvement was defined as at least 1 resident progress note on the last or next-to-last day of hospitalization.
Patients found to be incorrectly assigned to the teaching unit group were removed from the study. Replacements were identified by matching the first patient of the relevant quartet with remaining available candidates in the group from which the patient had been removed. In selecting replacements, we chose the patient with the same DRG whose LOS was closest to the first patient of the relevant quartet. When 2 or more patients qualified equally by LOS, the patient was selected whose discharge date was closest to the first patient of the quartet. When no suitable replacement could be found the entire quartet was removed from the study.
Data Gathering
We obtained the following information about each discharge from a patient database (TRENDSTAR by HBOC, Atlanta, Ga): DRG, attending physician, demographic data, payer, patient care unit, whether admission was emergent, disposition, costs and reimbursements to the hospital, and LOS. From our previous work we had determined which attending physicians were teaching and which nonteaching during the study period by the criteria described above.
Risk adjustment was performed by New Solutions, Inc. (New Brunswick, NJ). The analysis, based on Systemetrics methodology, collects variables derived from the Universal Bill 92 data. Co-morbidities, demographics, and a count of involved body systems are entered into multivariate or logistic regression against a large database of patients to derive likelihood weights. These weights are then summed and applied to calculate each patient's probability of death during the admission.6
One of the authors (LJ), a certified inpatient coder, then reviewed the initial and subsequent attending physician and resident progress notes in each inpatient medical record. We assigned the most complex patient visit code (evaluation and management code, or E and M) consistent with the medical record documentation for the day. Coding judgments were based on contemporary standards published in Current Procedural Terminology (CPT) and followed 1995 documentation guidelines.7
Briefly, we counted references in each daily note to body areas and systems approved by Healthcare Financing Administration (HCFA) guidelines. We also counted references to approved areas of concern within the history of present illness, and noted the presence of a past history, family history, or social history. These elements were then combined in accordance with HCFA guidelines to characterize history and physical examination as “problem focused,”“expanded problem focused,”“detailed,” or “comprehensive.” The number of diagnoses or therapeutic options considered in the daily note, the amount and complexity of data, and the risk associated with the main diagnosis were combined to choose level of decision making from among “straightforward,”“low,”“moderate,” and “high.” Finally, these descriptors of history, physical exam, and case complexity were combined to yield the appropriate E and M code.
In assignment of E and M codes, resident progress notes were considered together with attending physician counter-documentation, according to published standards for counter-documentation of resident notes.9 A second experienced physician coder reviewed a randomly determined 16% sample of charts to measure inter-observer variation in code assignment.
In order to assess the potential effect of resident notes on allowable reimbursement, we assigned 2 sets of codes to each teaching-unit patient. The first code reflected actual allowable dollars and relative value units (RVUs) based on both the resident note and attending physician counter-documentation. The second code reflected dollars and RVUs as they would be assigned using only the resident's documentation, irrespective of the attending physician's counter-documentation.
From the contemporary rate schedule of each patient's insurance carrier (applicable in northern New Jersey at the time the patient was hospitalized), we noted reimbursement to the attending physician for each E and M code used in the study. We estimated “self pay” reimbursement as 15% of Medicare rates and applied the average rates of named managed care organizations to the 9 patients in the hospital's database category “other managed care.” Contemporary RVUs for E and M codes were obtained from federal regulations and reported as the total of physician work RVU, transitional facility practice expense RVU, and malpractice RVU.8
By study design, chart review was the gold standard of teaching-unit assignment, but coders were blind to the teaching status of the attending physician.
Endpoints
We compared the 4 principle groups for possibly confounding differences in demographics, proportion of emergent admissions, disposition after discharge, distribution of insurance payers, hospital costs and reimbursement, LOS, and risk-adjusted likelihood of in-hospital death. Primary endpoints of the study were the total and daily RVUs and dollar reimbursements within patient groups. The secondary endpoint was distribution of E and M codes within patient groups.
Analysis
For discrete data, we tested significance using χ2 analysis. We applied non-parametric testing (Mann-Whitney U test) to continuous data sets in which mean and median differed by more than 15% or in which fewer than 2% or more than 7% of the data points fell either above or below 1.96 standard deviations from the mean. We calculated Z scores for non-parametric tests. For all other continuous data sets we performed t tests and reported P values.
RESULTS
Patient Characteristics
Of the 100 patients discharged from the teaching unit, chart review found 66 cared for by residents. Eleven patients in the T/T group and 13 in the T/N group were found to be nonteaching-unit patients, for whom matching replacements were available. The remaining 10 patients had no suitable replacements. They and the other patients in their quartets were eliminated from the study. Of the 100 patients discharged from nonteaching units, 1 received resident care and a matching replacement was found. Forty of the original 50 quartets (160 patients) were therefore available for study after replacements.
Patient characteristics after replacements are shown in Table 1. The 160 patients accounted for 910 billable days. There were 475 billable days among the 80 teaching patients and 435 among the 80 nonteaching patients. Teaching attendings accounted for 474 days and nonteaching attendings 436. No significant differences in patient demographics were noted among the groups. Overall, the population was 59% female and 68% white with a median age of 73.5 years. Medicare insured 62% of all patients, with no significant differences between groups in the distribution of insurers. Most patients were admitted through the emergency department and most discharged home, with no significant differences between groups. Median LOS was 4.5 days among patients on the teaching unit and 4.0 days among nonteaching-unit patients, a nonsignificant difference, nor were there significant differences between groups in median hospital cost ($3,611 among nonteaching-unit and $3,921 among teaching-unit patients) or median hospital reimbursement ($6,195 among nonteaching-unit patients and $6,442 among teaching-unit patients).
Table 1.
Characteristics of Study Patients*
| Teaching Unit | Nonteaching Unit | |||
|---|---|---|---|---|
| Teaching Attending (N = 40) | Nonteaching Attending (N = 40) | Teaching Attending (N = 40) | Nonteaching Attending (N = 40) | |
| Male, n | 19 | 17 | 16 | 14 |
| Median age, y | 74.0 | 72.0 | 72.0 | 75.0 |
| Ethnicity | ||||
| White | 29 | 30 | 21 | 29 |
| African American | 5 | 3 | 11 | 7 |
| Hispanic | 0 | 0 | 0 | 0 |
| Other | 0 | 0 | 0 | 0 |
| Unknown | 6 | 7 | 8 | 4 |
| Source of admission | ||||
| Emergency department | 39 | 36 | 35 | 36 |
| Scheduled | 1 | 2 | 2 | 4 |
| Unscheduled | 0 | 2 | 3 | 0 |
| Disposition | ||||
| Home | 34 | 26 | 31 | 32 |
| Expired | 1 | 2 | 1 | 4 |
| Skilled nursing facility | 2 | 8 | 3 | 1 |
| Intermediate care | 1 | 1 | 1 | 1 |
| Home health care | 1 | 1 | 0 | 0 |
| Other | 1 | 2 | 4 | 2 |
| Expected in-hospital deaths | 3.51 | 2.85 | 2.57 | 2.65 |
| Most common DRGs | ||||
| 416 Sepsis | 6 | 6 | 6 | 7 |
| 14 Stroke | 4 | 4 | 4 | 4 |
| 182 Gastrointestinal | 4 | 4 | 4 | 4 |
| 89 Pneumonia | 4 | 4 | 4 | 4 |
| 174 Gastrointestinal hemorrhage | 4 | 4 | 4 | 4 |
| 183 Gastrointestinal | 2 | 2 | 2 | 2 |
| 127 Heart failure | 2 | 2 | 2 | 2 |
| 296 Metabolic | 1 | 1 | 1 | 1 |
| Length of stay | ||||
| Total patient days | 222 | 253 | 214 | 221 |
| Median LOS | 4.5 | 4.5 | 4.0 | 4.0 |
| Most common payers | ||||
| Medicare | 22 | 27 | 24 | 26 |
| Medicare HMO | 1 | 0 | 4 | 3 |
| Other commercial | 1 | 0 | 2 | 0 |
| Blue Cross | 3 | 2 | 3 | 4 |
| Oxford | 2 | 1 | 2 | 2 |
| Other managed care | 3 | 3 | 1 | 2 |
| Median hospital reimbursement, $ | 5,901 | 6,591 | 5,613 | 7,077 |
| Median hospital cost, $ | 4,068 | 3,778 | 3,832 | 3,462 |
| Median profit, $ | 1,833 | 2,813 | 1,781 | 3,615 |
Patients discharged from the teaching unit found on chart review to have received no care from residents have been replaced. If replacement was not possible the entire quartet was eliminated from the study.
DRG, diagnosis-related group; LOS, length of stay.
Reliability of Code Assignment
The 13 patient records randomly selected for review by a second coder represented 72 patient days (8% of all study days). Of these, 85% were coded identically by both reviewers. Eleven percent of days were coded higher and 4% lower by the second reviewer.
Complex Codes
Complex codes (99223 as an initial visit code, 99233 as a subsequent visit code, and 99239 as a discharge code) occurred with similar frequency among teaching-unit and nonteaching-unit patients. Of the 475 billable patient days among teaching-unit patients, 22 (4.6%) were assigned to these complex codes. Of the 435 billable patient days among nonteaching-unit patients, 19 (4.4%) were assigned complex codes, a non-significant difference. However, complex codes were significantly more common among patients of nonteaching attendings. For 23 of 474 billable days (4.9%), a complex code was justified by chart documentation, compared with 18 of 436 days (4.1%) among teaching attendings (P < .05). Distribution of E and M codes among teaching and nonteaching units and among teaching and nonteaching attendings are shown in Table 2 for allowable coding and for coding based only on resident documentation.
Table 2.
Distribution of E and M Codes Among Patient Groups*
| Visit Type | CPT Code | Relative Value Unit | All Teaching-Unit Patients, % | All Teaching Attending Patients, % | All Nonteaching-Unit Patients, % | All Nonteaching Attending Patients, % | All Teaching-Unit Patients, Resident Note Only, % |
|---|---|---|---|---|---|---|---|
| Unbillable | 10.5 | 8.9 | 6.2 | 8.0 | 4.7 | ||
| Initial | 99221 | 2.0 | 5.9 | 9.9 | 11.0 | 7.0 | 2.1 |
| 99222 | 3.3 | 3.2 | 2.8 | 3.7 | 4.0 | 8.2 | |
| 99223 | 4.2 | 3.6 | 2.8 | 3.2 | 4.0 | 5.7 | |
| Follow-up | 99231 | 1.0 | 43.2 | 46.6 | 44.6 | 41.4 | 16.1 |
| 99232 | 1.6 | 16.4 | 11.0 | 13.3 | 18.6 | 45.0 | |
| 99233 | 2.2 | 1.0 | 0.2 | 0.5 | 0.8 | 1.7 | |
| Discharge | 99238 | 1.8 | 16.2 | 16.7 | 16.8 | 16.2 | 16.1 |
| 99239 | 2.4 | 0.4 | 1.1 | 0.7 | 0 | 0.4 |
Teaching-unit and nonteaching-unit patients and patients of teaching and nonteaching attendings are shown. Distribution of E and M codes based only on resident documentation is also shown.
CPT, current procedural terminology.
RVUs
Total RVUs were 665 among teaching-unit patients and 646 among nonteaching-unit patients. Among teaching attendings, total RVUs were 610, and among nonteaching attendings, 700. Values for RVUs per day were normally distributed among the 160 patients. Mean daily RVUs among teaching patients was 1.75 and among nonteaching patients 1.84 (P = .3). Among patients of nonteaching attendings, mean daily RVUs was 1.76, compared with 1.83 among patients of teaching attendings (P = .2).
Allowable Dollars
Distribution of values for allowable dollars per day and allowable dollars per patient were skewed to the right among the 160 patients. Median allowable dollars per day was $56 among nonteaching-unit patients and $55 among teaching-unit patients (Z = 0.54; P = .6). Median allowable dollars per patient was $279 among teaching-unit patients and $277 among nonteaching-unit patients (Z = 0.03; P = .98). Patients of nonteaching attendings generated a median $58 per day and $294 per discharge. Among teaching attendings, median allowable dollars per day was $53 and per discharge $267. These differences were of borderline significance (Z = 1.58; P = .1 for dollars per day and Z = 1.56; P = .1 for dollars per discharge). Although resident notes were typically detailed, they did not usually affect allowable reimbursement because guidelines for counter-documentation were not often followed.
Resident Documentation
When only resident documentation was considered in assigning E and M codes, total RVUs among teaching-unit patients increased by 27%, from 610 to 835. Mean daily RVUs increased by 39%, from 1.75 to 2.88 (P < .01). Median dollars per patient increased by 27%, from $279 to $355 (Z = 2.78; P = .007), and median dollars per visit increased by 27%, from $55 to $70 (Z = 4.27; P < .001). Complex codes increased from 22 (4.9%) to 37 (7.9%) when only resident documentation was considered (P < .01). An example of 2 resident notes, with optimal and suboptimal counterdocumentation, is shown in Figure 1.
FIGURE 1.
Examples of resident and attending documentation. In (a), the attending physician failed to follow teaching guidelines, resulting in an allowable code of 99231 despite extensive resident documentation. In (b), the correct use of teaching criteria resulted in an allowable 99223 code, although the attending physician's note was brief.
Expected and Observed Mortality
Among the 80 teaching-unit patients, 6.36 deaths were expected and 3 observed; among the 80 nonteaching-unit patients, 5.22 deaths were expected and 5 observed (P = .18). Among teaching attendings, 6.08 deaths were expected and 2 observed, compared with 5.50 expected and 6 observed deaths among the nonteaching attendings (P = .09).
DISCUSSION
We found that nonteaching attendings, caring for patients with a lower expected mortality, nevertheless generated more complex codes and higher RVUs and payments than teaching attendings. Our principal finding, however, is that properly counter-documented resident notes could have raised reimbursements to the attending physician by a median $76 per discharge. This incentive to mastery of E and M teaching guidelines should be considered alongside the possible disadvantages of adding further complexity to the writing of progress notes.
Over the last decade, the progress note has become a complex construct. Physicians use their notes to help formulate medical thinking and communicate plans, to interpret available data, to argue for a diagnosis or diagnostic approach, and as an aid to memory. However, hospitals, insurers, and compliance regulators also have laid claim to the content of these notes. Administrative purposes that must now be addressed include documentation to optimize DRG coding, justification for continued hospitalization, charting to avoid or mitigate lawsuit, and demonstration that criteria have been met to effect a patient transfer. Most recently, hospitals and physicians have also been under pressure to itemize co-morbidities for the purpose of accurate risk adjustment. Sophistication in chart documentation to enhance physician billing adds yet another consideration to those of patient care, legal prudence, and regulatory compliance.
Prior to 1992, physician E and M services to Medicare and Medicaid patients (whether in the office or hospital) were coded at levels which reflected a first or subsequent visit and the amount of time spent with a patient.9 Actual payments, however, depended also on each physician's usual charges, leading to wide disparities in reimbursement for the same services.
In 1992, acting in concert with the HCFA, the American Medical Association published a revised CPT. This new edition included definitions and codes for E and M services that focused far more closely on individual elements of care and on the content of progress notes as an indication that the billed service was actually performed.10 The resident's role in supplying medical data for reimbursement was also defined in detail, in section 15016 of HCFA's Medicare Carrier Manual Instructions.11 Prominent in these requirements is the attending physician's obligation to summarize and comment on resident notes used to document billing. By the mid 1990s, private insurers had adopted the HCFA's documentation-based billing approach, even as guidelines became more complex, and over objections from physician groups and influential medical journals.12,13
Our finding of potentially enhanced revenue from proper counter-documentation must be viewed in light of these developments and of the other burdens borne by the daily progress note. With these considerations in mind, the financial incentive we found may not alone be sufficient to produce widespread adherence to counter-documentation guidelines, nor, indeed, is it clear that widespread adherence would be in the best interests of patient care or of teaching. For example, a recent study concluded that documentation for billing purposes is “a major detriment to the quantity of teaching on inpatient services.”14
There is another, more sinister, incentive for teaching physicians to learn and apply CPT documentation guidelines, however—the criminalization of incorrectly billed visits. Since 1995, with the inception of “Operation Restore Trust,” the Department of Health and Human Services (which oversees the HCFA) has shown increasing zeal in ensuring physician adherence to E and M guidelines, particularly when residents are involved. Resources of the criminal justice system have been brought to bear in areas formerly governed by regulatory procedure.15 For example, failure of several major teaching institutions to comply fully with the provisions of Section 15016 recently lead to well-publicized investigations by the Office of the Inspector General (OIG) and to the imposition of large fines.16,17 At least in part as a result of these developments, compliance with HCFA documentation and billing standards has increased.18,19 The OIG reported that incorrectly paid Medicare reimbursements declined from 14% in 1995 to under 8% in 1999.20
Our method of examining allowable codes as a surrogate for actual billing derives credibility from such reports and allows documentation practices to be more easily standardized. Nevertheless, an important weakness of this study is the undetermined relationship between actual and allowable reimbursement for inpatient visits. Another limitation is that only patients from a single community teaching hospital were studied. Finally, the cohort from which study patients were drawn excluded those discharged from specialty care units, including the intensive care units.
We found that the detail typical of resident documentation can contribute importantly to the allowable reimbursement of their supervising attending physicians, but only if these physicians become skilled at standardized counter-documentation in the progress notes. The widespread development of such skills may inure teaching institutions against prosecution and may help fund residency training programs, but there may also be a clinical price: medical records that are cluttered and physicians who are distracted by paperwork from patient care and teaching tasks.
Acknowledgments
The authors wish to thank Mr. Joseph Jaeger for statistical help and Mr. J.J. Braidner and Dr. Allan Brett, who critically reviewed the manuscript.
REFERENCES
- 1.Tallia AF, Swee DE, Winter RO, Lichtig LK, Knabe FM, Knauf RA. Family practice graduate medical education and hospitals' patient care costs in New Jersey. Acad Med. 1994;9:747–53. doi: 10.1097/00001888-199409000-00021. [DOI] [PubMed] [Google Scholar]
- 2.Deamond HS, Fitzgerald LL, Day R. An analysis of the cost and revenue of an expanded medical residency. J Gen Intern Med. 1993;8:614–8. doi: 10.1007/BF02599717. [DOI] [PubMed] [Google Scholar]
- 3.Rosenthall E. Where It Pays Not to Teach. New York Times; February 23, 1997:4. [Google Scholar]
- 4.Shine D, Beg S, Jaeger J, Penacak D, Panush R. Association of resident coverage with cost, length of stay, and profitability at a community hospital. J Gen Intern Med. 2001;16:1–8. doi: 10.1111/j.1525-1497.2001.00314.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.American Medical Education. Graduate Medical Education Directory, 2000–2001. Chicago, Ill: American Medical Association; 2000. [Google Scholar]
- 6.Naessens JM, Leibson CL, Krisban I, Ballard DJ. Contribution of a measure of disease complexity (COMPLEX) to prediction of outcome and charges among hospitalized patients. Mayo Clin Proc. 1992;67:1140–9. doi: 10.1016/s0025-6196(12)61143-4. [DOI] [PubMed] [Google Scholar]
- 7.American Medical Association. CPT 1996: Physicians' Current Procedural Terminology. Chicago, Ill: American Medical Association; 1995. [Google Scholar]
- 8.Relative Value Studies, Inc. Relative Values for Physicians. Reston, Va: St. Anthony Publishing, Inc.; 1997. [Google Scholar]
- 9.American Medical Association. CPT 1991: Physicians' Current Procedural Terminology. Chicago, Ill: American Medical Association; 1990. [Google Scholar]
- 10.American Medical Association. CPT 1992: Physicians' Current Procedural Terminology. Chicago, Ill: American Medical Association; 1991. [Google Scholar]
- 11.Health Care Financing Administration. Carrier Manual Instructions. Section 15016. Supervising Physicians in Teaching Settings. Available at http://www.hcfa.gov/pubforms/14_car/3615000.htm#_1_8.
- 12.Editorial. Healthcare Financial Management. 1997;51:54–9. [Google Scholar]
- 13.Brett AS. New guidelines for coding physicians' services—a step backward. New Engl J Med. 1998;339:1705–8. doi: 10.1056/NEJM199812033392312. [DOI] [PubMed] [Google Scholar]
- 14.McConville JF, Rubin DT, Humphrey H, Carson SS. Effects of billing and documentation requirements on the quantity and quality of teaching by attending physicians. Acad Med. 2001;76:1144–7. doi: 10.1097/00001888-200111000-00019. [DOI] [PubMed] [Google Scholar]
- 15.Kalb PE. Health care fraud and abuse. JAMA. 1999;282:1163–8. doi: 10.1001/jama.282.12.1163. [DOI] [PubMed] [Google Scholar]
- 16.Eichenwald K. Investigations of hospitals will proceed. New York Times; July 16, 1997;A10. [Google Scholar]
- 17.Clough JD. Medical McCarthyism: medicare, teaching hospitals, and charges of health care fraud. Cleveland Clinic J Med. 1997;64:517–19. doi: 10.3949/ccjm.64.10.517. [DOI] [PubMed] [Google Scholar]
- 18.Miller DD, Getsey CL. Impact of a compliance program for billing on internal medicine faculty's documentation practices and productivity. Acad Med. 2001;76:266–72. doi: 10.1097/00001888-200103000-00017. [DOI] [PubMed] [Google Scholar]
- 19.Larios D. Impact of latest OIG compliance guidelines on physician billing practices. Tenn Med. 1999:83–4. March. [PubMed] [Google Scholar]
- 20.HCFA status report on evaluation and management documentation guidelines. June, 2000. p3. Available at: www.hcfa.gov/medicare/emdg20.doc. Accessed April 30, 2002.

