Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2011 Nov 1.
Published in final edited form as: Int J Med Inform. 2010 Oct 14;79(11):772–777. doi: 10.1016/j.ijmedinf.2010.09.004

The Introduction of a Diagnostic Decision Support System (DXplain) into the workflow of a teaching hospital service can decrease the cost of service for diagnostically challenging Diagnostic Related Groups (DRG)s

Peter L Elkin 1, Mark Liebow 2, Brent A Bauer 2, Swarna Chaliki 2, Dietlind Wahner-Roedler 2, John Bundrick 2, Mark Lee 2, Steven H Brown 3,4, David Froehling 2, Kent Bailey 2, Kathleen Famiglietti 5, Richard Kim 5, Ed Hoffer 5, Mitchell Feldman 5, G Octo Barnett 5
PMCID: PMC2977948  NIHMSID: NIHMS245866  PMID: 20951080

Abstract

Background

In an era of short inpatient stays, residents may overlook relevant elements of the differential diagnosis as they try to evaluate and treat patients. However, if a resident’s first principal diagnosis is wrong, the patient’s appropriate evaluation and treatment may take longer, cost more, and lead to worse outcomes. A diagnostic decision support system may lead to the generation of a broader differential diagnosis that more often includes the correct diagnosis, permitting a shorter, more effective, and less costly hospital stay.

Methods

We provided residents on General Medicine services access to DXplain, an established computer-based diagnostic decision support system, for 6 months. We compared charges and cost of service for diagnostically challenging cases seen during the fourth through sixth month of access to DXplain (intervention period) to control cases seen in the six months before the system was made available.

Results

564 cases were identified as diagnostically challenging by our criteria during the intervention period along with 1173 cases during the control period. Total charges were $1281 lower (P=.006), Medicare Part A charges $1032 lower (p=.006) and cost of service $990 lower (P=.001) per admission in the intervention cases than in control cases.

Conclusions

Using DXplain on all diagnostically challenging cases might save our medical center over $2,000,000 a year on the General Medicine Services alone. Using clinical diagnostic decision support systems may improve quality and decrease cost substantially at teaching hospitals.

Keywords: Expert Systems, Clinical Decision Support, Medical Education, Medical Economics

Introduction

Case-based payment systems for hospitals such as Medicare’s Prospective Payment System create strong financial incentives for short stays and limited inpatient testing. Residents tell us they are aware of these incentives. Residents admitting patients are anxious to develop a working diagnosis, confirm the diagnosis as rapidly as possible, treat the patient until stable, and then discharge the patient for further care as an outpatient. However, a rapid pace can discourage the contemplation of the condition of newly admitted patients, even though admission is often the time when such contemplation can be most useful. Lack of time to consider a differential diagnosis, especially for the elderly, complex patients who make up an increasing percentage of our inpatient services, can lead to an incomplete differential diagnosis and, from there, premature closure on an incorrect diagnosis. An incorrect diagnosis can increase costs and length-of-stay while reducing quality of care, a doubly undesirable outcome.

An adequate differential diagnosis is a way to avoid premature diagnostic closure (PDC). Keeping several diagnoses in mind makes it easier to change a working diagnosis that is becoming less plausible as a patient’s evaluation progresses. Having a differential diagnosis reduces the temptation to interpret the data to fit a single diagnosis to which the resident may have become attached.

Computerized decision support systems may help residents generate a more thorough differential diagnosis. DXplain is one of these systems. It is a computer-based medical education, reference, and decision support system (“expert system”) that has continued to evolve since its initial development in the mid-1980s by the Laboratory of Computer Science at Massachusetts General Hospital.[1,2] DXplain has characteristics of a decision support tool, a medical reference system, and an electronic medical textbook. As a decision support tool, it uses an interactive format to collect clinical information and then uses data on the crude probabilities of about 5000 clinical manifestations (history, examination findings, laboratory and imaging data) to generate from the manifestations present in a patient a differential diagnosis ranked by probability of the diagnosis. The data base is continuously updated and the system uses a modified form of Bayesian logic along with numerous heuristics to produce its differential diagnosis. DXplain also lists other findings that would support a diagnosis if present and those entered by a user not usually found in a disease. As a medical textbook and reference, DXplain provides a comprehensive description and selected references for more than 2200 different diseases, emphasizing the signs and symptoms present in each disease, the etiology, the pathology and the prognosis.[3,4] DXplain has been used by thousands of physicians and medical students for clinical assistance over the last twenty years. Ten years ago, the Laboratory of Computer Science began to make DXplain available over the Internet to hospitals, medical schools, and other medical organizations.[5]

How can an expert system help? A first-year medical resident, not yet an experienced clinician, may admit 5 patients, some of whom are diagnostically challenging, in 24 hours while being responsible for up to 30 inpatients at night. The resident may have little time to read about or consider the differential diagnoses for the newly admitted patients. Using an expert system may help a resident generate an adequate differential diagnosis quickly and easily while also teaching medical knowledge and diagnostic strategies that may be useful for future patients. Faculty can also glean teaching points from the differential diagnoses generated by the expert system.

We undertook this exploratory study to see if the use of DXplain as a clinical decision support system has the potential to decrease the cost of care for diagnostically challenging cases as compared with usual practice on General Medicine services in a teaching hospital as compared with usual practice.

Computer-based Diagnostic Decision Support Systems

As early as 1959, Ledley and Lusted suggested that computers could help doctors in the diagnostic process.[6] Many papers appeared showing the accuracy of medical diagnosis by computer, generally in a very limited field such as thyroid disease or congenital heart disease. Few of these early systems were used outside the environment of their developers due to their limited knowledge bases, poor user interfaces and the many obstacles to sharing computer systems in the time period of the early 1960’s. In the current environment of the Internet and widespread availability of personal computers, the potential for routine use of decision-support systems to assist health professionals in the diagnostic process has become reality. Tim de Dombal at the University of Leeds created the first abdominal pain diagnosis program based on Bayesian probability theory. The system differentiated between appendicitis, diverticulitis, perforated ulcers, cholecystitis small-bowel obstruction, pancreatitis and non-specific abdominal pain using data acquired from thousands of patient presentations.[7] Ted Shortliffe while at Stanford University developed a program named MYCIN which provided consultation regarding the empiric antibiotic management of infectious diseases.[8] MYCIN used production rules consisting of conditional statements that took the form of If / Then statements (e.g. If the location of the infection is the meninges and the patient is of an age range 15–55 Then the likely organisms causing the infection are Streptococcus Pneumoniae, etc).[9] This methodology falls under the general computer science category of artificial intelligence.[10]

Homer R. Warner at the University of Utah created the HELP system which was an integrated hospital information system with associated decision support.[11,12] The HELP system incorporated a complete electronic medical record with a hospital information system. The rules in the HELP system were written in a preprescribed fashion and this syntax eventually became standardized as the Arden Syntax.[13] Each complete rule set is named a medical logic module and each such module has its own conclusions.[14]

Randy Miller and Jack Myers created the quick medical reference (QMR) which was developed as a diagnostic decision support system for general medicine.[15] QMR was employed at the University of Pittsburg as a consult service which functioned under the model that a physician with a computerized clinical diagnostic decision support system was more effective at making diagnoses than the physician alone.[16] In QMR manifestations are associated with diagnoses and the positive association of these manifestations are graded by their frequency of occurrence and by their evoking strength (i.e. how often should one think of this diagnosis if one notes this particular manifestation). Manifestations and diagnoses are graded by their importance and this information is used with the weightings to provide a ranked list of diagnoses for a given set of manifestations.[17]

DXplain, a computer-based medical education, reference and decision support system, was developed in the early 1980’s by the Laboratory of Computer Science (LCS) at Massachusetts General Hospital (MGH).[18,19] DXplain has the characteristics of an electronic medical textbook, a medical reference system and a decision support tool. In the role of a medical textbook, DXplain can provide a comprehensive description and selected references for over 2,300 different diseases, emphasizing the signs and symptoms that occur in each disease, the etiology, the pathology, and the prognosis. As a decision support tool, DXplain uses its knowledge base of the crude probabilities of approximately 5,000 clinical manifestations (History, PE findings, Lab data, X-ray data and elements of the past medical history) and generates from it a differential diagnosis [20,21] associated with individual diseases. The system uses an interactive format to collect clinical information and makes use of a modified form of Bayesian logic to produce a ranked list of diagnoses that might be associated with the clinical manifestations. DXplain uses this same knowledge base and logic to list other findings that, if present, would support a particular disease, and also lists what findings entered by the user are not usually found in a particular disease. The system also provides references and disease descriptions for each of the diagnoses in its database.[22]

Over the past sixteen years DXplain has been used by thousands of physicians and medical students. Nine years ago, LCS began to make DXplain available over the Internet to hospitals, medical schools, and medical organizations.[23]

Methods

The study was conducted on the General Medicine services at Saint Mary’s Hospital, a 1200 bed hospital operated by the Mayo Clinic in Rochester, Minnesota. The control period was April through September 2000. Information on length of stay, charges, and cost of care was collected on “diagnostically challenging” cases. From October 2000 through March 2001, DXplain was made available to residents doing month-long rotations on the 5 General Medicine services. First-year residents attended a presentation on DXplain and were encouraged to use it for cases they considered “challenging”. They could use the results found from DXplain to expand differential diagnosis or to discuss diagnostic and treatment strategy with a supervising resident or the attending physician. Supervising third-year residents also attended the presentation on DXplain and were encouraged to ask their first-year residents about their use of DXplain. We allowed a three month run-in (October through December 2000) after DXplain was first introduced to allow residents to become familiar with using the program before beginning our “intervention period“ of January through March 2001. Our study period consisted of the control period and the intervention period. Some residents may have done rotations on General Medicine services in both the control and intervention periods, but residents would not have done more than one rotation during the intervention period. Resident rotations were determined by the residency program and were not influenced by the investigators.

We required residents to use individually identifiable passwords so we could record how often residents signed on to the system. We did not record patient identifiers or how many cases were entered by each resident in a session.

Two Mayo Clinic general internist faculty members who were unaware of the study hypotheses retrospectively examined the Diagnosis-Related Groups (DRGs) assigned to admissions during the study period and divided them into “diagnostically challenging” (e.g. DRG 179-inflammatory bowel disease) and “not diagnostically challenging” (e.g. DRG 236-fractures of the hip and pelvis). For most cases, clinicians do not need an expert system to diagnose a hip fracture. The two physicians agreed on all assignments. We compared total charges, Medicare “Part A” charges, and total costs from admissions in diagnostically challenging DRGs in the control period to those in the intervention period. We used the Mayo Clinic cost accounting system to calculate costs and charges. Mayo Clinic has a cost data warehouse that allows accurate determination of the costs of providing services to patients, using standard cost accounting procedures. These costs are described in “real” dollars from a constant time base. Because costs and charges were not normally distributed, we used a logarithmic transformation to estimate the mean differences in costs and charges (i.e. savings from the intervention). We estimated the mean total decrease in costs and charges by applying the percentage change implied by the mean log difference to the mean costs and charges during the control period. Charges and costs in the intervention and control periods were compared using the rank sum test. We calculated a two-tailed P value for the rank sum results and considered a result of 0.05 or less as significant.

Results

During the intervention period residents used DXplain 323 times. Eleven entries were specified as “hypothetical”. We did not require that residents use individually identifiable passwords, and therefore we could only record how often residents signed on to the system. We did not record patient identifiers or how many cases were entered by each individual resident. As logons were by session and not case we did not record how many cases were entered during each Explain session.

1173 cases were identified as being in the “diagnostically challenging” DRGs during the control period and 564 during the intervention period, representing about 190 cases a month. Control period costs and charges are in Table IV. The mean total charges per case fell from $12684 in the control period to $11403 in the intervention period while Part A charges fell from $10422 to $9390 as noted in Table V. Costs fell from $8318 to $7328. This represents a 10% drop in charges and a 12% drop in costs (see Tables III and VI). These differences are statistically significant as noted in Tables I and II.

Table IV.

Control Period Costs and Charges prior to instituting DXplain on the Hospital Services.

Charges and Costs Mean Median Geometric Mean
Total charges $12,684 $8,010 $8.074.77
Type A charges $10,422 $6,799 $6,668.83
Costs $8,318 $5,454 $5,468.17

Table V.

The costs in the Diagnostically Challenging DRGs were significantly lower for the DXplain group than for the controls by on average $990 / admission.

Obs Number of Values Used for Control Number of Values Used for DXplain Mean for Control Mean for DXplain Median for DXplain 25.0 Percentile for Control 25.0 Percentile for DXplain
Chg 1173 564 12684 11403 7173.32 4324 3794
Cst 1173 564 8318 7328 4779.64 2928 2568
PtA 1173 564 10422 9390 5832.46 3565 3166
Los 1173 564 4.14 3.99 3.00 2.00 2.00
Obs, Cont 75.0 Percentile for Control 75.0 Percentile for DXplain Percent Reduction Confidence Intervals for Percent Reduction Two Sample T-test P: Windsorized 2-Tail P for Rank Sum Z- Statistic
Chg 15607 13367 10.1% (1.2% to 8.2%) 0.0180 0.0056
Cst 9957 8571 11.9% (3.7% to 9.5%) 0.0054 0.0012
PtA 12677 11157 9.9% (1.1% to 8.0%) 0.0173 0.0058
Los 5.00 5.00 3.62% 0.88993 0.42580

Table III.

Derivation of the relative decrease in cost and total charges for both Part A and Part B charges.

Reduction in Charges Percent Reduction with Confidence Interval
Total charges reduction 10.1% (1.2% to 18.2%)
Part A Charges reduction 9.9% (1.1% to 18.0%)
Non-A charges reduction 10.5% (0.3% to 19.6%)
Costs reduction 11.9% (3.7% to 19.5%)

Table VI.

The log scale differences between the DXplain and Control groups.

Obs Log Mean Differences for Control Group vs. DXplain Confidence Interval Percent Reduction Two Sample T-test P: Windsorized 2-Tail P for Rank Sum Z- Statistic
Chg 0.1063 0.0117 to 0.2009 10.1% (1.2% to 18.2%) 0.0180 0.0056
Cst 0.1270 0.0376 to 0.2164 11.9% (3.7% to 19.5%) 0.0054 0.0012
PtA 0.1046 0.0110 to 0.1983 9.9% (1.1% to 18.0%) 0.0173 0.0058

Table I.

Statistically significant improvements were seen in the DXplain group for Total charges, Part A Charges and Part B Charges as well as for total cost.

Charges Significance
For total charges p=0.0056
Part A charges p=0.0058
Non-A charges p=0.0136
Total costs p=0.0012

Table II.

The means and confidence interval for the costs on a log scale were computed.

Cost Mean Confidence Interval
Total costs 0.1063 0.0117 to 0.2009
Part A charges 0.1046 0.0110 to 0.1983
Non-A charges 0.1107 0.0032 to 0.2183
Costs 0.1270 0.0376 to 0.2164

Using the rank sum test we tested the significance of the cost and charge reductions seen in the intervention group using DXplain.

The confidence intervals for the differences in means on log scale were:

These were then transformed to relative reductions in charges/costs:

This translates into a mean reduction in charges and costs with the confidence intervals as provided in Table 3. Table IV shows the costs and charges for the control group.

Table V compares the costs, charges, Part A charges and length of stay between the intervention and control groups:

Table VI represents the differences between the intervention and control groups for charges, costs and Part A charges.

Discussion

In this exploratory study, costs, charges, and length of stay decreased for diagnostically challenging cases when residents had access to the decision support system DXplain compared to the control period without DXplain availability. We recognize this is correlative and does not prove causation. However, we believe the access to DXplain is the most likely explanation for the decrease, as there were no other changes we believe could account for these decreases. While these decreases may have been due to improved clinical skills of the residents, as the intervention period was equally later and earlier (3 months earlier averaged with 3 months later) in the academic year than the control period, and we did not see such changes in comparable time periods in other academic years; also the average level of experience was identical in the intervention and control periods. The case mix did not vary substantially between the control and intervention periods. It is possible, though unlikely, that there was substantial variation in the acuity of cases within DRGs between the control and the intervention periods.

How could the use of DXplain have helped to reduce the cost of care? We believe it may be by helping residents to generate a broader differential diagnosis list, more often including the correct diagnosis, in the minds of physicians caring for a patient. It may seem paradoxical that evaluating a patient with several diagnoses in mind may be cheaper than doing so with only one until considering how expensive it is to have a single but incorrect diagnosis. Also, the ranked differential diagnosis provided by DXplain may focus a clinical team on evaluating the more plausible diagnoses first, which we believe will usually get to the correct diagnosis quicker and at less cost. This will also improve quality of care. The use of the system did represent a time commitment on the part of the residents however they were uniformly happy with the interaction and felt that it was time well spent.[24] Costs per case averaged $990 less in the intervention period than in the control period for a savings of $558,360 for the 564 diagnostically challenging cases seen in three months. This would mean that over a year, assuming there would be this many diagnostically challenging cases each quarter, savings on the General Medicine services might be over $2,250,000. Although a lower percentage of cases on other internal medicine services would be considered diagnostically challenging, the annual savings for the internal medicine patients at Saint Mary’s Hospital might exceed $4,000,000. While this is only a small fraction of the total cost of caring for internal medicine patients at Saint Mary’s, the savings would go directly toward the operating margin for such patients. That operating margin is small and $4,000,000 a year would be a substantial resource for our institution. As payments for cases are capitated under its DRG, the decrease in part a charges would not count against the $990 gain associated with each diagnostically challenging case. Instead the drop in Part A charges explains the savings by showing that on average less billable service was provided to the intervention group.

Access to DXplain was provided at no charge for this study by Massachusetts General Hospital. The annual cost for an unlimited-use license for an institution the size of Mayo Clinic is $4000–$6000 a year. This return on investment may seem too good to be true, but may be real if using the decision support system leads to a substantial and systematic improvement in patient care. While further studies are needed to see if similar savings could be obtained in other settings, if these results could be generalized nationwide, savings for Medicare patients alone could approach $100 million annually.

We believe clinical decision support systems such as DXplain can facilitate resident education in the hospital. Residents now do much of their work using a computer, so it would be a small step to integrate a high-powered, user-friendly diagnostic clinical decision support system into the routines of admitting residents. It should at least expand their differential diagnoses while providing the “just-in-time” learning thought an appropriate educational method for adult learners. The system may be particularly useful with resident work-hour restrictions that has led to reduced time to read and often to decreased time with faculty. The system may complement the traditional faculty role by encouraging residents to interpret clinical data in the context of additional diagnoses.

CDSS alerts have been a challenge to healthcare organizations and to clinical practice.[25] Our study shows an alternative way to think of using computer based decision support to improve the value associated with our clinical practice.

By helping trainees with differential diagnoses, a clinical decision support system may free faculty to focus on other important issues, such as management of existing problems, appropriate use of laboratory testing and imaging studies, and how to use consulting services. Faculty can also review the system’s differential diagnosis to help address how to prioritize diagnoses and how well this patient’s clinical data fit each of the suggested diagnoses.

Limitations

There are several limitations in our study. The study was conducted at one hospital. DXplain may not have been used for all diagnostically challenging patients. We did not study outcomes of patients in the intervention period compared to those of patients in the study period, or look at comparative outcomes for patients where DXplain was used versus those where it could have been used but was not. Further studies at multiple centers could address some of these limitations. We used historical controls, half with more experience and half with less than the intervention group, rather than concurrent controls to avoid contamination between control and intervention groups and because we believed case mix, acuity, and average trainee skill level would not differ substantially between the control period and the intervention period.

Summary

Costs and charges for diagnostically challenging cases decreased significantly after medical residents were given access to DXplain and encouraged to use it to help with developing a differential diagnosis for newly admitted patients. These results, though preliminary, suggest clinical decision support systems may help trainees manage inpatient admissions more efficiently. There may also be educational benefits to trainees and faculty who use a decision support system. Further studies are needed to show that our results are reproducible in other settings, to show outcomes are improved by using such a system, and to understand how using a decision support system changes resident thinking and behavior.

Expert systems such as DXplain can be useful knowledge sources for clinicians. There is more and more clinicians are required to know every day, so the need for tools to deliver high quality, patient-specific information just when it is most useful continues to grow. As the era of personalized medicine grows closer, clinicians must become comfortable in the role of knowledge seeker. Informatics specialists must produce methods and tools to assist busy clinicians in their efforts toward best practices.

Acknowledgments

This work has been supported in part by a grant from the National Library of Medicine LM06918, and grants from the Centers for Disease Control and Prevention PH000022 and HK00014 and in part by no-cost access to DXplain from the Laboratory of Computer Science, Massachusetts General Hospital.

The authors thank Inna Gurewitz, MPH for her assistance in preparation of this manuscript.

Footnotes

Authors Contributions:

Specific contributions from each author:

- Peter L. Elkin, M.D. – Designed and ran the study, authored the manuscript, contributed to the study design, review and commentary,

- Mark Liebow, M.D. – Authored the manuscript, contributed to the study design, review and commentary

- Brent A. Bauer, M.D. – Contributed to the study design, review and commentary

- Swarna Chaliki, M. D. – Contributed to the study design, review and commentary

- Dietlind Wahner-Roedler, M.D. – Contributed to the study design, review and commentary

- John Bundrick, M.D. – Contributed to the study design, review and commentary

- Mark Lee, M.D. – Contributed to the study design, review and commentary

- Steven H. Brown, M.D. – Contributed to the study design, review and commentary

- David Froehling, M.D. – Contributed to the study design, review and commentary

- Kent Bailey, Ph.D. – Contributed to the study design, review and commentary

- Kathleen Famiglietti – Contributed to the study design, review and commentary

- Richard Kim – Contributed to the study design, review and commentary

- Ed Hoffer, M.D. – Contributed to the study design, review and commentary

- Mitchell Feldman, M.D. – Contributed to the study design, review and commentary

- G. Octo Barnett, M.D. – Contributed to the study design, review and commentary

Conflicts of Interest: Massachusetts General Hospital holds the Intellectual Property Rights for DXplain.

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  • 1.Barnett GO, Cimino JJ, Hupp JA, Hoffer EP. DXplain - an evolving diagnostic decision -support system. JAMA. 1987 Jul 3;258(1):67–74. doi: 10.1001/jama.258.1.67. (this was the original reference on DXplain) [DOI] [PubMed] [Google Scholar]
  • 2.Packer MS, Hoffer EP, Barnett GO, Famiglietti KT, Kim RJ, McLatchey JP, et al. Evolution of DXplain: A Decision Support System. Proc Annu Symp Comput Appl Med Care. 1989 November 8;:949–951. [Google Scholar]
  • 3.Elkin PL, Barnett GO, Famiglietti KT, Kim RJ. Closing the Loop on Diagnostic Decision Support Systems. Proc Annu Symp Comput Appl Med Care. 1990 November 7;:589–593. [Google Scholar]
  • 4.Elkin PL, McLatchey J, Packer M, Hoffer E, Cimino C, Studney D, et al. Automated Batch Searching of MEDLINE for DXplain. Proc Annu Symp Comput Appl Med Care. 1989 November 8;:436–440. [Google Scholar]
  • 5.Barnett GO, Famiglietti KT, Kim RJ, Hoffer EP, Feldman MJ. DXplain on the Internet. Proc AMIA Symp. 1998:607–611. (this is the primary paper about DXplain available over the Internet) [PMC free article] [PubMed] [Google Scholar]
  • 6.Ledley RS, Lusted LB. The role of computers in medical diagnosis. Med Dok. 1961 Jul;5:70–8. [PubMed] [Google Scholar]
  • 7.de Dombal T, Clamp S, Margulies M, Chan M. Computer training for doctors and students. BMJ. 1994 November 5;309(6963):1234–5. doi: 10.1136/bmj.309.6963.1234c. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Shortliffe EH, Axline SG, Buchanan BG, Merigan TC, Cohen SN. An artificial intelligence program to advise physicians regarding antimicrobial therapy. Comput Biomed Res. 1973 Dec;6(6):544–60. doi: 10.1016/0010-4809(73)90029-3. [DOI] [PubMed] [Google Scholar]
  • 9.Shortliffe EH, Davis R, Axline SG, Buchanan BG, Green CC, Cohen SN. Computer-based consultations in clinical therapeutics: explanation and rule acquisition capabilities of the MYCIN system. Comput Biomed Res. 1975 Aug;8(4):303–20. doi: 10.1016/0010-4809(75)90009-9. [DOI] [PubMed] [Google Scholar]
  • 10.Shortliffe EH, Buchanan BG. Artificial intelligence. N Engl J Med. 1980 Jun 26;302(26):1482. [PubMed] [Google Scholar]
  • 11.Warner HR, Olmsted CM, Rutheford BD. HELP--a program for medical decision-making. Comput Biomed Res. 1972 Feb;5(1):65–74. doi: 10.1016/0010-4809(72)90007-9. [DOI] [PubMed] [Google Scholar]
  • 12.Pryor TA, Gardner RM, Clayton PD, Warner HR. The HELP system. J Med Syst. 1983 Apr;7(2):87–102. doi: 10.1007/BF00995116. [DOI] [PubMed] [Google Scholar]
  • 13.Hripcsak G, Wigertz OB, Kahn MG, Clayton PD, Pryor TA. ASTM E31.15 on health knowledge representation: the Arden Syntax. Stud Health Technol Inform. 1993;6:105–12. [PubMed] [Google Scholar]
  • 14.Prior TA. The use of medical logic modules at LDS hospital. Comput Biol Med. 1994 Sep;24(5):391–5. doi: 10.1016/0010-4825(94)90007-8. [DOI] [PubMed] [Google Scholar]
  • 15.Miller R, Masarie FE, Myers JD. Quick medical reference (QMR) for diagnostic assistance. MD Comput. 1986 Sep-Oct;3(5):34–48. [PubMed] [Google Scholar]
  • 16.Bankowitz RA, McNeil MA, Challinor SM, Parker RC, Kapoor WN, Miller RA. A computer-assisted medical diagnostic consultation service. Implementation and prospective evaluation of a prototype. Ann Intern Med. 1989 May 15;110(10):824–32. doi: 10.7326/0003-4819-110-10-824. [DOI] [PubMed] [Google Scholar]
  • 17.Giuse DA, Giuse NB, Miller RA. A tool for the computer-assisted creation of QMR medical knowledge base disease profiles. Proc Annu Symp Comput Appl Med Care. 1991:978–9. [PMC free article] [PubMed] [Google Scholar]
  • 18.Barnett GO, Hoffer EP, Packer MS, Famiglietti KT, Kim RJ, Cimino C, et al. DXplain - Important Issues in the Development of a Computer Based Decision Support System. Proc Annu Symp Comput Appl Med Care. 1990 November 7;:1013. [Google Scholar]
  • 19.Barnett GO, Cimino JJ, Hupp JA, Hoffer EP. DXplain - an evolving diagnostic decision -support system. JAMA. 1987 Jul 3;258(1):67–74. doi: 10.1001/jama.258.1.67. (this was the original reference on DXplain) [DOI] [PubMed] [Google Scholar]
  • 20.Packer MS, Hoffer EP, Barnett GO, Famiglietti KT, Kim RJ, McLatchey JP, et al. Evolution of DXplain: A Decision Support System. Proc Annu Symp Comput Appl Med Care. 1989 November 8;:949–951. [Google Scholar]
  • 21.Elkin PL, Barnett GO, Famiglietti KT, Kim RJ. Closing the Loop on Diagnostic Decision Support Systems. Proc Annu Symp Comput Appl Med Care. 1990 November 7;:589–593. [Google Scholar]
  • 22.Elkin PL, McLatchey J, Packer M, Hoffer E, Cimino C, Studney D, et al. Automated Batch Searching of MEDLINE for DXplain. Proc Annu Symp Comput Appl Med Care. 1989 November 8;:436–440. [Google Scholar]
  • 23.Barnett GO, Famiglietti KT, Kim RJ, Hoffer EP, Feldman MJ. DXplain on the Internet. Proc AMIA Symp. 1998:607–611. (this is the primary paper about DXplain available over the Internet) [PMC free article] [PubMed] [Google Scholar]
  • 24.Bauer BA, Lee M, Bergstrom L, Wahner-Roedler D, Bundrick J, Litin S, et al. Internal medicine resident satisfaction with a diagnostic decision support system (DXplain) introduced on a teaching hospital service. Proc AMIA Symp. 2002:31–5. [PMC free article] [PubMed] [Google Scholar]
  • 25.Koppel R, Metlay JP, Cohen A, Abaluck B, Localio AR, Kimmel SE, et al. Role of computerized physician order entry systems in facilitating medication errors. JAMA. 2005 Mar 9;293(10):1197–203. doi: 10.1001/jama.293.10.1197. [DOI] [PubMed] [Google Scholar]

RESOURCES