Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Dec 1.
Published in final edited form as: Healthc (Amst). 2020 Oct 22;8(4):100488. doi: 10.1016/j.hjdsi.2020.100488

Challenges Involved in Establishing a Web-based Clinical Decision Support Tool in Community Health Centers

Rachel Gold a,b, Mary Middendorf b, John Heintzman b, Joan Nelson b,1, Patrick O’Connor c, JoAnn Sperl-Hillen c, Deepika Appana c, Erik Geissal b,2, Vijay Thirumalai c, Christina Sheppler a, Maryjane Dunne b
PMCID: PMC7680381  NIHMSID: NIHMS1642819  PMID: 33132174

Background

Clinical decision support (CDS) systems (CDSS) that summarize a patient’s cardiovascular disease (CVD) risks and provide individualized care recommendations can improve rates of guideline-concordant CVD care.[116] Some form of CDSS are in place in most US healthcare systems; by 2017 almost all hospitals and 80% of other care settings had implemented some CDS functionalities,[17] and use of such CDS varies widely.[18, 19] The deployment of a given CDSS beyond the care system in which it was developed is often limited,[20] in great part because guideline-based CDSS must be continually updated as new evidence emerges, requiring resources including clinical and programming expertise.[21] Care systems might minimize the burden of maintaining CDSS by sharing a single, regularly updated, web-based CDSS that integrates with their EHR – a ‘hub-and-spoke’ approach to CDSS.

A 2018 report from the Office of the National Coordinator on Health Information Technology called for expanded efforts to enable CDSS that can integrate data from different sources,[21] and described the barriers to doing so. One such barrier – lack of standardization of data elements – may be addressed by efforts currently underway (such as Fast Healthcare Interoperability Resources (FHIR),[22] CDSHooks,[23] and the Logical Observation Identifiers Names and Codes (LOINC) standardization of medical laboratory observations).[24] However, other challenges remain, and guidance on establishing hub-and-spoke CDSS in primary care is needed.

CV Wizard©, developed at HealthPartners Institute (HPI),[15] provides encrypted CDS generated from a single firewall-protected web service. A build package installed in the EHR initiates a secure data transfer (e.g., selected vital signs, laboratory test results, medications, diagnoses, allergies, along with a unique patient identifier code) to the web service. The web service processes these data through algorithms that are informed by current national guidelines [2528] and updated when new evidence-based guidelines are released. (For example, when the guidelines defining blood pressure control changed in 2017, CV Wizard incorporated them within a few months.) The CV Wizard© system alerts clinic staff when a given patient’s modifiable 10-year CVD risk is ≥10%. When staff click on a URL link within the alert, CV Wizard© generates a personalized, prioritized summary of the patient’s modifiable CVD risks (blood pressure [BP], Body Mass Index [BMI], lipids, glycated hemoglobin [A1c], smoking, aspirin use) for point-of-care review. A clinician summary includes care recommendations that account for the patient’s medications, allergies, comorbidities, etc. (Figure 1). A low-literacy patient summary can be printed (in English or Spanish) and given to patients to facilitate shared decision-making (Figure 2). CV Wizard© also enables users to send questions or criticism of the tool’s care suggestions to the Wizard team[15] for review and resolution as indicated. CV Wizard’s seamless presentation within the EHR, individualized and prioritized recommendations, adaptation to a patient’s changing clinical state, and feedback functionality make it one of the most sophisticated CDSS tools for CVD risk management.

Figure 1.

Figure 1.

Example of CV Wizard© Clinician Summary

Figure 2.

Figure 2.

Example of CV Wizard© Patient Summary

CV Wizard’s adoption rates and user satisfaction at HealthPartners are high, and its use significantly improved patient outcomes in that setting.[15, 29, 30] CV Wizard is currently in use at multiple integrated care systems in 11 U.S. states, with an average of 250,000 web service calls per day. Its application in these systems required creating a business associate agreement and a service agreement to permit secure data transfer between the systems’ EHR and the CV Wizard web service, and 4-6 months for the data mapping, programming, and testing required for implementation.

Given CV Wizard’s successful adoption in the HealthPartners care delivery system, we are formally testing its effectiveness in the community health center (CHC) context, in a 5-year randomized trial (R01HL133793). This paper describes the challenges encountered and addressed in establishing the CV Wizard CDSS in the CHC setting, as part of this trial. We present this work to guide others and facilitate the spread of ‘hub-and-spoke CDSS’ in CHCs and other primary care settings.

Organizational Context

OCHIN, Inc., is a non-profit organization that provides information technology support, including a shared Epic© EHR, to >600 CHC clinics in 47 states. These CHCs’ socioeconomically vulnerable patients have high rates of hypertension, lipid disorders, obesity, and diabetes. The 68 CHCs taking part in the clinical trial are OCHIN members.

To activate CV Wizard in the study CHCs, the OCHIN and HPI teams – which included experienced health services researchers, primary care clinicians, and EHR programmers – met every two weeks by phone from March 2017 to April 2018. At these meetings, the teams identified and resolved numerous challenges involved with establishing the CV Wizard CDSS in the EHR shared by the study CHCs. Problems identified and solved while establishing CV Wizard for use in the OCHIN clinics fell into three categories: legal / compliance, technical, and clinical implementation. These categories were anticipated a priori, based on team members’ past experience in implementing EHR-based tools in four large care systems. They were addressed based on the HPI team’s experience to the extent possible, given the differences between OCHIN and other settings where CV Wizard had been implemented previously. No standard ‘playbook’ for implementing CV Wizard existed, but it was established from the onset that working on legal / compliance, technical, and clinical / operational requirements simultaneously would be necessary for implementation within the pre-specified project timeline (about 6 months for full CDSS integration).

Problems and Solutions: Legal / Compliance

Address data security and other legal considerations.

Both parties’ legal counsel conferred to ensure the legality of the data exchange and formalize data governance. Documents created to this end included: a Master Service Agreement to specify the terms of CV Wizard’s delivery (e.g., both parties’ responsibilities, trademark use, monetary exchange, future use); a Business Associate Agreement covering terms for disclosure of personal health information from OCHIN to HPI; and a Secure File Transfer Protocol for transferring EHR data and proprietary programming code. Finalizing these documents took five months.

Ensure that data are sent securely.

We established and executed a plan for securely exchanging data between the CDSS and EHR, including personal health information. In brief, CV Wizard generates and stores a unique patient identifier. When a ‘display call’ occurs, the CDSS matches this identifier against a key to display the CDSS tools. For analytic purposes, CV Wizard also creates a per-patient study ID which is saved in the EHR.

Decide who maintains records.

We determined that EHR data sent to the CDSS is owned by OCHIN, CV Wizard and the CDS web service are owned by HPI, and all ‘study materials,’ including CDS results, belong to OCHIN. HPI purges the patient name but saves all other received data and sent responses for two weeks, to enable addressing provider questions about the CDS output. In a non-study situation, these data would then be purged; for study purposes, after the two-week hold, the data is saved in a CV Wizard-generated analysis dataset. These datasets, sent to OCHIN monthly, include a study ID that links via crosswalk to the EHR. Both parties maintain copies of the analytic datasets throughout the study, but HPI cannot use the data without permission, and will destroy it upon request.

Determine which data are sent to the CDSS.

CV Wizard’s algorithms use patient data (medications, laboratory results, allergies, vital signs, demographics, diagnoses) relevant to CVD risk. Because CDS algorithms are updated as new evidence and guidelines emerge, which can happen frequently, our choice was to send: only the EHR data that the algorithms use at the moment, risking having to change programming on OCHIN’s end every time updated algorithms require new data points; or, all patient data in a given category, risking slowing the data exchange and sharing data that might never be used by the CDSS. After weighing concerns about data safety versus future maintenance needs, informed by both organizations’ compliance teams, we opted to send all data from the relevant categories. The deciding factor was a desire to facilitate ongoing use of CV Wizard.

Problems and Solutions: Technical

Establish a structure for data exchange between the CHCs’ shared EHR and the CV Wizard server.

EHR data had to be sent securely to the CDS web service, and CDS results had to be returned for real-time display in the EHR. Enabling this exchange involved developing data extraction tables within the EHR (e.g., medication, laboratory, diagnostic codes, vitals), and EHR MUMPS (“Massachusetts General Hospital Utility Multi-Programming System”) program routines, to transmit EHR data to the CDS web service, save responses in EHR flowsheets, and display results in the EHR.

Manage lab data from multiple sources.

OCHIN’s >600 clinics receive laboratory data from >60 interfaces, which add result components to the system on an ongoing basis. Because prior implementations of CV Wizard relied on static lists of laboratory-specific variables to find laboratory data, a new system was needed to accommodate OCHIN’s dynamic generation of result components. HPI redesigned the algorithm code to let key elements be defined in an interface table (instead of being hard-coded), and to allow the algorithms’ existing lab keys to map to flexible metrics. These metrics look up past lab values using standard application program interfaces (APIs) based on dynamically identified groups of lab components, using any of a number of standard component classification schemes, including ‘base names,’ which OCHIN already uses and sets according to LOINC codes supplied by lab interfaces. HPI also programmed a ‘listener’ on labs to detect anything not already in their tables, and flag if they received lab data that could not be mapped to these components.

Improve the CDSS’ MUMPS program routine’s performance.

In initial tests of CV Wizard in OCHIN’s EHR, the routine through which its server ‘calls’ the data was untenably slow for certain patients. As this was due to code-checking each individual visit in the past 5 years independently for diagnoses and laboratory results, we decided to use more efficient dual-key indexing / standard APIs to retrieve encounter diagnoses, problem list, allergies, medications, vitals and other patient data. This reduced the amount of data retrieved from the database to supply CV Wizard with the elements needed for the CDS calculations, increasing speed and avoiding unnecessary lag as the systems exchanged data.

Manage other data exchange components.

EHR data on medications and allergies were mapped by HPI to fit the algorithms’ code. Because OCHIN’s EHR uses a different medication database vendor than HPI’s prior CV Wizard implementations, this required substantial effort over six months, including 25% full-time equivalent of an experienced Epic programmer to conduct initial identification, medication matching, coding of pharmacologic class information, loading data, and iterative testing, with ongoing input and reviews by the study leads (physicians).

Optimize the CDS in the shared EHR.

To ensure that CV Wizard worked accurately in the CHCs’ shared EHR, the CDS results and displays were quality-tested through OCHIN’s standard processes for testing all changes made to the EHR. This involved working in a testing environment, using fictional patient records designed to be challenging for the CDSS to process, e.g., by having highly complex clinical characteristics. Additionally, CV Wizard is designed to target decision support suggestions to high-risk patients. To ensure that the CHCs’ staff were alerted when a high-risk patient was identified, an EHR alert was built to tell clinic rooming staff when CV Wizard had identified one or more care gaps and had actionable suggestions for a given patient. A link was embedded in the alert so that the user could go to the CV Wizard interface with one click.

Problems and Solutions: Clinical Implementation

Establish how to generate data on CDS use rates.

The CV Wizard team observed in prior applications that providing clinicians and clinic leaders with regular feedback on CDS print rates is key to achieving and maintaining high CDS use rates. We sought to provide similar data to OCHIN clinics, but because OCHIN is a collaborative of hundreds of clinics, the reports needed to be self-service and clinic-specific, without requiring representatives from all those clinics to log into HPI’s website for their reports. Instead, OCHIN created reports using Epic’s panel management tools to show how often CV Wizard was opened for targeted high-risk patients when it was recommended for them, at the clinic and provider levels. HPI also added an option for OCHIN to “call back” into CV Wizard after each visit to find out if the CDS was printed. Once this was implemented at OCHIN, print rates were added to the use rate reports.

Modify the ‘feedback’ function.

At HPI, clinician feedback is sent via email to a single CV Wizard point of contact for the whole EHR. OCHIN sought to enable clinicians and staff to send feedback to someone at their clinic as a first step, similar to other existing systems for user feedback on the EHR. Thus, CV Wizard’s feedback was changed from a centralized to a distributed system. We routed feedback data to each clinic’s EHR specialist via the EHR’s InBasket ‘Help Desk’ function. These specialists were trained to receive and triage Wizard feedback (answering the question directly or sending it to their clinic’s CV Wizard ‘Clinician Champion’). Questions remaining unresolved after these steps would be sent to the OCHIN study team and discussed with HPI as needed.

Customize the risk threshold for triggering the CDSS.

If a CDSS alerts too rarely (e.g., for narrowly defined groups of patients) likelihood of consistent use may decline; if it alerts too frequently, users may grow frustrated.[31] HPI achieved CDS use rates of 70-80% at alerted encounters with a risk threshold set such that the CDS use was recommended at about 20% of all adult patient visits. We were concerned that differences between HPI and OCHIN CHC patient populations would mean that using the same risk threshold at OCHIN could generate alerts at unacceptably high or low proportions of visits: CHCs serve a high percentage of multimorbid patients, which might increase the alert firing rate, but also a high percentage of younger patients, which might decrease the rate. We ultimately decided to begin with using the same risk threshold criteria as at HPI, targeting adults with: 10-year modifiable CVD risk ≥10%; or diabetes plus >=1 uncontrolled CVD risk factor or high A1c; or existing ASCVD + >=1 uncontrolled major CVD risk factor. We agreed to revisit this choice if we observed that CDS was targeting >30% of adult care visits in a given month for any of the study clinics; as this did not occur, the original criteria were retained.

Unresolved Questions / Lessons for the Field

It is inefficient (and could introduce unwanted variation) to reprogram multiple CDS systems whenever the underlying evidence base changes, which can occur frequently. Remotely operated hub-and-spoke CDSS that accept and process data from many care delivery systems could address this inefficiency and expedite the adoption and dissemination of current care guidelines. Establishing such hub-and-spoke CDSS in new settings presents numerous challenges: ensuring that data exchange meets legal standards; resolving complex technical issues inherent to exchanging the data needed for CDSS to operate; and modifying CDSS components to fit the new setting while minimizing customization. Such challenges can be resolved, as demonstrated here, but future efforts to use hub-and-spoke CDSS should be aware of and prepare for them. (As noted above, national data standardization efforts will remove some of these challenges and may further enhance health care systems’ ability to use evidence-based, hub-and-spoke CDSS like CV Wizard©.) As these challenges are addressed, hub-and-spoke CDSS could be used both in CVD care, as discussed here, and in myriad health conditions; given the known potential for CDSS to improve health outcomes,r211 diverse clinical content areas might be well-served using such tools.

As little has been published regarding the challenges involved in establishing hub-and-spoke CDSS in new settings, the lessons presented here may be useful for others seeking to harness the benefits of such CDSS. While aspects of the problems and solutions described here are likely to vary between settings, several overarching themes are likely to be broadly applicable. Notably: time and resources will be needed to manage legal considerations when connecting a new setting to a hub-and-spoke CDSS, to ensure secure exchange of data, and to determine ownership of records generated by the CDSS. A balance will need to be struck between sending data currently needed for the CDSS and data that might be needed in the future, while maintaining efficiency in the data exchange process. Linking data from a new source to an existing CDSS is likely to involve resources in terms of time and staff with appropriate expertise, as is the need to conduct data standardization of the data. If the new setting differs substantially from that where the CDSS originates, assessing how to optimize the CDS output for users at the new setting may be critical.

Getting such CDSS to work accurately in distal EHR systems is only the first step in such tools’ adoption by clinical teams; the uptake of advanced EHR tools like CDS varies widely between clinical sites, teams, and individual providers.[18, 19, 32] Far more research is needed on best practices for helping care teams adopt such tools into standard workflows, and sustain their use in practice.

IMPLEMENTATION LESSONS.

  • Establishing a shared ‘hub-and-spoke,’ web-based clinical decision support system (CDSS) in an EHR shared by >600 community health centers incurred a myriad of challenges, which are summarized here to guide others seeking to use similar CDSS.

  • Legal and compliance challenges involved ensuring secure data exchanges, determining which entity maintains data records, and deciding which data are sent to the CDSS.

  • Technical challenges involved using lab data from multiple sources and improving the CDSS’ cache routine performance in its new setting.

  • Clinical implementation challenges involved identifying optimal strategies for generating data on CDSS use rates, modifying the CDSS functionality for obtaining clinician / staff feedback, and customizing the risk thresholds that trigger the CDSS for the new setting.

ACKNOWLEDGEMENTS

We would like to thank Arwen Bunce, Carmit McMullen, James Davis, Laurel Nightingale, Nadia Yosuf, Stuart Cowburn, Dagan Wright, Julianne Bava, David Killaby, Lauren Crain, Christina Wood, and Debra McCauley for their contributions to this study.

FUNDING

Research reported in this publication was supported by the National Heart, Lung, And Blood Institute of the National Institutes of Health under Award Number R01HL133793. During the study period, Drs. O’Connor and Sperl-Hillen were also partially supported by National Institute of Health Award Number P30DK092924. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

COMPETING INTERESTS

Aside from the grants reported in the Funding section, the authors of this manuscript have no competing interests to report.

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

REFERENCES

  • 1.Gilmer TP, O’Connor PJ, Sperl-Hillen JM, Rush WA, Johnson PE, Amundson GH, Asche SE, Ekstrom HL. Cost-effectiveness of an electronic medical record based clinical decision support system. Health Serv Res. 2012;47(6):2137–58. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Dudl RJ, Wang MC, Wong M, Bellows J. Preventing myocardial infarction and stroke with a simplified bundle of cardioprotective medications. Am J Manag Care. 2009;15(10):e88–e94. [PubMed] [Google Scholar]
  • 3.Feldstein AC, Perrin NA, Unitan R, Rosales AG, Nichols GA, Smith DH, Schneider J, Davino CM, Zhou YY, Lee NL. Effect of a patient panel-support tool on care delivery. Am J Manag Care. 2010;16(10):e256–e66. [PubMed] [Google Scholar]
  • 4.O’Connor PJ. Adding value to evidence-based clinical guidelines. JAMA. 2005;294(6):741–3. [DOI] [PubMed] [Google Scholar]
  • 5.Ash JS, Sittig DF, Guappone KP, Dykstra RH, Richardson J, Wright A, Carpenter J, McMullen C, Shapiro M, Bunce A, Middleton B. Recommended practices for computerized clinical decision support and knowledge management in community settings: a qualitative study. BMC Med Inform Decis Mak. 2012;12:6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Ash JS, Sittig DF, Dykstra R, Wright A, McMullen C, Richardson J, Middleton B. Identifying best practices for clinical decision support and knowledge management in the field. Stud Health Technol Inform. 2010;160(Pt 2):806–10. [PMC free article] [PubMed] [Google Scholar]
  • 7.Bright TJ, Wong A, Dhurjati R, Bristow E, Bastian L, Coeytaux RR, Samsa G, Hasselblad V, Williams JW, Musty MD, Wing L, Kendrick AS, Sanders GD, Lobach D. Effect of clinical decision-support systems: a systematic review. Ann Intern Med. 2012;157(1):29–43. [DOI] [PubMed] [Google Scholar]
  • 8.Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330(7494):765. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Lobach D, Sanders GD, Bright TJ, Wong A, Dhurjati R, Bristow E, Bastian L, Coeytaux R, Samsa G, Hasselblad V, Williams JW, Wing L, Musty M, Kendrick AS. Enabling health care decisionmaking through clinical decision support and knowledge management. Evid Rep Technol Assess (Full Rep). 2012(203):1–784. [PMC free article] [PubMed] [Google Scholar]
  • 10.Souza NM, Sebaldt RJ, Mackay JA, Prorok JC, Weise-Kelly L, Navarro T, Wilczynski NL, Haynes RB. Computerized clinical decision support systems for primary preventive care: a decision-maker-researcher partnership systematic review of effects on process of care and patient outcomes. Implement Sci. 2011;6:87. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Roshanov PS, You JJ, Dhaliwal J, Koff D, Mackay JA, Weise-Kelly L, Navarro T, Wilczynski NL, Haynes RB. Can computerized clinical decision support systems improve practitioners’ diagnostic test ordering behavior? A decision-maker-researcher partnership systematic review. Implement Sci. 2011;6:88. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Jaspers MW, Smeulers M, Vermeulen H, Peute LW. Effects of clinical decision-support systems on practitioner performance and patient outcomes: a synthesis of high-quality systematic review findings. J Am Med Inform Assoc. 2011;18(3):327–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Moja L, Kwag KH, Lytras T, Bertizzolo L, Brandt L, Pecoraro V, Rigon G, Vaona A, Ruggiero F, Mangia M, Iorio A, Kunnamo I, Bonovas S. Effectiveness of computerized decision support systems linked to electronic health records: a systematic review and meta-analysis. Am J Public Health. 2014;104(12):e12–e22. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Murphy EV. Clinical decision support: effectiveness in improving quality processes and clinical outcomes and factors that may influence success. Yale J Biol Med. 2014;87(2):187–97. [PMC free article] [PubMed] [Google Scholar]
  • 15.Sperl-Hillen JM, Rossom RC, Kharbanda EO, Gold R, Geissal ED, Elliott TE, Desai JR, Rindal DB, Saman DM, Waring SC, Margolis KL, O’Connor PJ. Priorities Wizard: Multisite Web-Based Primary Care Clinical Decision Support Improved Chronic Care Outcomes with High Use Rates and High Clinician Satisfaction Rates. EGEMS (Wash DC). 2019;7(1):9 Epub 2019/04/12. doi: 10.5334/egems.284. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.O’Connor PJ, Sperl-Hillen JM. Current Status and Future Directions for Electronic Point-of-Care Clinical Decision Support to Improve Diabetes Management in Primary Care. Diabetes Technol Ther. 2019;21(S2):S226–S34. Epub 2019/06/07. doi: 10.1089/dia.2019.0070. [DOI] [PubMed] [Google Scholar]
  • 17.PSNet (Patient Safety Network). Clinical Decision Support Systems: Agency for Healthcare Research and Quality,; 2019. [October 6, 2020]. Available from: https://psnet.ahrq.gov/primer/clinical-decision-support-systems.
  • 18.Ancker JS, Kern LM, Edwards A, Nosal S, Stein DM, Hauser D, Kaushal R. How is the electronic health record being used? Use of EHR data to assess physician-level variability in technology use. J Am Med Inform Assoc. 2014;21(6):1001–8. Epub 2014/06/11. doi: 10.1136/amiajnl-2013-002627. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Rumball-Smith J, Shekelle P, Damberg CL. Electronic health record “super-users” and “under-users” in ambulatory care practices. Am J Manag Care. 2018;24(1):26–31. Epub 2018/01/20. [PMC free article] [PubMed] [Google Scholar]
  • 20.Desai JR, Wu P, Nichols GA, Lieu TA, O’Connor PJ. Diabetes and asthma case identification, validation, and representativeness when using electronic health data to construct registries for comparative effectiveness and epidemiologic research. Med Care. 2012;50 Suppl:S30–S5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Tcheng JE, Bakken S, Bates DW, Bonner III H, Gandhi TK, Josephs M, K. Kawamoto K, Lomotan EA, Mackay E, Middleton B, Teich JM, Weingarten S, Hamilton Lopez M, editors. Optimizing Strategies for Clinical Decision Support: Summary of a Meeting Series. Washington, DC: National Academy of Medicine; 2017. [PubMed] [Google Scholar]
  • 22.Health Level Seven International. HL7 International 2020. [October 8, 2020]. Available from: http://www.hl7.org/.
  • 23.HL7 & Boston Children’s Hospital. CDS Hooks [October 6, 2020]. Available from: https://cds-hooks.org/.
  • 24.Logical Observation Identifiers Names and Codes (LOINC), 2020. [October 6, 2020]. Available from: https://en.wikipedia.org/wiki/LOINC. [DOI] [PubMed]
  • 25.Stone NJ, Robinson JG, Lichtenstein AH, Bairey Merz CN, Blum CB, Eckel RH, Goldberg AC, Gordon D, Levy D, Lloyd-Jones DM, McBride P, Schwartz JS, Shero ST, Smith SC Jr., Watson K, Wilson PW, Eddleman KM, Jarrett NM, LaBresh K, Nevo L, Wnek J, Anderson JL, Halperin JL, Albert NM, Bozkurt B, Brindis RG, Curtis LH, DeMets D, Hochman JS, Kovacs RJ, Ohman EM, Pressler SJ, Sellke FW, Shen WK, Smith SC Jr., Tomaselli GF. 2013 ACC/AHA guideline on the treatment of blood cholesterol to reduce atherosclerotic cardiovascular risk in adults: a report of the American College of Cardiology/American Heart Association Task Force on Practice Guidelines. Circulation. 2014;129(25 Suppl 2):S1–45. [DOI] [PubMed] [Google Scholar]
  • 26.Eckel RH, Jakicic JM, Ard JD, de Jesus JM, Houston MN, Hubbard VS, Lee IM, Lichtenstein AH, Loria CM, Millen BE, Nonas CA, Sacks FM, Smith SC Jr., Svetkey LP, Wadden TA, Yanovski SZ, Kendall KA, Morgan LC, Trisolini MG, Velasco G, Wnek J, Anderson JL, Halperin JL, Albert NM, Bozkurt B, Brindis RG, Curtis LH, DeMets D, Hochman JS, Kovacs RJ, Ohman EM, Pressler SJ, Sellke FW, Shen WK, Smith SC Jr., Tomaselli GF. 2013 AHA/ACC guideline on lifestyle management to reduce cardiovascular risk: a report of the American College of Cardiology/American Heart Association Task Force on Practice Guidelines. Circulation. 2014;129(25 Suppl 2):S76–S99. [DOI] [PubMed] [Google Scholar]
  • 27.Bibbins-Domingo K Aspirin Use for the Primary Prevention of Cardiovascular Disease and Colorectal Cancer: U.S. Preventive Services Task Force Recommendation Statement. Ann Intern Med. 2016;164(12):836–45. Epub 2016/04/12. doi: 10.7326/m16-0577. [DOI] [PubMed] [Google Scholar]
  • 28.Standards of Medical Care in Diabetes-2016: Summary of Revisions. Diabetes Care. 2016;39 Suppl 1:S4–5. Epub 2015/12/24. doi: 10.2337/dc16-S003. [DOI] [PubMed] [Google Scholar]
  • 29.Sperl-Hillen JM, Crain AL, Margolis KL, Ekstrom HL, Appana D, Amundson G, Sharma R, Desai JR, O’Connor PJ. Clinical decision support directed to primary care patients and providers reduces cardiovascular risk: a randomized trial. J Am Med Inform Assoc. 2018;25(9):1137–46. Epub 2018/07/10. doi: 10.1093/jamia/ocy085. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.O’Connor PJ, Sperl-Hillen JM, Rush WA, Johnson PE, Amundson GH, Asche SE, Ekstrom HL, Gilmer TP. Impact of electronic health record clinical decision support on diabetes care: a randomized trial. Ann Fam Med. 2011;9(1):12–21. Epub 2011/01/19. doi: 10.1370/afm.1196. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Ancker JS, Edwards A, Nosal S, Hauser D, Mauer E, Kaushal R. Effects of workload, work complexity, and repeated alerts on alert fatigue in a clinical decision support system. BMC Med Inform Decis Mak. 2017;17(1):36 Epub 2017/04/12. doi: 10.1186/s12911-017-0430-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Poon EG, Wright A, Simon SR, Jenter CA, Kaushal R, Volk LA, Cleary PD, Singer JA, Tumolo AZ, Bates DW. Relationship between use of electronic health record features and health care quality: results of a statewide survey. Med Care. 2010;48(3):203–9. Epub 2010/02/04. doi: 10.1097/MLR.0b013e3181c16203. [DOI] [PubMed] [Google Scholar]

RESOURCES