Skip to main content
Missouri Medicine logoLink to Missouri Medicine
. 2017 Jul-Aug;114(4):316–320.

Embedding a Medical Search Engine Within an Electronic Health Record

Patricia Alafaireet 1,, Jeff Belden 2, Matt Botkin 3, Karl Kochendorfer 4, Robin Kruse 5, Dylan Strecker 6, Jayne Williams 7
PMCID: PMC6140091  PMID: 30228619

Abstract

This study investigates an information retrieval tool embedded in an electronic health record (EHR). 1-Search provides a single search for retrieving information from a variety of content sources. 1-Search’s usefulness and impact were determined by measuring the extent of physicians’ information needs, pre- and post-implementation user satisfaction, and the impact of 1-Search on clinical decision-making. Results support incorporation of 1-Search into the EHR, the continued use of 1-Search, and further development.

Introduction to the Problem

Studies show that 64–70% of clinical questions go unanswered, despite the existence of an answer, because physicians did not expect to find an answer or lacked time to pursue one.14 On average, physicians generate about two questions per every third patient encounter, and often need answers to medical questions while they are seeing patients.1,411 In 2009, patients made 1.3 billion visits to physicians’ offices, hospital outpatient departments, and hospital emergency departments.12 These visits generated an estimated 867 million clinical questions, of which, by basic calculation, about 607 million (70%) went unanswered, potentially affecting the quality of care for 405 million patient visits.12

Physician’s information needs are often complex and require access to high-quality, specialized information from multiple, dispersed sources, including electronic resources produced by publishers and content aggregators, each of which varies in breadth of content, depth of information, and quality of evidence.2,13 These sources may be difficult to navigate and time-consuming to access at the point of care.3,13,14 Searching multiple information sources during the typical 15-minute patient consultation may be too time-consuming and detract from patient-physician communication. The time-consuming nature of searching for information results from a number of barriers, including the inability to link directly from the EHR to algorithms and patient education materials; the burden of accessing multiple websites, each with its own internal search engine; and the inherent difficulties of navigating websites and other sources across levels of information ranging from guidelines to primary research.15 Additionally, most search results are not easily exported to a personal digital library of preferred content.15

Insufficient time to search is one of the two major reasons physicians do not pursue answers to clinical questions.13 Another common reason that physicians’ questions go unanswered is their uncertainty that an answer exists, in spite of an enormous output of published clinical evidence.2,16 In the last ten years, more than 9 million citations have been added to the MEDLINE database alone.17 Muir Gray, the former director of Britain’s National Library for Health, referred to this as an information paradox; a situation where physicians are overwhelmed by new information yet have many unanswered questions.16 Recent studies comparing several evidence-based electronic resources for their ability to answer clinical questions demonstrate that no single resource answers all questions.1,13,14 One study found that the success rate for finding an answer improves as the number of databases searched increases.13 In short, no single source can answer all questions, and a variety must be available to comprehensively meet physicians’ information needs.18

Un-embedded information retrieval systems can increase the evidence that physicians use in decision-making, but the obstacles to their use in the clinical environment are substantial.1923 These barriers include unintuitive and time-consuming navigation issues as well as the scope of retrieval and quality of the retrieved information. Pub Med, perhaps the best-recognized biomedical search engine, requires its users to navigate to external websites to access information. Other existing general web search engines often fail by retrieving information that, for physicians, lacks depth and relevance. Many commonly accessible web search engines are horizontal, meaning they provide a wide scope of search results from a variety of different sources. These search engines frequently only locate web pages by key words, titles and other more easily identified data; their ability to search proprietary database content is superficial at best.24 Horizontal search engines (e.g., Google) are more likely to guide users to popular sites, and may offer little information about the quality of that resource. Although “popularity ranking” may serve the non-clinicians well, it may not work at all for a biomedical textbook or a healthcare organization’s intranet resources.18

Another less researched barrier is the lack of web and enterprise architecture that leads to the storage of valuable information in silos.15 This separateness, along with the varied construction of databases, websites, and intranets, adds another level to the already time-consuming and difficult process physicians must carry out to access or share information. This increases the probability that a clinical question will go unanswered.

Solution Development

MedSocket of Missouri Inc., formerly MedSocket LLC, is a health IT company dedicated to delivering needed information to healthcare providers at the point of care. MedSocket’s patented, medical search engine, 1-Search, was invented by Dr. Karl Kochendorfer. 1-Search was designed from the healthcare provider’s perspective, with specific attention to clinical information needs at the point of care. Using a single interface, 1-Search’s innovative design features customized aggregation of pre-appraised content sources (e.g., UpToDate, OVID MEDLINE, DynaMed), organizational data (e.g., secure intranet) and personal notes (e.g., uploaded and shared documents). Search filters allow users to narrow their search by information type (e.g., guidelines, patient handouts, drug information, etc.). It also allows individual institutions to customize aggregation of content sources according to their electronic resource licenses and subscriptions. Additional features offer customized (user-specific) profiles that automatically retrieve information from the clinician’s area of expertise when performing a search. These profiles enable information gathering from the most highly utilized textbooks and resources covering the breadth of a given subject. The ability of profiles to act as custom filters has enormous potential for increasing the relevancy of search results while maintaining user autonomy.

A 1-Search family medicine profile was established and first deployed at the University of Missouri’s Department of Family and Community Medicine in August 2011. The Department supported this technology financially with an annual license and commissioned an evaluation of the technology. Preliminary data showed that more than 80 users performed over 4,400 searches with the un-embedded version, and demonstrated extensive heterogeneity of information needs. A key finding of this evaluation was that users frequently said that they would prefer to access 1-Search from the EHR. A second key finding was that although the Department mandates the use of the EHR, significant variation exists in how clinicians use it. This variability appears to provide sufficient sample stratification to allow identification of potential differences in frequency of information access during patient consultations. These characteristics (of this test site and population) supported the subsequent evaluation with grant funding from the National Institutes of Health. The team sought and received Small Business Innovation Research grant funding to research the feasibility of integrating its medical search engine within an EHR and measure its effect on information seeking by clinicians.

In 2013, 1-Search was embedded in Cerner’s EHR system as part of the funded study (R43LM011590). Clinicians in the Department of Family and Community Medicine at the University of Missouri were able to access patient records through the EHR and use the embedded search box to seamlessly search across multiple clinical evidence resources and the healthcare organization’s intranet without switching or logging into another system.

Evaluation Strategy

This project focused on enhancing physicians’ and other clinical providers’ access to information from high quality resources at the point of care by integrating 1-Search with Chart Search in Cerner’s EHR system. To measure the impact of this integration, the researchers gathered and evaluated qualitative and quantitative data by measuring time spent searching, level of information quality, and physicians’ ability to make decisions as well as other factors. This approach allowed for a thorough evaluation in a limited timeframe and laid the groundwork for longitudinal patient-outcome studies assessing the value of information retrieved using 1-Search. We used two strategies to measure the impact of the use of 1-Search on physicians – surveys and analysis of computer log file. This combination provided maximum information with minimum demands upon physician time. The physician experience surveys were based on the DeLone and McLean model for information system success.25

Evaluation began with an invitation to all family physicians at the University to participate in the research with the option to opt out at any time without penalty. Participants were informed that survey completion constituted informed consent. The Health Sciences Institutional Review Board at the University of Missouri approved the study.

Recruitment of physicians, including residents, into the study began with presentations during the Department’s weekly faculty and resident meetings one month prior to 1-Search’s launch in the EHR. Training was accomplished with two 30-minute sessions (one for faculty physicians and another for residents) as well as training that was incorporated into weekly faculty and resident meetings. An e-mail invitation was sent to all faculty and resident physicians. The invitation included general study information, the pre-implementation survey, and notification of the introductory training sessions. A basic user manual for 1-Search was accessible to participants during the study period.

To protect confidentiality, each participant was assigned a unique, arbitrary identifier. The identifier was also embedded in survey links so that an individual’s pre- and post-surveys could be linked. Because both attending physicians and resident physicians participated, and because physicians rotated within our academic training programs and sites, not all participants completed both pre- and post-surveys.

Results from Pre- and Post-Surveys

The pre-survey was sent to 98 physicians and the post-survey, to 99. The pre-survey was administered before training in how to use 1-Search. The post-survey was administered after training and a minimum trial period of 90 days of 1-Search use. Both surveys were conducted using Survey Monkey. Participation was voluntary and no incentives were provided. For the pre-survey, 69 of 98 responded, and for the post-survey, 58 of 99 responded. Forty-one people answered both surveys. In the pre-survey, 29 respondents (42%) reported prior 1-Search use. In the post-survey, 30 respondents (52%) reported using 1-Search. The proportion of 1-Search users between surveys did not differ (p=0.23). Of the 41 who answered both, 17 reported they had used 1-Search on both surveys, 13 reported they had not on both, and the remaining 11 reported different use between surveys.

Of those who responded to both surveys, 36 reported how upset they would be if 1-Search were no longer available. On the pre-survey, only 6% indicated they would be moderately to very upset, compared with 22% on the post-survey (p=0.02). Among those who reported 1-Search use, 52% reported being satisfied or extremely satisfied with 1-Search on the pre-survey compared with 66% on the post-survey (p=0.31). In the pre-survey, respondents reported using multiple search engines (Google – 85%, Pub Med/Medline – 49.2%, Google Scholar – 31%, Wikipedia – 31%, an internal SharePoint site – 28%, Web MD – 15%, 1-Search – 12%, and Bing – 1.5%). The use of 1-Search did not seem to positively affect the likelihood that clinicians had their clinical questions answered using electronic resources, but 87 % did feel they could find clinical answers using electronic sources. In fact, 86% were confident that they could find clinical information online and 66.67% thought electronic resources were quick and easy to use, but 49% frequently failed to find the needed information when searching. Multiple barriers to the use of electronic resources for answer clinical questions exist (See Table 1).

Table 1.

Barriers to Answering Clinical Questions Reported by Respondents to the Pre-Survey (N=61).*

Issue or Barrier Number (Percent) responding “Yes”
Finding time 43 (70.5)
Must check multiple places 29 (47.5)
Knowing where to look 24 (39.3)
No answer seems to exist 19 (31.1)
Lack of searching skills 17 (27.9)
Lack of remote access 11 (18.0)
Searching too slow 9 (14.8)
Material in difficult format 5 (8.2)
Lack of computer access 3 (4.9)
Lack of computer skills 2 (3.3)
*

Respondents who chose at least one barrier

In the pre-survey, respondents identified multiple pre-deployment barriers to the use of 1-Search (See Figure 1). Barriers identified are not mutually exclusive. The most common barriers include lack of time (70.5%), the need to check multiple sources (47.5%), and knowing where to look (39.3%).

Figure 1.

Figure 1

Potential barriers identified by respondents regarding 1-Search use (Pre-deployment)

Clinicians expected to use 1-Search in a number of ways. Figure 2 illustrates expected uses reported on the pre-survey and the ways 1-Search was actually used as reported in the post-survey. Respondents could endorse more than one response. The results suggest that the use of 1-Search helped clinicians gain access to treatment information and to locate guidelines and other information to support improved interaction with patients.

Figure 2.

Figure 2

Expected and actual use of 1-Search

In the post-survey, more than 65% of physicians reported being satisfied with 1-Search and almost 38% felt that using 1-Search saved them time. However, more than 13% of users felt 1-Search was too slow. Nearly 80% of users agree that 1-Search results were useful in their clinical practice, while only 28% felt that that 1-Search often failed to retrieve useful information. Results were mixed regarding reliability of the evidence retrieved by 1-Search and its ease of use. However, 58.6% of users believed that 1-Search helped them diagnose and treat patients, and 31% believed they had learned something new from the use of 1-Search. No users felt that using 1-Search was potentially harmful to a patient.

Results from log file analysis

We analyzed computer log files to investigate the frequency of 1-Search use as well as other search characteristics. The data were collected over a period of 62 days. The day of the week that saw the highest 1-Search use was Wednesday (35%), likely reflecting the higher level of clinic visits experienced that day. The type of source for accessed material varied widely. Seeking clinic information was the most frequent activity (33.5%); accessing patient handouts occurred in about 14% of uses. The actual duration of the search (not inclusive of reading or printing time) ranged from less than 1 second to 5+ seconds, with the largest percentage of searches taking 3 to 4.9 seconds to complete. Generally, only one click was needed for physicians to access the needed information. Attending physicians were, by far, the most common users, and the percentage of uses of 1-Search were relatively equally distributed over the different clinic sites.

Conclusion

The results of this research suggests that the 1-Search, as an embedded information retrieval tool, holds promise to improve care though reduced cognitive workload, more compressed reference information delivery, and timely access to information within a typical clinical workflow. Physicians supported incorporation of 1-Search into the EHR and provided evidence to support the continued use of 1-Search as well as its further development.

Biography

Patricia Alafaireet, PhD, (above) and Dylan Strecker, BS, are in the Department of Health Management and Informatics, University of Missouri. Jeff Belden, MD, and Robin Kruse, PhD, are in the Department of Family and Community Medicine, University of Missouri. Karl Kochendorfer, MD, is with the Department of Family and Community Medicine, University of Illinois Hospital and Health System. Matt Botkin and Jayne Williams, MA, are with MedSocket, Columbia.

Contact: AlafaireetP@health.missouri.edu

graphic file with name ms114_p0316f3.jpg

Footnotes

Disclosure

KK, JW, and MB are employed by Medsocket of Missouri, Inc. They did not take part in any data analysis.

References

  • 1.Covell D, Uman G, Manning P. Information needs in office practice: are they being met? Annals of Internal Medicine. 1985;103(4):596–599. doi: 10.7326/0003-4819-103-4-596. [DOI] [PubMed] [Google Scholar]
  • 2.Ely JW, Osheroff JA, Chambliss ML, Ebell MH, Rosenbaum ME. Answering physicians’ clinical questions: obstacles and potential solutions. Journal of the American Medical Informatics Association. 2005;12(2):217–224. doi: 10.1197/jamia.M1608. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Ely JW, Osheroff JA, Ebell MH, et al. Analysis of questions asked by family doctors regarding patient care. BMJ. 1999;319(7206):358–361. doi: 10.1136/bmj.319.7206.358. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Gorman PN, Helfand M. Information seeking in primary care: how physicians choose which clinical questions to pursue and which to leave unanswered. Medical Decision Making. 1995;15(2):113–119. doi: 10.1177/0272989X9501500203. [DOI] [PubMed] [Google Scholar]
  • 5.Ely J, Burch R, Vinson D. The information needs of family physicians: case-specific clinical questions. Journal of Family Practice. 1992;35(3):265–269. [PubMed] [Google Scholar]
  • 6.Osheroff JA, Forsythe DE, Buchanan BG, Bankowitz RA, Blumenfeld BH, Miller RA. Physicians’ information needs: analysis of questions posed during clinical teaching. Annals of Internal Medicine. 1991;114(7):576–581. doi: 10.7326/0003-4819-114-7-576. [DOI] [PubMed] [Google Scholar]
  • 7.Timpka T, Arborelius E. The GP’s dilemmas: a study of knowledge need and use during health care consultations. Methods of Information in Medicine. 1990;29(1):23–29. [PubMed] [Google Scholar]
  • 8.Dee C, Blazek R. Information needs of the rural physician: a descriptive study. Bulletin of the Medical Library Association. 1993;81(3):259–264. [PMC free article] [PubMed] [Google Scholar]
  • 9.Smith R. What clinical information do doctors need? BMJ. 313(7064):1062–1068. doi: 10.1136/bmj.313.7064.1062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Coumou H, Meijman F. How do primary care physicians seek answers to clinical questions? A literature review. Journal of the Medical Library Association. 2006;94(1):55–60. [PMC free article] [PubMed] [Google Scholar]
  • 11.Gonzalez-Gonzalez AI, Dawes M, Sanchez-Mateos J, et al. Information needs and information-seeking behavior of primary care physicians. Annals of Family Medicine. 2007;5(4):345–352. doi: 10.1370/afm.681. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.National Center for Health Statistics . Health, United States, 2011: With Special Feature on Socioeconomic Status and Health. Hyattsville, MD: 2012. DHHS Publication No. 2012-1232. [PubMed] [Google Scholar]
  • 13.Alper B, Stevermer J, White D, Ewigman B. Answering family physicians’ clinical questions using electronic medical databases. Journal of Family Practice. 2001;50(11):960–965. [PubMed] [Google Scholar]
  • 14.Gehanno JF, Paris C, Thirion B, Caillard JF. Assessment of bibliographic databases performance in information retrieval for occupational and environmental toxicology. Occupational & Environmental Medicine. 1998;55(8):562–566. doi: 10.1136/oem.55.8.562. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Hersh W. Ubiquitous but unfinished: grand challenges for information retrieval. Health Information & Libraries Journal. 2008;25(Suppl 1):90–93. doi: 10.1111/j.1471-1842.2008.00815.x. [DOI] [PubMed] [Google Scholar]
  • 16.Smith R. Strategies for coping with information overload. BMJ. 341:c7126. doi: 10.1136/bmj.c7126. [DOI] [PubMed] [Google Scholar]
  • 17.U.S. National Library of Medicine Detailed Indexing Statistics 1965–2016. https://www.nlm.nih.gov/bsd/index_stats_comp.html. Accessed 07/27/2017. [Google Scholar]
  • 18.Hersh W. Information retrieval: a health and biomedical perspective. New York, NY: Springer Verlag; 2009. [Google Scholar]
  • 19.Magrabi F, Westbrook JI, Coiera EW. What factors are associated with the integration of evidence retrieval technology into routine general practice settings? International Journal of Medical Informatics. 2007;76(10):701–709. doi: 10.1016/j.ijmedinf.2006.06.009. [DOI] [PubMed] [Google Scholar]
  • 20.Tannery NH, Epstein BA, Wessel CB, Yarger F, LaDue J, Klem ML. Impact and user satisfaction of a clinical information portal embedded in an electronic health record. Perspectives in Health Information Management. 2011;8:1d. [PMC free article] [PubMed] [Google Scholar]
  • 21.Van Duppen D, Aertgeerts B, Hannes K, et al. Online on-the-spot searching increases use of evidence during consultations in family practice. Patient Education & Counseling. 2007;68(1):61–65. doi: 10.1016/j.pec.2007.04.008. [DOI] [PubMed] [Google Scholar]
  • 22.Westbrook JI, Coiera EW, Gosling AS. Do online information retrieval systems help experienced clinicians answer clinical questions? Journal of the American Medical Informatics Association. 2005;12(3):315–321. doi: 10.1197/jamia.M1717. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Westbrook JI, Coiera EW, Sophie Gosling A, Braithwaite J. Critical incidents and journey mapping as techniques to evaluate the impact of online evidence retrieval systems on health care delivery and patient outcomes. International Journal of Medical Informatics. 2007;76(2–3):234–245. doi: 10.1016/j.ijmedinf.2006.03.006. [DOI] [PubMed] [Google Scholar]
  • 24.Steinbrook R. Searching for the right search--reaching the medical literature. New England Journal of Medicine. 2006;354(1):4–7. doi: 10.1056/NEJMp058128. [DOI] [PubMed] [Google Scholar]
  • 25.DeLone W, McLean E. Information systems success: The quest for the dependent variable. Information Systems Research. 1992;3(1) [Google Scholar]

Articles from Missouri Medicine are provided here courtesy of Missouri State Medical Association

RESOURCES