Abstract
Objective: The objectives were (1) to develop an academic, graduate-level course designed for information professionals seeking to bring evidence to clinical medicine and public health practice and to address, in the course approach, the “real-world” time constraints of these domains and (2) to further specify and realize identified elements of the “informationist” concept.
Setting: The course took place at the Division of Health Sciences Informatics, School of Medicine, Johns Hopkins University.
Participants: A multidisciplinary faculty, selected for their expertise in the course core competencies, and three students, two post-graduate National Library of Medicine (NLM) informationist fellows and one NLM second-year associate, participated in the research.
Intervention: A 1.5-credit, graduate-level course, “Informationist Seminar: Bringing the Evidence to Practice,” was offered in October to December 2006. In this team-taught course, a series of lectures by course faculty and panel discussions involving outside experts were combined with in-class discussion, homework exercises, and a major project that involved choosing and answering, in both oral and written form, a real-world question based on a case scenario in clinical or public health practice.
Conclusion: This course represents an approach that could be replicated in other academic health centers with similar pools of expertise. Ongoing journal clubs that reiterate the question-and-answer process with new questions derived from clinical and public health practice and incorporate peer review and faculty mentoring would reinforce the skills acquired in the seminar.
Highlights
Interdisciplinary faculty designed and offered a graduate-level course to teach the skills required by an informationist in clinical and public health practice, further elaborating a model for preparing informationists.
Implications
This scalable approach to teaching skills for the transfer of evidence into practice could be replicated in academic health centers with similar pools of expertise; such replication could contribute data toward validating this training approach.
Greater clarity on an appropriate, or “good enough,” standard of evidence for supporting point-of-action decision making is needed.
Based on the assumption that practicing skills increases confidence and the likelihood that skills will be applied, this course included mentored practice of oral and written evidence presentation skills. Further research could determine whether a course that includes such mentored practice increases the likelihood that students will apply their newly acquired skills.
INTRODUCTION
Current and accurate information is a critical component of good health care practice, and the role that information professionals can play in facilitating the transfer of evidence into practice is continuing to expand. Over thirty years ago, Lamb drew attention to the gap that can exist between the published clinical literature and the knowledge that individual physicians bring to bear on a patient's care [1, 2]. As part of her work in this area, she established the first clinical librarian program at the University of Missouri– Kansas City School of Medicine in 1971. In the 1990s, Giuse and her colleagues at Vanderbilt University Medical Center expanded this concept by redefining the clinical librarian's role, implementing and evaluating innovative practice with intensive care and other teams that emphasized acceptability by clinicians and demonstrated competencies of medical librarians [3–6].
In 2000, Davidoff and Florance [7, 8] expressed concern about inadequate inclusion of new knowledge from the published literature in clinical decision making and called for development of a new professional to address this gap. To respond to clinical realities, they proposed developing a “national program, modeled on the experience of clinical librarianship, to train, credential, and pay for the services of information specialists” [9]. These “informationists” would be cross-trained specialists who have specific content knowledge, could provide in-depth information services, and would be uniquely qualified to apply their expertise to information problem solving in a specific domain [9].
The name, the role, and even the need for such information specialists are topics under debate in the literature. Whether the informationist is really a new professional category or just a new word for a clinical librarian, whether the informationist can be seen as naturally evolving from the clinical librarian's historical role, and what might be the distinguishing characteristics of the informationist are just some of the questions being asked [7, 8, 10–19].
Models in which information specialists facilitate the application of evidenced-based answers to questions arising in clinical practice have been tested internationally as well. In 2002, Greenhalgh and colleagues [20] at the University College London Medical School described and compared two models of “informaticist” service: one, “more academically rigorous with a research component and little personal contact with practitioners,” and the other, “based in a general practice and one that took a more flexible, facilitative approach.”
Considerable attention has been paid to the application of the informationist concept to the clinical setting, and the importance of such a role in basic research has been considered [21, 22]. However, less attention has been paid to the role of the informationist in the public health setting, although there is a growing need for research into roles that combine the unique attributes of public health expertise (i.e., the focus on a population versus an individual, the drive to advance prevention, and the need to evaluate programs and activities) with expertise in library and information science and technology. In 2003, Swain and colleagues [23] reported an early exploration of a team role for information professionals in public health as part of a training exercise for bioterrorism preparedness and response at the US Centers for Disease Control and Prevention (CDC). For the training exercise, they demonstrated strategies to meet the needs of public health professionals working in the field, with all its associated information retrieval and connectivity challenges. In an independently administered evaluation of the exercise, team members stated that the informationist's expertise as an information professional contributed positively to the team's ability to respond effectively to those needs.
From this literature emerges an implicit mandate to train informationists. In 2005, the Welch Medical Library at Johns Hopkins University responded to this need by sponsoring two National Library of Medicine (NLM) informationist fellows, designing individualized curricula to provide training in the skills required for their new roles as informationists. This paper describes the design and content of this approach to informationist training and discusses the results and lessons learned from the first offering of this seminar in the fall of 2006.
COURSE DESIGN AND OVERVIEW
The “Informationist Seminar: Bringing the Evidence to Practice” course was a 1.5-credit, graduate-level course offered through the School of Medicine at Johns Hopkins. Taught for the first time in October to December 2006, this seminar was designed for information professionals seeking to develop the core competencies required of an informationist in the fields of clinical medicine and public health: how to identify a question; how to search for and critically appraise the available, relevant evidence; and how to effectively present that evidence in response [18]. In particular, the goal of this first course presentation was to address the specific curriculum needs of two NLM informationist fellows: a senior fellow whose focus was clinical and a junior fellow whose focus was public health. Both of these NLM informationist fellows were medical librarians. A third student, a second-year NLM associate fellow and medical librarian, audited the course. All three either had completed or were currently taking basic classes or coursework in statistics.
The course faculty, chosen on the basis of their expertise in the required competencies, consisted of a librarian with public health training, who served as course director and recruited the other participating faculty; an NLM post-doctoral informatics fellow, who has been a faculty member and administrator in a graduate school of library and information science; a clinician educator, who practices medicine and teaches evidence-based medical care; a systematic review expert, who serves as co-director of the Johns Hopkins Evidence-based Practice Center; an expert searcher, who has post-graduate training and several years of experience conducting searches and teaching search methodology; and a scientific editor, who teaches biomedical writing. The involvement of faculty with a wide range of expertise reflected the recognition that the skills needed by informationists are multifaceted. The faculty worked together as a team to develop and teach the course, meeting extensively in the months preceding the course to read and discuss the relevant literature, to arrive at a concerted understanding of the informationist concept, and, especially for the nonlibrarian faculty, to fully appreciate and acknowledge the merit in training information professionals for a role in facilitating the transfer of evidence into practice. Based on their discussion, the faculty agreed on the competencies to be taught and identified the most effective didactic approaches to meet the objectives of the course.
Competencies
Skill in identifying and searching resources relevant to a particular question is an essential part of librarianship. The faculty chose to include searching as a competency for the course because the students' experience in searching the literature varied, with some being more experienced than others. The faculty reasoned that at a minimum, this approach would offer a valuable review, provide a context for the subsequently taught competencies, and could conceivably improve already-acquired skills.
Traditional training in clinical librarianship enables students to understand the clinical context and theory of levels of evidence but not how to identify, evaluate, and effectively present evidence in response to a clinical or public health question. Medical residents often gain such competency in applying evidence-based medicine principles to the bedside during the course of months of clinical care and rounds. To provide informationists with a comparable level of skill in identifying and presenting relevant evidence in a shorter period of time, the course faculty made these core competencies a focus of this team-based course [24].
Course objectives
The course objectives were to teach students to (1) understand and demonstrate the evidence-based answering cycle, which includes the ability to define a taxonomy of questions; (2) describe the answer process and identify a question domain; (3) learn to search, screen, and evaluate evidence to support a query; (4) demonstrate appropriate skills for presenting evidence-based data; and (5) demonstrate team membership and participation in a framework defined by real-world time constraints.
Educational strategies
A modified seminar format, combining lectures and laboratory exercises with in-class discussion and a major project, was used to teach the skills involved in finding, analyzing, and delivering evidence for clinical and public health decision making: the identification of a question embedded in a case presentation, development of effective search strategies for relevant evidence to address the question, evaluation and synthesis of the identified evidence, and effective presentation of that evidence. The course followed the eight-week format typically used by the schools of medicine and public health, with the class meeting once weekly for three hours per week.
Course prerequisites included a master's degree from an accredited library and information science program or permission of the instructors. Evidence of an applicant's training and experience in literature searching was considered an important factor in the instructors' evaluation of course applicants who might seek an exception to the required professional information degree.
Evaluation of student performance was based on class participation, completion of reading assignments and homework exercises, and satisfactory completion of a major project (described below) that involved choosing and answering, in both oral and written form, a real-world question based on a case scenario in clinical or public health practice.
Course format
In this team-taught course, a series of lectures by course faculty and panel discussions involving outside experts were combined with in-class discussion and homework exercises. In general, each class session included a didactic lecture or panel discussion, laboratory exercises applying the principles articulated in the didactic presentation, a discussion of the previous week's assignments, and a class component related to each student's major project. The final class was devoted to the students' oral presentations and the response of the course faculty, students, and invited guests to those presentations.
Course content
Introductory sessions
The course began with an introduction to the concept and evolution of the informationist role, which was followed by panel discussions involving several invited clinical and public health experts who described their work environments and the types of questions that are asked in their work settings. The panel members elucidated their need for evidence, including the timeframes in which they needed questions answered and the types of evidence they required. During the first class session, course participants were also given a one-hour overview of the overall process for developing a systematic review of the literature, as the “gold standard” for assessing the state of the published literature addressing a clinical or public health question [18, 25–29].
The major course project: answering a real-world question
A major focus of this course was a project designed to give the students hands-on experience in analyzing and presenting evidence to support decision making by health practitioners in a particular situation. Because it was such an integral part of the course, the project was woven into each class session.
At the beginning of the course, the students began work on this major course project by selecting a primary question from a list of real-world questions contributed by several members of the clinical and public health faculties at Johns Hopkins. These questions involved health care scenarios that were of particular professional interest to the experts who had posed them. Examples included: “There was a recent review article that salmeterol can kill patients. How should we be using it in pediatric asthma?” and “Provide a background literature review for a randomized controlled trial (RCT) being undertaken with methamphetamine-using youth in Thailand. Specifically: what is the human immunodeficiency virus (HIV) risk among drug-using youth?”
The selected questions were then used as real-world examples for the class exercises and formed the basis of the students' final oral and written presentations. The students were given contact information for the person who had originated the question they had chosen, and they were encouraged to establish ongoing contact with the question originator to allow them to obtain feedback and, as necessary, to refine the original question and their answer.
Students' progress in carrying out the major project, including oral and written presentations at the conclusion of the course, was monitored on an ongoing basis during class sessions. They received periodic feedback from the faculty and their fellow students with regard to the formulation of their question, their search strategy, and their analysis of the evidence they obtained. Homework exercises and each assignment related to the major project (defining a search strategy, conducting the search, evaluating the search results, and presenting the results in written and oral form) were reviewed by the appropriate faculty members. The students' presentations at the end of the course were evaluated by course faculty, with additional input from the clinical and public health practitioners who posed the questions addressed by students. In addition, each student acted as a peer reviewer of another student's search strategy and final written presentation.
Course sessions
The remaining class sessions— “Question and Context Specification,” “Searching and Screening,” and “Effective Presentation of Search Results”—were designed to address a series of identified learning objectives related to the core skills needed by informationists. The approach taken to help students achieve each of these objectives is described in the following sections.
Objectives 1 and 2: To understand and demonstrate the evidence-based answering cycle, including the ability to locate the question at hand in an established taxonomy of questions, to describe the answer process, and to identify a question domain. Ely and Osheroff have articulated the concept of a taxonomy of questions that are relevant to clinicians [30–32]. The course faculty used this concept to raise the students' awareness of the attributes of possible questions, not only their domain (e.g., cardiac versus endocrine), but also their type and purpose, as suggested by the Ely-Osheroff taxonomy. Echoing the working hypothesis underlying these two researchers' investigations, the faculty pointed out the need for informationists to match the resources and the type of evidence summary to the type of questions being asked.
-
Objective 3, part 1: To learn to search and screen evidence to support a query. Two class sessions were devoted to searching, screening, and evaluating the literature. While teaching the mechanics of good searching is relatively straightforward, the challenge for informationists is to define a “sufficient search”: a search that, while not a comprehensive search of the kind performed for a systematic review, is of sufficient scope and depth so as to avoid introducing bias into the process. While the clinical informationist program at Vanderbilt has developed standards of practice, such as determining the best articles representing multiple viewpoints found to address a question under examination [6, 33–35], there is currently no widely established standard for a sufficient search-and-screening process. The focus of a sufficient search, as presented in this course, was on identifying pre-appraised evidence.
Students were first advised to identify the highest quality evidence available, starting with a search for evidence-based guidelines from the National Guideline Clearinghouse [36] and websites of relevant professional societies. They were then instructed to look for systematic reviews through searches of The Cochrane Library, PubMed, and Embase. Then students were instructed to look for primary studies. For questions addressing therapy or treatment, students were advised to seek RCTs, which are considered the gold standard of evidence for such questions. For other sorts of questions, faculty and students discussed what they considered to be appropriate sources. For instance, for questions about incidence or prevalence of diseases, the recommendation was made to seek population-based studies or surveys from the appropriate agencies. For each of these types of evidence, clear definitions and suggestions for screening results were provided. Some of the criteria for screening that were discussed with the students related to recentness and directness, that is, how closely the article matched the question being asked.
As a homework exercise, the students were asked to develop a search strategy for their primary question and for a classmate's question and to present them orally at the next class session. During the following session, the students presented their own search strategy and reviewed and critiqued each other's strategies.
Objective 3, part 2: To evaluate and synthesize evidence to support a query. In these sessions, the faculty reviewed how to identify different types of guidelines and studies and pointed out the potential sources of bias for each of these sources of evidence. This discussion led to an overview of critical appraisal, including the identification of existing forms and resources for completion of appraisals. The standard critical appraisal forms for therapy, diagnosis, etiology, and prognosis studies are available from a number of sources, including the Evidence Based Medicine Tool Kit produced by the University of Alberta and the Center for Evidence-based Medicine tools [37]. The JAMA user guides to the medical literature [38], covering a broader range of article types, are also available freely online. Through feedback on their presentations, the students reviewed the appropriate language to use to describe the results of studies and the recommendations from guidelines.
Objective 4: To demonstrate appropriate skills for presenting evidence-based answers. As preparation for their final presentation, the students first drafted a written synthesis of the evidence they identified as addressing their real-world question. The students then participated in a class session on effective writing, which focused on stylistic approaches to achieving accuracy, brevity, clarity, and responsiveness to the question and its context. Illustrative examples drawn from the students' drafts and other related documents were used as a starting point for a discussion that underscored common errors that can interfere with comprehension and emphasized the importance of technical accuracy and sensitivity to the readers' expectations. The students then read each other's drafts and offered constructive suggestions for their improvement. The course instructors also provided feedback regarding both content and writing style. On the basis of these comments, the students produced a revision for which they received a grade. Before their final oral presentation to the faculty and the practitioners who had developed the case scenarios, the students' written syntheses were edited for grammar and style by one of the faculty members. The students used these suggestions to polish their final written syntheses that were distributed during the oral presentations.
-
Objective 5: To demonstrate team membership and participation through oral presentation of an answer, within a framework defined by real-world time constraints. In the context of the learning environment, the students had several weeks to work with one question and prepare an answer. However, decisions about sources to search, types of evidence to consider, and specific content and format of the answer were made based on the premise that, in the real world, the students would have a very limited time to prepare and present an answer. In a lecture and laboratory session addressing the oral presentation of an answer in the clinical or public health context, specific attention was paid to the challenge of garnering the respect of those asking a question. Practice presentations were videotaped and replayed as needed during the discussion that followed. A recurring theme in the class was the necessity for the informationist to balance the rapid response time needed by the clinical team with the desire for complete and correct results. In particular, the faculty pointed out the lack of research to guide the informationist in making this tradeoff: There is currently no available evidence- or principle-based method to help clinicians, librarians, or informationists decide when to stop a search and to summarize what has already been found. Learning the art of successfully achieving this tradeoff between completeness and rapid turnaround is a key goal in on-the-job learning, and that future goal was pointed out to the students. More concretely, the presentation itself was done with the entire class standing, to model the pressured environment and immediacy of clinical information exchange on the ward.
Particular emphasis was placed on understanding the context of the clinical group, because this is one of the most stressful, high-pressure environments in which informationists might be asked to function. Students were advised that they could gain respect in such situations by finding clinically relevant resources that the medical residents and attending physicians had not found and by demonstrating knowledge of the clinical context and its constraints. Advice regarding the pragmatics of answering questions was based on the faculty members' experiences with clinical services, both as librarians and as attending physicians concerned with teaching evidence-based medicine. The students were advised to base their presentations on the following outline:- — Restatement of the question: This restatement would remind the group of the question, focus their attention, and indicate the clinical assumptions that informationists intuited because of their domain knowledge, which would further serve to increase credibility (e.g., a request for information about tympanograms implies that pneumatic otoscopy has already been attempted).
- — A brief mention of the sources searched: The purpose here was to point out that there is more to searching than just consulting PubMed.
- — A brief summary of the “answers” provided in the resource and any critical-appraisal issues or conflicts between the sources: This type of summary served to temper the perception that a single article answers the clinical question.
The videotaped final oral presentation simulated a rounds experience, with the audience standing and the presenter expected to present the highlights with credibility, reserving details for requests only. Immediate constructive criticism and suggestions were offered by the faculty and fellow students. First, the audience's visceral reaction was sought: Was the presentation convincing? Did the audience feel that it got an answer? Was it an authoritative answer? Was it the right answer? Then, both the style and content were discussed.
The videotapes were not only useful for discussions with students during the practice sessions, but they proved, and will continue to prove, useful for other reasons as well. For example, one faculty reviewer was unable to attend the final presentation but later used the videotaped presentation to provide feedback and comments to one of the fellows. The course faculty will draw on the videotaped practice and final presentations by the students from the first course in planning the next course offering (scheduled for spring 2008).
MAIN OUTCOMES
Three students enrolled in this first offering of the seminar: two NLM informationist fellows and an NLM second-year associate fellow. All three students were active participants in the class, and they successfully completed the assigned exercises and delivered final oral and written presentations of evidence for their chosen real-world cases.
At the conclusion of the course, the students were asked to complete an anonymous online evaluation adapted from a standard Johns Hopkins course evaluation form. The course received an overall evaluation of 4.66, with 5 (“excellent”) being the highest possible score. Student comments ranged from “this was a really good course” to suggestions for improvement in workload pacing, timing for synthesis instruction— one suggested it start earlier—and concern about mid-course revisions of the syllabus. All students rated the course content goals and the usefulness and practicality of the content as 5 out of 5. The degree to which the goals were achieved received an average score of 4.3 out of 5. Clarity of goals, organization and sequence of content, and style of education all received an average rating of 4.3. All students said they would recommend the course to other fellows, agreed that the course would improve their professional effectiveness, and found the course intellectually challenging.
Positive feedback was also received from the clinical and public health practitioners who provided the questions addressed by the students in their major project. One of these practitioners commented that the search turned up articles he had previously not seen and that he had subsequently retrieved them for review. There was also a general consensus among the course planners that the course, as taught, was clearly only introductory in nature. The course faculty sensed that the competencies presented in the course would actually merit a full graduate degree program, in which expanded coursework in each competency would be combined with mentored internships and ongoing seminars. In recent months, other librarians at Johns Hopkins have expressed interest in the course, which will next be offered in the spring of 2008. One of the enrolled NLM informationist fellows reported use of the skills he acquired in the course: Shortly after the seminar concluded, he was called on to respond to a question from a clinician and to present the relevant data found. The clinician soon returned with another question and request for available evidence. The fellow interpreted the second request as indicating the clinician's satisfaction with the first response and as illustrating the usefulness of the skills he had acquired in the course.
The course faculty met at the conclusion of the course to assign grades and reflect on the lessons learned from the course, the most important of which are discussed below.
LESSONS LEARNED AND CONCLUDING REMARKS
The course described here represents the efforts of an interdisciplinary faculty—drawing on the evidence in the literature and their own teaching, research, and clinical experience—to develop the informationist concept by designing a course to teach the skills required by an informationist to bring evidence to clinical and public health practice. At present, many of the skills involved in facilitating evidence-based practice are taught individually and are scattered across a range of separate courses offered by library schools, in schools of medicine or public health, and in continuing education venues at professional meetings. The new seminar described here combines a range of such skills in one course and supplements other existing resources for informationist training, such as Vanderbilt's recent (2006) initiative in publishing case studies of informationist practice [27].
Because, like many other academic medical institutions, Johns Hopkins had no practicing informationists who could train or act as role models for its fellows, this informationist training course was developed by assembling an interdisciplinary team of faculty members, each of whom had expertise in one or more of the required skills. The varied backgrounds of the faculty, along with the prior clinical and public health experience of the students, enriched the classroom discussions and highlighted differences in disciplinary approaches to information gathering and evidential problem solving. For example, the clinical approach to information seeking tends to focus on issues related to applying results of a systematic review of RCTs to a specific patient, where the degree of fit between the patient and the evidence must continually be considered by the clinician. In public health, on the other hand, decisions regarding a particular treatment approach may be applied to large populations comprising thousands of individuals and representing a multitude of personal situations and contexts.
The faculty members all brought expertise in different aspects of the skill set needed for the informationist role, but their exposure to the concept of this new professional role varied. During the sessions on searching and screening, several of the course faculty shared additional insights with the class that addressed searching in their own particular domains of expertise, and the exchange that ensued demonstrated the iterative nature of question formulation and search development.
An interesting feature of the planning process for this course was the extensive discussion of the literature that was required as part of the course development process. This continued discussion among the faculty led to an evolving course design but also promoted faculty buy-in to the concept and role of this new professional, the informationist.
The fellows enrolled in this first offering brought varying levels of background in statistics, epidemiology, search skills, and critical appraisal. During their post-course evaluation, the faculty agreed that future offerings of the seminar should include prerequisites pertinent to the searching and evaluation section of the course: (1) introductory coursework in epidemiology and statistics, (2) more hands-on critical appraisal exercises, and (3) participation in peer-reviewed, search-skill seminars or extensive experience with in-depth searching, such as that associated with systematic reviews. The additional coursework and/or experience would help ensure that the students were at a comparable level of understanding and would allow for more in-depth discussions of the issues of bias as well as avoid some of the issues surrounding the description of study results. It would also help ensure that students were advanced in their searching skills. These additions to the course design would help increase the students' sensitivity to the importance of having relevant domain knowledge, using language appropriate to their particular context, and demonstrating superior search skills. Such contextual sensitivity and search expertise not only help informationists convey knowledge about the topic effectively, but also help them fit in and be perceived as valued members of a clinical or public health team.
One challenge the faculty encountered in offering this course was the absence of an established appropriate, or “good enough,” standard of evidence short of a systematic review that meets real-world time constraints. Future research is needed to establish such a standard.
Student follow-up at a later date is needed to establish the long-term success of the course and to confirm whether the course objectives were truly effective and at the right level. This first experience with a course and the early feedback the faculty received suggest that the approach described here is one that could profitably be replicated in other academic health centers with similar pools of expertise.
This introductory course is only a first step in the development of an effective informationist. Ongoing journal clubs, which could reiterate the information-seeking process by posing new questions derived from clinical and public health practice and incorporating peer review and faculty mentoring, would reinforce the skills acquired in the course described here.
In summary, a group of interdisciplinary faculty at Johns Hopkins has developed and offered a course for informationists that represents a novel approach to teaching skills important to the transfer of evidence into practice. This successful first experience with a small group of informationist fellows suggests that a course of this type has the potential to serve as a scaleable model for training adequate numbers of these new information professionals. The authors believe that a major advantage of this interdisciplinary approach to teaching informationist skills lies in the ease with which this model could be replicated at other academic medical institutions, which are likely to have faculty with a similar range of necessary expertise. In addition, although the focus of the course described here was on enhancing evidence-based clinical and public health practice, the authors suggest that the same principles and interdisciplinary approach could be applied to teaching evidence-based library and information science practice in a wide variety of fields.
Acknowledgments
The course faculty thank the NLM fellows and first students—Robert Swain, Doug Varner, AHIP, and Lisa Massengale—for their contributions to this course. Thanks are also due to the participating panelists and domain experts for the questions and wisdom they contributed: Ronald Gray, Pat Thomas, Jennifer McIntosh, and Jinlene Chan.
Footnotes
* The authors acknowledge with appreciation the contributions that funding from the National Library of Medicine (NLM) made to the training programs described (1F37LM008608-01, 1F38LM008610-01, and the second-year associate fellowship). Kathleen Burr Oliver acknowledges the Centers for Disease Control and Prevention Information Center and its director, Jocelyn Rankin, AHIP, FMLA, for funding an Oak Ridge Institute for Science and Education (ORISE) fellowship that led to the one of the NLM informationist fellowships described above and enabled some development and testing of the public health informationist concept.
† Based on a presentation at the “Evidence Based Library and Information Practice Conference (EBLIP4)”; Chapel Hill, NC; May 6– 9, 2007.
REFERENCES
- Acari R, Lamb G. The librarian in clinical care. Hosp Med Staff. 1977 Dec; 6(12):18–23. [PubMed] [Google Scholar]
- Lamb G, Jefferson A, White C.. And now, clinical librarians on rounds. Hartford Hosp Bull. 1975;30(2):77–86. [Google Scholar]
- Giuse NB, Huber JT, Giuse DA, Kafantaris SR, and Stead WW. Integrating health sciences librarians into biomedicine. Bull Med Libr Assoc. 1996 Oct; 84(4):534–40. [PMC free article] [PubMed] [Google Scholar]
- Giuse NB, Huber JT, Kafantaris SR, Giuse DA, Miller MD, Giles DE, Miller RA, and Stead WW. Preparing librarians to meet the challenges of today's health care environment. J Am Med Inform Assoc. 1997 Jan–Feb; 4(1):57–67. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Giuse NB. Advancing the practice of clinical medical librarianship [editorial]. Bull Med Libr Assoc. 1997 Oct; 85(4):437–8. [PMC free article] [PubMed] [Google Scholar]
- Giuse NB, Kafantaris SR, Miller MD, Wilder KS, Martin SL, Sathe NA, and Campbell JD. Clinical medical librarianship: the Vanderbilt experience. Bull Med Libr Assoc. 1998 Jul; 86(3):412–6. [PMC free article] [PubMed] [Google Scholar]
- Davidoff F, Florance V. The informationist: a new health profession? Ann Intern Med. 2000 Jun 20; 132(12):996–8. [DOI] [PubMed] [Google Scholar]
- Davidoff Frank MD, Florance Valerie PhD. The informationist [letters]. Ann Intern Med. 2001 Feb 6; 134(3):252–3.11177347 [Google Scholar]
- Oliver K. The Johns Hopkins Welch Medical Library as base: information professionals working in library user environments. In: Library as place: rethinking roles, rethinking space. Washington, DC: Council on Library and Information Resources, 2005:66–75. [Google Scholar]
- Houghton Bruce MD, Rich Eugene C MD. The informationist [letters]. Ann Intern Med. 2001 Feb 6; 134(3):251–2. [DOI] [PubMed] [Google Scholar]
- Jorgense DB. The informationist. Ann Intern Med. 2001 Feb 6; 134(3):251. [DOI] [PubMed] [Google Scholar]
- Kronenfeld M. “The informationist: a new health profession?” so what are we? chopped liver? Natl Netw. 2000 Oct; 25(2):115. [PubMed] [Google Scholar]
- Plutchak TS. Informationists and librarians [editorial]. Bull Med Libr Assoc. 2000 Oct; 88(4):391–2. [PMC free article] [PubMed] [Google Scholar]
- Plutchak TS. The informationist—two years later [editorial]. J Med Libr Assoc. 2002 Oct; 90(4):367–9. [PMC free article] [PubMed] [Google Scholar]
- Sandroni S. The informationist. Ann Intern Med. 2001 Feb 6; 134(3):251. [DOI] [PubMed] [Google Scholar]
- Schott MJ. The informationist. Ann Intern Med. 2001 Feb 6; 134(3):252–3. [DOI] [PubMed] [Google Scholar]
- Detlefsen EG. The education of informationists, from the perspective of a library and information sciences educator. J Med Libr Assoc. 2002 Jan; 90(1):59–67. [PMC free article] [PubMed] [Google Scholar]
- Sackett DL. Evidence-based medicine: how to practice and teach EBM. 2nd ed. Edinburgh, Scotland, UK: Churchill Livingstone, 2000. [Google Scholar]
- Hammet T, Oliver K, Rankin J.. Gap project (an internal proposal to test informationist role in the Center Disease Control and Prevention's Global AIDS Program). 2003 [Google Scholar]
- Greenhalgh T, Hughes J, Humphrey C, Rogers S, Swinglehurst D, and Martin P. A comparative case study of two models of a clinical informaticist service. BMJ. 2002 Mar 2; 324(7336):524–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shipman JP, Cunningham DJ, Holst R, and Watson LA. The informationist conference: report. J Med Libr Assoc. 2002 Oct; 90(4):458–64. [PMC free article] [PubMed] [Google Scholar]
- Lyon J, Giuse NB, Williams A, Koonce T, and Walden R. A model for training the new bioinformationist. J Med Libr Assoc. 2004 Apr; 92(2):188–95. [PMC free article] [PubMed] [Google Scholar]
- Swain R, Oliver K, Rankin J, Bonander J, Loonsk J.. Bioterrorism alert: reference and literature support for Emergency Operations Center [EOC] and investigative teams. Ref Serv Rev. 2004;32(1):74–82. [Google Scholar]
- Kern DE, Thomas PA, Howard DM, and Bass EB. Curriculum development for medical education: a six-step approach. Baltimore, MD: Johns Hopkins, 1998. [Google Scholar]
- Brownson R, Baker E, Leet T, and Gillespie K. Evidence-based public health. New York, NY: Oxford University Press, 2003. [Google Scholar]
- Fielding JE, Briss PA.. Promoting evidence-based public health policy: can we have better evidence and more action? Health Affairs. 2006;25(4):969–78. doi: 10.1377/hlthaff.25.4.969. [DOI] [PubMed] [Google Scholar]
- Jerome RN, Miller RA. Expert synthesis of the literature to support critical care decision making. J Med Libr Assoc. 2006 Oct; 94(4):376–81. [PMC free article] [PubMed] [Google Scholar]
- Oliver KB, Roderer N.. Working toward the informationist. Health Informatics J. 2006;12(1):41–8. doi: 10.1177/1460458206061207. [DOI] [PubMed] [Google Scholar]
- Giuse NB, Sathe NA, and Jerome R. Envisioning the information specialist in context: a multi-center study to articulate roles and training models [web document]. Chicago, IL: Medical Library Association, 2006. [cited 10 Aug 2007]. <http://www.mlanet.org/members/pdf/isic_final_report_feb06.pdf>. [Google Scholar]
- Ely JW, Osheroff JA, Maviglia SM, and Rosenbaum ME. Patient-care questions that physicians are unable to answer. J Am Med Inform Assoc. 2007 Jul–Aug; 14(4):407–14.Epub 2007 Apr 25. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ely JW, Osheroff JA, Chambliss ML, Ebell MH, and Rosenbaum ME. Answering physicians' clinical questions: obstacles and potential solutions. J Am Med Inform Assoc. 2005 Mar–Apr; 12(2):217–24. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ely JW, Osheroff JA, Gorman PN, Ebell MH, Chambliss ML, Pifer EA, and Stavri PZ. A taxonomy of generic clinical questions: classification study. BMJ. 2000 Aug 12; 321(7258):429–32. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Guise NB, Koonce TY, Jerome RN, Cahall M, Sathe NA, and Williams A. Evolution of a mature clinical informationist model. J Am Med Inform Assoc. 2005 May–Jun; 12(3):249–55.Epub 2005 Jan. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jerome RN, Giuse NB, Gish KW, Sathe NA, and Dietrich MS. Information needs of clinical teams: analysis of questions received by the Clinical Informatics Consult Service. Bull Med Libr Assoc. 2001 Apr; 89(2):177–84. [PMC free article] [PubMed] [Google Scholar]
- Rosenbloom ST, Giuse NB, Jerome RN, and Blackford JU. Providing evidence-based answers to complex clinical questions: evaluating the consistency of article selection. Acad Med. 2005 Jan; 80(1):109–14. [DOI] [PubMed] [Google Scholar]
- Department of Health and Human Services. National guidelines clearinghouse [web document]. Washington, DC: The Department. [cited 13 Aug 2007]. <http://www.guidelines.gov>. [Google Scholar]
- Evidence-based medicine toolkit [web document]. University of Alberta. Edmonton, AB, Canada: The University. [rev. 11 April 2003; cited 16 Aug 2007] <http://www.ebm.med.ualberta.ca>. [Google Scholar]
- Centre for Health Evidence, University of Alberta. Users' guides to evidence-based practice [web document]. Edmonton, AB, Canada: The University. [cited 15 Aug 2007] <http://www.cche.net/usersguides/main.asp>. [Google Scholar]
