Abstract
Background
Effective learning can occur at the point of care, when opportunities arise to acquire information and apply it to a clinical problem. To assess interest in point-of-care learning, we conducted a survey to explore radiologists' attitudes and preferences regarding the use of just-in-time learning (JITL) in radiology.
Materials and Methods
Following Institutional Review Board approval, we invited 104 current radiology residents and 86 radiologists in practice to participate in a 12-item Internet-based survey to assess their attitudes toward just-in-time learning. Voluntary participation in the survey was solicited by e-mail; respondents completed the survey on a web-based form.
Results
Seventy-nine physicians completed the questionnaire, including 47 radiology residents and 32 radiologists in practice; the overall response rate was 42%. Respondents generally expressed a strong interest for JITL: 96% indicated a willingness to try such a system, and 38% indicated that they definitely would use a JITL system. They expressed apreference for learning interventions of 5–10 min in length.
Conclusions
Current and recent radiology trainees have expressed a strong interest in just-in-time learning. The information from this survey should be useful in pursuing the design of learning interventions and systems for delivering just-in-time learning to radiologists.
Key Words: Continuing medical education (CME), radiology education, just-in-time learning, survey research, radiology workflow, systems integration
Introduction
Rapid advances in medical science and technology require physicians to continue learning throughout their career. Although physicians frequently have questions that arise during the course of their work, most questions go unanswered.1 There is evidence that current traditional methods of medical education – particularly those of continuing medical education (CME) to support lifelong learning – are ineffectual and in need of change.2,3 A traditional CME architecture of receptive learning has little or no impact on the quality of medical care.4
When learning is remote in time, place, and context from the environment in which the content can be applied, physicians may never use the learned information and may not understand how to apply the knowledge in other clinical settings.1,5 To be effective, learning must be incorporated into a schema that supports its application to a range of “real world” situations. Lave and Wenger have described “situated learning”, in which the learning experience is integrated into the setting in which the knowledge can be used.6 Also, adults are “directive” learners, who desire to learn when the learning can be directly applied to a practical need or question.7
Just-in-time learning (JITL) provides brief educational experiences targeted to a specific need or clinical question. Digital radiology is ideally suited for situated learning environments because one can readily identify the context of a radiologist's work. As part of a project to integrate context-oriented JITL into the workflow of radiologists' practice, we explored the perceived needs and preferences of radiologists for a JITL system in radiology.
Materials and Methods
To assess the perceived instructional needs and interests of potential users of JITL in radiology, we conducted a survey of radiology trainees and radiologists in practice. The Human Research Review Committee of the Medical College of Wisconsin (MCW) approved the study protocol. We obtained the names and e-mail addresses of 104 current residents under three diagnostic radiology training programs: MCW Affiliated Hospitals, University of Southern California, and Loma Linda University. We also obtained a list of 157 radiologists in practice who had completed radiology residency or fellowship at MCW Affiliated Hospitals since June 1990; of these, we polled the 86 radiologists for whom e-mail addresses were available. Voluntary participation in the survey was solicited by e-mail messages to all subjects.
A 12-item multiple-choice questionnaire was presented to participants using a commercial web survey site (www.surveymonkey.com). The survey described JITL briefly and included questions about demographics (year of completion of residency, gender), CME activities, level of interest in JITL, and preferences for a JITL system. Respondents were offered a chance to win a $25 gift certificate from a drawing.
The response data were analyzed blindly. The results for overall preference and for learning-module duration were further analyzed by partitioning the data by residency status (current vs. completed), gender, picture archiving and communication system (PACS) use, and year of completion of residency training. The chi-square test was used to determine differences between observed and expected results.
Results
Seventy-nine physicians completed the questionnaire, including 47 radiology residents (response rate, 45%) and 32 radiologists in practice (response rate, 37%). Respondents included 55 men (70%) and 24 women (30%). The average time for respondents to complete the questionnaire was 3.8 ± 1.8 min. Seventy-two percent of respondents completed or will complete their residency training after 2002; 13% completed training 10 or more years ago.
PACS usage was frequent, particularly among the current trainees. Sixty-six percent of practicing radiologists and 89% of current residents used a PACS to interpret more than 75% of imaging studies in their practice. All current residents used a PACS to interpret at least 25% of studies. The educational activities in which respondents most frequently participated were (in order of decreasing frequency): journal-based CME, on-line CME, and national meetings of general radiology societies.
Respondents expressed interest in using a JITL system, or trying a JITL system and using it if they liked it (Table 1). Overall, 76 of 79 respondents (96%; with a 95% confidence interval of 91–100%) indicated a willingness to try such a system. As shown in Table 2, radiologists expressed the greatest preference for learning modules or interventions of 5–10 min in duration. The aspect of learning that respondents identified as most important was the ability to design a personalized curriculum specific to their topics of interest for learning (Table 3).
Table 1.
We are Building a System to Deliver Personal “Just in Time” Learning. The System Will Let You Review Brief Modules for CME Credit (“Micro-CME”) on Demand Through Your PACS System or the Web. Each Module Might Take 3 to 15 Minutes to Complete. How Would You Rate Such a System for Your Personal Use?
Response | Number | Percent (%) | 95% CI |
---|---|---|---|
I definitely would use it | 30 | 38 | 27–49 |
I would try it, and might use it if I liked it | 46 | 58 | 47–70 |
Unsure | 3 | 4 | 0–9 |
I probably would not use it | 0 | 0 | 0–1 |
I would not use it | 0 | 0 | 0–1 |
The 95% percent confidence interval (CI) was computed for each response.
Table 2.
What Would be the Ideal Amount of Time for Each Learning (CME) Module?
Time per Learning Module (min) | Number | Percent (%) |
---|---|---|
3 | 3 | 4 |
5 | 28 | 35 |
10 | 33 | 42 |
15 | 15 | 19 |
Table 3.
Percentage of Respondents Who Described Each Feature as “Very Important,” “Important,” or “Not Important.” The Score Value Ranges from 0 (Completely “Not Important”) to 100 (Completely “Important”), and is Computed as a Weighted Average of Responses in Each Category. Radiologists Were Posed the Question: “Please Rate the Importance of Each Feature of A ‘Just in Time’ CME System”
Very Important (%) | Important (%) | Not Important (%) | Score | |
---|---|---|---|---|
Ability to specify the number of modules to be viewed per day | 35 | 48 | 16 | 59 |
CME content closely related to the organ system or body part you're viewing on PACS | 37 | 58 | 5 | 66 |
CME content related to the imaging modality you're viewing on PACS | 29 | 62 | 9 | 60 |
Ability to define your own “curriculum” to indicate the areas in which you want CME content | 63 | 37 | 0 | 82 |
In terms of content presentation of the learning experience, the respondents indicated that text, images, and drawings were the most useful to them (Table 4). There was stronger preference for “bullet points” – that is, abbreviated text outlines, such as review points – over full sentences such as found in journal articles (Table 5). Interactivity in and feedback from the learning experience were of intermediate importance. The ability to work in groups found the least favor among the respondents (Table 4). As shown in Table 6, immediate feedback and explanations of the response were the most important aspects of the multiple-choice questions to be included in learning modules.
Table 4.
Percentage of Respondents Who Described Each Feature as “Very Important,” “Important,” “Unsure,” or “Not Important. The Score Value Ranges from 0 (Completely “Not Important”) to 100 (Completely “Important”), and is Computed as a Weighted Average of Responses in Each Category. Radiologists were Posed the Question: “How Important are the Following Potential Features of Learning Modules?”
Very Important (%) | Important (%) | Unsure (%) | Not Important (%) | Score | |
---|---|---|---|---|---|
Textual descriptions of diseases and imaging findings | 46 | 52 | 3 | 0 | 81 |
Images of related teaching cases | 51 | 48 | 1 | 0 | 83 |
Anatomical drawings or schematic diagrams | 56 | 38 | 5 | 0 | 84 |
Animation | 10 | 34 | 35 | 20 | 45 |
Interactive modules | 16 | 59 | 19 | 5 | 62 |
Opportunity to get feedback | 21 | 59 | 12 | 9 | 64 |
Opportunity to give feedback to authors | 8 | 41 | 24 | 27 | 43 |
Ability to form groups so you can work with others | 1 | 8 | 49 | 42 | 23 |
Table 5.
Preference for Text Style: Full Sentences vs. Lists of “Bullet Points”
Prefer (%) | Unsure (%) | Do Not Prefer (%) | |
---|---|---|---|
Full sentences | 25 | 49 | 26 |
Bullet points | 72 | 24 | 4 |
Table 6.
Percentage of Respondents Who Described Each Feature as “Very Important,” “Important,” “Unsure,” or “Not Important. The Score Value Ranges From 0 (Completely “Not Important”) to 100 (Completely “Important”), and is Computed as a Weighted Average of Responses in Each Category. Radiologists were Asked: “Each CME Module Will Include A Few Multiple-Choice Questions. Please Rate the Value of Each Feature”
Very Important (%) | Important (%) | Unsure (%) | Not Important (%) | Score | |
---|---|---|---|---|---|
Ability to bypass the multiple-choice questions and skip CME credit | 15 | 41 | 33 | 11 | 53 |
Immediate feedback to show the correct answers | 57 | 42 | 1 | 0 | 85 |
Explanations of why each of the choices is right or wrong | 72 | 24 | 4 | 0 | 89 |
Feedback to compare your performance with that of your peers | 13 | 27 | 30 | 30 | 41 |
Current radiology residents were significantly more likely than practicing radiologists to use PACS to interpret a larger number of studies (p = 0.002), as noted above. Women were more likely than men to express strong interest in using a JITL system (p = 0.034). Otherwise, there was no significant difference in either the overall interest in JITL or the preferred duration of study modules (p ≥ 0.09) when partitioning the data by residency status (current vs. completed), year of completion of residency training, gender, or PACS use.
Discussion
Web-based information technology has created robust and easily accessible opportunities for continuing medical education and continued professional development. Internet-based programs have been shown to impart knowledge as effectively as traditional CME activities.8,9 When Internet-based CME programs support independent and individualized self-directed learning, they more closely meet physicians' needs and motivations.10,11 The American Medical Association has recently endorsed point-of-care educational interventions, including Internet searching and independent learning, that lead to performance improvement.12
Medical decision making is aided by immediate, targeted information delivered at the point of care.13 Interactive and contextually relevant (“situated”) CME can improve knowledge, skills, attitudes, behavior, and health care outcomes.14 Thus, “just-in-time” learning helps promote evidence-based radiology, where scientific knowledge is linked to radiology practice.15
Traditional CME activities and many on-line CME programs – such as those offered in RadioGraphics or in New England Journal of Medicine – have been constructed in 1-hour units. Some newer approaches, such as the case exercises on AuntMinnie.com, offer “fractional” CME credits. In our survey, radiologists expressed a preference for learning experiences of 5–15 min in length. It has been noted that when radiology residents and practicing radiologists have access to the Internet in their clinical work areas, they frequently use www.Google.com and other search tools to find relevant information that can help them address a specific clinical question. We have hypothesized that radiologists would find it appealing to be able to obtain CME credit for brief, focused educational interventions. The current study lends support to that hypothesis.
The survey has helped identify priorities for elements to include in the instructional design JITL modules. Graphical elements, such as radiological images and anatomic drawings, are important. Radiologists want informative feedback from a JITL system to indicate and explain the correct answers. They are less interested in working in groups or comparing their performance with that of their peers, although the latter is a requirement for Maintenance of Certification. Respondents indicated a strong preference for the ability to construct one's own curriculum to individualize the areas in which JITL would be delivered. They indicated an intermediate preference for learning content linked to organ system, body part, or imaging modality – and the ability to bypass multiple-choice questions.
The current study has several limitations. The study sample was drawn from a limited set of training programs. The brief nature of the questionnaire necessarily limits the amount of information gathered; we believed the brevity of the survey was necessary to obtain a sufficiently large response sample. The study population was skewed intentionally to include “younger” radiologists, that is, those still in training and those who completed training in the past 15 years. Radiologists who received American Board of Radiology certification after 2002 or any subspecialty certification face the requirements for Maintenance of Certification; CME forms a critical part of that process.16 Current radiology residents were more likely than practicing radiologists to use a PACS; given the trends for increasingly widespread adoption of PACS, this potential bias was considered acceptable.
Because a rich JITL system for radiologists has not yet been developed, the respondents stated preferences in the absence of direct experience with such a technology; their preferences could change or may not generalize to a larger group. Also, the fact that respondents received the e-mail solicitation and completed the web-based survey may reflect a selection bias: radiologists less familiar or less facile with information technology may have been excluded or may have chosen not to participate. These limitations of Internet-based surveys are well recognized.17,18
Radiology education provides an excellent context in which to develop robust tools for just-in-time learning because radiology practice is intimately linked to information technology. Information about patients is managed by a radiology information system (RIS), and radiological images are stored, transmitted, and displayed by means of a computer-based PACS. We posit that an interactive, individualized Internet-based system of brief, highly focused educational modules will add value to radiology practice. The development of core elements, including reusable learning objects, would foster development of individualized curricula and learning plans. Such a system will form another component of enterprise integration, as espoused by the Integrating the Healthcare Enterprise (IHE) initiative.19,20
Acknowledgments
This study was supported in part by a research grant to C.E.K. from the Society for Computer Applications in Radiology (SCAR). The authors are grateful to the radiologists who participated in the survey. We thank Ms. Susan Liberski for her assistance in preparing and conducting the survey, and Dr. Paul Nagy for critical review of the manuscript.
References
- 1.Fox RD, Mazmanian PE, Putnam RW, editors. Changing and Learning in the Lives of Physicians. New York: Praeger; 1989. [Google Scholar]
- 2.Greiner AC, Knebel E, editors. Health Professions Education: A Bridge to Quality. Washington, DC: National Academies Press; 2002. [PubMed] [Google Scholar]
- 3.Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA. 1995;274:700–705. doi: 10.1001/jama.274.9.700. [DOI] [PubMed] [Google Scholar]
- 4.Patel MR, Meine TJ, Radeva J, et al. State-mandated continuing medical education and the use of proven therapies in patients with an acute myocardial infarction. J Am Coll Cardiol. 2004;44:192–198. doi: 10.1016/j.jacc.2004.03.070. [DOI] [PubMed] [Google Scholar]
- 5.Brown JS, Collins A, Duguid P. Situated cognition and the culture of learning. Educ Res. 1989;18:32–42. [Google Scholar]
- 6.Lave J, Wenger E. Situated Learning: Legitimate Peripheral Participation. Cambridge: Cambridge University Press; 1990. [Google Scholar]
- 7.Knowles MS. The Modern Practice of Adult Education: From Pedagogy to Andragogy. New York: Cambridge, The Adult Education Company; 1980. [Google Scholar]
- 8.Wutoh R, Boren SA, Balas EA. eLearning: A review of Internet-based continuing medical education. J Contin Educ Health Prof. 2004;24:20–30. doi: 10.1002/chp.1340240105. [DOI] [PubMed] [Google Scholar]
- 9.Casebeer L, Kristofco RE, Strasser S, et al. Standardizing evaluation of on-line continuing medical education: Physician knowledge, attitudes, and reflection on practice. J Contin Educ Health Prof. 2004;24:68–75. doi: 10.1002/chp.1340240203. [DOI] [PubMed] [Google Scholar]
- 10.Kaufman DM. Applying educational theory in practice. BMJ. 2003;326:213–216. doi: 10.1136/bmj.326.7382.213. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Harden RM. A new vision for distance learning and continuing medical education. J Contin Educ Health Prof. 2005;25:43–51. doi: 10.1002/chp.8. [DOI] [PubMed] [Google Scholar]
- 12.American Medical Association (2005) Internet Point ofCare. American Medical Association.<http://www.ama-assn.org/ama/pub/category/15085.html>. Accessed 7 June 2005
- 13.Elson RB, Faughnan JG, Connelly DP. An industrial process view of information delivery to support clinical decision making: Implications for systems design and process measures. J Am Med Inform Assoc. 1997;4:266–278. doi: 10.1136/jamia.1997.0040266. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Robertson MK, Umble KE, Cervero RM. Impact studies in continuing education for health professions: Update. J Contin Educ Health Prof. 2003;23:146–156. doi: 10.1002/chp.1340230305. [DOI] [PubMed] [Google Scholar]
- 15.Evidence-Based Radiology Working Group Evidence-based radiology: A new approach to the practice of radiology. Radiology. 2001;220:566–575. doi: 10.1148/radiol.2203001465. [DOI] [PubMed] [Google Scholar]
- 16.Madewell JE, Hattery RR, Thomas SR, et al. Maintenance of Certification. Am J Roentgenol. 2005;184:3–10. doi: 10.2214/ajr.184.1.01840003. [DOI] [PubMed] [Google Scholar]
- 17.Schleyer TKL, Forrest JL. Methods for the design and administration of web-based surveys. J Am Med Inform Assoc. 2000;7:416–425. doi: 10.1136/jamia.2000.0070416. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Wyatt JC. When to use web-based surveys. J Am Med Inform Assoc. 2000;7:426–429. doi: 10.1136/jamia.2000.0070426. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Flanders AE, Carrino JA. Understanding DICOM and IHE. Semin Roentgenol. 2003;38:270–281. doi: 10.1016/S0037-198X(03)00044-0. [DOI] [PubMed] [Google Scholar]
- 20.Carr CD, Moore SM. IHE: A model for driving adoption of standards. Comput Med Imaging Graph. 2003;27:137–146. doi: 10.1016/S0895-6111(02)00087-3. [DOI] [PubMed] [Google Scholar]