The term “education” has many meanings, although its gestalt shows little effect on the performance of clinicians or the outcomes of health care. This lack of effect is especially true in continuing medical education (CME), where education often implies a large, group-based session held in a hotel or conference setting. The American Medical Association defines CME as “any and all ways by which physicians learn and maintain their competence” — clearly a more fulsome construct than attending a short course.1
In this paper, we describe educational interventions that are designed to promote the incorporation of best evidence into the practices of health professionals. We address a theoretic basis for the learning and education of physicians (making reference mainly to physicians because most studies in this area have involved physicians). We also provide an outline of effective large-group methods, describe innovations in formal education that use high-tech (and low-tech) strategies and discuss future trends in CME.
What are the purposes of education?
Why do health care professionals learn? They are driven by many external forces. These forces include the explosion of knowledge, interest in CME among specialty societies, the use of CME “credits” to document the maintenance of knowledge and skills, and a large interest in CME by pharmaceutical and other commercial interests that recognize it as a means of influencing the practices of physicians. Of course, many internal forces are at work as well, including an innate sense of professionalism on the part of most health care workers.
The question of how physicians learn is equally complex. Fox and his colleagues asked over 300 North American physicians what practices they had changed and what forces had driven that change.2 The forces for change described by the physicians were varied. Whereas some changes had arisen from traditional educational experiences, many more resulted from intrapersonal factors (e.g., a recent personal experience) or from changing demographics (e.g., aging or changing populations and changes in the demands of patients). The changes varied from small adjustments or accommodations (e.g., adding to a regimen a new drug within a class of drugs already known and prescribed) to much larger “redirections,” such as adopting an entirely new method of practice.
In another approach to understanding how physicians learn, Schon describes the internal process of learning and reflection. He suggests that a potent mechanism of learning is secondary to self-appraisal and awareness built from clinical experiences and leads to a building of a new and expanded competency or “zone of mastery,”3 Candy’s description of the traits of the self-directed learner is another model for learning.4 These traits include discipline and motivation, analytic abilities, an ability to reflect and be self-aware, curiosity, openness and flexibility, independence and self-sufficiency, well-developed information-seeking and retrieval skills, and good general learning skills. Clearly these are desirable — if not always fully achievable — attributes.
Many authors have suggested steps in the process of change.5 Rogers6 referred to this process as the decision-innovation process, whereas Prochaska and Velicer7 referred to it as the trans-theoretical model. Specifically focusing on physicians, Pathman and colleagues8 used the model known as awareness–agreement–adoption–adherence to describe the process. Here, the physician becomes aware of a new finding or practice, moves to a process of agreement with it and then to an adoption of it, either on a trial or irregular basis. Finally, she adheres to the practice, conforming, for example, to guideline-based recommendations whenever indicated. These stages of learning are important when considering the effect of educational interventions.
What is the process for education?
Education is one means to effect changes in performance and improve practice-related outcomes, thereby achieving knowledge translation. The PRECEED model (Predisposing, Reinforcing, Enabling Construct in Educational Diagnosis and Evaluation) by Green and Kreuter,9 which incorporates elements that are characterized as predisposing, enabling and reinforcing, helps with the conceptualization of education as an intervention. In this model, predisposing elements include mailed guidelines, didactic lectures, conferences and rounds, all of which may predispose toward change in the uptake of knowledge. Enabling elements include patient education materials and other tools (e.g., flow charts) that might enable the change. Finally, reinforcing strategies including reminders, or audit and feedback, which are useful in solidifying a change already made. A systematic review10 supports this construct and enables us to consider aligning educational interventions with the stage of learning (Table 1).
Table 1.
Continuum of learning or change8 | Awareness | Agreement | Adoption | Adherence |
Elements of change9 | Predisposing elements | Enabling strategies | Reinforcing elements | |
Possible roles for educational interventions | Conferences, lectures, rounds, print materials | Small-group learning activity; interactivity in lectures | Workshop; materials distributed at conferences; audit and feedback | Audit and feedback; reminders |
The characteristics of the educational intervention and the process through which the learner adheres to a new practice provides a framework to highlight the development and use of educational interventions. This framework uses the four steps in the Pathman model as an organizing principle. First, several systematic reviews have identified that most didactic conferences10,11 or mailed materials,12 which employ only one technique, are infrequent producers of change in performance. However, this finding may be “unfair” to such traditional modalities because they may play a crucial role in predisposing to change. For example, where health professionals are unaware of new evidence, conferences, print materials and rounds may predispose them to change. Second, when learners are aware of a new finding or guideline but do not agree with it, small-group learning or increased interactivity in the conference-based setting exposes them to peer influence.13 Such influences are a strong predictor of increased discussion and possible consensus. Third, if the issue is one of adoption of a new manual, communication skill or a complex algorithm for care, more in-depth workshops or online learning experiences may facilitate the change.13 Finally, once the process has been adopted, system-based interventions such as reminders or audit and feedback may be considered to facilitate sustainability.14
What educational interventions can we use to effect knowledge translation?
Large-group sessions
Educational events for large numbers of learners are commonplace, although the evidence indicates that this type of educational intervention produces little, if any, change in performance. However, several studies10,11,15,16 have outlined relatively useful and effective strategies within the large-group model to increase the impact on performance and outcomes in health care. These strategies include needs assessment,15 increased interactivity16 and variation in the educational method.11
Determining needs and setting objectives
There is ample evidence that not only the needs of learners but also the needs of their patients and the gaps in the systems of health care within which learners practise should drive CME.11 Health care systems frequently use only objectively determined needs (e.g., the clinical care “gap”) to drive the educational agenda, which is a process that misses an understanding of the learning process and that may fail to lead to benefit. In contrast, CME planners frequently use subjective assessments of needs, despite evidence that clinicians may be poor self-assessors17,18 and that objectively determined gaps may more closely link the CME process to demonstrable outcomes.
Subjective strategies for assessment of needs include questionnaires, focus groups, structured individual interviews and diaries or log books. To offset the deficiencies of self-assessment inherent in these methods and to create a more holistic strategy for needs assessment, objective tools can be used. These tools include standardized assessments of knowledge or skills, audits of charts, peer review, observation of the practice of health professionals and reports of patterns of practice, and data on the performance of physicians.19
The results of these assessments can be used to produce objectives for educational activities. Continuing medical education has shifted from conceiving of these as learning objectives (i.e., what the learner should know or be able to do at the end of the activity) to conceiving of them as behavioural objectives (i.e., what the learner should be expected to do as a result of what has been learned).
Formatting the large-group session
Several strategies can enhance the delivery of effective formal, large-group CME. These strategies include employing multiple methods within the framework of the activity, increasing the interactivity of the sessions and using other strategies to increase the reach and impact.11
Multiple methods
As discussed in a previous article in this series,20 no clear evidence exists suggesting benefit of multicomponent interventions over single-component interventions. However, suggestions have been made that multicomponent interventions could be more effective than single interventions if they address different types of barriers for change. Within the context of the formal CME events, most recent evidence shows that multiple methods used within the context of the activity may promote uptake and translation into practice.11
The methods may be characterized in several ways. First, formal sessions may use a variety of media for presentation (e.g., audiotapes to present heart sounds, actual or standardized patients or videotapes, panel-based discussions to present conflicting perspectives on one topic, debates to highlight issues where agreement is lacking and quizzes to determine learning needs. Second, given that knowledge is a necessary but not sufficient condition for change in performance to occur, practice-enablers may be distributed and used during the course of a standard CME event. Examples of practice-enablers are reminders, protocols and flow sheets for care of patients, patient education materials, wall charts and other tools that may be used in the practice-based setting after the conclusion of the CME activity.11
Third, CME activities may use clinical scenarios and vignettes in an attempt to increase relevance and applicability of educational material. Vignettes are frequently derived from actual clinical cases and modified to ensure patient confidentiality and exemplify details of history, diagnosis or management.21 They are used to promote reflection and interaction. There are many methods of presenting such cases or clinical stories. Short paper-based cases can involve prompts for discussion of diagnosis or management. Standardized patients can present highly credible clinical findings and histories. Video- and audio-based cases, role-playing and simulation-based techniques may add relevance and increase potential for learning.11
Staging a multimethod learning experience so that it is interrupted shows evidence of increased effect.11 For example, two workshops of three hours each, held a month apart, allow the learners to absorb information from the first event, apply it in the work setting and then discuss this process, with reinforcement of learning during the second event. An example of this interrupted learning process is the opportunity afforded by the weekly or monthly recurrence of clinical rounds.
Interactivity
With fairly clear evidence for effect,16 interactivity increases the interplay among audience members, or between participants and the presenter. There are a number of ways in which this interactivity can be accomplished.
To facilitate interaction between the presenter and participants, planners may increase the question-and-answer sessions of lectures, divide lectures into 10-minute periods of lecture followed by questions and answers21 or use an audience-response system.22 The last method may employ technology to poll the audience for responses to projected questions, or it may use use low-tech (though not so anonymous) options such as colour-coded cards.
To facilitate interaction among participants, buzz groups — described by the noise they make in a normally quiet audience — can be used. This method allows participants to engage neighbouring audience members in conversation. Pyramiding or snowballing builds on interactions between pairs of participants by expanding it to groups of four or six, and eventually involving all participants. An example of this method is known as “think–pair–share”, in which reflective practice first occurs (i.e., a quiet moment for participants to think of a particular case, for instance), followed by discussion of the idea with a neighbouring participant, then sharing of the idea with the larger audience.
Small-group learning
Small-group learning in CME is one of many innovations created by the growth in problem-based learning methods in undergraduate medical education. This method uses groups of 5–10 people and employs many of the principles of effective CME (e.g., case vignettes, group-based discussion and high degrees of interactivity). Groups meet regularly, usually without an expert, and are led by one of their own members, who acts as a facilitator. Common in Canada and in Europe, these groups have shown impact on competence and performance. This impact is most likely due to a combination of their concentration on evidence-based materials and their heavy reliance on peer pressure and influence.13,23 An example is the practice-based small-group learning activities of the Foundation for Medical Practice Education, which includes over 3000 family physicians.23 While some groups are informal and self-organizing, many others are a part of national maintenance of competence and CME programs, such as professional certifying bodies.24
Distance-based education techniques
While in-person CME remains a primary vehicle for the delivery of knowledge, other ways exist in which knowledge translation may be accomplished. For example, programs with a visiting speaker may use web-, video- or audio-casts. These activities must be interactive to engage the learner and improve impact. They may employ interactive cases and other methods to stimulate the learner to use critical thinking and problem-solving. Recent studies have shown increases in knowledge and retention of knowledge by physicians after participation in online CME courses.25 If appropriately designed, such courses may be superior to live activities in effecting changes in the physician behaviour.26
Online communities of practice27 are another potential intervention for knowledge translation. Groups of learners participate in audio conferences and case discussions. Follow-up or support is achieved through electronic means using reminders, cases and other means to promote networking and consulting among peers. These groups can assist in evaluating the effectiveness of the education as well as determining needs for new activities. They can build a knowledge base that is both community-based and shared.
Self-directed learning
Some health professionals possess a preference for more self-directed choices because of learning style or logistic need. Such choices include traditional sources such as textbooks, monographs, clinical practice guidelines and journals that provide clinical information. Important developments to aid self-directed learning have included the advent of printed or computerized self-assessment programs, which provide learners with feedback about their competence as they read materials and answer questions.
Portfolio-based learning28,29 is also an important tool in self-directed learning and is derived from the concept of the artist’s or photographers’ collection of work. More complex, however, than a simple accumulation of exemplary work, the portfolio is intended to document educational activities undertaken by the clinician; quality-related records (i.e., chart reviews carried out or performance-related milestones achieved); gaps in learning that have been identified; examples of learning plans and objectives and the resources used to meet them; and other data related to performance and outcomes in health care. Portfolios can be used for self-reflection, for self-assessment and learning, or in an educational manner — providing grist for conversation with a peer or other mentor or applied to questions of relicensure, recertification and other needs. Some professional organizations have tools to facilitate this activity.30
Current trends in CME
Trends in and challenges to the construct, delivery and use of CME have led to a more holistic and integrated understanding by learners and CME providers of this last and longest phase of clinicians’ learning. Among the trends are the changing construct of CME from the traditional understanding of it as a vehicle for the transfer of information to a more complete understanding of the learning process and the complex health system in which it occurs. Another trend is an increasing focus on performance and outcomes in health care and the use of performance measures. This focus is leading planners of CME to increase attention to Levels 4–6 of the Moore31 evaluation schema (Table 2), as opposed to their previous focus on lower levels.
Table 2.
Level | Outcome | Metrics or indicators |
---|---|---|
1 | Participation | Attendance |
2 | Satisfaction | Satisfaction of participant |
3 | Learning | Changes in knowledge, skills or attitude |
4 | Performance | Changes in performance in practice |
5 | Patient-specific health | Changes in health status of patient |
6 | Population-specific health | Changes in health status of population |
New and emerging diseases represent one of the challenges. A need exists for rapid-response educational technologies in the face of serious issues related to SARS-like diseases, pandemic influenza and bioterrorism. These issues call for use of technologies such as short-message-service texting, fax-based networks, email and other means of communication that are based on “push” technology. Management of chronic diseases is another challenge. Researchers have outlined the need for improved management of chronic diseases in an aging population with comorbidities. This need shows promise in driving the educational aspects of knowledge translation, such as the creation of interprofessional educational initiatives, the dissemination and incorporation of algorithms for complex care and the use of point-of-care resources, among other methods for learning.
Maintenance of licensure and certification constitutes a further challenge. The traditional notion of credit, which, for physicians at least, is linked to mandatory participation in CME is increasingly questioned by licensing bodies and specialty societies. The traditional credit-hour has served to document participation in CME but falls short in showing translation to maintained competence or improved performance. With the movement toward more self-directed, practice-based learning, critics have argued for a system of relative value that provides higher-value credit for those activities that show improved practice. This concept is incorporated into the movement toward maintenance of licensure and recertification in the United States and Canada.30,32
Gaps in educational interventions
Several areas of research are important in an era of accountability and movement toward demonstrated competence and performance as the result of participation in CME. These areas include questions about the learner (i.e., are self-assessment and self-directed learning core logic-based character traits or can they be taught? If the latter, how can this best be accomplished?), about the vehicles for communication (i.e., what vectors for the transmission of knowledge work best: PDA-mediated educational messages or traditional educational ones?) and about the context of learning (e.g., the seeing of learning, its remuneration pattern, its linkage to resources in information technology) and their effects on learning and uptake. Finally, a large area for research involves determination of the factors influencing uptake of information in which the variables include questions about the nature, complexity, compatibility and level of the evidence to be adopted.
The book Knowledge Translation in Health Care: Moving from Evidence to Practice, edited by Sharon Straus, Jacqueline Tetroe and Ian D. Graham and published by Wiley-Blackwell in 2009, includes the topics addressed in this series.
Key points
The effectiveness of large-group sessions in continuing medical education can be enhanced by using rigorous needs assessments and increasing interactivity and engagement in the learning process.
Other interventions that show promise include small-group learning, communities of practice and distance-based education.
Self-directed learning may be enhanced by the addition of portfolio-based learning and self-assessment exercises.
Articles to date in this series
Straus SE, Tetroe J, Graham ID. Defining knowledge translation. www.cmaj.ca/cgi/doi/10.1503/cmaj.081229
Brouwers M, Stacey D, O’Connor A. Knowledge creation: synthesis, tools and products. www.cmaj.ca/cgi/doi/10.1503/cmaj.081230
Kitson A, Straus SE. The knowledge-to-action cycle: identifying the gaps. www.cmaj.ca/cgi/doi/10.1503/cmaj.081231
Harrison MB, Légaré F. Adapting clinical practice guidelines to local context and assessing barriers to their use. www.cmaj.ca/cgi/doi/10.1503/cmaj.081232
Wensing M, Bosch M, Grol R. Developing and selecting interventions for translating knowledge to action. www.cmaj.ca/cgi/doi/10.1503/cmaj.081233
Footnotes
Funding: No external funding was received for this paper.
Competing interests: None declared.
Contributors: Both of the authors were involved in the conception and development of the article. Dave Davis performed the review of the literature. Both of the authors drafted the manuscript, critically revised it for important intellectual content and approved the final version submitted for publication.
This article has been peer reviewed.
REFERENCES
- 1.American Medical Association. The Physician’s Recognition Award and credit system: information for accredited providers and physicians. 2006 revision. Chicago (IL): The Association; 2006. [(accessed 2009 Oct. 19)]. p. 2. Available: www.ama-assn.org/ama1/pub/upload/mm/455/pra2006.pdf. [Google Scholar]
- 2.Fox RD, Mazmanian PE, Putnam W. Changing and learning in the lives of physicians. New York (NY): Praeger Publishers; 1994. p. 371. [Google Scholar]
- 3.Schön DA. The reflective practitioner: how professionals think in action. London (UK): Temple Smith; 1983. p. 137. [Google Scholar]
- 4.Candy PC. Self-direction for lifelong learning. San Francisco (CA): Jossey-Bass Publishers; 1991. pp. 86–94. [Google Scholar]
- 5.Grol R, Wensing M, Eccles M. Improving patient care The implementation of change in clinical practice. London (UK): Elsevier; 2005. pp. 41–59. [Google Scholar]
- 6.Rogers E. Diffusion of innovations. 5th ed. New York (NY): Free Press; 2003. pp. 146–50. [Google Scholar]
- 7.Prochaska JO, Velicer WF. The transtheoretical model of health behaviour change. Am J Health Promot. 1997;12:38–48. doi: 10.4278/0890-1171-12.1.38. [DOI] [PubMed] [Google Scholar]
- 8.Pathman DE, Konrad TR, Freed GL, et al. The awareness-to-adherence model of the steps to clinical guideline compliance. The case of pediatric vaccine recommendations. Med Care. 1996;34:873–89. doi: 10.1097/00005650-199609000-00002. [DOI] [PubMed] [Google Scholar]
- 9.Green LW, Kreuter MW. Health promotion planning: an educational and ecological approach. 4th ed. Toronto (ON): McGraw Hill; 2005. pp. 140–7. [Google Scholar]
- 10.Davis DA, Thomson MA, Oxman AD, et al. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA. 1995;274:700–5. doi: 10.1001/jama.274.9.700. [DOI] [PubMed] [Google Scholar]
- 11.Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education. Evid Rep Technol Assess (Full Rep) 2007:1–69. [PMC free article] [PubMed] [Google Scholar]
- 12.Wagner TH. The effectiveness of mailed patient reminders on mammography screening: a meta-analysis. Am J Prev Med. 1998;14:64–70. doi: 10.1016/s0749-3797(97)00003-2. [DOI] [PubMed] [Google Scholar]
- 13.Peloso PM, Stakiw KJ. Small-group format for continuing medical education: a report from the field. J Contin Educ Health Prof. 2000;20:27–32. doi: 10.1002/chp.1340200106. [DOI] [PubMed] [Google Scholar]
- 14.Jamtvedt G, Young JM, Kristoffersen DT, et al. Audit and feedback: effects on professional practice and health care outcomes [review] Cochrane Database Syst Rev. 2006;(2):CD000259. doi: 10.1002/14651858.CD000259.pub2. [DOI] [PubMed] [Google Scholar]
- 15.Davis D, O’Brien MA, Freemantle N, et al. Impact of formal continuing medical education: do conferences, workshops, rounds and other traditional continuing education activities change physician behaviour or health outcomes? JAMA. 1999;282:867–74. doi: 10.1001/jama.282.9.867. [DOI] [PubMed] [Google Scholar]
- 16.Thomas O’Brien MA, Freemantle N, Oxman AD, et al. Continuing education meetings and workshops: effects on professional practice and health care outcomes [review] Cochrane Database Syst Rev. 2001;(2):CD003030. doi: 10.1002/14651858.CD003030. [DOI] [PubMed] [Google Scholar]
- 17.Sibley JC, Sackett DL, Neufeld V, et al. A randomized trial of continuing medical education. N Engl J Med. 1982;306:511–5. doi: 10.1056/NEJM198203043060904. [DOI] [PubMed] [Google Scholar]
- 18.Davis DA, Mazmanian PE, Fordis M, et al. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296:1094–102. doi: 10.1001/jama.296.9.1094. [DOI] [PubMed] [Google Scholar]
- 19.Lockyer J. Needs assessment: lessons learned. J Cont Educ Health Prof. 1998;18:190–2. [Google Scholar]
- 20.Wensing M, Bosch M, Grol R. Developing and selecting interventions for translating knowledge to action. CMAJ. 2009 Dec 21; doi: 10.1503/cmaj.081233. [Epub] [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Brose JA. Case presentation as a teaching tool: making a good thing better. J Am Osteopath Assoc. 1992;92:376–8. [PubMed] [Google Scholar]
- 22.Gagnon RJ, Thivierge R. Evaluating touch pad technology. J Contin Educ Health Prof. 1997:20–26. [Google Scholar]
- 23.Foundation for Medical Practice Education. [(accessed 2009 Oct. 14)]. Available: www.fmpe.org/en/about/background.html.
- 24.College of Family Physicians of Canada. MainPro program. [(accessed 2009 Oct. 14)]. Available: www.cfpc.ca/English/cfpc/cme/mainpro/maintenance%20of%20proficiency/default.asp?s=1.
- 25.Casebeer LL, Kristofco RE, Strasser S, et al. Standardizing evaluation of on-line continuing medical education: physician knowledge, attitudes and reflection on practice. J Contin Educ Health Prof. 2002;24:68–75. doi: 10.1002/chp.1340240203. [DOI] [PubMed] [Google Scholar]
- 26.Fordis M, King JE, Ballantyne CM, et al. Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA. 2005;294:1043–51. doi: 10.1001/jama.294.9.1043. [DOI] [PubMed] [Google Scholar]
- 27.Wenger EC, Snyder WM.Communities of practice: the organizational frontier Harv Bus Rev 200078139–45.11184968 [Google Scholar]
- 28.Parboosingh J. Learning portfolios: potential to assist health professionals with self-directed learning. J Contin Educ Health Prof. 1996;16:75–81. [Google Scholar]
- 29.Campbell C, Parboosingh J, Gondocz T, et al. Study of physicians’ use of a software program to create a portfolio of their self-directed learning. Acad Med. 1996;71:S49–51. doi: 10.1097/00001888-199610000-00042. [DOI] [PubMed] [Google Scholar]
- 30.Royal College of Physicians and Surgeons of Canada. Continuing Professional Development. Ottawa (ON): The College; 2005. [(accessed 2009 Apr. 27)]. Available: http://rcpsc.medical.org/opd/index.php. [Google Scholar]
- 31.Moore DL. A framework for outcomes evaluation in the continuing professional development of physicians. In: Davis DA, Barnes BE, Fox RD, editors. The continuing professional development of physicians. Chicago (IL): AMA Press; 2003. pp. 249–69. [Google Scholar]
- 32.American Board of Medical Specialities. ABMS maintenance of certification. Chicago (IL): The Board; 2009. [(accessed 2009 Oct. 19)]. Available: www.abms.org/maintenance_of_certification/abms_moc.aspx. [Google Scholar]