SYNOPSIS
This article describes lessons learned by the University at Albany Center for Public Health Preparedness (UA-CPHP) in using three technologies to deliver preparedness training for public health professionals in New York State. These three technologies are:
Audience response system (ARS, or the “clicker” system)—Purchased to improve engagement of all participants in heterogeneous training audiences, it also markedly reduces staff time while improving training evaluation (cost: $4,500).
Satellite broadcast programs—UA-CPHP produced more than 50 broadcasts, which remain available as videostreams and/or podcasts. Viewership of archived programs sometimes surpasses that of the live event (cost estimate: $23,000 to $39,000).
Interactive online courses—Seventeen courses have registered more than 44,000 learners worldwide. The Pandemic Influenza course alone has reached more than 16,000 registrants from all 50 U.S. states and at least 56 other countries (cost estimate: $30,000 to $65,000).
UA-CPHP's experience as a preparedness training center has confirmed that contemporary technology can be employed to improve and increase the reach of these training efforts. An additional finding was that, quite unintentionally, the intensive use of distance-based educational modalities designed to reach public health practitioners in New York State has afforded UA-CPHP a substantial national and international audience as well, and at no additional cost.
The attacks of September 11, 2001 (9/11), and the anthrax letter threats during the ensuing weeks shocked the public health system into recognizing how inadequately prepared we were to prevent, detect, and respond to large-scale emergencies. Late in 2002, the Institute of Medicine report, The Future of the Public's Health in the 21st Century,1 expressed concern that much of the public health workforce had little formal education in public health, and those who did have this education had been trained in a pre-9/11 milieu. The rapid proliferation of responsibilities related to emergency preparedness (EP) took many public health professionals into programmatic areas they could never have anticipated during their academic programs.
Responding to this nationwide need, the Centers for Disease Control and Prevention expanded the academic Centers for Public Health Preparedness (CPHP) program from four to 27 centers. The imperative for these CPHPs was to link academic expertise to state and local health agencies' needs to advance the state of readiness in their jurisdictions.
Not unlike that of the other 26 CPHPs, the primary mission of the University at Albany Center for Public Health Preparedness (UA-CPHP) is to train public health workers in an all-hazards framework to prevent, detect, and respond to emergencies. Given the depth of training needs and the geographic breadth of New York State (NYS), UA-CPHP adopted distance-based technology to deliver diverse training programs reaching more than 18,700 public health workers across all 62 NYS counties. In pursuing its mission, UA-CPHP developed programs in a variety of educational modalities appropriate to the subject matter: in-person training, including workshops, conferences, drills, and exercises; interactive online courses; and satellite broadcasts, which remain available in videostream, podcast, and DVD formats.
This article describes UA-CPHP's experiences and lessons learned with three disparate technologies we employ to deliver or augment workforce training—(1) an audience response system (ARS) for in-person trainings, (2) satellite broadcasts, and (3) interactive Web-based continuing education courses.
TRAINING METHODS
ARS
UA-CPHP delivers in-person trainings to audiences ranging widely in size, skill level, and knowledge. Engaging heterogeneous groups and eliciting participation from all segments of the audience are challenging. After we observed colleagues using their ARS to address this issue successfully, UA-CPHP purchased its own ARS. An ARS is a real-time data collection system comprising small, handheld keypads (clickers) that audience members use to answer questions posed by the instructor.2 (Subsequent discussion refers specifically to characteristics of our radio-frequency-based system from Turning Technologies, LLC [http://www.-turningtechnologies.com]. Detailed descriptions of ARSs and their use appear in an article by Caldwell.3)
UA-CPHP trainers develop conventional presentations in Microsoft® PowerPoint, augmented with proprietary software from the vendor, and intersperse slides posing ARS questions. Audience members respond by selecting an answer from the choices provided and pressing the corresponding number on their clickers. Once the entire audience responds, the trainer displays the tabulated or graphed responses with a single click. UA-CPHP's trainers often use the ARS as a modern, technology-augmented version of the Socratic Method,3 using guided conversation and discussion to lead learners to understand the concept. Although the ARS is engaging, learners can be irritated by technical difficulties,4 so we advise instructors to rehearse and prepare to troubleshoot potential problems before using the system in an actual training session.
Satellite broadcast programs
Since 2002, UA-CPHP has partnered with the University at Albany Center for Public Health Continuing Education (CPHCE) to create and deliver satellite broadcasts on EP topics. These broadcasts are produced as live, scripted interviews with one or more experts in a studio at NYNetwork—a professional broadcast facility owned by the State University of New York (SUNY) and charged with disseminating NYS governmental information. Producing telecasts at NYNetwork assures that every county in NYS has at least one site capable of downlinking our broadcasts. However, these broadcasts can be received by any site that has compatible reception ability and the technical specifications unique to each broadcast. UA-CPHP dispenses these details when sites register with us, for which we only require them to tell us the size of their expected audience. For some broadcasts, we partnered with the Upper Midwest CPHP to downlink the broadcast and provide a live videostream feed. More recently, NYNetwork has developed the capacity to provide the live videostream directly, simplifying the process and producing a more consistently high-quality Web feed, bypassing any interference in the satellite downlink.
Potential broadcast topics and speakers are identified by various means, including solicited or unsolicited recommendations from our stakeholders, staff's awareness of emergent issues, literature reviews, and -unsolicited recommendations from other contacts. Once one or more expert speakers have been identified, one of our training staff works with the expert to develop an outline and a PowerPoint slide presentation. During the program, the moderator will use these materials to ask questions scripted to draw out the information in a conversational manner. Our staff person assists in adapting the slides for unfolding the narrative, as well as technical and aesthetic design issues specific to the broadcast medium. Depending on the budget, the topic, and the lead time, the video production team in CPHCE may be engaged to produce original, short video clips to be interspersed to sustain viewer interest, demonstrate a technique, or provide additional information.
Satellite broadcasts have a potentially wide reach, especially when streamed over the Internet. Live telecasts allow some limited interactivity, as viewers are invited to submit questions through telephone, e-mail, or fax, and the presenter responds to as many of the questions as possible during the broadcast. All programs are transferred to hard media (VHS tape historically and now DVD) for dissemination, and provided as archived videostreams available through UA-CPHP's website; many are available as free podcasts on our website and through the iTunes® Store.
Interactive online courses
In 2004, UA-CPHP formed a partnership to develop self-paced, interactive online courses with the Instructional Technologies Unit in the University at Albany Rockefeller Institute's Professional Development Program (PDP). In designing courses, our developers endeavor to incorporate attributes that, in our collective experience, have proven to be attractive to adult learners. These include interactivity, accuracy, and timeliness of content; real-life examples; quality technical design; and provision of a printable completion certificate and continuing education credits. UA-CPHP's online courses require anywhere from 30 minutes to six hours for the user to complete.
Traditionally, UA-CPHP begins course development by identifying timely topics for which colleagues with EP experience have identified gaps in educational offerings. They also may be chosen based on specific requests from colleagues. For example, we partnered with the University of South Carolina CPHP to develop an online course on public health ethics.
Once the topic is determined, UA-CPHP staff identifies the target audience, drafts course objectives, and develops the curriculum to meet the needs and knowledge level of the target audience. UA-CPHP staff members work closely with the multidisciplinary group at PDP to guide the technical layout of the course. Depending on the project's scope, UA-CPHP staff can draw on expertise from any or all of the PDP's diverse skill set—including instructional design, graphic design, scriptwriting, website design, and programming in numerous software languages. Writers divide content into several topic-specific modules, but the user can stop at any point and resume at a later time. To maximize visual clarity, each page contains a limited amount of material. Developers incorporate pictures and graphics to illustrate concepts and real-life examples to integrate the concepts developed. Quizzes and exercises throughout the course make the experience interactive and test the user's learning. Quizzes (except for the final exam) provide immediate feedback to allow learners to gauge their comprehension of the subject matter as they progress through the course.
RESULTS
ARS
UA-CPHP staff members have used the ARS in approximately 15 presentations ranging from technical seminars for professional groups to community presentations in public libraries. Audience members are consistently enthusiastic about using the ARS. In trainings for Medical Reserve Corps members, 95% (88 of 93) of trainees agreed with the evaluation item positing that “the ARS was beneficial to the training” (data not shown).
The system is easy to use. It takes at most 10 minutes to train presenters to use the system and even less time for audience members to develop familiarity with the system and answer questions quickly. The instructor orients audience members to the mechanics of using the clicker by posing a simple, nonthreatening question—such as predictions on upcoming sporting events or local elections. If the objective is to promote interaction, the presenter must be comfortable using the system, so it is not intrusive, and must frame questions in ways that facilitate discussion or understanding.
Logistical challenges associated with the ARS are few. Managing the distribution and collection of the clickers is the most significant issue identified. Having staff designated to collect the clickers at the end of the session is advisable. Although we have not yet conducted a formal, quantitative comparison, the ARS system appears to increase evaluation response rates, as evaluation questions commingle with the presentation. The ARS software creates a data file with all responses from each uniquely coded clicker in a single line record. This data structure affords two important analytical opportunities: (1) the ability to stratify results from substantive questions by demographic and other attributes, and (2) the ability to conduct matched-pair analyses of pre- and posttest responses.
Satellite broadcast programs
UA-CPHP has produced 53 satellite broadcasts on a wide range of public health preparedness topics such as “Ethics in a Time of Bioterrorism,” “Risk Communication,” and “Pandemic Influenza Planning” (a complete list is available at http://www.ualbanycphp.org/broadcasts.cfm). Beyond the NYS workforce, people from all 50 states and numerous countries regularly view these broadcasts. Based on information provided by broadcast site coordinators, live broadcasts attract an average of 3,871 viewers spanning 246 sites. (Unless otherwise specified, all data pertaining to broadcasts and online courses refer to the period ending 10/31/09.)
While we tend to focus on the live broadcast as the actual educational event, for some broadcasts the audience for archived videostreams persists over time, sometimes eclipsing the number viewing the live event. To illustrate, Table 1 shows the live event viewership and the yearly number of archived videostream viewers for broadcasts with the three fewest and the three greatest numbers of total viewers in the most recent three complete years, 2007–2009. For example, “Assessing Chemical Exposure: A Different Approach” was viewed live by 3,736 people on January 6, 2005. Almost two years later, in 2007, 962 viewers accessed the broadcast by videostream. The number of viewers actually increased in subsequent years to 1,334 in 2008 and 1,460 in 2009. Over just these three years, total viewership exceeded the live event audience, as did two other broadcasts.
Table 1.
Viewers of archived videostreams compared with live satellite broadcast viewership for selected University at Albany CPHP emergency preparedness training programs, 2007–2009a
aData are provided for the three most viewed and the three least viewed programs from 2007 to 2009—the three most recent, complete years. The year 2007 is designated here as a baseline to characterize changes within this period.
CPHP = Center for Public Health Preparedness
Interactive online courses
More than 44,000 users have registered for the 17 courses we have published to date. The most popular courses, “Preparedness and Community Response to Pandemics” (Pandemic Influenza) and “Terrorism, Preparedness, and Public Health,” have reached nearly 30,000 registrants from all 50 U.S. states and at least 56 foreign countries (Table 2).
Table 2.
Geographic locations of registrants in six self-paced interactive online courses in emergency preparedness training developed by the University at Albany CPHP from course launch date through October 2009
aPuerto Rico and Guam
CPHP = Center for Public Health Preparedness
Evaluation by end users is consistently positive. Table 3 summarizes responses summed across all courses to the three (of 10 to 12) evaluation questions we deem to be the most central measures of quality and utility. Among more than 20,000 respondents, 96% rated course quality as good or excellent and 99% would recommend the course to others. On average, 88% of respondents felt the course they took would help them perform their job more effectively.
Table 3.
Evaluation summary for interactive online courses (including lowest- and highest-rated course on each of three questions) in emergency preparedness training developed by the University at Albany CPHP
aThe question was not asked in three courses: Emergency Preparedness Training for Hospital Clinicians, Your Family Disaster Plan, and Working in a POD.
b“Lowest-rated” and “highest-rated” course refers to the courses with the most extreme ratings specific to each question. Examination of the sample sizes in the “highest-rated” column makes it clear that these values represent data from three different courses.
CPHP = Center for Public Health Preparedness
Comparison of preliminary cost estimates
It is beyond the scope of this article to analyze costs systematically. We can, however, identify the major cost items with certainty and give reasonable ranges of low and high cost estimates. With only an hour or two of training required for a presenter, the sole significant cost of incorporating the ARS into trainings is the initial purchase of the hardware package, including software. The ARS radio-frequency system from Turning Technologies LLC cost approximately $4,200 for a receiver and 100 clickers. The only recurring costs we anticipate are for replacing batteries and lost clickers.
The costs of the two distance-based technologies, shown in Table 4, scale considerably with the complexity of the presentation. Fixed basic costs of a satellite broadcast include studio time; satellite uplink; and personnel costs to create presenter slides in -video-compatible formats, serve as an in-studio producer, schedule the event, develop and disseminate publicity materials, and register downlink sites. Travel and honorarium costs for the speaker and moderator are basic costs that vary somewhat, depending on their employment and location. Options that may incur additional costs include an afternoon rebroadcast to accommodate Western time zones, live video streaming, multiple expert speakers, and use of a larger studio. Costs also increase if original video is produced, at approximately $1,000 per broadcast minute. We estimate overall broadcast costs range from $23,000 to $39,000 depending on the options included.
Table 4.
Comparison of estimated costs of emergency preparedness training courses by type of distance-based technology, UA-CPHP
aPersonnel costs include salaries and fringe benefits.
bCosts include studio production services, satellite fee, and production of master copy.
cEstimates for lower-cost programs are skewed lower because they also tend to be among the older ones. Current costs are considerably higher, by 50% or more, depending on the complexity.
dSatellite broadcasts include only the live event audience.
UA = University at Albany
CPHP = Center for Public Health Preparedness
PDP = University at Albany Rockefeller Institute's Professional Development Program
In our experience, the costs of self-paced, Web-based courses are almost entirely in personnel. Given the diverse technical needs of researching the topic, programming, instructional design, and graphics production, costs can be considerable. Total costs per course range from $30,000 to $65,000. Production costs scale in proportion to course duration and complexity of the presentation. Beyond that, costs increase markedly if original video, still photography, and or programming in Web animation/interactivity software are specified.
These costs may seem daunting, but the large audience sizes of distance-based trainings yield reasonable costs per learner (Table 4). Using the average audience size of 3,871, the cost of a satellite broadcast ranges from $6 to $10 per learner in the live broadcast audience. Including viewers of archived videostreams would reduce this cost by about 50% (data not shown). Our Web-based courses have a per-learner cost ranging from $4 to $19. The estimate for “high cost” Web courses shown in Table 4 represents data for the Pandemic Influenza course. Despite the estimated $65,000 cost, when spread across 16,412 registrants, the cost was a reasonable $4 per learner.
DISCUSSION
ARS
While we adopted the ARS in an effort to increase audience engagement, we did not anticipate other benefits in both training and evaluation. An instructor can quickly gauge the level of understanding and adjust the presentation as needed. If differences are anticipated in advance, the instructor can use ARS feedback to branch into alternative pathways, such as by showing definitions of terminology unfamiliar to some audience segments. In situations where individuals from different disciplines and agencies are training together, which is often the case during just-in-time training, the ARS can be a mechanism for equalizing the audience by allowing all participants to provide input while protecting their anonymity.5
Our relatively brief ARS experience largely agrees with what others have reported:3,5–7 the ARS is viewed positively by learners, engages the learner, and provides an effective mechanism for data collection.7 A randomized, controlled trial evaluating the use of ARS in continuing medical education showed that in presentations using an ARS, the quality of the speaker, overall presentation, and level of attention were rated higher than when the same presentation was delivered without the ARS.6 Our observations are consistent with other reports that suggest participants appreciate the immediate feedback afforded in ARS-enhanced trainings,3,6 and this increased engagement can lead to more effective learning.3,8
Two other lessons in using the ARS have been realized in training evaluation. First, the ARS radically reduces time spent on training evaluation. Previously, we usually developed scannable paper forms for evaluation of in-person trainings. We estimate that process takes six to eight hours to modify our standard template; register the scanned form; photocopy, distribute, collect, and manually check and clean each submitted form; scan the forms; and generate a standardized report. In contrast, the incremental time to add a similar number of ARS questions to the presentation slide set and generate a report is an hour or less. Not only do we attain nearly 100% response rates, but standard reports are also available instantaneously. Second, we unexpectedly have received several invitations from stakeholders to assist with evaluation of their programs, once they have observed the data collection possibilities that the ARS provides.
Satellite broadcast programs
Satellite broadcasts also have been shown to be effective in increasing knowledge and changing attitudes.9 Although a professionally produced broadcast can be an efficient and effective means of delivering topic-specific material to a large audience, it is very resource intensive. Planning and producing a broadcast typically starts two to three months in advance and requires specialized skills and resources. Of the three technologies discussed in this article, our experience with satellite broadcasts is probably the least transferable to other institutions, possibly with the exception of CPHPs located in other capital cities. The University at Albany School of Public Health's close integration with the New York State Department of Health (NYS DOH), one of the nation's largest and most reputable state health agencies, affords us access to nationally prominent public health officials, health-care regulators, and bench scientists at a world-class public health laboratory. Expert speakers from NYS DOH cost us nothing, as they cannot accept honoraria, and for many, the studio is in the same building complex as their office. Participating in their senior EP staff meetings routinely yields recommendations for important emerging topics and expert speakers. Likewise, NYNetwork's government-centric mission keeps costs down and assures us access to every county government in NYS.
By making the satellite broadcasts available as free, archived videostreams and podcasts, these programs constitute an enduring resource that can be obtained by anyone with Internet access for use in their training. This greatly broadens our reach with minimal additional cost.
Interactive online courses
The success of our online courses may reflect the adult learner's preference for trainings that are (1) easily accessible from remote locations,10,11 (2) self-paced,10 and/or (3) free. Public health professionals in Wisconsin were found to prefer short, self-paced, asynchronous, non-degree continuing education programs over other online models.12 Although we have not compared the success of our online courses with other instructional methods, published evidence suggests that Internet-based training can produce results similar to live interactive workshops providing continuing medical education.13
Focused primarily on reaching the NYS workforce, UA-CPHP did not anticipate the global utilization of our online courses. Besides the numbers of international registrants identified in Table 2, staff from two Latin American agencies found the Pandemic Influenza course to be of such quality that they requested a Spanish translation. We were able to provide a translated transcript. Creating an interactive version, however, was cost-prohibitive, as it would also necessitate re-creating many graphics and extensive user navigation guidance. The lesson learned from this experience was that course designers should construct courses that facilitate future modifications to the extent possible within budget limits.
Without question, the hardest lessons learned about delivering online courses relate to capturing evaluation data. Earlier in UA-CPHP's development, the focus was primarily on data needed for certifying continuing education credits. Thus, course completion was defined as the user passing a final exam—available only after completing all modules. Users who are not concerned with earning credits may complete the course but not the exam; however, they cannot easily be differentiated from others dropping out earlier. Our partners at PDP have explained that retrofitting courses to report other categories requires restructuring at a fairly deep—and, therefore, costly—level. We are working with them to analyze as much information as possible from user logs sometimes exceeding a million records per course.
Similarly, the online course registration does not ask the user to identify his or her nation of residence. Our belated recognition and quantification of UA-CPHP's international reach only occurred because one of our staff noticed a number of international domain names when reviewing course registrant lists, leading to her manually extracting the data reported in Table 2. In UA-CPHP's early forays into online course development, the focus was entirely on NYS public health workers. To include a question on the registrant's nation of residence might have seemed frivolous or irrelevant.
The lessons we draw from this experience start at the earliest design stage. From inception, evaluators should identify and operationally define all metrics that can be recorded automatically and transparently as the learner moves through the course. Such metrics could include the time spent in each module and the number and identity of incorrect responses before passing each quiz. However, when considering data elements that would require additional data entry by the learner, the incremental gain in information from each candidate element must be weighed against the risk of frustrating the end-user to the point that they elect not to continue. Finally, the underlying database should be designed to accommodate the addition of data elements, such as nation of residence, that were not identified as a priority at the outset.
CONCLUSION
Our experience as a preparedness training center has substantially confirmed that contemporary technology can be employed to improve and increase the reach of our training efforts. Audience response technology does appear to engage diverse audiences, as intended; the positive effect on evaluation has been a welcome surprise. Satellite broadcasts have an unexpectedly large, sustained audience in the form of videostreams and podcasts. Were we granted one do-over, we would opt to improve our data collection to improve enumeration and evaluation of our highly successful interactive Web courses.
Our primary mission has been to provide preparedness-related training to the NYS workforce. Quite unintentionally, the intensive use of distance-based educational modalities has afforded UA-CPHP a substantial national and international audience. Upon reflection, we note that a generation ago, a small training staff in Albany, New York, could not have imagined reaching a public health worker in Guam or Zambia or even the rural mountains of Montana. But today, we can and we do this daily—without ever intending to do so.
Acknowledgments
Many of the training programs described were developed by the University at Albany Center for Public Health Preparedness (UA-CPHP) under the leadership of former principal investigators, directors, and assistant/associate directors, including Guthrie S. Birkhead, Carol D. Young, Peter J. Levin, Robert G. Westphal, Cheryl Reeves, and Eric Gebbie. The authors thank them for establishing the strong foundation on which to build.
The authors also thank Centers for Disease Control and Prevention (CDC) Project Officer Gregg Leeman for his enthusiastic and insightful support of UA-CPHP activities and for his public comments on the geographic reach of the UA-CPHP programming, which suggested the title of this article.
The authors also thank Rebecca Stanley and her colleagues at the Instructional Technologies Unit in the University at Albany Rockefeller Institute's Professional Development Program for their contributions to the success of UA-CPHP's Web-based courses.
Footnotes
This project was supported under cooperative agreement #5 U90 TP224249 from CDC. The contents of this article do not necessarily represent the official views of CDC. Several UA-CPHP broadcasts and online courses were developed in conjunction with NYCEPCE under the Office of the Assistant Secretary for Preparedness and Response Bioterrorism Training and Curriculum Development Program (T01HP01411) and the Western New York Public Health Alliance Rural Advanced Practice Center, supported by NACCHO in collaboration with CDC.
Collaborating institutional partners on some of these projects include the New York Consortium for Emergency Preparedness Continuing Education (NYCEPCE) based at the Columbia University Center for Health Policy Studies; the South Carolina and Upper Midwest CPHPs; and the National Association of County and City Health Officials (NACCHO) and its Advanced Practice Center at the Western New York Public Health Alliance.
REFERENCES
- 1.Institute of Medicine. The future of the public's health in the 21st century. Washington: National Academies Press; 2002. [PubMed] [Google Scholar]
- 2.Institute for Teaching Learning and Academic Leadership, University at Albany, State University of New York. Teaching & learning resources. Clickers. 2010. [cited 2010 Jun 29]. Available from: URL: http://www.albany.edu/teachingandlearning/tlr/teaching_resources/clickers.shtml#5.
- 3.Caldwell JE. Clickers in the large classroom: current research and best-practice tips. CBE Life Sci Educ. 2007;6:9–20. doi: 10.1187/cbe.06-12-0205. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Draper S, Cargill J, Cutts Q. Electronically enhanced classroom interaction. Australian J Educ Technology. 2002;18:13–23. [Google Scholar]
- 5.Copeland HL, Hewson MG, Stoller JK, Longworth DL. Making the continuing medical education lecture effective. J Contin Educ Health Prof. 1998;18:227–34. [Google Scholar]
- 6.Miller RG, Ashar BH, Getz KJ. Evaluation of an audience response system for the continuing education of health professionals. J Contin Educ Health Prof. 2003;23:109–15. doi: 10.1002/chp.1340230208. [DOI] [PubMed] [Google Scholar]
- 7.Gamito EJ, Burhansstipanov L, Krebs LU, Bemis L, Bradley A. The use of an electronic audience response system for data collection. J Cancer Educ. 2005;20(1 Suppl):80–6. doi: 10.1207/s15430154jce2001s_16. [DOI] [PubMed] [Google Scholar]
- 8.Schackow TE, Chavez M, Loya L, Friedman M. Audience response system: effect on learning in family medicine residents. Fam Med. 2004;36:496–504. [PubMed] [Google Scholar]
- 9.Peddecord KM, Holsclaw P, Jacobson IG, Kwizera L, Rose K, Gersberg R, et al. Nationwide satellite training for public health professionals: Web-based follow-up. J Contin Educ Health Prof. 2007;27:111–7. doi: 10.1002/chp.109. [DOI] [PubMed] [Google Scholar]
- 10.Donavant BW. The new, modern practice of adult education: online instruction in a continuing professional education setting. Adult Educ Q. 2009;59:227–45. [Google Scholar]
- 11.Whitten P, Ford DJ, Davis N, Speicher R, Collins B. Comparison of face-to-face versus interactive video continuing medical education deliveries modalities. J Contin Educ Health Prof. 1998;18:93–9. [Google Scholar]
- 12.Zusevics KL, Gilmore GD, Jecklin RA, Swain GR. Online education: the needs, interests, and capacities of Wisconsin public health professionals. MERLOT J Online Learning and Teaching. 2009;5:531–45. [Google Scholar]
- 13.Fordis M, King JE, Ballantyne CM, Jones PH, Schneider KH, Spann SJ, et al. Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA. 2005;294:1043–51. doi: 10.1001/jama.294.9.1043. [DOI] [PubMed] [Google Scholar]




