Abstract
Background
The worldwide market for continuing medical education (CME) was severely affected by the COVID-19 pandemic, which precipitated an increase in web-based CME course attendance. Virtual education methods may be effective for engaging learners and changing behaviors. However, more information is needed about physician preferences for in-person vs. livestreamed CME courses in the postpandemic era. Because of the paucity of data regarding this topic, the current study was designed to evaluate CME participant characteristics, preferences, engagement, and satisfaction with traditional in-person vs. virtual educational methods.
Methods
A cross-sectional study was performed of attendees of two large internal medicine CME courses held in 2021. Both CME courses were offered via in-person and livestream options, and were taught by Mayo Clinic content experts. Participants, who consisted of practicing physicians seeking CME, completed a 41-question survey after CME course completion. Statistical comparisons were performed by using Fisher exact tests for all survey items, except for those with ordinal response sets, which were compared with Cochran-Armitage trend tests.
Results
A total of 146 participants completed the survey (response rate, 30.2%). Among the 77 respondents who attended in-person courses, the most frequent reasons indicated were the opportunity to travel (66%) and collaboration/networking with others (25%). Among the 68 respondents who attended the livestream courses, the most frequent reasons indicated included COVID-19–related concerns (65%), convenience (46%), and travel costs (34%). The percentage of respondents who indicated that they would choose the same mode of attendance if given the option again was higher for those who attended in person than for those who attended via livestream (91% vs. 65%, P < .001).
Conclusions
These data suggest that in-person course offerings will continue to be a preferred learning method for some physicians. However, most respondents who attended virtually preferred that method. Therefore, hybrid CME models offering both in-person and virtual options may be most beneficial for meeting the needs of all CME learners.
Supplementary Information
The online version contains supplementary material available at 10.1186/s12909-024-06046-1.
Keywords: Continuing medical education, Hybrid learning, In-person learning, Livestream learning
Introduction
Physicians and other health care professionals use continuing medical education (CME) courses to stay up to date in their medical knowledge, skills, and professional performance [1]. CME is most effective when it is interactive, involves multiple content exposures, and focuses on topics that are considered important and timely [1]. When properly delivered, CME may improve clinician attitude, increase knowledge and skills, and foster behavioral changes, which may, in turn, improve patient outcomes [2].
Traditionally, in the United States, most CME courses have consisted of lectures, presentations, and workshops in large didactic settings [2]. However, the worldwide CME market was disrupted in early 2020 by the COVID-19 pandemic, which required fundamental changes to graduate and postgraduate education, as well as CME [3–5]. Because of physical distancing requirements and limitations placed on travel, CME course providers (both academically and privately based organizations) were compelled to transition from live, in-person CME events to internet-based events [6, 7]. Consequently, traditional didactic methods of education have become less feasible, and interest in distance-learning methods, particularly virtual methods, has increased [8, 9].
Virtual educational methods, both synchronous and asynchronous, were characterized as safe and effective long before the COVID-19 pandemic began. Indeed, internet-based learning strategies have been shown to be both practical and well tolerated by a broad spectrum of medical learners, including physicians seeking CME credit [10–16]. Moreover, in an increasingly digitally connected world, medical learners have shifted expectations about medical learning options [17–20].
Before the COVID-19 pandemic, Mayo Clinic had observed an increase in popularity and availability of livestreamed CME events. However, after the pandemic began, all institutions had to either cancel CME offerings entirely or quickly adapt and learn how to deliver a livestreaming option. These institutions also had to market their livestream CME courses, encourage end-user participation and engagement, and train faculty members on how to best use the digital platform selected for livestreaming. This brought about new challenges, and the education field has had to reinvent itself with blended instructional formats for its course offerings [21–27]. Most studies characterizing the results of internet-based learning initiatives have included younger learners in academic settings, which may not reflect the needs and preferences of CME learners [21].
A recent three-year study of Mayo Clinic CME participants reported no difference in learner engagement level between in-person or livestreamed CME courses [15]. However, little is known about participant preferences for in-person vs. livestreamed CME courses in the postpandemic era, and not enough is currently understood about the potential effect of each of these formats on participant satisfaction with the overall course curricula. Because of the paucity of data on this topic, we sought to evaluate participant characteristics, preferences, engagement, and satisfaction with traditional in-person vs. livestream options for CME courses. Our findings will be transformative for CME providers as they continue to optimize current and future CME offerings to meet the needs of participants in a dynamic marketplace.
Methods
Study overview
This study was reviewed by the Mayo Clinic Institutional Review Board (IRB) and determined to be exempt under Sect. 45 CFR 46.101(b), item 2. During the course of the study, all major changes to the study design and procedures continued to be appropriately filed with and reviewed by the IRB, which continued the exempt status of the study. We conducted a cross-sectional survey of attendees of two CME courses provided by Mayo Clinic: A Systematic Approach to Medically Unexplained Symptoms (MUS) 2021 and Updates in Internal Medicine (UIM) 2021. Although these courses were not marketed to an international audience, a number of international attendees were registered in both the livestream and in-person courses. We conducted this study in accordance with Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) guidelines for cross-sectional observational studies.
Setting
The MUS 2021 CME course took place from August 11, 2021, through August 14, 2021, in Marina del Rey, California, and UIM 2021 course took place from October 21, 2021, through October 23, 2021, in Lake Buena Vista, Florida. MUS 2021 comprised 27 separate sessions taught by expert physicians and health care practitioners. UIM 2021 comprised 20 separate sessions taught by expert physicians and health care practitioners, with two additional optional precourse sessions. Both courses were taught in large, didactic formats for in-person attendees, with learning objectives clearly listed at the beginning of each session. Livestreaming for both courses occurred via Zoom, and recordings were made available to course participants through an online course registration platform. Both in-person and livestream participants were able to ask questions, and in-person courses were held in compliance with social distancing requirements at the time, including mandatory masking. Courses were sponsored by the Mayo Clinic Division of General Internal Medicine, and cost between $850 and $995 to attend. Depending on the course, participants were eligible to receive up to 22 h of CME credits through either the American Academy of Family Physicians (AAFP), American Board of Internal Medicine (ABIM), or the American Medical Association (AMA). Full details for each of the respective courses remain available on the course websites [28, 29].
Participants
Inclusion criteria included ability to read/write English, attendance at one of the two CME courses, and volunteering to complete the electronic survey after reading the introduction letter, which served as passive consent for participation. Surveys were emailed to all eligible course attendees within 4 days after the course conclusion. All course registrants were eligible to receive surveys.
Survey development and data collection
The survey used in this report was developed and is copyrighted by Mayo Clinic (Supplemental Material). Our institutional Research Electronic Data Capture (REDCap) tool [30, 31] was used to create and distribute the survey. The survey focused on demographic information (e.g., age, sex, and length of time in medical practice), as well as educational preferences (e.g., in-person vs. livestream, time of day, and duration of conferences). Several questions had branching logic, and most of the questions had Likert scale responses, which included response options such as strongly agree, agree, neutral, disagree, and strongly disagree. The five overarching components of the survey were as follows: (1) current mode of participation and preferences (14 questions), (2) mode of participation in past CME courses (four questions with three parts for each question; 12 questions in total), (3) digital media comfort (five questions), (4) background and practice (five questions), and (5) demographics (five questions). Because most physician learners have responsibilities maintaining busy practices, efforts were made to keep the survey instrument brief. The resulting survey was 41 questions in length, which required approximately five minutes to complete.
During the daily course announcements, participants were informed of our intent to send the survey after course attendance. Attendees were informed that completion of the survey instrument was voluntary and anonymous. Surveys were distributed via a REDCap-generated email, which provided the following information: study aims, the voluntary nature of the study, contact information for questions or complaints, and a disclaimer stating that participation/nonparticipation in the survey will not affect participant health care or employment at their institution. If attendees chose to participate in the study, a hyperlink included in the email was used to access the survey. All survey email nonresponders received up to three email reminders before correspondence regarding the survey was ceased. Survey respondents received no incentives for participating in the study.
Data analysis
All survey data were collected, compiled, and summarized with the REDCap tool. Responses were summarized as frequency counts and percentages and were compared between those who attended in-person sessions vs. those who attended livestreamed sessions. Statistical comparisons were performed with Fisher exact tests for all survey items except for those with ordinal response sets, which were compared with Cochran-Armitage trend tests. Continuous variables were summarized as median (IQR) and compared with Wilcoxon rank sum tests. In all cases, two-tailed P values less than 0.05 were considered statistically significant. Statistical analyses were performed with SAS software, v9.4 (SAS Institute Inc).
Results
A total of 483 attendees received surveys, of whom 299 attended MUS 2021 and 184 attended UIM 2021. Of those who received surveys, 146 (30.2%) responded, with 104 from MUS 2021 (34.8%) and 42 from UIM 2021 (22.8%). Of all attendees who responded, 77 indicated that they attended the course in person, 68 indicated that they attended via livestream, and 1 indicated that he/she attended by viewing the course recording online. Data for the respondent who viewed the online recording were excluded from analysis, which thereby resulted in a total of 145 survey respondents included in our analysis. In addition, some participants did not respond to each survey item, so the number of participants for each response item may be less than the overall number of surveys completed.
Respondent demographics and practice characteristics according to mode of attendance are summarized in Table 1. Most respondent characteristics were similar between attendance groups. However, significantly more respondents who attended via livestream were female than were those who attended in person (72% vs. 52%, P = .02), and significantly more respondents who attended in person indicated a specialty other than internal medicine or family medicine than did those who attended via livestream (23% vs. 4%, P = .01).
Table 1.
Characteristics of participants attending continuing medical education courses
No. (%) of respondents | |||
---|---|---|---|
Characteristic |
In-person course
( n = 77) |
Livestreamed
course ( n = 68) |
P a |
Age, y | (n = 73) | (n = 67) | 0.84 |
20–30 | 3 (4) | 2 (3) | |
31–40 | 15 (21) | 12 (18) | |
41–50 | 20 (27) | 18 (27) | |
51–60 | 20 (27) | 25 (37) | |
61–70 | 13 (18) | 8 (12) | |
≥71 | 2 (3) | 2 (3) | |
Sex | (n = 73) | (n = 67) | 0.02 |
Female | 38 (52) | 48 (72) | |
Male | 35 (48) | 19 (28) | |
Transgender | 0 (0) | 0 (0) | |
I do not wish to identify | 0 (0) | 0 (0) | |
I do not wish to answer | 0 (0) | 0 (0) | |
Ethnicity | (n = 72) | (n = 66) | 0.77 |
Hispanic or Latino | 7 (10) | 5 (8) | |
Not Hispanic or Latino | 65 (90) | 61 (92) | |
Race | (n = 71) | (n = 66) | 0.41 |
Asian | 14 (20) | 10 (15) | |
Black/African American | 6 (8) | 4 (6) | |
White | 49 (69) | 46 (70) | |
Other | 2 (3) | 6 (9) | |
Practice | (n = 73) | (n = 67) | 0.66 |
Academic | 22 (30) | 15 (22) | |
HMO | 5 (7) | 2 (3) | |
Industry | 1 (1) | 2 (3) | |
Single-specialty group | 17 (23) | 20 (30) | |
Multispecialty group | 19 (26) | 20 (30) | |
Solo/Self-employed | 4 (5) | 1 (1) | |
Government or military | 1 (1) | 2 (3) | |
Other | 4 (5) | 5 (7) | |
Specialty | (n = 73) | (n = 67) | 0.01 |
Internal medicine | 27 (37) | 31 (46) | |
Family medicine | 29 (40) | 33 (49) | |
Medical specialty | 14 (19) | 3 (4) | |
Nonmedical specialty | 3 (4) | 0 (0) | |
Practice location | (n = 74) | (n = 66) | 0.35 |
Same geographic region as course | 18 (24) | 21 (32) | |
Different geographic region than course | 56 (76) | 45 (68) | |
Course attended | (n = 77) | (n = 68) | > 0.99 |
MUS 2021 | 55 (71) | 48 (71) | |
UIM 2021 | 22 (29) | 20 (29) | |
Reason for attending | (n = 71) | (n = 67) | 0.66 |
Recertification | 12 (17) | 14 (21) | |
General knowledge | 59 (83) | 53 (79) |
Abbreviations: HMO: health maintenance organization; MUS: A Systematic Approach to Medically Unexplained Symptoms; UIM: Updates in Internal Medicine
aP values determined with Fisher exact tests
The reasons indicated for selecting the given mode of attendance are summarized in Table 2. Among the 77 respondents who chose to attend in person, the most frequent reasons indicated were the opportunity to travel (66%), collaboration/networking with others (25%), convenience (19%), flexibility in time (18%), and the speakers (17%). Among the 68 respondents who attended via livestream, the most frequent reasons indicated included COVID-19–related concerns (65%), convenience (46%), travel costs (34%), flexibility in time (32%), and clinical responsibility conflicts (15%).
Table 2.
Reasons for attending course in the chosen mode
No. (%) of respondentsa | ||
---|---|---|
Reason |
In-person course
( n = 77) |
Livestreamed course
( n = 68) |
COVID-19–related concerns | 0 (0) | 44 (65) |
Travel costs | 1 (1) | 23 (34) |
Collaboration/networking with others | 19 (25) | 0 (0) |
Employer restrictions | 0 (0) | 6 (9) |
Clinical responsibility conflicts | 0 (0) | 10 (15) |
Accrediting body | 6 (8) | 1 (1) |
Registration cost | 2 (3) | 0 (0) |
Flexibility in time | 14 (18) | 22 (32) |
Convenience | 15 (19) | 31 (46) |
Speaker(s) | 13 (17) | 1 (1) |
Travel opportunity | 51 (66) | 0 (0) |
Other | 8 (10) | 8 (12) |
a Respondents were instructed to select all responses that apply
Responses to questions regarding satisfaction with the course are summarized in Table 3. Most surveyed attendees (96% in-person and 93% livestream) were aware that different attendance options were provided. The percentage of respondents who indicated that they would choose the same mode of attendance if given the option again was higher for those who attended in person than for those who attended via livestream (91% vs. 65%, P < .001). Nonetheless, the majority of both attendance groups (94% in-person and 91% livestream) indicated that they would recommend attending the course via the same mode chosen, and nearly all respondents indicated that their experience was the same or better than they expected.
Table 3.
Participant responses regarding satisfaction with course attended
No. (%) of respondents | |||
---|---|---|---|
Survey question/response |
In-person course
( n = 77) |
Livestreamed course
( n = 68) |
P a |
Were you aware of different options? | (n = 77) | (n = 67) | 0.47 |
No | 3 (4) | 5 (7) | |
Yes | 74 (96) | 62 (93) | |
If you could do it over, would you choose the same mode? | (n = 75) | (n = 68) | < 0.001 |
No | 1 (1) | 11 (16) | |
Yes | 68 (91) | 44 (65) | |
Unsure | 6 (8) | 13 (19) | |
Which mode would you have chosen? | (n = 1) | (n = 11) | |
Live, in-person | 0 (0) | 10 (91) | |
Online/internet-based, prerecorded | 1 (100) | 1 (9) | |
Would you recommend this course using this mode? | (n = 77) | (n = 68) | 0.79 |
No | 1 (1) | 2 (3) | |
Yes | 72 (94) | 62 (91) | |
Unsure | 4 (5) | 4 (6) | |
How was your experience? | (n = 77) | (n = 67) | 0.83 |
Worse than I expected | 2 (3) | 3 (4) | |
The same as I expected | 33 (43) | 28 (42) | |
Better than I expected | 42 (55) | 36 (54) |
aP values determined with Fisher exact tests
Additional survey responses regarding the availability of course content after course completion and computer/technical problems during the course are summarized in Table 4. Most respondents indicated that it was very important or extremely important for recordings of the lectures to be made available after the course, with more respondents who attended via livestream (87%) than those who attended in person (76%) indicating a higher level of importance, although this difference was not statistically significant (P = .05). The percentage of respondents who indicated that they would prefer PDF files or downloadable slide decks be made available to them was similar between attendance groups (P = .85); however, the percentage of respondents who stated that they would prefer recordings be made available was significantly higher for those who attended via livestream than for those who attended in person (76% vs. 57%, P = .02).
Table 4.
Respondent preferences for course material availability and technical supporta
Survey question/response | In-person course (n = 77) |
Livestreamed course (n = 68) |
P |
---|---|---|---|
How important is it to you to have the lectures made available after the course? | (n = 77) | (n = 68) | .05b |
Not at all | 4 (5) | 0 (0) | |
Slightly | 1 (1) | 2 (3) | |
Moderately | 13 (17) | 7 (10) | |
Very | 21 (27) | 17 (25) | |
Extremely | 38 (49) | 42 (62) | |
What formats would you prefer?c | |||
PDFs/downloadable slide decks | 57 (74) | 52 (76) | .85d |
Recordings | 44 (57) | 52 (76) | .02d |
Other | 1 (1) | 1 (1) | > .99d |
Was it clear where to go for computer/technical problems? | (n = 72) | (n = 67) | .83d |
No | 12 (17) | 13 (19) | |
Yes | 60 (83) | 54 (81) | |
Computer/technical problems presented a significant challenge | (n = 53) | (n = 61) | .23b |
Strongly disagree | 25 (47) | 30 (49) | |
Disagree | 18 (34) | 27 (44) | |
Agree | 9 (17) | 4 (7) | |
Strongly Agree | 1 (2) | 0 (0) | |
Digital media comfort scoree | 5 (3–5) | 5 (4–5) | .59f |
a Digital media comfort score summarized as median (IQR), and all other data summarized as No. (%) of respondents
bP values determined with Cochran-Armitage trend tests
c Respondents were instructed to select all responses that apply
dP values determined with Fisher exact tests
e The survey asked respondents whether they knew how to perform the following 5 tasks: downloading files, saving images found online, using shortcut keys, opening a new tab in a browser, and bookmarking a website. For each respondent, the digital media comfort score was calculated as the number of tasks rated as true or mostly true
fP value determined with Wilcoxon rank sum test
Table 5 summarizes the responses to a series of questions inquiring how often and by what mode the respondents attended CME courses in the past 3 years. Respondents who attended the CME course in person attended significantly more in-person courses during the prior 3 years than did those who attended via livestream (P = .009). Those who attended the CME course in person also preferred to attend more courses in person (P = .002) and reported a better experience with in-person courses (P < .001) than did those who attended via livestream.
Table 5.
Past Course experience and preferences for future courses
No. (%) of respondents | |||
---|---|---|---|
Survey question/response | In-person course (n = 77) |
Livestreamed course (n = 68) |
P a |
During the past 3 years, how often did you obtain CME credits from an online (prerecorded) distance-learning course? | (n = 73) | (n = 62) | 0.46 |
Never | 16 (22) | 11 (18) | |
Less than 1 time per year | 19 (26) | 17 (27) | |
1 or 2 times per year | 22 (30) | 13 (21) | |
Several times per year | 12 (16) | 18 (29) | |
At least monthly | 1 (1) | 2 (3) | |
At least weekly | 3 (4) | 1 (2) | |
During the past 3 years, how often did you obtain CME credits from an online (live) distance-learning course? | (n = 74) | (n = 62) | 0.20 |
Never | 15 (20) | 9 (15) | |
Less than 1 time per year | 15 (20) | 15 (24) | |
1 or 2 times per year | 32 (43) | 21 (34) | |
Several times per year | 11 (15) | 14 (23) | |
At least monthly | 1 (1) | 2 (3) | |
At least weekly | 0 (0) | 1 (2) | |
During the past 3 years, how often did you obtain CME credits from a live onsite course? | (n = 74) | (n = 63) | 0.009 |
Never | 3 (4) | 10 (16) | |
Less than 1 time per year | 12 (16) | 11 (17) | |
1 or 2 times per year | 36 (49) | 31 (49) | |
Several times per year | 22 (30) | 11 (17) | |
At least monthly | 1 (1) | 0 (0) | |
Ideally, how often would you want to obtain CME credits from an online (prerecorded) distance-learning course? | (n = 71) | (n = 63) | 0.02 |
Never | 22 (31) | 4 (6) | |
Less than 1 time per year | 15 (21) | 16 (25) | |
1 or 2 times per year | 17 (24) | 22 (35) | |
Several times per year | 12 (17) | 17 (27) | |
At least monthly | 1 (1) | 1 (2) | |
At least weekly | 4 (6) | 3 (5) | |
Ideally, how often would you want to obtain CME credits from an online (live) distance-learning course? | (n = 71) | (n = 63) | 0.006 |
Never | 17 (24) | 1 (2) | |
Less than 1 time per year | 20 (28) | 13 (21) | |
1 or 2 times per year | 19 (27) | 31 (49) | |
Several times per year | 11 (15) | 17 (27) | |
At least monthly | 1 (1) | 0 (0) | |
At least weekly | 3 (4) | 1 (2) | |
Ideally, how often would you want to obtain CME credits from a live onsite course? | (n = 75) | (n = 65) | 0.002 |
Never | 1 (1) | 2 (3) | |
Less than 1 time per year | 3 (4) | 9 (14) | |
1 or 2 times per year | 33 (44) | 36 (55) | |
Several times per year | 32 (43) | 17 (26) | |
At least monthly | 2 (3) | 0 (0) | |
At least weekly | 4 (5) | 1 (2) | |
How would you rate your experience with online (prerecorded) distance-learning courses in the past 3 years? | (n = 65) | (n = 55) | 0.002 |
Never again – I will avoid this … | 11 (17) | 2 (4) | |
Average – I could take it or leave it. | 24 (37) | 18 (33) | |
Good and bad, I would do this again … | 27 (42) | 23 (42) | |
Best – I would do this every time … | 3 (5) | 12 (22) | |
How would you rate your experience with online (live) distance-learning courses in the past 3 years? | (n = 68) | (n = 61) | 0.008 |
Never again – I will avoid this … | 2 (3) | 4 (7) | |
Average – I could take it or leave it. | 25 (37) | 5 (8) | |
Good and bad, I would do this again … | 36 (53) | 40 (66) | |
Best – I would do this every time … | 5 (7) | 12 (20) | |
How would you rate your experience with live onsite courses in the last 3 years? | (n = 70) | (n = 59) | < 0.001 |
Never again – I will avoid this … | 0 (0) | 3 (5) | |
Average – I could take it or leave it. | 1 (1) | 3 (5) | |
Good and bad, I would do this again … | 12 (17) | 25 (42) | |
Best – I would do this every time … | 57 (81) | 28 (47) |
Abbreviation: CME: continuing medical education
aP values determined with Cochran-Armitage trend tests
Discussion
Summary of main results
Satisfaction levels with either in-person or livestream learning were generally high among all survey respondents who attended the Mayo Clinic MUS and/or UIM CME courses in 2021, regardless of the mode of attendance (in-person or livestream). However, those who livestreamed the course indicated they were less likely to choose the same mode of CME course delivery for future conferences. Our CME course attendees were older, more likely to be out of training and in active clinical practice, and less likely to be affiliated with an academic medical center. Our survey data suggest that in-person course offerings will continue to be a preferred learning method among more experienced physicians who are in primary specialties. Respondents who attended via livestream were more likely to be practicing in general internal medicine or family medicine. Together, our findings indicate that hybrid CME models that incorporate both livestream and in-person options are most ideal, when feasible, for all participants.
Result significance and further research needs
While hybrid CME models are most ideal, they may be more logistically limited by need for additional resources, including support staff and physical space. Participants who attended CME conferences in person were primarily motivated by the opportunity to travel to the CME course location as a collaborative and networking opportunity. We did not observe a statistically significant difference in age, comfort with use of digital media, or previous experience with online CME courses among the attendance groups. Thus, pandemic fatigue and need for social connection are possible, if not likely, drivers for the desire to collaborate and network [32–35]. This may suggest a dual role for in-person CME courses as both an opportunity for improving medical knowledge and providing interpersonal contact in the setting of social isolation protocols. Therefore, in-person CME courses will most likely be successful when promoting active interaction between participants. Further studies are needed, but such desire for networking among CME course attendees suggests that live presentations are more likely to be well received than are prerecorded presentations without an interactive component, regardless of attendance mode.
Our data suggest that participants who livestream CME conferences are more likely to prioritize convenience and flexibility, be female, and be concerned about travel costs and the risk of becoming infected with SARS-CoV-2. These participants are also more likely to prioritize having materials and presentation recordings be made available after the course, which suggests an increased desire for asynchronous learning by these attendees. Therefore, best practices would include making recordings and slide decks available after course completion to accommodate livestream learners. The finding that more participants who attended via livestream were female than were in-person learners suggests that this discrepancy may be due to gender-based differences in attitudes about COVID-19 [36, 37]. Increased concern regarding travel costs may also be attributed to the financial effect of the COVID-19 pandemic on the health care system and declining CME benefits. This concern may reflect decreases in physician time and income to pursue in-person CME education [38–41]. Notably, a number of livestream participants suggested that in-person learning is ideal, despite attending via livestream. Consistent with prior surveys, this finding suggests that although livestream education has been reported to yield similar engagement levels among learners, it does not fully replace the in-person CME course experience [15, 35]. Therefore, active learning and engagement strategies tailored to the livestream experience will most likely be well received, although further studies are needed [33].
Limitations
Both of our CME courses occurred in the fall of 2021, when COVID-19 restrictions were rapidly evolving, and thus, our findings reflect a brief view of physician attitudes and preferences during that time. Moreover, both courses took place in popular vacation destinations. Therefore, the potential destination effect of a travel opportunity may have influenced physician learning style preference. Our study represents a learner population that is older and more likely to practice outside of academic medical centers, which may limit its generalizability to younger, more academic learner populations. In addition, because the overall response rate for the survey was 30.2%, we cannot exclude the potential effect of nonresponse bias on our findings; however, our response rate exceeds previously published response rates for similar surveys [42, 43]. Moreover, our satisfaction survey was written to assess satisfaction with the method of course delivery, but may not reflect satisfaction with the course quality. This may impact satisfaction with course delivery. Finally, several of the course speakers presented virtually, even for the in-person CME offerings, which may have affected overall satisfaction. Nevertheless, satisfaction was notably high for all respondents.
Overall, our study had a robust sample size, and our findings may be generalizable to other CME courses. To our knowledge, our study is the first to compare learner preferences for livestream vs. in-person CME learning in the postpandemic era. Therefore, our findings may have implications for future CME course design, particularly for older learners who practice in primarily nonacademic settings.
Conclusion
Hybrid learning opportunities may meet the needs of diverse groups of CME learners who value in-person collaboration and networking, as well as those who value flexibility. Best practices should make all CME course materials available online and asynchronously from course presentations. For in-person conferences, opportunities for networking and collaboration should be provided. Because in-person CME learners prefer in-person conferences for networking, best practices should encourage all speakers to provide presentations onsite rather than virtual, prerecorded presentations. As the postpandemic landscape evolves, in-person conference safety assessments will change, and further understanding of learner preferences and attitudes will be essential for future CME conferences.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Acknowledgements
The study team thanks Ryan T. Hurt, MD, PhD, and Erin M. Pagel, MS, MHI, for their support and patience during this study. The study team also thanks all study participants, who without their participation, this study would not have been possible.
Nisha Badders, PhD, ELS, Mayo Clinic, substantively edited the manuscript. The Scientific Publications staff at Mayo Clinic provided proofreading and administrative and clerical support.
Abbreviations
- CME
continuing medical education
- IRB
Institutional Review Board
- MUS
A Systematic Approach to Medically Unexplained Symptoms
- REDCap
Research Electronic Data Capture
- UIM
Updates in Internal Medicine
Biographies
Michael R. Mueller, MD
is a consultant in the Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota, and an assistant professor of medicine, Mayo Clinic College of Medicine and Science.
Ivana T. Croghan, PhD
is an associate consultant in the Division of General Internal Medicine, Division of Community Internal Medicine, and Department of Quantitative Health Sciences, Mayo Clinic, Rochester, Minnesota, and a professor of medicine, Mayo Clinic College of Medicine and Science.
Darrell R. Schroeder, MS
is a principal biostatistician in the Division of Clinical Trials and Biostatistics, Mayo Clinic, Rochester, Minnesota, and an assistant professor of biostatistics, Mayo Clinic College of Medicine and Science.
M. Nadir Bhuiyan, MD
is a consultant in the Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota, and an assistant professor of medicine, Mayo Clinic College of Medicine and Science.
Ravindra Ganesh MBBS, MD
is a consultant in the Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota, and an associate professor of medicine, Mayo Clinic College of Medicine and Science.
Arya B. Mohabbat, MD
is a consultant in the Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota, and an assistant professor of medicine, Mayo Clinic College of Medicine and Science.
Sanjeev Nanda, MD
is a consultant in the Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota, and an assistant professor of medicine, Mayo Clinic College of Medicine and Science.
Elizabeth C. Wight, MD
is a consultant in the Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota, and an assistant professor of medicine, Mayo Clinic College of Medicine and Science.
Deb L. Blomberg, MBA
is a program manager for continuing medical education in Internal Medicine Administrative Services, Mayo Clinic, Rochester, Minnesota, and an instructor in medicine, Mayo Clinic College of Medicine and Science.
Sara L. Bonnes, MD, MS,
is a consultant in the Division of General Internal Medicine, Mayo Clinic, Rochester, Minnesota, and an associate professor of medicine, Mayo Clinic College of Medicine and Science.
Author contributions
Michael R. Mueller: Conceptualization, Methodology, Validation, Investigation, Writing – Original Draft, Writing – Review and Editing, Visualization, Ivana T. Croghan: Conceptualization, Methodology, Validation, Investigation, Data Curation, Writing – Original Draft, Writing – Review and Editing, Visualization, SupervisionDarrell R. Schroeder: Formal analysis, VisualizationM. Nadir Bhuiyan: Validation, Resources, Writing – Review and EditingRavindra Ganesh: Conceptualization, Writing – Review and Editing, Visualization, SupervisionArya B. Mohabbat: Validation, Resources, Writing – Review and EditingSanjeev Nanda: Validation, Resources, Writing – Review and EditingElizabeth C. Wight: Validation, Resources, Writing – Review and EditingDeb L. Blomberg: Investigation, ResourcesSara L. Bonnes: Conceptualization, Methodology, Validation, Supervision.
Funding
This study was supported, in part, by the Mayo Clinic Department of Medicine, General Internal Medicine Division. The REDCap system is supported, in part, by a Center for Clinical and Translational Science award (UL1 TR000135) from the National Center for Advancing Translational Sciences.
Data availability
The authors confirm that the data supporting the findings of this study are available within the article and its supplementary materials.
Declarations
Ethics approval and consent to participate
In accordance with the Declaration of Helsinki, this study was reviewed and determined to be exempt by the Mayo Clinic Institutional Review Board (IRB) according to 45 CFR 46.101(b), item 2. All modifications to the study design or procedures were submitted to the IRB during the course of the study to determine whether the study remained exempt. The written contact cover letter (which served as passive informed consent information) and the survey were both reviewed by the IRB and noted. Informed consent was obtained from all study participants before study initiation.
Previous presentations
None.
Consent to publish
Not applicable.
Competing interests
M.R.M. has received funding from The France Foundation; R.G. has received funding from The France Foundation and is on the scientific advisory board for Alpaca Health; S.L.B. is on the scientific advisory board for CorMedix; A.B.M. has a grant from Purina.
Footnotes
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Cervero RM, Gaines JK. The impact of CME on physician performance and patient health outcomes: an updated synthesis of systematic reviews. J Contin Educ Health Prof. 2015;35(2):131–8. [DOI] [PubMed] [Google Scholar]
- 2.Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner: guide to the evidence. JAMA. 2002;288(9):1057–60. [DOI] [PubMed] [Google Scholar]
- 3.Simulescu L, Meijer M, Vodusek DB. With the support of the BioMed Alliance CMEEPCr. Continuing Medical Education (CME) in time of crisis: how medical societies face challenges and adapt to provide unbiased CME. J Eur CME. 2022;11(1):2035950. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Liu CH, You-Hsien Lin H. The impact of COVID-19 on medical education: experiences from one medical university in Taiwan. J Formos Med Assoc. 2021;120(9):1782–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Brady AK, Pradhan D. Learning without Borders: Asynchronous and Distance Learning in the age of COVID-19 and Beyond. ATS Sch. 2020;1(3):233–42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Ruiz-Barrera MA, Agudelo-Arrieta M, Aponte-Caballero R, et al. Developing a web-based Congress: the 2020 International web-based Neurosurgery Congress Method. World Neurosurg. 2021;148:e415–24. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Kanneganti A, Sia CH, Ashokka B, Ooi SBS. Continuing medical education during a pandemic: an academic institution’s experience. Postgrad Med J. 2020;96(1137):384–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Ghanem O, Logghe HJ, Tran BV, Huynh D, Jacob B. Closed Facebook groups and CME credit: a new format for continuing medical education. Surg Endosc. 2019;33(2):587–91. [DOI] [PubMed] [Google Scholar]
- 9.Madrigal E, Mannan R. pathCast: An Interactive Medical Education Curriculum that leverages livestreaming on Facebook and YouTube. Acad Med. 2020;95(5):744–50. [DOI] [PubMed] [Google Scholar]
- 10.Wilcha RJ. Effectiveness of virtual medical teaching during the COVID-19 Crisis: systematic review. JMIR Med Educ. 2020;6(2):e20963. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Casebeer L, Engler S, Bennett N, et al. A controlled trial of the effectiveness of internet continuing medical education. BMC Med. 2008;6:37. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Wang ZY, Zhang LJ, Liu YH, et al. The effectiveness of E-learning in continuing medical education for tuberculosis health workers: a quasi-experiment from China. Infect Dis Poverty. 2021;10(1):72. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Ramnanan C, Di Lorenzo G, Dong S, Pak V, Visva S. Synchronous vs. asynchronous anatomy content delivery during COVID-19: comparing student perceptions and impact on Student Performance. FASEB J. 2021;35:S1. [Google Scholar]
- 14.Praharaj SK, Ameen S. The relevance of Telemedicine in Continuing Medical Education. Indian J Psychol Med. 2020;42(5 Suppl):S97–102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Stephenson CR, Yudkowsky R, Wittich CM, Cook DA. Learner engagement and teaching effectiveness in livestreamed versus in-person CME. Med Educ. 2023;57(4):349–58. [DOI] [PubMed] [Google Scholar]
- 16.Mueller M, Ganesh R, Schroeder D, Beckman TJ. A post-COVID syndrome curriculum for continuing medical education (CME): in-person versus livestream. Front Med. 2024;11. [DOI] [PMC free article] [PubMed]
- 17.Stoehr F, Muller L, Brady A, et al. How COVID-19 kick-started online learning in medical education-the DigiMed study. PLoS ONE. 2021;16(9):e0257394. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.McMahon GT, Skochelak SE. Evolution of Continuing Medical Education: promoting Innovation through Regulatory Alignment. JAMA. 2018;319(6):545–6. [DOI] [PubMed] [Google Scholar]
- 19.Fiuzzi M. Outcomes and observations of On-line CME activities during the pandemic. J cme. 2023;12(1):2167286. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Blomberg D, Stephenson C, Atkinson T, et al. Continuing Medical Education in the Post COVID-19 pandemic era. JMIR Med Educ. 2023;9:e49825. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.He L, Yang N, Xu L, et al. Synchronous distance education vs traditional education for health science students: a systematic review and meta-analysis. Med Educ. 2021;55(3):293–308. [DOI] [PubMed] [Google Scholar]
- 22.Dedeilia A, Sotiropoulos MG, Hanrahan JG, Janga D, Dedeilias P, Sideris M. Medical and Surgical Education Challenges and Innovations in the COVID-19 Era: A Systematic Review. In Vivo. 2020;34(3 Suppl):1603–1611. [DOI] [PMC free article] [PubMed]
- 23.Papapanou M, Routsi E, Tsamakis K, et al. Medical education challenges and innovations during COVID-19 pandemic. Postgrad Med J. 2022;98(1159):321–7. [DOI] [PubMed] [Google Scholar]
- 24.Ismail II, Abdelkarim A, Al-Hashel JY. Physicians’ attitude towards webinars and online education amid COVID-19 pandemic: when less is more. PLoS ONE. 2021;16(4):e0250241. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Yadav SK, Para S, Singh G, Gupta R, Sarin N, Singh S. Comparison of asynchronous and synchronous methods of online teaching for students of medical laboratory technology course: a cross-sectional analysis. J Educ Health Promot. 2021;10:232. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Curran VR, Fleet LJ, Kirby F. A comparative evaluation of the effect of internet-based CME delivery format on satisfaction, knowledge and confidence. BMC Med Educ. 2010;10:10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Rivera R, Smart J, Sakaria S, et al. Planning Engaging, Remote, Synchronous Didactics in the COVID-19 pandemic era. JMIR Med Educ. 2021;7(2):e25213. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.A Systematic Approach to Medically Unexplained Symptoms. 2021; https://ce.mayo.edu/internal-medicine/content/systematic-approach-medically-unexplained-symptoms-2021. Accessed September 6, 2024.
- 29.Updates in Internal Medicine. 2021. 2021; https://ce.mayo.edu/internal-medicine/content/updates-internal-medicine-2021
- 30.Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)--a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inf. 2009;42(2):377–81. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Harris PA, Taylor R, Minor BL, et al. The REDCap consortium: building an international community of software platform partners. J Biomed Inf. 2019;95:103208. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Samara O, Monzon A. Zoom Burnout amidst a pandemic: perspective from a Medical Student and Learner. Ther Adv Infect Dis. 2021;8:20499361211026717. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.de Sobral OK, Lima JB, Lima Rocha DLF. Active methodologies association with online learning fatigue among medical students. BMC Med Educ. 2022;22(1):74. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Elbogen EB, Lanier M, Griffin SC, et al. A National Study of Zoom Fatigue and Mental Health during the COVID-19 pandemic: implications for future remote work. Cyberpsychol Behav Soc Netw. 2022;25(7):409–15. [DOI] [PubMed] [Google Scholar]
- 35.Schulte TL, Groning T, Ramsauer B, et al. Impact of COVID-19 on Continuing Medical Education-results of an online survey among users of a non-profit multi-specialty live online education platform. Front Med (Lausanne). 2021;8:773806. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Galasso V, Pons V, Profeta P, Becher M, Brouard S, Foucault M. Gender differences in COVID-19 attitudes and behavior: Panel evidence from eight countries. Proc Natl Acad Sci U S A. 2020;117(44):27285–91. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Laufer A, Shechory Bitton M. Gender differences in the reaction to COVID-19. Women Health. 2021;61(8):800–10. [DOI] [PubMed] [Google Scholar]
- 38.Fried JE, Liebers DT, Roberts ET. Sustaining rural hospitals after COVID-19: the case for global budgets. JAMA. 2020;324(2):137–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Glied S, Levy H. The potential effects of Coronavirus on National Health expenditures. JAMA. 2020;323(20):2001–2. [DOI] [PubMed] [Google Scholar]
- 40.Khullar D, Bond AM, Schpero WL. COVID-19 and the Financial Health of US hospitals. JAMA. 2020;323(21):2127–8. [DOI] [PubMed] [Google Scholar]
- 41.Cutler DM, Summers LH. The COVID-19 pandemic and the $16 trillion virus. JAMA. 2020;324(15):1495–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Stephenson CR, Bonnes SL, Sawatsky AP, et al. The relationship between learner engagement and teaching effectiveness: a novel assessment of student engagement in continuing medical education. BMC Med Educ. 2020;20(1):403. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Schoen MJ, Tipton EF, Houston TK, et al. Characteristics that predict physician participation in a web-based CME activity: the MI-Plus study. J Contin Educ Health Prof. 2009;29(4):246–53. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The authors confirm that the data supporting the findings of this study are available within the article and its supplementary materials.