Abstract
Background Emergency medicine (EM) residency programs can provide up to 20% of their planned didactic experiences asynchronously through the Individualized Interactive Instruction (III) initiative. Although blogs and podcasts provide potential material for III content, programs often struggle with identifying quality online content.
Objective To develop and implement a process to curate quality EM content on blogs and podcasts for resident education and III credit.
Methods We developed the Approved Instructional Resources (AIR) Series on the Academic Life in Emergency Medicine website. Monthly, an editorial board identifies, peer reviews, and writes assessment questions for high-quality blog/podcast content. Eight educators rate each post using a standardized scoring instrument. Posts scoring ≥ 30 of 35 points are awarded an AIR badge and featured in the series. Enrolled residents can complete an assessment quiz for III credit. After 12 months of implementation, we report on program feasibility, enrollment rate, web analytics, and resident satisfaction scores.
Results As of June 2015, 65 EM residency programs are enrolled in the AIR Series, and 2140 AIR quizzes have been completed. A total of 96% (2064 of 2140) of participants agree or strongly agree that the activity would improve their clinical competency, 98% (2098 of 2140) plan to use the AIR Series for III credit, and 97% (2077 of 2140) plan to use it again in the future.
Conclusions The AIR Series is a national asynchronous EM curriculum featuring quality blogs and podcasts. It uses a national expert panel and novel scoring instrument to peer review web-based educational resources.
What was known and gap
Emergency medicine (EM) residents may receive up to 20% of their didactic education asynchronously through online learning.
What is new
A national expert panel of educators curates high-quality quality blogs and podcasts for graduate medical education, with post-education testing, with 65 participating EM residency programs.
Limitations
The system for teaching material selection and the testing/scoring system do not have established validity evidence.
Bottom line
The program offers a free national curriculum and high-quality teaching materials for EM residents' individualized, asynchronous learning.
Introduction
The rising popularity of social media educational resources, such as blogs and podcasts, supports that adult learners are interested in self-directed learning.1 Online educational resources are readily discoverable, accessible, and timely, in contrast to the synchronous, classroom-based, pedagogical curriculum typically employed in residency didactic conferences. The Accreditation Council for Graduate Medical Education (ACGME) Review Committee for Emergency Medicine (EM) initiative, called Individualized Interactive Instruction (III),2 enables EM residency programs to replace up to 20% of traditional didactic conference with faculty-monitored asynchronous learning. “Individualized learning away from groups of similar level learners, which allows learners to consume material at their own pace on their own timetable,”3 coincides with the rapid increase and popularity of educational blogs and podcasts in EM.4–6
Although a few individual residency programs have instituted asynchronous learning activities to replace didactic conference time,7 many programs are hesitant to adopt blogs and podcasts as III-eligible asynchronous learning material because few have peer review or quality assurance measures. Despite early studies on identifying and ensuring quality,8,9 no academic guidelines currently exist to help health professions educators identify trustworthy blogs and podcasts for their learners. Furthermore, there are few published descriptions of centralized, collaborative programs to develop, evaluate, and track resident participation of asynchronous learning activities involving social media resources.
In July 2014, we launched the Approved Instructional Resources (AIR) Series10 on the Academic Life in Emergency Medicine (ALiEM) website in response to these quality assurance challenges. Using a process called “content curation,” which involves identifying, filtering, and sharing relevant content with a target audience, we critically appraised and published quality online educational modules to EM learners. This innovation report describes the AIR Series and its development, implementation, and initial outcomes, including feasibility, learner satisfaction, residency enrollment, and web analytics.
Methods
The AIR Series was developed in line with the Kern 6-step framework for curriculum development.11
1. Problem Identification and General Needs Assessment
The AIR Series was developed to address residencies' general need for quality online content that meets III criteria, identified by a literature search and reinforced by informal conversations with program directors at national meetings. Academic Life in Emergency Medicine (www.ALiEM.com), a public EM education and faculty development blog founded in 2009, now garners over 1 million page views annually worldwide, supporting that learners are choosing to pursue self-directed learning online.
2. Targeted Needs Assessment
The target audience for the AIR Series is all US EM residents. ACGME requirements for III drove the targeted needs assessment. Residents and residency programs were informed of the AIR Series resource primarily through an ALiEM blog post announcement (www.aliem.com/new-air-series-aliem-approved-instructional-resources) and through a post in the Council of Emergency Medicine Residency Directors (CORD) e-mail listserv; informal feedback from these notifications further informed the targeted needs assessment.
3. Goals and Objectives
The overarching goal of the AIR Series is to identify and critically appraise available online EM educational materials in order to provide residents with an avenue for self-directed learning that meets III criteria. Although the learner objectives are module-specific, each module is designed to prepare the learner to apply core EM content to clinical scenarios, describe an evidence-based best practice approach to common patient presentations, and identify controversies in the literature for a given clinical question.
4. Educational Strategies.
Three months prior to the AIR Series launch, a senior EM resident team leader recruited, by e-mail, a voluntary editorial board with expertise in graduate medical education and/or social media. The 8 faculty panelists consist of 2 professors, 2 associate professors, 3 assistant professors, and 1 clinical instructor. Of these, 7 are educational leaders (eg, dean, chair, residency director, associate residency director, clerkship director), and 3 are active educators on social media platforms. The editorial board also includes an additional faculty member with expertise in multiple-choice question writing. The team leader shepherds the entire workflow process, serves as the administrative contact person for residency programs, and is involved with every step of the curation and rating process. Instructional design strategies aligned with ACGME III criteria are shown in table 1.2
table 1.
5. Implementation
The AIR Series modules are posted monthly on the AIR Series web page (www.aliem.com/aliem-approved-instructional-resources-air-series). The workflow process, subdivided by different team member roles and stepwise time investment, is summarized in figure 1. Each module features a core EM topic, mirroring the CORD Practice Test calendar (http://www.cordtests.org/Test%20Schedule.htm). Subject-specific content published within 12 months is searched on the top 50 open-access blog and podcast sites in EM and critical care based on the Social Media Index (SMi-50).12 When initial searches result in more than 30 posts, a subgroup of 3 faculty raters filter them to exclude lower-quality content (defined by inaccurate content, lacking references, or being opinion pieces). If 2 of the 3 raters feel that a post is of such lower quality, it is excluded.
The 8 faculty raters grade the final list of blogs and podcasts using the AIR Scoring Instrument based on 5 domains: Best Evidence in Emergency Medicine (BEEM) Rater Scale, Content Accuracy, Educational Utility, Evidence-Based Medicine, and Referencing (table 2). The BEEM score has been shown to accurately predict citation rate and theoretical “impact” in peer-reviewed journal publications.13 Because the BEEM score assumes that the post is firstly accurate, we created a second grading domain of Content Accuracy. To reward posts with high-quality, resident-specific educational content, the Educational Utility domain was added. Finally, to assess bias and transparency, the 2 domains of Evidence-Based Medicine and Referencing were added.
table 2.
Each of these 5 domains is scored on a 7-point Likert scale with behavioral anchors. Each rater independently enters his or her scores using Google Forms. Posts with a mean score of ≥ 30 points out of 35 are awarded an AIR badge. Posts with a mean score of 27 to 29 or receiving many positive free-text comments by raters are also included as AIR Honorable Mention posts.
For each featured post, a team of content experts, including a member with item writing expertise, identifies a core teaching point to optimize content validity and writes and revises a correlating multiple-choice question using best practice guidelines. This includes creating a stem prompt, having a clear single correct answer, including incorrect distractors, copyediting for content clarity, and piloting to optimize response process validity. Because the emphasis is on learning, rather than achieving a score, and as there is no gold standard for a passing grade, the quizzes require the resident to select the correct answer before being allowed to progress to the next question. Multiple attempts are allowed.
Each monthly module's AIR and Honorable Mention posts are published on the ALiEM website along with a Google Forms link to the multiple-choice quiz and postmodule feedback survey that includes 3 items: (1) I anticipate that this activity will improve my competency in the emergency department (strongly agree, agree, neutral, disagree, strongly disagree); (2) I plan to use this AIR Series module for residency conference credit (yes, no); and (3) I plan to revisit the ALiEM AIR Series as a future educational resource (yes, no). Response process validity of these survey questions was addressed by piloting the questions in a focus group and revising them based on feedback. Resident responses to the quiz and survey are recorded and tracked in a master Google Drive spreadsheet. Residency directors participating in the AIR Series for III conference credit are provided with access to the master spreadsheet and have e-mail access to the team leader for questions or problems. All monthly posts are archived and retrievable by any resident on the ALiEM AIR website.
6. Evaluation and Feedback
The initial acceptability and impact of the AIR Series innovation was evaluated by 4 outcome measures: feasibility, residency program enrollment, website traffic analytics, and learner feedback.
This study received Institutional Review Board exemption from Oregon Health & Science University.
Results
The AIR Series was feasible to implement, incurring zero direct costs because it uses the free, cloud-based platform Google Drive, and the results are published on an already existing blog site. Although there are many free blogging platforms available (such as Wordpress, Tumblr, and Medium), partnering with an existing blog site provided the advantages of an established readership, dissemination network, and reputation upon the launch of the AIR Series. Regarding time costs, a total of 63 to 81 people hours are required per monthly module. The team leader accounts for approximately half of the time commitment (figure 1). An additional 10 to 20 hours may be required to build a new blog site, if not partnering with an established site.
From an acceptability standpoint, since July 2014, the AIR Series has successfully enrolled 65 US EM residency programs (figure 2) from the Midwest (16), Northeast (20), Southeast (16), Southwest (5), and West (8). In the first year (July 2014 to June 2015), the AIR Series featured 38 AIR and 45 honorable mention posts over 9 modules, recording 456 to 1830 page views from 188 to 470 cities in 29 to 50 countries in each of the module's first 30 days of publication. As of June 1, 2015, residents completed 2140 AIR quizzes with 96% (2064 of 2140) of participants agreeing/strongly agreeing that the activity would improve their clinical competency, 98% (2098 of 2140) planning to use the AIR Series for residency conference III credit, and 97% (2077 of 2140) planning to revisit the ALiEM AIR Series in the future.
Discussion
We found the ALiEM AIR Series, a free, national asynchronous curricular resource peer reviewed by a geographically diverse editorial board, to be feasible to develop and implement. The program has since enrolled 65 EM programs, garnered high satisfaction scores from residents, and poses no financial and little administrative burden to programs. Our panel peer-review process using the AIR scoring instrument is a novel, feasible, and reproducible approach to identifying high-quality social media content for other health professions specialties and learners on the national level. This process adds a transparent level of peer review and quality assurance that is missing in most of these resources.
As the AIR Series has gained widespread adoption for III conference credit across EM residency programs, we have added team members to ensure sustainability. Additions include a new resident member who identifies SMi-50 blog posts and podcasts under team leader supervision and 2 qualified faculty graders to allow a rotating grading cycle. Additionally, the AIR Series will soon transition to a custom learning management system to provide a more user-friendly interface to monitor learner progress and visualize analytic data.
There are several limitations in the design and implementation of the AIR Series. Although the Social Media Index is based on Alexa website rankings and social media followership numbers, it lacks extensive validity evidence and may miss some quality educational content.11 Regarding the scoring tool, although the 5 domains (BEEM Rater Scale, Content Accuracy, Educational Utility, Evidence-Based Medicine, and Referencing) have some content validity as markers for educational quality, these have not been shown to be causally linked to increased accurate knowledge transfer in EM residents. Additionally, although we have sought to optimize content and response process validity of the scoring tool and assessment quizzes by expert development, review, and pilot testing, further validity evidence is needed to guide score interpretation. Although the interrater reliability of the scoring tool has not been performed, the tool was constructed to be simple and unambiguous.
Future directions include demonstrating further learner-centered outcomes, such as the impact of the AIR Series on resident knowledge and performance; comparing the AIR Series to traditional learning methods such as textbooks, didactics, and journals; and collecting further validity evidence for the AIR scoring instrument.
Conclusion
The AIR Series curates quality blogs and podcasts for graduate medical education using a national expert panel of educators, a novel scoring instrument to peer review social media resources, and custom assessment quizzes. The initiative serves as the foundation for a free national curriculum featuring blogs and podcasts that can be incorporated into the EM curriculum to provide high-quality teaching materials for residents' online self-learning.
References
- 1. Holmes G, Abington-Cooper M. Pedagogy vs andragogy: a false dichotomy? J Technol Stud. 2000; 26 2: 50– 55. [Google Scholar]
- 2. Accreditation Council for Graduate Medical Education Review Committee for Emergency Medicine. Frequently asked questions: emergency medicine. 2012. http://www.acgme.org/Portals/0/PDFs/FAQ/110_Emergeny_Medicine_FAQs.pdf. Accessed April 7, 2016. [Google Scholar]
- 3. Reiter DA, Lakoff DJ, Trueger NS, Shah K. Individual interactive instruction: an innovative enhancement to resident education. Ann Emerg Med. 2013; 61 1: 110– 113. [DOI] [PubMed] [Google Scholar]
- 4. Cadogan M, Thoma B, Chan TM, Lin M. Free Open Access Meducation (FOAM): the rise of emergency medicine and critical care blogs and podcasts (2002–2013). Emerg Med J. 2014; 31 e1: 76– 77. [DOI] [PubMed] [Google Scholar]
- 5. Mallin M, Schlein S, Doctor S, Stroud S, Dawson M, Fix M. A survey of the current utilization of asynchronous education among emergency medicine residents in the United States. Acad Med. 2014; 89 4: 598– 601. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Purdy E, Thoma B, Bednarczyk J, Migneault D, Sherbino J. The use of free online educational resources by Canadian emergency medicine residents and program directors CJEM. 2015; 17 2: 101– 106. [DOI] [PubMed] [Google Scholar]
- 7. Scott KR, Hsu CH, Johnson NJ, Mamtani M, Conlon LW, DeRoos FJ. Integration of social media in emergency medicine residency curriculum. Ann Emerg Med. 2014; 64 4: 396– 404. [DOI] [PubMed] [Google Scholar]
- 8. Paterson QS, Thoma B, Milne WK, Lin M, Chan TM. A systematic review and qualitative analysis to determine quality indicators for health professions education blogs and podcasts. J Grad Med Educ. 2015; 7 4: 549– 554. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Thoma B, Chan TM, Paterson QS, Milne WK, Sanders JL, Lin M. Emergency medicine and critical care blogs and podcasts: establishing an international consensus on quality. Ann Emerg Med. 2015; 66 4: 396– 402.e4. [DOI] [PubMed] [Google Scholar]
- 10. Lin M, ALiEM AIR. Series. Academic Life in Emergency Medicine. 2014. http://www.aliem.com/aliem-approved-instructional-resources-air-series. Accessed February 15, 2016. [Google Scholar]
- 11. Kern DE, Thomas PA, Howard DM, Bass EB. Curriculum Development for Medical Education: A Six-Step Approach. Baltimore, MD: Johns Hopkins Press; 1998. [Google Scholar]
- 12. Thoma B, Sanders JL, Lin M, Paterson QS, Steeg J, Chan TM. The social media index: measuring the impact of emergency medicine and critical care websites. West J Emerg Med. 2015; 16 2: 242– 249. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Carpenter CR, Sarli CC, Fowler SA, Kulasegaram K, Vallera T, Lapaine P, et al. Best evidence in emergency medicine (BEEM) rater scores correlate with publications' future citations. Acad Emerg Med. 2013; 20 10: 1004– 1012. [DOI] [PubMed] [Google Scholar]