Skip to main content
Perspectives on Medical Education logoLink to Perspectives on Medical Education
. 2023 Jan 9;12(1):25–40. doi: 10.5334/pme.817

Guidelines: The Do’s, Don’ts and Don’t Knows of Creating Open Educational Resources

Faran Khalid 1, Michael Wu 1, Daniel K Ting 2, Brent Thoma 3, Mary R C Haas 4, Michael J Brenner 5, Yusuf Yilmaz 6,7, Young-Min Kim 8, Teresa M Chan 9,10
PMCID: PMC9997113  PMID: 36908747

Abstract

Background:

In medical education, there is a growing global demand for Open Educational Resources (OERs). However, OER creators are challenged by a lack of uniform standards. In this guideline, the authors curated the literature on how to produce OERs for medical education with practical guidance on the Do’s, Don’ts and Don’t Knows for OER creation in order to improve the impact and quality of OERs in medical education.

Methods:

We conducted a rapid literature review by searching OVID MEDLINE, EMBASE, and Cochrane Central database using keywords “open educational resources” and “OER”. The search was supplemented by hand searching the identified articles’ references. We organized included articles by theme and extracted relevant content. Lastly, we developed recommendations via an iterative process of peer review and discussion: evidence-based best practices were designated Do’s and Don’ts while gaps were designated Don’t Knows. We used a consensus process to quantify evidentiary strength.

Results:

The authors performed full text analysis of 81 eligible studies. A total of 15 Do’s, Don’t, and Don’t Knows guidelines were compiled and presented alongside relevant evidence about OERs.

Discussion:

OERs can add value for medical educators and their learners, both as tools for expanding teaching opportunities and for promoting medical education scholarship. This summary should guide OER creators in producing high-quality resources and pursuing future research where best practices are lacking.

Introduction

The rapid expansion of remote and electronic learning has coincided with a digital transformation of the educational landscape, affecting how content is disseminated, applied, and integrated into practice. Open Educational Resources (OERs), defined as “teaching, learning and research materials in any medium, digital or otherwise, that reside in the public domain or have been released under an open license that permits no-cost access, use, adaptation and redistribution by others with no or limited restrictions” [1] are increasingly being used at all levels of medical education. They include freely accessible, openly licensed text, media, and other digital assets for teaching, learning, assessing, and research that may be re-mixed, improved and redistributed under some licenses [2].

Unfortunately, as podcasts, blogs, online journal clubs, online textbooks, and other forms of OER have proliferated, standards for quality have lagged, with potential negative impacts for the quality of the education provided via these newer media [3,4]. This is particularly relevant at a time when these resources must distinguish themselves from the substantial amount of medical misinformation being published online [5].

Evidence-based guidance in the creation of OERs would support increased quality and consistency in these resources while also ensuring alignment between resource design and learner needs [6,7,8]. We enlisted medical education experts to curate the available evidence with the goal of defining best practices in the creation of OERs in medical education.

Methods

Development of this guideline proceeded in four stages: literature review; guideline development; consensus development; and grading. The literature review was conducted using a combination of the results from a rapid review of OER evaluation methods by Ting et al., a hand search of all references from the articles from the Ting review, and expert feedback from our guidelines’ senior authors who are well-published in this area [4]. These stages resulted in a comprehensive list of guidelines in the form of “Do’s, Don’ts, and Don’t Knows”.

Authorship Team

We recruited leading medical educators from Canada, the United States, Turkey, and Korea to contribute. We identified expert participants by seeking individuals’ relevant expertise in medical education, open access resources, and previous experience in the development of guidelines or consensus documents from within a closed online research community of practice called the Technology Education and Collaboration Hub [9]. Additional authors were recruited due to their roles as frontline users of OERs (e.g. students, frontline teachers). We aimed for multidisciplinary representation, with representatives from medical specialties, surgical specialties, and the learning sciences.

Literature Review

A articles for guideline development were aggregated from four sources. First, we procured all citations from the rapid review by Ting et al. which was a literature review on evaluation tools for OERs [4]. Second, we crowd-sourced additional studies from the authorship team relevant to OER creation. Third, we conducted a time-restricted search of OVID MEDLINE, EMBASE and Cochrane Central from 2010–2020 using the keywords “open educational resources” and “OER.” Fourth, we hand searched the reference lists of each included article for other relevant articles.

Guideline Development

The lead authors (MW, FK, TMC) distilled key takeaway points and suggestions for OER curation and presented these findings for discussion with the larger authorship team. The authorship team provided input based on practical experience and literature. Through discussion and review, recommendations were developed and categorized. The lead authors then translated these recommendations into draft guidelines in the form of Do’s, Don’ts and Don’t Knows.

Consensus Development

The guideline list and references were then circulated for all authors to suggest further evidence and edits. When the opinion was divided on classification of a guideline in discussion of Do’s and Don’ts, the guideline was moved into the Don’t Know category. After further refinement using comments provided by the authorship team, the lead authors circulated the guidelines for final approval.

Grading

Guidelines were then graded for strength of evidence using the Perspectives on Medical Education (PMED) grading framework [10,11] by the lead authors (MW, FK, TMC).

Results

Eighty-one studies were initially included for full-text analysis and identification of key categories. The primary authors identified the key takeaways from these articles and formulated them as guidelines. Based on our thematic analysis of these studies, we developed 15 guidelines through a process of discussion and consensus-building. Figure 1 depicts our workflow. The studies that were ultimately included can be found in the reference section of our paper, while Appendix A shows all studies considered regardless of their inclusion in the final publication. After careful consideration and discussion within the group, only categories pertinent to the creation of OERs directed guideline development. A final list of 15 guidelines were derived from this process.

Figure 1.

Flow diagram for included studies

Flow diagram for included studies.

Legend: OER = Open Educational Resource.

The background evidence for each guideline is described within each subsection. The strength of the recommendations is summarized in Table 1. The raw results table can be found in Appendix A.

Table 1.

Consensus criteria for strength of recommendation.


GUIDELINE NUMBER CLASSIFICATION OF DO’S, DON’TS AND DON’T KNOWS (RECOMMENDATION LEVEL IN BRACKETS)

1 Do: Conduct a needs assessment for OERs (Moderate)

2 Do: Create OER for rarely performed procedures or seldom encountered clinical presentations (Moderate)

3 Do: Use critical appraisal and tools to guide creation of OERs (Moderate)

4 Do: Contribute to a virtual community of practice of educators interested in OER creation (Strong)

5 Do: Use video archival to enhance online learning and education (Moderate)

6 Do: Teach learners to develop OERs as scholarship (Moderate)

7 Do: Develop infographics and graphical abstracts by following criteria for content, formatting, and style (Strong)

8 Do: Create and use podcasts as learning resources, knowing that trainees may be concurrently dual tasking (Moderate)

9 Do: Consider creation of OERs that may be suitable for use at the point-of-care (Tentative)

10 Don’t: underestimate the uptake and influence of OERs in trainee education (Tentative)

11 Don’t: overlook the need to encourage learners to critically appraise OERs (even your own) (Tentative)

12 Don’t Know: Best practices for cross-language and cross-cultural sharing of OERs (Tentative)

13 Don’t Know: Efficacy of various types of OERs and best practices for OERs to optimize learning (Tentative)

14 Don’t Know: The ideal way to incorporate OERs into existing curricula (Tentative)

15 Don’t Know: Ethical considerations in use of OERs that are developed with industry support (Tentative)

Strong A large and consistent body of evidence. 

Moderate Solid empirical evidence from one or more papers plus consensus of the authors.

Tentative Limited empirical evidence, but clear consensus of the authors.

While most of the guidelines are applicable to all OERs, in some cases we focused our specific recommendations for particular types of OERs (e.g. infographics, blogs, and podcasts). For guidelines referencing specific types of OERs, the subtype is explicitly mentioned in the subheading.

Guideline 1 – Do: Conduct a needs assessment for OERs

The complex process of OER creation often fails to consider the needs of the learners. Neglecting learner needs can result in irrelevant content ill-suited to address the curricular demands of its primary learners. Many studies, however, focus on understanding the content and the mode of delivery that learners desire [12,13,14,15]. The evidence from these papers should guide the creation of OER to better facilitate learning and fill in the gaps in OER. To understand learners’ needs, educators and content creators should conduct needs assessments through mechanisms such as surveys and focus groups to obtain student opinions. For example, Mallin et al. found that 80% of residents choose an OER topic for studying based on a recent clinical encounter [16], while Forestell and colleagues directly surveyed both trainees and educators to determine their curricular plan [12]. Studies like these can then help creators decide on topics for their OERs.

Assessing learner preference for medium can increase the likelihood that learners access and share content. For example, Mallin et al. found that emergency medicine (EM) residents prefer to listen to podcasts and social media instead of completing home reading assignments [16]. Given how many EM residencies operate on a rotating curriculum that covers all areas of their field, educators may consider suggesting timely resources like podcasts covering topics likely to be encountered by the resident in the near future. Although tailoring to learning styles has little evidence of effectiveness, adopting preferred formats can improve exposure [17]. Furthermore, an emphasis on active over passive format enhances learner engagement [18], although one study found that this is rare [19].

From the junior learner (medical student or first year resident) perspective, existing OERs often lack or underemphasize foundational content [20,21]. Online survey-based needs assessments can elucidate differences in educator and student perceptions on topic importance [12,13,22,23,24]. In 2020, Forestell et al. conducted an online survey-based needs assessment on 32 EM topics and value was assigned to each by educators and students [12]. Of the 32 topics, the authors identified 23 topics mutually agreed by learners and educators to be important for future OERs. This needs assessment identified general consistency between topics that students and educators deemed as high priority. However, EM educators valued some topics (i.e. chest pain, dyspnea) more highly than students, suggesting that students may not always self-identify their own knowledge gaps. Consequently, OER creators must seek out both the perspectives of learners and teachers to best identify the existing gaps in knowledge and/or curricula.

Guideline 2 – Do: Create OERs for rarely performed procedures or seldom encountered clinical presentations

Educators should create more OERs for certain procedures and rare presentations that learners, including rural physicians, encounter less frequently during clinical care. Current literature often provides an abundance of content related to common conditions and a relative paucity of credible research related to more rare topics or procedures [23,25]. Expanding OER geared toward rural physicians can better account for varying geographical contexts among learners. In 2016, Folkl et al. employed a survey to rural and urban Canadian-based EM physicians on their usage of EM resources and their self-reported confidence in EM domains [23]. The study identified that rural and urban emergency physicians perceive their own knowledge levels differently, particularly knowledge related to critical care. This difference in perceived knowledge may reflect less exposure by rural EM physicians to critically ill patients or the need for a wider knowledge base among ruralists who may lack access to specialists. As such, OERs must further address certain topics in greater detail to include rural learners. This observation should also apply to the context of other healthcare professional populations. We make an inference based on this needs assessment that OERs should likely also cover seldom encountered clinical presentations.

Guideline 3 – Do: Use critical appraisal and tools to guide creation of OERs

As OERs have emerged as an important resource within medical education, several tools have been developed to evaluate their quality. Ting et al. (2020) published a rapid review that identified different quality assurance tools that were designed for these resources [4]. Table 2 provides an overview of the critical appraisal tools outlined in this review with the addition of the recently published revised Approved Instructional Resources (AIR) Score [26].

Table 2.

Overview of critical appraisal tools outlined in this review.


TOOL SUMMARY SOURCE

Quality checklists for blogs and podcasts [101] The quality checklists were the first of the tools published for medical education. They were derived from lists of quality indicators identified within Delphi studies. They could support both the review and creation of OERs. Paterson QS, Colmers IN, Lin M, Thoma B, Chan T. The quality checklists for health professions blogs and podcasts. The Winnower. 2015;1–7. Available at: https://thewinnower.com/papers/2641-the-quality-checklists-for-medical-education-blogs-and-podcasts

rMETRIQ Score [70] The METRIQ-5 and METRIQ-8 scores were derived from the same quality indicators as the quality checklists. The revised METRIQ (rMETRIQ) score is an optimized version that was clarified through a utilization study. It aims to provide a quality score that could be applied with little expertise [70]. Revised METRIQ score grading tool: https://metriqstudy.org/the-rmetriq-score-a-quality-assessment-tool-for-the-critical-appraisal-of-foam/

rAIR Score [26] The ALiEM AIR Score was developed by emergency medicine educators to rate OERs for their learner. The revised AIR (rAIR) score was optimized and simplified through a utilization study but serves the same purpose [26]. Revised AIR score grading tool: https://europepmc.org/articles/PMC8194147/figure/aet210601-fig-0001/

MEWQET
MEOW
CCMEWQET
These three scores have a common origin. MEWQET, developed by pathology educators, was modified into MEOW by otolaryngology educators. These two tools informed the creation of CCMEWQET by critical care educators. They provide a specialty-specific lens to the evaluation of OERs. MEWQET:
https://www.jpathinformatics.org/articles/2013/4/1/images/JPatholInform_2013_4_1_29_120729_sm11.jpg
MEOW:
https://journalotohns.biomedcentral.com/articles/10.1186/s40463-017-0220-4/tables/1
CCMEWQET:
https://journals.sagepub.com/na101/home/literatum/publisher/sage/journals/content/jica/2019/jica_34_1/0885066618759287/20181129/images/large/10.1177_0885066618759287-table1.jpeg

In the same way that quality evaluation tools inform the conduct of research, these quality evaluation tools could support OER developers in the development of high-quality resources. By ensuring that their resources meet the quality indicators of relevant tools, it is likely that creators will increase the quality of their work.

Guideline 4 – Do: Contribute to a virtual community of practice of educators interested in OER creation

A community of practice (CoP) is formed when a group of professionals who share an interest engage in a social context conducive to participatory learning [27]. Although CoPs differ widely, three unifying attributes are mutual engagement, joint enterprise, and shared repertoire [27,28,29,30]. Mutual engagement arises from the chemistry of social relationships that bind members together, whereas joint enterprise refers to the set of goals that they share. Members cultivate a shared repertoire as they develop mutually understood knowledge, techniques, ideas, and terminology. A virtual CoP shares many of the same features of a regular CoP, except that the primary mode of communication between members occurs in the virtual space, using online communication tools [28,29,30]. In recent years, virtual CoPs have gained increasing traction, and a growing number have started to emerge around OERs [31,32].

There are several compelling reasons to join a virtual CoP. CoPs unite individuals who share a passion. Additionally, establishing blended networks of faculty and trainees affords a structure for facilitating meaningful coaching, mentorship, and sponsorship relationships [31,33]. For trainees, connecting with mentors can introduce them to optimizing social media use by avoiding pitfalls such as professionalism lapses and learning the basics of OER creation [32,34,35]. For educators, CoPs can galvanize collaborative scholarship, achieving a “critical mass” of motivated individuals with whom to share work and ideas [31,35]. Additionally, emerging evidence suggests that educators may use the virtual CoP in novel ways that facilitate high quality scholarship, such as to help recruit participants for research studies [36,37]. Trainees, on the other hand, can add further value to the CoP by providing bidirectional mentorship by sharing how they use modern technologies and programs which in turn can assist faculty in their creation of OERs.

For many blog-based OERs, submissions are welcomed from their online audience of trainees to consider for publication [38]. Some have introduced a modified peer-review process for submissions that seeks to both elevate the quality of the work and provide a positive academic experience through transparent coaching from faculty physicians [34,38]. To facilitate further progression from peripheral to core membership within the CoP, organizations have created formal training curricula through apprenticeships that pair trainees with experienced mentors in OER production [32,34,35,38,39].

Guideline 5 – Do: Use video archival to enhance online learning and education

Educators should continue to use videoconferencing with archival (e.g. recording a Zoom webinar/talks and then sharing via a social video streaming service like YouTube or Vimeo) for open dissemination of knowledge and overcoming distance and access barriers, a need amplified by the physical distancing necessitated during the COVID-19 pandemic. Educators must invest in applying new technologies since they can help in: 1) improving project collaboration; 2) creating virtual meetings; 3) fostering digital mentorship; 4) forming virtual communities of practice; and 5) advancing online learning in the realms where remote e-work is unavoidable and where previously communication may have been hindered by a lack of in-person activities [40,41].

Guideline 6 – Do: Teach learners to develop OERs as scholarship

In Scholarship Reconsidered, Boyer argued that a narrow view of scholarship as experimental research is incomplete and that an expanded definition that includes integration, application, and teaching is needed to capture the work of academics [42]. Boyer’s expanded framework for scholarship paved the way for enhanced legitimacy of OERs as the widespread adoption of the internet followed in the 1990s and 2000s [43]. Thoma et al. describe examples of digital products that parallel traditional scholarly output, largely falling under the “teaching and learning” category: interactive resources (i.e., online discussion boards, social networks and wikis), independent study resources (i.e., e-mail, online courses, serious games, virtual reality, web-based and computer-assisted learning), audiovisual resources (i.e., podcasts, video podcasts, and instructional videos), point-of-care resources (i.e., applications), written resources (i.e., online textbooks, blogs, open access journals and websites), and resource repositories (i.e., online repositories and search engines) [44]. Sherbino and colleagues later expanded this definition to specifically examine social media-based scholarship [45]. Husain et al. define digital scholarship as “original content that is disseminated digitally, whether that content is research, teaching materials, enduring resources, commentaries, or other scholarly work” [46].

Learners can be supported by faculty to develop OERs as scholarship through formal curricular teaching methods. Murray et al. studied the effectiveness of teaching evidence-based medicine to medical students using Wikipedia [47]. First year medical students in small groups were tasked to choose a medical Wikipedia article to appraise and edit. Students excelled at identifying knowledge gaps and selecting appropriate literature for edits. Positioning medical learners as critical appraisers of existing digital resources actively engaged them in OER creation. However, students faced challenges including the technical aspect of editing Wikipedia and difficulty collaborating with the greater Wikipedia community. This highlights that although novel digital assignments and tools can facilitate medical learning, technical and social barriers accompany them.

Guideline 7 – Do: Develop infographics and graphical abstracts by following criteria for content, formatting, and style

Infographics and graphical abstracts (including visual abstracts) are powerful tools that not only prompt individuals to take notice of content, but also serve to frame the overarching message and takeaways [48]. The salience of these visually oriented tools underscores the need to adhere to best practice in content, formatting, and style, with an eye toward minimizing potential sources of bias. Scientists and scholars have increasingly leveraged these visual tools [49,50,51,52], a trend that parallels the overall accelerating adoption of technology in medical education [53]. Graphical abstracts are a relatively new entrant to the educational and scientific community. While no uniformly agreed-on standards exist, several styles have emerged.

The visual abstract style, introduced in 2016, is arguably most relevant to OER creation [49]. Visual abstract style infographics are used primarily by scholarly journals and consist of a title and key findings (text and visual icon), showcasing the most important data [52]. Visual abstracts are usually created with digital software and can increase engagement by healthcare professionals, particularly on social media [54]. The infographic style is less specific to medical journals and often professionally produced using specialized software; the intended audience may be academicians [55,56,57] or the general public [58]. The diagram style was developed decades prior for use in specialized fields [50]. Last, the comic style conveys research findings, combining humorous illustration with minimal text [59]. Among these, the visual abstract is most relevant to medical educators, as visual abstracts are straightforward to develop, useful for scholarly endeavors, and increasingly used on social networking platforms [49].

The visual abstract needs to accurately translate the written abstract without distortion, and therefore the need for standardization is more stringent than for other infographics. Visual abstracts should give context to the study; explicitly state the quality of the evidence; and minimize reporting bias if applicable. To maximize engagement, the layout of abstracts should be clear and organized with easy-to-read fonts and high-resolution images. Also, these images should be reproducible in grayscale. The source of data should be transparent, including author names, degrees, and full citation. Consumers often prefer visual abstracts as they cause less cognitive load, although visual abstracts do not necessarily improve delayed information retention [57].

Infographics and visual abstracts are alluring to readers due to imagery and succinctness, but this attribute also makes them inherently susceptible to misinterpretation. This is especially important as accelerated dissemination of research via social networks may improve uptake of an article, but this rapidity also makes it potentially challenging to rein in misconceptions [55,56].

The bias toward positive results — a tendency long documented in the scientific literature [60,61] — is of particular concern with visual abstracts. Care must be taken to avoid oversimplification [62], and overgeneralizations. Out of necessity, visual abstracts tend to be pared down, and often lack key statistical measures of uncertainty or inaccuracy, such as confidence intervals. Furthermore, visual abstracts are often created outside of the peer-review process. A few practices can help mitigate these risks of bias. For example, having both an internal and external review before posting on social media (e.g., Twitter) provides layers of quality control [63].

Guideline 8 – Do: Create and use podcasts as learning resources, knowing that trainees may be concurrently dual tasking

Students and educators increasingly utilize podcasts [64]. Podcasts can also effectively disseminate information and learning objectives across various regional learning institutions.

Podcasts are easy to use and engaging, enabling both broad exposure to content and targeted learning [7]. Being able to multitask, listening to podcasts while doing other activities is a unique advantage of podcasts. The learners can use their time productively [7,65]. A recent randomized controlled trial showed that listening to podcasts while driving did not significantly decrease retention both 30-minutes immediately after (initial recall) and 30-days later (delayed recall) compared to undistracted listening [66].

Educators should examine usage patterns of podcasts to help guide specifics of content development such as length of an episode. Most listeners used podcasts for less than 30 minutes [65,67] and less than 2 hours per week [68]. Educators should also be aware of what motivates learners to use podcasts (e.g., learning content and staying up to date were main motivators) [65,67,68].

Based on the current evidence, podcasts should be less than 30-minutes and deal with up-to-date information. Podcasts generally do not incorporate active learning and thus necessitate complementary resources. Gestalt ratings from approximately 20 health professionals are required to reliably assess podcast quality [17], so consulting broadly and getting wide opinions may be worthwhile prior to OER release.

Guideline 9 – Do: Consider creation of OERs that may be suitable for use at the point-of-care

Educators and students alike can utilize them to answer clinical questions in real-time that enhance patient care and education. OERs can thus serve as a valuable “point of care” (POC) resource, which is defined as “any reference material used in the provision of medical care directly at the bedside and may include clinical problem-solving, patient care, patient education, or learner education” [15]. When creating POC resources, a study by Patocka et al. generated a conceptual framework for how EM providers use POC resources that describes four main purposes: deep-dive, advanced clinical decision making, teaching patients and teaching learners [15]. Junior learners tend to prioritize increasing their depth of knowledge (“deep dive”), whereas more senior learners seek to answer specific clinical questions through small bursts of knowledge-seeking prompted by scenarios [15]. Additionally, senior learners and practicing physicians tend to use POC not only for themselves but to disseminate knowledge to others, including both patients and learners, thereby freeing up time for other tasks [15]. OER creators can be informed by this literature to better tailor the resource to the use and user.

Guideline 10 – Don’t: underestimate the uptake and influence of OERs in trainee education

This recent expansion of asynchronous OERs has spurred a movement away from traditional textbooks and synchronous classroom-based activities, particularly among more recent generations of learners. As early as 2015, a survey of Canadian EM residents found that over 90% of respondents had used OERs for general EM education, procedural skills, and diagnostic test interpretation, with the most commonly used resources including wikis, file-sharing websites, textbooks, and podcasts [6]. A 2014 survey of American EM residents found that roughly 98% engaged in at least an hour of educational activities outside of traditional residency curricula, with listening to podcasts more commonly reported than time reading textbooks [16]. Most residents perceived podcasts to have the greatest benefit, over other resources such as textbooks, journals, and Google [6]. The global uptake of asynchronous OERs has occurred, although more slowly. A 2013 global survey of 44 trainees demonstrated that 82% were aware of blogs, 80% of websites, 75% of podcasts and 61% of Twitter as EM educational resources, with trainees in lower income settings generally less aware of specific resources despite lack of internet access not appearing to be a major barrier to use [69]. This, however, is not universal across all modalities – podcasts were noted to have less uptake in low- and middle-income contexts [24].

Guideline 11 – Don’t: overlook the need to encourage learners to critically appraise OERs (even your own)

Although asynchronous OERs come with many advantages, Mallin et al. identified a concerning finding: trainees utilizing asynchronous OERs reported rarely evaluating the primary sources or the quality of the evidence [16]. Again, generational differences have been noted, with program directors more likely than residents to access primary references [6]. While the use of online educational resources among trainees is inevitable, these findings highlight the need for educators to teach them about critical appraisal including quality indicators, such as the revised METRIQ score or the AIR score [26,70,71]. Some literature also suggests it may also be worthwhile to explore new ways of teaching critical appraisal skills of these OERs alongside the original peer-reviewed papers [72]. While many have written about the role that OER producers must play in encouraging active critical appraisal of the peer-reviewed literature [73,74,75], some literature reminds us that the critical appraisal of OERs themselves must also be incorporated into the readers’ skills. This skill should be encouraged by educators, but also advocated by those who serve as OER creators, curators, or editors [4,8,72,76].

Guideline 12 – Don’t Know: Best practices for cross-language and cross-cultural sharing of OERs

The one key advantage of OER is the easy dissemination and accessibility of resources across the globe. However, the paucity of multilingual OER repositories represents a barrier to access. Moreover, the differing contexts and cultures of health globally may necessitate more than simple language translation to achieve the same relevance of a given OER to different audiences. Although OERs allow for creation of content that appeals to learners in many different countries that speak many different languages, it remains unclear how to best improve accessibility of content and improve cross-language sharing of resources [3,24]. It is also unclear how OERs fare in the varying cross-cultural contexts and whether they reinforce a certain way of thinking or introduce biases. While OERs are often used in the Global South [77], we must be mindful that in other medical education systems there have been barriers to engagement [78], and attending to these barriers in OERs will be a persistent challenge to overcome.

Guideline 13 – Don’t Know: Efficacy of various types of OERs and best practices for OERs to optimize learning

While there is some evidence that OERs of different formats (e.g., blogs vs podcasts) can have similar learning outcomes [65,79,80,81], additional research must clarify best practices for learning optimization. Some research has been completed to understand what listeners feel make podcasts most effective [17,18,67]. For instance, a recent study has shown that interpolated questions within a podcast may improve knowledge retention [18]. However, additional empirical studies will need to further clarify which attributes enhance learner experience and outcomes. Interestingly, research can sometimes show counterintuitive findings. Several studies have shown that participants often dual-task while listening to podcasts [17,65,67,82] however, at least one randomized controlled trial has shown that dual-tasking with a driving simulator has little effect on learning outcomes [66].

Guideline 14 – Don’t Know: The ideal way to incorporate OERs into existing curricula

The approach that learners exhibit when digesting OER content, such as podcasts, often deviates from traditional active learning behaviors like notetaking and repeating [17]. It is still unclear as to how OERs might be best harnessed to augment, complement, or replace traditional learning activities. As such, there remains an uncertainty on how medical educators can best integrate OERs into traditional medical curricula. Traditional learning is done in environments where more active learning behaviors such as note taking can be exhibited, whereas OERs like podcasts are often consumed in situations like driving or exercise where such active learning behaviors are not easily observed.

While there is some promising evidence that OERs may be better than textbooks in some circumstances [81], this has not held true across all studies for all OER comparisons to traditional formats [83], suggesting that more nuanced research is required to understand when OERs may be most appropriate for learning. With increasing strain and burnout in faculty and trainee groups alike, it may be useful to engage in further research to determine what modalities might be equivalent to lectures or textbooks to decrease the burden of synchronous learning experiences. It is important to understand the place of traditional resources versus OERs in learning and investigate how to best integrate OERs into an educational strategy. These questions are important when considering needs based OER creation.

Guideline 15 – Don’t Know: Ethical considerations in use of OERs that are developed with industry support

Partnerships between industry and academic centers have played an important role in bridging the gap between discovery and clinical implementation, but industry funding for development of such resources is fraught with potential of bias [84,85,86]. Whereas most refereed academic journals require disclosure of funding, such requirements are less standardized in OERs and are often absent [86]. The authors advocate for transparency around funding sources and other potential sources of bias in informational resources, but strategies to ensure this transparency are still evolving. Best practice, however, is for individuals creating such content to disclose financial or other potential conflicts that may have relevance to the information provided. When OERs involve industry support, unrestricted educational grants are preferable, as they minimize likelihood of sponsor influence on content. Educators should use due diligence in ensuring that material recommended to learners is free of commercial bias and made fully transparent.

Discussion

We have presented 15 guidelines (9 Do’s, 2 Don’ts, and 4 Don’t Knows) for scaffolding the creation of OERs within medical education based on a critical reading of the literature. There are still many unanswered questions within this burgeoning new area of scholarship, and we hope that the “Don’t Knows” inspire others from diverse backgrounds to ask key questions about how OERs might be incorporated into medical education. Please see Figure 2 for a summary of our findings.

Figure 2.

A graphical representation summarizing the dos, don’ts and don’t knows from this guideline

Summary graphic: Guidelines and background evidence for OER creation.

As guidelines for the production of OERs are developed, these learning formats have become an increasingly legitimized form of educational scholarship. Sherbino et al. outlined four criteria for social media to be considered scholarship that could be adapted for these resources, concluding that it must: 1) reflect original content, 2) advance the field of health professions education by building on theory, research or best practice, 3) be archived and disseminated, and 4) provide the health professions education community with the ability to comment on and provide feedback in a transparent fashion that informs wider discussion [45]. Additionally, while not developed specifically for digital scholarship, Glassick’s criteria for evaluating scholarship provide a valuable starting point and include that scholarship must demonstrate clear goals, adequate preparation, appropriate methods, significant results, effective presentation and reflective critique [87]. Creators of OERs can use these criteria to scaffold their processes to ensure that these resources are rigorous.

There are ever-looming threats to the OER movement. A recent study showed that there may be a decline in the number of OER producers [88]. The sustainability of OER production may very well depend on how academics and/or readers begin to value their continued existence [89,90]. If OERs are considered scholarship it becomes important to quantify their value. Husain et al. provide recommendations for presenting digital scholarship to promotion and tenure committees that included demonstrating scholarship criteria, providing external evidence of impact through the use of some of the previously described metrics, including digital peer-review roles, citing digital scholarship consistently, crafting a digital scholarship mission statement, using traditional frameworks such as the teaching portfolio [46]. Cabrera et al. also provide recommendations for both scholars and academic institutions with regard to preparing and interpreting promotion packets that include digital scholarship [91]. The formal acknowledgement of OERs as scholarship could help producers to get academic credit towards promotion, which could also improve the sustainability of OER production.

Unfortunately, when it comes to evaluating the impact and quality of OERs, metrics applied to traditional scholarship such as journal impact factor and number of citations do not translate well to digital scholarship [91,92,93,94]. Alternative impact metrics for educational scholarship could include page views, time spent on a page, reactions (e.g., likes, dislikes, favorites), impressions, dissemination, unique users, geographic reach, followers on professional social media accounts, the Social Media Index [95,96], Alexa Ranking, and Altmetrics [46,97,98]. Alternative quality metrics have also been developed including the METRIQ score [70,99], the Social Media Index [95,96], Approved Instructional Resources (AIR) score [26,71,100], and the Quality Checklists for Health Professions Blogs and Podcasts [101]. These tools could be used by promotion and tenure committees to adapt their criteria [46,102] in a way that encompasses a wider view of what can be considered scholarship [103].

Limitations

This guideline has several limitations to consider. Firstly, while this guideline has sought to aggregate the evidence regarding OER creation, this is certainly an evolving field. New articles will undoubtedly appear prior to publication of this guideline. Secondly, since our review focused on the guiding literature in medicine about OER creation, we may have missed guideline literature from surgical fields or other diagnostic fields. Thirdly, there is certainly a bias within the guideline towards EM citations and literature – and this may be due to a few different causes: 1) there is a preponderance of published literature by the EM community around OERs due to the naming of the Free Open Access Medical education movement at an EM conference but also the coincidental founding of a specialty-related journal; 2) several of the authors as well as published consulting experts identify as EM physicians and may have skewed our awareness of certain bodies of literature over others; 3) members of the EM community (including members of this authorship team) may preferentially use the term OER in their publications and this may have created a bias in our search findings. In future guidelines, dedicated searches of certain modalities such as blogs, podcasts, and specific social media platforms may be helpful to eliminate this bias.

Conclusion

This article culminates key evidence into a guideline for the creation of OERs. We hope that the field will continue to evolve its practices by addressing the Don’t Knows, bringing further clarity to the Do’s and Don’ts of the field.

Funding Statement

The open access publishing fee was funded by a generous grant from McMaster University received by TMC (University Scholar Award).

Ethics and consent

There are no human or animal subjects in this article and hence ethics approval was not required.

Funding information

The open access publishing fee was funded by a generous grant from McMaster University received by TMC (University Scholar Award).

Competing Interests

FK, MW, MRCH, MB, DKT, and YMK have no conflicts to declare. YY is the recipient of the TUBITAK Postdoctoral Fellowship grant. BT has received stipends from the University of Saskatchewan for teaching and research and the Royal College of Physicians and Surgeons for teaching and administrative work. TMC reports an honoraria and a research grant from McMaster University for her education research work with the McMaster Education Research, Innovation, and Theory (MERIT) group and administrative stipend for her role of associate dean via the McMaster Faculty of Health Sciences Office of Continuing Professional Development. She also discloses that she has received various grants from governmental sources (Government of Ontario, Virtual Learning Strategy eCampus Ontario program).

Author Contributions

All authors contributed to the 4 criteria of the International Committee of Medical Journal Editors: criteria 1 (Substantial contributions to the conception or design of the work; or the acquisition, analysis, or interpretation of data for the work), criteria 2 (Drafting the work or revising it critically for important intellectual content), criteria 3 (Final approval of the version to be published), and criteria 4 (Agreement to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved).

Faran Khalid and Michael Wu are Joint first author.

References

  • 1.World Open Educational Resources Congress | United Nations Educational, Scientific and Cultural Organization [Internet]. [cited 2022 Apr 4]. Available from: https://webarchive.unesco.org/20160807000909/http://www.unesco.org/new/en/communication-and-information/events/calendar-of-events/events-websites/world-open-educational-resources-congress/.
  • 2.Cadogan M, Thoma B, Chan TM, Lin M. Free Open Access Meducation (FOAM): the rise of emergency medicine and critical care blogs and podcasts (2002–2013). Emerg Med J. 2014; 31: e76–7. DOI: 10.1136/emermed-2013-203502 [DOI] [PubMed] [Google Scholar]
  • 3.Chan TM, Stehman C, Gottlieb M, Thoma B. A Short History of Free Open Access Medical Education. The Past, Present, and Future. ATS_Scholar. American Thoracic Society – ATS Scholar; 2020. DOI: 10.34197/ats-scholar.2020-0014PS [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Ting DK, Boreskie P, Luckett-Gatopoulos S, et al. Quality Appraisal and Assurance Techniques for Free Open Access Medical Education (FOAM) Resources: A Rapid Review. Semin Nephrol. 2020; 40: 309–19. DOI: 10.1016/j.semnephrol.2020.04.011 [DOI] [PubMed] [Google Scholar]
  • 5.Naeem SB, Bhatti R, Khan A. An exploration of how fake news is taking over social media and putting public health at risk. Health Inf Libr J. 2021; 38: 143–9. DOI: 10.1111/hir.12320 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Purdy E, Thoma B, Bednarczyk J, Migneault D, Sherbino J. The use of free online educational resources by Canadian emergency medicine residents and program directors. CJEM. 2015; 1717: 101–6. DOI: 10.1017/cem.2014.73 [DOI] [PubMed] [Google Scholar]
  • 7.Riddell J, Robins L, Brown A, Sherbino J, Lin M, Ilgen JS. Independent and Interwoven: A Qualitative Exploration of Residents’ Experiences with Educational Podcasts. Acad Med. 2020; 95: 89–96. [DOI] [PubMed] [Google Scholar]
  • 8.Azim A, Beck-Esmay J, Chan TM. Editorial Processes in Free Open Access Medical Educational (FOAM) Resources. AEM Educ Train. 2018; 2: 204–12. DOI: 10.1002/aet2.10097 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Gisondi MA, Michael S, Li-Sauerwine S, et al. The Purpose, Design, and Promise of Medical Education Research Labs. Acad Med. 2022; 97: 1281–8. DOI: 10.1097/ACM.0000000000004746 [DOI] [PubMed] [Google Scholar]
  • 10.Lefroy J, Watling C, Teunissen PW, Brand P. Guidelines: the do’s, don’ts and don’t knows of feedback for clinical education. Perspect Med Educ. 2015; 4: 284–99. DOI: 10.1007/s40037-015-0231-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Chou CL, Kalet A, Costa MJ, Cleland J, Winston K. Guidelines: The dos, don’ts and don’t knows of remediation in medical education. Perspect Med Educ. 2019; 8: 322–38. DOI: 10.1007/s40037-019-00544-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Forestell B, Beals L, Shah A, Chan TM. Developing ClerkCast: An Emergency Medicine Clerkship Needs Assessment Project. Cureus [Internet]. [cited 2021 Jan 17]; 12. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7188007/. DOI: 10.7759/cureus.7459 [DOI] [PMC free article] [PubMed]
  • 13.Chan TM, Jo D, Shih AW, et al. The Massive Online Needs Assessment (MONA) to inform the development of an emergency haematology educational blog series. Perspect Med Educ. 2018; 219–23. DOI: 10.1007/s40037-018-0406-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Tseng EK, Jo D, Shih AW, Wit KD, Chan TM. Window to the Unknown: Using Storytelling to Identify Learning Needs for the Intrinsic Competencies Within an Online Needs Assessment. AEM Educ Train. 2019; 3: 179–87. DOI: 10.1002/aet2.10315 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Patocka C, Lin M, Voros J, Chan T. Point-of-care Resource Use in the Emergency Department: A Developmental Model. AEM Educ Train. 2018; 2: 221–8. DOI: 10.1002/aet2.10101 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Mallin M, Schlein S, Doctor S, Stroud S, Dawson M, Fix M. A survey of the current utilization of asynchronous education among emergency medicine residents in the United States. Acad Med. 2014; 89: 598–601. DOI: 10.1097/ACM.0000000000000170 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Roland D, Thoma B, Tagg A, Woods J, Chan TM, Riddell J. What Are the Real-World Podcast-Listening Habits of Medical Professionals? Cureus [Internet]. 2021. [cited 2022 Apr 4]; 13. Available from: https://www.cureus.com/articles/62158-what-are-the-real-world-podcast-listening-habits-of-medical-professionals. DOI: 10.7759/cureus.16240 [DOI] [PMC free article] [PubMed]
  • 18.Weinstock M, Pallaci M, Aluisio AR, et al. Effect of Interpolated Questions on Podcast Knowledge Acquisition and Retention: A Double-Blind, Multicenter, Randomized Controlled Trial. Ann Emerg Med. 2020; 76: 353–61. DOI: 10.1016/j.annemergmed.2020.01.021 [DOI] [PubMed] [Google Scholar]
  • 19.Zhang E, Trad N, Corty R, Zohrob D, Trivedi S, Rodman A. How podcasts teach: A comprehensive analysis of the didactic methods of the top hundred medical podcasts. Med Teach. 2022; 44: 1146–50. DOI: 10.1080/0142159X.2022.2071691 [DOI] [PubMed] [Google Scholar]
  • 20.Grock A, Chan W, Aluisio AR, Alsup C, Huang D, Joshi N. Holes in the FOAM: An Analysis of Curricular Comprehensiveness in Online Educational Resources. AEM Educ Train [Internet]. [cited 2020 Dec 23];n/a. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/aet2.10556. DOI: 10.1002/aet2.10556 [DOI] [PMC free article] [PubMed]
  • 21.Stuntz R, Clontz R. An Evaluation of Emergency Medicine Core Content Covered by Free Open Access Medical Education Resources. Ann Emerg Med. 2016; 67: 649–53e2. DOI: 10.1016/j.annemergmed.2015.12.020 [DOI] [PubMed] [Google Scholar]
  • 22.Khan T, Chan TM. Intrinsic CanMEDS Competencies Expected of Medical Students During Emergency Medicine Core Rotation: A Needs Assessment. Cureus. 2018; 10. DOI: 10.7759/cureus.3316 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Folkl A, Chan T, Blau E. Use of Free, Open Access Medical Education and Perceived Emergency Medicine Educational Needs Among Rural Physicians in Southwestern Ontario. Cureus. 2016; 8: e796. DOI: 10.7759/cureus.796 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Rodman A, Abrams HR, Watto M, et al. Medical Podcasting in Low- and Middle-Income Countries: A Needs Assessment and Vision for the Future. Teach Learn Med. 2021; 33: 416–22. DOI: 10.1080/10401334.2021.1875834 [DOI] [PubMed] [Google Scholar]
  • 25.Rath A, Salamon V, Peixoto S, et al. A systematic literature review of evidence-based clinical practice for rare diseases: what are the perceived and real barriers for improving the evidence and how can they be overcome? Trials. 2017; 18: 556. DOI: 10.1186/s13063-017-2287-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Grock A, Jordan J, Zaver F, et al. The revised Approved Instructional Resources score: An improved quality evaluation tool for online educational resources. AEM Educ Train. 2021; 5: e10601. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Lave J, Wenger E. Situated Learning. Cambridge, UK: Cambridge University Press; 1991. DOI: 10.1017/CBO9780511815355 [DOI] [Google Scholar]
  • 28.Bourhis A, Dubé L, Jacob R. The Success of Virtual Communities of Practice: The Leadership Factor. J Knowl Manag. 2005; 3: 23–34. [Google Scholar]
  • 29.Dube L, Bourhis A, Jacob R. Towards a Typology of Virtual Communities of Practice. Interdiscip J Inf Knowl Manag. 2006; 1: 69–93. DOI: 10.28945/115 [DOI] [Google Scholar]
  • 30.Dubé L, Bourhis A, Jacob R. The impact of structuring characteristics on the launching of virtual communities of practice. J Organ Change Manag. 2005; 18: 145–66. DOI: 10.1108/09534810510589570 [DOI] [Google Scholar]
  • 31.Thoma B, Brazil V, Spurr J, et al. Establishing a Virtual Community of Practice in Simulation: The Value of Social Media Simul Healthc. [DOI] [PubMed] [Google Scholar]
  • 32.Yarris LM, Chan TM, Gottlieb M, Juve AM. Finding Your People in the Digital Age: Virtual Communities of Practice to Promote Education Scholarship. J Grad Med Educ. 2019; 11: 1–5. DOI: 10.4300/JGME-D-18-01093.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Colbert GB, Topf J, Jhaveri KD, et al. The Social Media Revolution in Nephrology Education. Kidney Int Rep. 2018; 3: 519–29. DOI: 10.1016/j.ekir.2018.02.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Ting DK, Thoma B, Luckett-Gatopoulos S, et al. CanadiEM: Accessing a Virtual Community of Practice to Create a Canadian National Medical Education Institution. AEM Educ Train. 2019; 3: 86–91. DOI: 10.1002/aet2.10199 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Shah S, Topf J. Mentorship in the digital age: Nephrology social media collective internship. Clin J Am Soc Nephrol. 2019; 14: 294–6. DOI: 10.2215/CJN.09970818 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Thoma B, Paddock M, Purdy E, et al. Leveraging a virtual community of practice to participate in a survey-based study: A description of the METRIQ Study Methodology. AEM Educ Train. 2017; 1: 110–3. DOI: 10.1002/aet2.10013 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Dong JK, Saunders C, Wachira BW, Thoma B, Chan TM. Social media and the modern scientist: a research primer for low- and middle-income countries. Afr J Emerg Med. 2020; 10: S120–4. DOI: 10.1016/j.afjem.2020.04.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Sidalak D, Purdy E, Luckett-Gatopoulos S, Thoma B, Chan TM. Coached Peer Review: Developing the Next Generation of Authors. Acad Med.2017; 92: 201–4. DOI: 10.1097/ACM.0000000000001224 [DOI] [PubMed] [Google Scholar]
  • 39.Caners K, Baylis J, Heyd C, Chan T. Sharing is caring: How EM Sim Cases (EMSimCases.com) has created a collaborative simulation education culture in Canada. CJEM. 2020; 22: 819–21. DOI: 10.1017/cem.2020.392 [DOI] [PubMed] [Google Scholar]
  • 40.He S, Lai D, Mott S, et al. Remote e-Work and Distance Learning for Academic Medicine: Best Practices and Opportunities for the Future. J Grad Med Educ. 2020; 12: 256–63. DOI: 10.4300/JGME-D-20-00242.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Gottlieb M, Landry A, Egan DJ, et al. Rethinking Residency Conferences in the Era of COVID-19. AEM Educ Train [Internet]. 2020. [cited 2020 Apr 25]; 4. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/aet2.10449. DOI: 10.1002/aet2.10449 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Boyer EL, Moser D, Ream TC, Braxton JM. Scholarship reconsidered: Priorities of the professoriate. Lawrenceville: Princeton University Press; 1990. [Google Scholar]
  • 43.Leiner BM, Cerf VG, Clark DD, et al. A brief history of the internet. ACM SIGCOMM Comput Commun Rev. 2009; 39: 10. DOI: 10.1145/1629607.1629613 [DOI] [Google Scholar]
  • 44.Thoma B, Chan T, Benitez J, Lin M. Educational scholarship in the digital age: a scoping review and analysis of scholarly products. The Winnower. 2014; 1: 1–13. DOI: 10.15200/winn.141827.77297 [DOI] [Google Scholar]
  • 45.Sherbino J, Arora VM, Van Melle E, Rogers R, Frank JR, Holmboe ES. Criteria for social media-based scholarship in health professions education. Postgrad Med J. 2015; 91: 551–5. DOI: 10.1136/postgradmedj-2015-133300 [DOI] [PubMed] [Google Scholar]
  • 46.Husain A, Repanshek Z, Singh M, et al. Consensus Guidelines for Digital Scholarship in Academic Promotion. West J Emerg Med Integrating Emerg Care Popul Health [Internet]. 2020. [cited 2020 Jul 9]; 0. Available from: https://escholarship.org/uc/item/44f3v8f1. [DOI] [PMC free article] [PubMed]
  • 47.Murray H, Walker M, Dawson J, Simper N, Maggio LA. Teaching Evidence-Based Medicine to Medical Students Using Wikipedia as a Platform. Acad Med. 2020; 95: 382–6. DOI: 10.1097/ACM.0000000000003085 [DOI] [PubMed] [Google Scholar]
  • 48.Ramos E, Concepcion BP. Visual Abstracts: Redesigning the Landscape of Research Dissemination. Semin Nephrol. 2020; 40: 291–7. DOI: 10.1016/j.semnephrol.2020.04.008 [DOI] [PubMed] [Google Scholar]
  • 49.Ibrahim AM. Seeing is Believing: Using Visual Abstracts to Disseminate Scientific Research. Am Coll Gastroenterol. 2018; 113: 459–61. DOI: 10.1038/ajg.2017.268 [DOI] [PubMed] [Google Scholar]
  • 50.Ibrahim AM, Bradley SM. Adoption of Visual Abstracts at Circulation CQO: Why and How We’re Doing It. Circ Cardiovasc Qual Outcomes [Internet]. 2017; 10. Available from: http://www.ncbi.nlm.nih.gov/pubmed/28302648. DOI: 10.1161/CIRCOUTCOMES.117.003684 [DOI] [PubMed] [Google Scholar]
  • 51.Brownlee SA, Ibrahim AM. Disseminating Research and #VisualAbstracts. In: Dimick JB, Lubitz CC, editors. Health Serv Res [Internet]. Cham: Springer International Publishing; 2020. [cited 2022 Apr 11]. p. 271–81. DOI: 10.1007/978-3-030-28357-5_23 [DOI] [Google Scholar]
  • 52.Gloviczki P, Lawrence PF. Visual abstracts bring key message of scientific research. J Vasc Surg. 2018; 67: 1319–20. DOI: 10.1016/j.jvs.2018.04.003 [DOI] [PubMed] [Google Scholar]
  • 53.Chan TM, Dzara K, Dimeo SP, Bhalerao A, Maggio LA. Social media in knowledge translation and education for physicians and trainees: a scoping review. Perspect Med Educ. 2020; 9: 20–30. DOI: 10.1007/s40037-019-00542-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Chapman SJ, Grossman RC, FitzPatrick MEB, Brady RRW. Randomized controlled trial of plain English and visual abstracts for disseminating surgical research via social media. Br J Surg. 2019; 106: 1611–6. DOI: 10.1002/bjs.11307 [DOI] [PubMed] [Google Scholar]
  • 55.Huang S, Martin LJ, Yeh CH, et al. The effect of an infographic promotion on research dissemination and readership: A randomized controlled trial. Can J Emerg Med. 2018; 20: 826–33. DOI: 10.1017/cem.2018.436 [DOI] [PubMed] [Google Scholar]
  • 56.Thoma B, Murray H, Huang SY, et al. The impact of social media promotion with infographics and podcasts on research dissemination and readership. Can J Emerg Med. 2018; 20: 300–6. DOI: 10.1017/cem.2017.394 [DOI] [PubMed] [Google Scholar]
  • 57.Martin LJ, Turnquist A, Groot B, et al. Exploring the Role of Infographics for Summarizing Medical Literature. Health Prof Educ. 2018; 5: 48–57. DOI: 10.1016/j.hpe.2018.03.005 [DOI] [Google Scholar]
  • 58.McCrorie A, Donnelly C, McGlade K. Infographics: Healthcare Communication for the Digital Age. Ulster Med J. 2016; 85: 71–5. [PMC free article] [PubMed] [Google Scholar]
  • 59.Farris GE. Annals Graphic Medicine – Dr. Mom: My Favorite Thing to Say on Rounds. Ann Intern Med. 2021; 174: W81–2. DOI: 10.7326/G21-0053 [DOI] [PubMed] [Google Scholar]
  • 60.Olson CM, Rennie D, Cook D, et al. Publication bias in editorial decision making. JAMA. 2002; 287: 2825–8. DOI: 10.1001/jama.287.21.2825 [DOI] [PubMed] [Google Scholar]
  • 61.Dickersin K. The existence of publication bias and risk factors for its occurrence. JAMA. 1990; 263: 1385–9. DOI: 10.1001/jama.1990.03440100097014 [DOI] [PubMed] [Google Scholar]
  • 62.Ibrahim AM, Lillemoe KD, Klingensmith ME, Dimick JB. Visual Abstracts to Disseminate Research on Social Media: A Prospective, Case-control Crossover Study. Ann Surg. 2017; 266: e46. DOI: 10.1097/SLA.0000000000002277 [DOI] [PubMed] [Google Scholar]
  • 63.Dyjur P, Li L. Learning 21st Century Skills by Engaging in an Infographics Assessment [Internet]. University of Calgary; 2015. [cited 2022 Apr 11]. Available from: https://prism.ucalgary.ca/handle/1880/50860 [Google Scholar]
  • 64.Kelly JM, Perseghin A, Dow AW, Trivedi SP, Rodman A, Berk J. Learning Through Listening: A Scoping Review of Podcast Use in Medical Education. Acad Med. 2022; 97: 1079–85. DOI: 10.1097/ACM.0000000000004565 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Chin A, Helman A, Chan T. Podcast Use in Undergraduate Medical Education. Cureus [Internet]. 2017; 9. Available from: https://www.cureus.com/articles/9395-podcast-use-in-undergraduate-medical-education. DOI: 10.7759/cureus.1930 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Gottlieb M, Riddell J, Cooney R, King A, Fung C-C, Sherbino J. Maximizing the Morning Commute: A Randomized Trial Assessing the Effect of Driving on Podcast Knowledge Acquisition and Retention. Ann Emerg Med. 2021; 78: 416–24. DOI: 10.1016/j.annemergmed.2021.02.030 [DOI] [PubMed] [Google Scholar]
  • 67.Riddell J, Swaminathan A, Lee M, Mohamed A, Rogers R, Rezaie SR. A Survey of Emergency Medicine Residents’ Use of Educational Podcasts. West J Emerg Med. 2017; 18: 229–34. DOI: 10.5811/westjem.2016.12.32850 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Ghiathi C, Seitz K, Kritek P. How to Create and Evaluate a Resident-Led Audio Program: Six Clinical Podcasts for Medicine House Staff. MedEdPORTAL J Teach Learn Resour. 2020; 16: 11062. DOI: 10.15766/mep_2374-8265.11062 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Thurtle N, Banks C, Cox M, Pain T, Furyk J. Free Open Access Medical Education resource knowledge and utilisation amongst Emergency Medicine trainees: A survey in four countries. Afr J Emerg Med [Internet]. 2015; Available from: http://www.sciencedirect.com/science/article/pii/S2211419X15001330. [DOI] [PMC free article] [PubMed]
  • 70.Colmers-Gray IN, Krishnan K, Chan TM, et al. The Revised METRIQ Score: A Quality Evaluation Tool for Online Educational Resources. AEM Educ Train. 2019; 3: 387–92. DOI: 10.1002/aet2.10376 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Chan TM, Grock A, Paddock M, Kulasegaram K, Yarris LM, Lin M. Examining Reliability and Validity of an Online Score (ALiEM AIR) for Rating Free Open Access Medical Education Resources. Ann Emerg Med. 2016; 68: 729–35. DOI: 10.1016/j.annemergmed.2016.02.018 [DOI] [PubMed] [Google Scholar]
  • 72.Ponce DPM, Tomlinson S, Sobolewski B. FOAM Club: A spin on the traditional journal club format focused on blogs and podcasts. AEM Educ Train [Internet]. [cited 2020 Aug 11];n/a. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/aet2.10516. [DOI] [PMC free article] [PubMed]
  • 73.Lin M, Joshi N, Hayes BD, Chan TM. Accelerating Knowledge Translation: Reflections From the Online ALiEM-Annals Global Emergency Medicine Journal Club Experience. Ann Emerg Med. 2017; 69: 469–74. DOI: 10.1016/j.annemergmed.2016.11.010 [DOI] [PubMed] [Google Scholar]
  • 74.Topf JM, Sparks MA, Phelan PJ, et al. The Evolution of the Journal Club: From Osler to Twitter. Am J Kidney Dis. 2017; 69: 827–36. DOI: 10.1053/j.ajkd.2016.12.012 [DOI] [PubMed] [Google Scholar]
  • 75.Topf JM, Hiremath S. Social media, medicine and the modern journal club. Int Rev Psychiatry. 2015; 27: 147–54. DOI: 10.3109/09540261.2014.998991 [DOI] [PubMed] [Google Scholar]
  • 76.Ting DK, Bailey BH, Scheuermeyer FX, Harris DR, Chan TM. The Journal Club 3.0: A qualitative, multisite study examining a new educational paradigm in the era of open educational resources. AEM Educ Train. 2022; 6: e10723. DOI: 10.1002/aet2.10723 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.de Oliviera Neto JD, Pete J, Daryono D, Cartmill T. OER use in the Global South: A baseline survey of higher education instructors [Internet]. African Minds, International Development Research Centre & Research on Open Educational Resources for Development; 2017. [cited 2022 Nov 24]. Available from: https://open.uct.ac.za/handle/11427/26405. [Google Scholar]
  • 78.Maggio LA, Costello JA, Ninkov AB, Frank JR, Artino AR, Jr. The voices of medical education scholarship: Describing the published landscape. Med Educ [Internet]. [cited 2022 Nov 24]; n/a. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1111/medu.14959. DOI: 10.1111/medu.14959 [DOI] [PMC free article] [PubMed]
  • 79.Roth J, Chang A, Ricci B, Hall M, Mehta N. Why Not a Podcast? Assessing Narrative Audio and Written Curricula in Obstetrical Neurology. J Grad Med Educ. 2020; 12: 86–91. DOI: 10.4300/JGME-D-19-00505.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80.Lien K, Chin A, Helman A, Chan TM. A Randomized Comparative Trial of the Knowledge Retention and Usage Conditions in Undergraduate Medical Students Using Podcasts and Blog Posts. Cureus. 2018; 10: 1–12. DOI: 10.7759/cureus.2065 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.Back DA, von Malotky J, Sostmann K, Hube R, Peters H, Hoff E. Superior Gain in Knowledge by Podcasts Versus Text-Based Learning in Teaching Orthopedics: A Randomized Controlled Trial. J Surg Educ. 2017; 74: 154–60. DOI: 10.1016/j.jsurg.2016.07.008 [DOI] [PubMed] [Google Scholar]
  • 82.Thoma B, Goerzen S, Horeczko T, et al. An international, interprofessional investigation of the self-reported podcast listening habits of emergency clinicians: A METRIQ Study. Can J Emerg Med. 2020; 22: 112–7. DOI: 10.1017/cem.2019.427 [DOI] [PubMed] [Google Scholar]
  • 83.Schreiber BE, Fukuta J, Gordon F. Live lecture versus video podcast in undergraduate medical education: A randomised controlled trial. BMC Med Educ. 2010; 10: 68. DOI: 10.1186/1472-6920-10-68 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.Tao D, Boothby A, McLouth J, Prasad V. Financial conflicts of interest among hematologist-oncologists on twitter. JAMA Int Med. 2017; 177: 425–7. DOI: 10.1001/jamainternmed.2016.8467 [DOI] [PubMed] [Google Scholar]
  • 85.DeAngelis CD, Fontanarosa PB, Flanagin A. Reporting Financial Conflicts of Interest and Relationships Between Investigators and Research Sponsors. JAMA. 2001; 286: 89–91. DOI: 10.1001/jama.286.1.89 [DOI] [PubMed] [Google Scholar]
  • 86.Krimsky S. Do Financial Conflicts of Interest Bias Research?: An Inquiry into the “Funding Effect” Hypothesis. Sci Technol Hum Values. 2013; 38: 566–87. DOI: 10.1177/0162243912456271 [DOI] [Google Scholar]
  • 87.Glassick CE. Boyer’s Expanded Definitions of Scholarship, the Standards for Assessing Scholarship, and the Elusiveness of the Scholarship of Teaching. Acad Med. 2000; 75: 877–80. DOI: 10.1097/00001888-200009000-00007 [DOI] [PubMed] [Google Scholar]
  • 88.Lin M, Phipps M, Yilmaz Y, Nash CJ, Gisondi MA, Chan TM. A Fork in the Road for Emergency Medicine and Critical Care Blogs and Podcasts: Cross-sectional Study. JMIR Med Educ. 2022; 8: e39946. DOI: 10.2196/39946 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Yilmaz Y, Ruan B, Thomas P, Tran V, Chan TM. Reframing organizations in the digital age: A qualitative study exploring institutional social media adoption involving emergency physicians and other researchers. F1000Research. 2021; 10: 1048. DOI: 10.12688/f1000research.73439.2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90.Chan TM, Ruan B, Lu D, Lee M, Yilmaz Y. Systems to support scholarly social media: a qualitative exploration of enablers and barriers to new scholarship in academic medicine. Can Med Educ J. 2021; 12: 14–27. DOI: 10.36834/cmej.72490 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 91.Cabrera D, Vartabedian BS, Spinner RJ, Jordan BL, Aase LA, Timimi FK. More Than Likes and Tweets: Creating Social Media Portfolios for Academic Promotion and Tenure. J Grad Med Educ. 2017; 9: 421–5. DOI: 10.4300/JGME-D-17-00171.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 92.Cabrera D. Mayo Clinic includes Social Media Scholarship Activities in Academic Advancement [Internet]. MCSMN Blog; 2016. [cited 2016 Jun 5]. Available from: https://socialmedia.mayoclinic.org/discussion/mayo-clinic-includes-social-media-scholarship-activities-in-academic-advancement/.
  • 93.Khan MS, Shahadat A, Khan SU, Ahmed S, Doukky R, Michos ED, et al. The Kardashian Index of Cardiologists. JACC Case Rep. 2020; 2: 2019–21. DOI: 10.1016/j.jaccas.2019.11.068 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Hall N. The Kardashian Index: A Measure of Discrepant Social Media Profile for Scientists. Genome Biol. 2015; 15: 424. DOI: 10.1186/s13059-014-0424-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95.Thoma B, Sanders J, Lin M, Paterson Q, Steeg J, Chan T. The Social Media Index: Measuring the impact of emergency medicine and critical care websites. West J Emerg Med. 2015; 16: 242–9. DOI: 10.5811/westjem.2015.1.24860 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96.Thoma B, Chan TM, Kapur P, Sifford D, Siemens M, Paddock M, et al. The Social Media Index as an Indicator of Quality for Emergency Medicine Blogs: A METRIQ Study. Ann Emerg Med. 2018; 72: 696–702. DOI: 10.1016/j.annemergmed.2018.05.003 [DOI] [PubMed] [Google Scholar]
  • 97.Trueger NS, Thoma B, Hsu CH, Sullivan D, Peters L, Lin M. Altmetric Score – A New Measure for Article-Level Dissemination and Impact. Ann Emerg Med. 2015; 66: 549–53. DOI: 10.1016/j.annemergmed.2015.04.022 [DOI] [PubMed] [Google Scholar]
  • 98.Trueger NS, Thoma B, Hsu CH, Peters L, Lin M. The Altmetric Score – A Better Impact Factor? Ann Emerg Med. 2015; 66: 548–553. DOI: 10.1016/j.annemergmed.2015.04.022 [DOI] [PubMed] [Google Scholar]
  • 99.Chan TM, Thoma B, Krishnan K, et al. Derivation of two critical appraisal scores for trainees to evaluate online educational resources: A METRIQ Study. West J Emerg Med. 2016; 17: 574–84. DOI: 10.5811/westjem.2016.6.30825 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 100.Lin M, Joshi N, Grock A, et al. Approved Instructional Resources Series: A National Initiative to Identify Quality Emergency Medicine Blog and Podcast Content for Residency Education. J Grad Med Educ. 2016; 2: 219–25. DOI: 10.4300/JGME-D-15-00388.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 101.Paterson QS, Colmers IN, Lin M, Thoma B, Chan T. The quality checklists for health professions blogs and podcasts. The Winnower. 2015; 1–7. Available at: https://thewinnower.com/papers/2641-the-quality-checklists-for-medical-education-blogs-and-podcasts.
  • 102.Schimanski LA, Alperin JP. The evaluation of scholarship in academic promotion and tenure processes: Past, present, and future. F1000Research. 2018; 7: 1605. DOI: 10.12688/f1000research.16493.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 103.Ting DK, Bigham BL, Mehta S, Stiell I. Adding value to scholarship in residency: Supporting and inspiring future emergency medicine research in Canada. Can J Emerg Med. 2018; 20: 318–20. DOI: 10.1017/cem.2018.395 [DOI] [PubMed] [Google Scholar]

Articles from Perspectives on Medical Education are provided here courtesy of Ubiquity Press

RESOURCES