Abstract
Introduction
Although ECG interpretation is an essential skill in clinical medicine, medical students and residents often lack ECG competence. Novel teaching methods are increasingly being implemented and investigated to improve ECG training. Computer-assisted instruction is one such method under investigation; however, its efficacy in achieving better ECG competence among medical students and residents remains uncertain.
Methods and analysis
This article describes the protocol for a systematic review and meta-analysis that will compare the effectiveness of computer-assisted instruction with other teaching methods used for the ECG training of medical students and residents. Only studies with a comparative research design will be considered. Articles will be searched for in electronic databases (PubMed, Scopus, Web of Science, Academic Search Premier, CINAHL, PsycINFO, Education Resources Information Center, Africa-Wide Information and Teacher Reference Center). In addition, we will review citation indexes and conduct a grey literature search. Data extraction will be done on articles that met the predefined eligibility criteria. A descriptive analysis of the different teaching modalities will be provided and their educational impact will be assessed in terms of effect size and the modified version of Kirkpatrick framework for the evaluation of educational interventions. This systematic review aims to provide evidence as to whether computer-assisted instruction is an effective teaching modality for ECG training. It is hoped that the information garnered from this systematic review will assist in future curricular development and improve ECG training.
Ethics and dissemination
As this research is a systematic review of published literature, ethical approval is not required. The results will be reported according to the Preferred Reporting Items for Systematic Review and Meta-Analysis statement and will be submitted to a peer-reviewed journal. The protocol and systematic review will be included in a PhD dissertation.
PROSPERO registration number
CRD42017067054; Pre-results.
Keywords: electrocardiogram (ecg), computer-assisted instruction, web-based learning, medical student(s), resident(s)
Strengths and limitations of this study.
In the face of inadequate ECG competence among graduating medical students and residents worldwide, it is important to review how Electrocardiography is taught.
This systematic review will evaluate the effectiveness of computer-assisted instruction compared with other teaching methods used in the ECG training of medical students and residents.
The protocol describes a comprehensive search strategy a eligibility criteria which have no geographical or language restrictions.
The systematic review might be limited by the presence of selection and/or performance bias inherent in some of the selected studies.
A meta-analysis will only be possible in the absence of heterogeneous data among included studies.
Introduction
The electrocardiogram (ECG) remains one of the most frequently performed diagnostic procedures in clinical practice.1 2 ECG interpretation is, therefore, considered an essential learning outcome in undergraduate medical curricula.3 Incorrect interpretation of an ECG, however, can lead to inappropriate clinical decisions with serious adverse outcomes, especially in the realms of arrhythmias and myocardial infarction.4 5 Previous studies have found that the majority of medical students lack confidence when interpreting ECGs, as they find it a difficult skill to master and retain.6–10 Of greater concern is the finding that graduating medical students are often unable to accurately interpret ECGs, particularly when dealing with life-threatening conditions such as complete heart block and atrial fibrillation.7–10 Suboptimal ECG competence has also been shown in residents in cardiology, internal medicine and emergency medicine, all of which are specialties where the ECG is considered a core skill of daily practice.11–16
‘ECG analysis’ refers to the detailed examination of an ECG tracing, which requires the measurement of intervals and the evaluation of the rhythm and each waveform, whereas ‘ECG interpretation’ refers to the conclusion reached after careful ECG analysis, that is, making a diagnosis of an arrhythmia, ischaemia and so on.17 ‘ECG competence’ refers to the ability to accurately analyse and interpret the ECG,7 18 whereas ‘ECG knowledge’ refers to the understanding of ECG concepts, for example, knowing that transmural ischaemia or pericarditis can cause ST-segment elevation.6 19
It is well known that ECG analysis and interpretation are difficult and require significant training.20 The reasons for this are multifold. To start, students are required to have sound prior knowledge of the anatomy and physiology of the cardiac conduction system before they can begin to study ECGs.21 ECG analysis also requires a good understanding of vectors and how these are influenced by lead placement and pathology.17 21 Furthermore, ECG interpretation requires two types of reasoning: the non-analytical pattern recognition of abnormal waveforms and rhythms, and the analytical, systematic analysis of the entire 12-lead ECG.22 23 The best clinical results are attained when both non-analytical pattern recognition and analytic systematic analysis of the ECG are used simultaneously; however, most medical students and postgraduate trainees find this overwhelming.22 23
Although a large deal of experience in ECG interpretation depends on clinical exposure,24 clinical exposure alone does not improve ECG diagnostic accuracy if it is not supplemented by a structured form of teaching.25 In undergraduate and postgraduate courses, ECGs are commonly taught by means of large group teaching,2 7 20 25 26 where a teacher or expert transfers ECG knowledge to a group of learners in the format of a lecture.27–29 Lectures are an effective and cost-efficient method of tuition, as they allow for large groups of students to be taught at once.30 31 However, large group teaching facilitates passive learning, as didactic lectures often offer students little opportunity for interactive discussion with the lecturer.27 30–32 ECG teaching also frequently occurs in the small group setting, that is, during ward rounds and bedside tutorials.29 31 Small group teaching allows for free communication and interaction between the learner and the teacher, or among the learners themselves.33
Alternative teaching methods are increasingly being implemented and investigated to improve ECG training, and the following are some examples of these. The ‘flipped classroom’ refers to the teaching method where students are required to watch short video lectures or study written material at their own pace, before attending a classroom lecture.34 35 Instead of didactic tuition, lecture time is devoted to a more interactive discussion between the student and lecturer, which allows for problem solving and knowledge application in the classroom.35 36 ‘Peer teaching’ refers to the teaching method in which students are taught by fellow students of the same academic year, whereas ‘near-peer teaching’ refers to the teaching method in which students are taught by more senior students from the same curriculum.37 ‘Reciprocal peer teaching’ allows for students to alternate between the roles of tutor and learner.38 The tutoring role promotes self-learning by teaching others,37 38 whereas the learner role has been shown to be as effective as instruction by lecturers.38 39 ‘Problem-based learning’ refers to the student-centred teaching method where a clinical problem is assigned to students, who then need to identify what they need to learn from the clinical case and apply their knowledge to solve a clinical problem.40 Apart from the face-to-face tuition by experts or peers, ECG knowledge can also be acquired by means of self-directed learning (SDL), which refers to the independent study of textbooks or other designated study material.41
‘Computer-assisted instruction’ (CAI) has been used as an ECG teaching modality since the 1960s.42 CAI or ‘computer-assisted learning’ (CAL) refers to any teaching method that uses a digital platform as an SDL technique, which includes both online and offline learning opportunities.43 Although CAI is the broadest term as it encompasses both online and offline modalities, newer terminology specifically referring to online learning modalities includes terms such as ‘web-based learning’, ‘web-based training’ and ‘e-learning’.44–47 CAI or web-based learning typically provides the student with text, illustrations and other multimedia material to study. Additional educational features such as practice fields and test-enhanced learning (eg, online multiple choice questions with immediate feedback) can also be provided by the digital platform.43 47–49 CAI is increasingly being used as a possible solution for the increasing numbers of medical students that lecturers need to teach and the insufficient time allocated for ECG instruction in undergraduate and postgraduate curricula.50–53 Web-based learning allows for flexibility in learning, as the student can access the material wherever and whenever convenient, outside the constraints of time allocated for formal instruction.46–48
It is worth reviewing the value of computer-based training in medical education, as the current generation of medical students and residents, who are known as ‘millennials’, are computer literate and often seek technologically enhanced means of education.54–56 These students and residents grew up during the advent of the world wide web, smartphones and social media and are used to obtaining immediate access to unlimited information through mobile devices and desktop computers.56 57 Although today’s medical student prefers podcasts and interactive multimedia to conventional classroom teaching and textbooks,56 there is not enough evidence to suggest that the digital platform should replace traditional teaching methods. Although a meta-analysis showed that web-based learning was as effective as conventional teaching methods in health professionals,58 more recent subject-specific systematic reviews in anatomy and orthopaedics favoured CAI, especially in the setting of blended learning.43 59 However, it cannot be extrapolated that the effectiveness of CAI in other domains holds true for teaching ECG.
The objective measure of a teaching method’s effectiveness is the assessment of students’ competence after being exposed to the educational intervention.43 ECG competence is measured by assessing the student’s ECG analysis and/or interpretation skills. An assessment shortly after an educational intervention tests the acquisition of ECG competence, whereas delayed testing assesses the retention of ECG competence.46 More comprehensively, the modified Kirkpatrick model is a widely accepted method of appraising an educational intervention’s outcome, as it measures learners’ views on the learning experience (level 1), modification of learners’ perception of the intervention (level 2a), modification of knowledge or skills (level 2b), transfer of learning to the workplace (level 3), change in organisational practice (level 4a) and benefits to patients (level 4b).60–63
However, the effectiveness of an instructional method should not be interpreted in isolation, as there are several educational approaches that also have a significant impact on learning. The learning environment (ie, whether instruction occurs in the classroom, computer laboratory or clinical setting)64 and the spacing of instructional events (ie, massed vs distributed instruction)65–67 should be borne in mind when assessing the efficacy of instructional methods. It should also be considered whether provision was made for deliberate practice (eg, paper-based or computer-based ECG analysis)68–70 and whether the instruction included any formative or summative assessment with feedback.71 72
A distinction should be made between the method of instruction (how knowledge is transferred from the expert to the learner) and learning theories (how knowledge is acquired and assimilated by the learner).73 Different learning theories underpin a range of instructional methods.74 Although there is some overlap, learning theories can be categorised as:
instrumental learning theories,74 which include behaviourism (learning through practice, feedback and reinforcement),73–75 cognitivism (learning with demonstrations and explanations, understanding concepts),74–76 constructivism (critical thinking and elaboration)75–78 and experiential learning (learning through experience)64 74 79 80;
humanistic learning theories,74 which include andragogy (adult learning driven by intrinsic instead of extrinsic motivation)74 81 82 and SDL (self-regulated learning, where the learner plans and monitors their own learning)83 84;
the transformative learning theory (critical reflection)74;
social learning theories,74 which include collaborative learning (interaction with peers and tutors)85 86 and contextual learning (with case scenarios or multiple examples with different perspectives).87 88
In the face of inadequate ECG competence among graduating medical students and residents worldwide,7–10 it is time to review the way that ECG analysis and interpretation are taught. Are conventional ECG teaching methods achieving the necessary ECG competence? Are teaching methods on the digital platform better than the ways that ECGs have traditionally been taught? Or should a blended learning strategy (ie, the combination of CAI and other teaching modalities) be implemented for ECG teaching? And which learning theories underpin computer-assisted ECG instruction? To the best of our knowledge, there is no systematic review of the effectiveness of CAI compared with other teaching methods used in the ECG training of medical students and residents.
Objectives
The objectives of this systematic review are to:
establish whether CAI (on its own or in a blended learning setting) achieves better acquisition of ECG competence among medical students and residents than other non-CAI ECG teaching methods do;
establish whether CAI (on its own or in a blended learning setting) achieves better retention of ECG competence among medical students and residents than other non-CAI ECG teaching methods do;
establish whether there is a difference in the effectiveness of computer-assisted ECG instruction between medical students and residents enrolled for specialty training;
identify the types of learning material or activities (eg, reading material, case scenarios, illustrations, videos, test-enhanced learning tools, etc) in which CAI is delivered for ECG teaching, and to establish which CAL material or activities are associated with better outcomes;
identify the educational approaches that are possible with computer-assisted ECG instruction, and to establish which of these are associated with better outcomes;
identify the learning theories that may underpin computer-assisted ECG instruction.
Methods and design
In accordance with the PRISMA-P (Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols) guidelines,89 this systematic review protocol was registered with the International Prospective Register of Systematic Reviews (PROSPERO) on 6 July 2017 with registration number CRD42017067054.
Criteria for considering studies for this review
A study will be deemed eligible to be included in this systematic review only if it fulfils all inclusion criteria and does not meet any of the exclusion criteria, as outlined in table 1.
Table 1.
Criteria to assess a study’s eligibility to be included in this systematic review
Inclusion criteria | Exclusion criteria |
Population | |
|
|
Intervention | |
|
|
Comparator | |
|
|
Outcome | |
Educational intervention’s effectiveness:
|
|
Study | |
Any comparative research design:
|
Any non-comparative research design:
|
Types of studies
All studies with a comparative research design, that is, randomised controlled trial, cohort study, case–control study, before-and-after study or cross-sectional research will be included.
Types of participants
We will include studies in which the participants were medical students or residents enrolled for specialty training (eg, cardiology, internal medicine, emergency medicine, family medicine, paediatrics and anaesthetics). In studies where the participants were not limited to medical students or residents, only data pertaining to the medical students and residents will be extracted.
Types of interventions
Studies must include CAI as an educational intervention, in either an online or an offline format. The comparator education intervention may include any other teaching method to which CAI was compared. We will exclude studies in which teaching modalities were not primarily and solely used to teach ECGs, or if the subject of teaching was not the conventional 12-lead ECG.
Types of outcome measures
Results must include quantitative data in which ECG competence was measured. We will include assessments of the acquisition of ECG competence (measured shortly after educational intervention) and/or assessments of the retention of ECG competence (delayed testing after educational intervention).
Language and years of publication
All articles published before July 2017 will be included. Publications in languages other than English will be translated, wherever possible.
Primary outcomes
The primary outcome of this systematic review is to determine whether CAI, on its own or in a blended learning setting, is more effective than non-CAI teaching methods in achieving acquisition and retention of ECG competence among medical students and residents.
ECG competence will be measured by extracting the mean scores and SD of assessments before and after exposure to CAI and non-CAI teaching methods, as well as the P values, CIs and effect sizes (Cohen’s d). If the Cohen’s d is not reported in the study, this will be calculated using the mean difference between the groups exposed to CAI and non-CAI teaching methods, divided by the SD of the group exposed to non-CIA teaching methods90 91:
An effect size of greater than 0.8 will be considered of significant practical importance, whereas effect sizes of 0.5 and 0.2 will be considered as moderate and negligible practical importance, respectively.90 91
The effect of the different ECG teaching modalities will also be scored according to a modified version of Kirkpatrick framework for the evaluation of educational interventions, as shown in table 2.60–63
Table 2.
The modified Kirkpatrick framework for the evaluation of educational interventions
Level 1 | Participants’ reactions |
Level 2a | Modifications of attitudes and perceptions |
Level 2b | Acquisition of knowledge and skills |
Level 3 | Change in behaviour |
Level 4a | Change in organisational practice |
Level 4b | Benefits to patients or students |
Secondary outcomes
The secondary outcomes of this study are to:
determine whether there is a difference in the effectiveness of computer-assisted ECG instruction between medical students and residents enrolled for specialty training;
identify the types of learning material or activities that are possible with computer-assisted ECG instruction (eg, annotated ECGs, text, illustrations, videos, case scenarios, worked examples, deliberate practice tools) and to establish which CAI learning material or activities were associated with better outcomes;
identify the educational approaches (combined or implemented separately) that are possible with computer-assisted ECG instruction and to establish whether these are more successful when used with CAI, conventional teaching methods or in a blended learning setting;
identify the learning theories that underpin the methods of ECG instruction, that is, CAI and other methods used for ECG teaching.
Search methods for identification of studies
The lead reviewer (CAV) and an expert librarian (MS) from the University of Cape Town’s Faculty of Health Sciences will conduct an extensive search for peer-reviewed articles.
Electronic searches
The following electronic databases will be used for the search of articles for this systematic review: PubMed, Scopus, Web of Science, Academic Search Premier, CINAHL, PsycINFO, Education Resources Information Center, Africa-Wide Information, Teacher Reference Center and Google Scholar. A combination of Medical Subject Heading terms and free text terms will be used to search for articles. Table 3 shows the main search strategy that we will use.
Table 3.
PubMed search strategy, modified as needed for other electronic databases
Population: medical students/residents enrolled for specialty training | ||
1 | MeSH terms: | Education, Medical [MeSH] OR Students, Medical [MeSH] |
2 | Free text: | fellow OR fellowship OR graduate OR medical student OR postgraduate OR residency OR resident OR registrar OR registrarship OR specialty OR specialties OR undergraduate |
3 | 1 OR 2 | |
Intervention: computer-assisted instruction | ||
4 | MeSH terms: | Computer-assisted Instruction [MeSH] OR Computer Simulation [MeSH] OR Educational Technology [MeSH] OR Internet [MeSH] |
5 | Free text: | app OR application OR ‘blended learning’ OR computer OR computer-assisted OR digital OR e-learning OR e-modules OR ‘flipped classroom’ OR Internet OR multimedia OR online OR software OR technology OR virtual OR web OR web-aided OR web-assisted OR web-based OR web-supported OR web-enhanced OR webCT OR web 2.0 OR YouTube |
6 | 4 OR 5 | |
Comparator: any other teaching method used | ||
7 | MeSH terms: | Cardiology/Education [MeSH] OR Education/Methods [MeSH] OR Electrocardiography/Education [MeSH] OR Models, Educational [MeSH] OR Problem-based Learning [MeSH] OR Teaching/Methods MeSH] OR Teaching Rounds [MeSH] |
8 | Free text: | activity OR activities OR bedside OR blackboard OR class OR classroom OR clinical OR competency-based OR conventional OR course OR didactic OR educational method OR educational techniques OR instruction OR instructional method OR instructional techniques OR interactive OR ‘large group’ OR lecture OR lecture-based OR near peer OR outcome-based OR PBL OR pedagogy OR pedagogical OR peer facilitated OR peer led OR peer teaching OR peer tutorial OR peer tutoring OR problem-based OR rounds OR self-directed OR self-instruction OR self-study OR seminar OR simulation OR simulator OR ‘small group’ OR teaching method OR teaching techniques OR test-enhanced learning OR traditional OR training OR tutorial OR tutoring OR ward OR ‘worked example’ OR workshop |
9 | 7 OR 8 | |
Outcome: efficacy in acquiring ECG knowledge or skills | ||
10 | MeSH terms: | Electrocardiography [MeSH] |
11 | Free text: | ECG OR EKG OR electrocardiogram OR electrocardiographic OR electrocardiography |
12 | 10 OR 11 | |
13 | MeSH terms: | Clinical Competence [MeSH] OR Cognition [MeSH] OR Learning [MeSH] |
14 | Free text: | accuracy OR analysis OR assessment OR cognition OR cognitive OR competence OR competency OR comprehension OR diagnosis OR diagnostic OR effectiveness OR efficacy OR examination OR interpretation OR insight OR knowledge OR learning OR measurement OR memory OR participation OR performance OR practice OR problem-solving OR proficiency OR reasoning OR recall OR reinforcement OR retention OR score OR self-assessment OR self-efficacy OR skills OR test OR understanding |
15 | 13 OR 14 | |
16 | 12 AND 15 | |
17 | 3 AND 6 AND 9 AND 16 |
MeSH, Medical Subject Heading.
Searching other sources
Citation indexes and reference lists of all articles found through the database search will be reviewed for any articles that were not identified during the database search. A grey literature search will also be conducted.
Data collection and analysis
The screening process and study selection will be done according to the guidelines of the Cochrane Handbook for Systematic Reviews of Interventions.92
Selection of studies
Two reviewers (CAV and RSM) will independently screen all articles identified by the search. The reviewers will complete a standardised coding sheet that will indicate whether an article meets all the inclusion criteria or what the reason for exclusion is.
Duplicate publications of articles will be removed. The more recent publication with the most complete dataset will be used where duplicate publications for the same data are reported.
The screening process will occur in two phases:
-
Phase 1: screening of title and abstract
All titles and abstracts of articles identified in the search will be screened for eligibility. If it is not apparent from the title or abstract whether an article meets eligibility criteria, or if both reviewers (CAV and RSM) do not exclude the article, the full text of the article will be reviewed.
-
Phase 2: screening of full-text article
The full text will be reviewed of all potentially eligible articles. A kappa coefficient will be calculated to measure the consistency between the reviewers (CAV and RSM).30 Where there are discrepancies between the reviewers, this will be discussed with a third reviewer (VB) who will act as an adjudicator. Reasons for exclusion will be documented and presented in a table of excluded studies.
Data extraction and management
References will be managed using EndNote V.X8 software (Clarivate Analytics).93 Two reviewers (CAV and RSM) will independently extract data from all articles meeting eligibility criteria. The reviewers will use a standardised electronic data collection form on Research Electronic Data Capture (REDCap),94 which is a secure online database manager hosted at the University of Cape Town. Collected data will be exported from REDCap database to Stata V.14.2 (Stata) for statistical analysis.
Data extraction will include, but will not be limited to:
citation information
study design
total study duration
study population
ECGs used during teaching
teaching methods (CAI vs other teaching methods)
digital learning material
educational approaches in study
learning theories underpinning instructional methods
ECG competencies measured
testing times
results
validity and reliability of results
psychometric properties of the assessment tools (eg, Cronbach’s α coefficient).95
A more detailed data extraction set is included in the online supplementary material.
bmjopen-2017-018811supp001.docx (114.7KB, docx)
Quality assessment
The Medical Education Research Study Quality Instrument (MERSQI) will be used to assess the quality of studies in this systematic review.96 Designed to evaluate the quality of experimental, quasi-experimental and observational studies, the MERSQI is a validated quality assessment tool in medical education.97
Assessment of risk of bias
Two reviewers (CAV and RSM) will independently assess each included study for risk of bias92:
Selection bias, that is, different baseline characteristics among the different groups.
Performance bias, that is, different exposure to factors other than intervention that may have influenced outcome among different groups.
Attrition bias, that is, differences between groups in withdrawal of participants.
Detection bias, that is, differences between groups in how outcomes are determined.
Reporting bias, that is, differences in outcome reporting.
Measures of effectiveness of educational intervention
The practical significance of the educational interventions will be determined by reviewing their effect sizes. The effectiveness of ECG teaching modalities used in the articles will also be scored according to a modified version of Kirkpatrick framework for the evaluation of educational interventions. This framework is the internationally preferred framework for evaluation of educational interventions.60–62 The framework is composed of four levels, as shown in table 2.
Dealing with missing data
Corresponding authors will be contacted in the event of absent or incomplete evidence in the included studies. A delay of 6 weeks will be allowed to receive a response following two email attempts.
Data synthesis
Systematic review
We will provide a descriptive analysis of CAI and the comparator teaching modalities used for teaching ECGs. The educational impact of the different teaching modalities used for ECG training (CAI and other methods) will be evaluated by the modified version of Kirkpatrick framework for the evaluation of educational interventions, as shown in table 2.60–62
Meta-analysis
Heterogeneity of the data will be tested by means of the I2 and χ2 tests, as well as by visual inspection of the forest plot. Where found, the possible reasons for any heterogeneity will be explored, and if unexplainable, findings will be reported in a narrative review. In the absence of heterogeneity, the effects of different teaching modalities will be quantitatively analysed. The relative risk and/or the OR will be used to determine the strength of effects among dichotomous variables, and weighted mean difference will be calculated for continuous variables. The statistical significance will be evaluated through inspection of the 95% CIs.
We will do subanalyses of the efficacy of CAI and conventional teaching methods on medical students versus residents. In addition, we will consider subanalyses of studies in terms of teaching methods (ie, CAI, non-CAI and blended learning), different learning material or activities used by CAI (where sufficient data exist), different educational approaches and different learning theories underpinning CAI and other teaching methods.
Mapping review
A mapping review will be done to characterise the quality, quantity and focus of current medical education literature on CAI of ECGs.
Sensitivity analysis
A sensitivity analysis will be undertaken to evaluate the effect of the risk of bias score on the overall result.92 Should any further arbitrary or unclear characteristics arise from the data extraction, a sensitivity analysis will also be applied.
Presenting and reporting of results
Results will be discussed in the text and summarised in table format, an example of which is given in table 4.
Table 4.
Results will be summarised in table format
Study | Study design | Participants | Computer-assisted instruction (CAI) | Comparator teaching methods (not using CAI) |
|
|
|
|
|
Study | ECG knowledge that was tested | Baseline ECG knowledge | Acquired ECG knowledge | Retention of ECG knowledge |
|
|
|
|
|
Study | Modified Kirkpatrick model60–63 | Quality assessment | Risk of bias | Significance of study results |
|
|
|
|
|
MERSQI, Medical Education Research Study Quality Instrument.
Discussion
Expected significance of the study
This systematic review aims to explore the pedagogical value of CAI compared with other instructional methods used in the teaching of ECGs. The findings of this systematic review will be important in the review of medical curricula. If gaps are identified in the literature, this will inform future research in the field of ECG teaching. The goal is to provide evidence of best teaching practices, as patient care will ultimately benefit from improved ECG competence among graduating medical students and residents.
Ethics and dissemination
This research does not require ethical approval, as the study is a systematic review of published literature. Any changes to the current protocol will be considered protocol amendment, and this will be communicated to the journal, along with a motivation and justification for the protocol amendment. The status of the systematic review will be updated regularly in PROSPERO. We aim to submit the results of this systematic review to a peer-reviewed journal. The protocol and systematic review will be included in a PhD dissertation.
Supplementary Material
Acknowledgments
The authors wish to thank Dr Nicholas Simpson, Dr Mahmoud Werfalli and Ms Kathryn Manning from the University of Cape Town for their valuable support and input during the preparation of this protocol.
Footnotes
Contributors: CAV is a PhD student. RSM and VB are his supervisors. CAV conceived of the review and undertook the drafting of the manuscript. CAV and MS undertook a scoping search and developed the search strategy. CAV, RSM and VB will be involved in data acquisition. CAV and MEE will analyse the data and participate in the interpretation of the results. All authors have read the manuscript and have given their approval for publication.
Competing interests: RSM is a lecturer and host of the AO Memorial Advanced ECG and Arrhythmia Course, and receives an honorarium from Medtronic Africa.
Provenance and peer review: Not commissioned; externally peer reviewed.
References
- 1.Okreglicki A, Scott Millar R. ECG: PQRST morphology – clues and tips. A guide to practical pattern recognition. SA Heart Journal 2006;3:27–36. [Google Scholar]
- 2.Auseon AJ, Schaal SF, Kolibash AJ, et al. Methods of teaching and evaluating electrocardiogram interpretation skills among cardiology fellowship programs in the United States. J Electrocardiol 2009;42:339–44. 10.1016/j.jelectrocard.2009.01.004 [DOI] [PubMed] [Google Scholar]
- 3.Jablonover RS, Stagnaro-Green A. ECG as an entrustable professional activity: CDIM survey results, ECG teaching and assessment in the third year. Am J Med 2016;129:226–30. 10.1016/j.amjmed.2015.10.034 [DOI] [PubMed] [Google Scholar]
- 4.Bogun F, Anh D, Kalahasty G, et al. Misdiagnosis of atrial fibrillation and its clinical consequences. Am J Med 2004;117:636–42. 10.1016/j.amjmed.2004.06.024 [DOI] [PubMed] [Google Scholar]
- 5.Masoudi FA, Magid DJ, Vinson DR, et al. Implications of the failure to identify high-risk electrocardiogram findings for the quality of care of patients with acute myocardial infarction: results of the Emergency Department Quality in Myocardial Infarction (EDQMI) study. Circulation 2006;114:1565–71. 10.1161/CIRCULATIONAHA.106.623652 [DOI] [PubMed] [Google Scholar]
- 6.Kopeć G, Magoń W, Hołda M, et al. Competency in ECG interpretation among medical students. Med Sci Monit 2015;21:3386–94. 10.12659/MSM.895129 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Jablonover RS, Lundberg E, Zhang Y, et al. Competency in electrocardiogram interpretation among graduating medical students. Teach Learn Med 2014;26:279–84. 10.1080/10401334.2014.918882 [DOI] [PubMed] [Google Scholar]
- 8.McAloon C, Leach H, Gill S, et al. Improving ECG competence in medical trainees in a UK district general hospital. Cardiol Res 2014;5:51–7. 10.14740/cr333e [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Lever NA, Larsen PD, Dawes M, et al. Are our medical graduates in New Zealand safe and accurate in ECG interpretation? N Z Med J 2009;122:9–15. [PubMed] [Google Scholar]
- 10.Little B, Mainie I, Ho KJ, et al. Electrocardiogram and rhythm strip interpretation by final year medical students. Ulster Med J 2001;70:108–10. [PMC free article] [PubMed] [Google Scholar]
- 11.Salerno SM, Alguire PC, Waxman HS. Competency in interpretation of 12-lead electrocardiograms: a summary and appraisal of published evidence. Ann Intern Med 2003;138:751–60. 10.7326/0003-4819-138-9-200305060-00013 [DOI] [PubMed] [Google Scholar]
- 12.Sibbald M, Davies EG, Dorian P, et al. Electrocardiographic interpretation skills of cardiology residents: are they competent? Can J Cardiol 2014;30:1721–4. 10.1016/j.cjca.2014.08.026 [DOI] [PubMed] [Google Scholar]
- 13.Eslava D, Dhillon S, Berger J, et al. Interpretation of electrocardiograms by first-year residents: the need for change. J Electrocardiol 2009;42:693–7. 10.1016/j.jelectrocard.2009.07.020 [DOI] [PubMed] [Google Scholar]
- 14.Berger JS, Eisen L, Nozad V, et al. Competency in electrocardiogram interpretation among internal medicine and emergency medicine residents. Am J Med 2005;118:873–80. 10.1016/j.amjmed.2004.12.004 [DOI] [PubMed] [Google Scholar]
- 15.de Jager J, Wallis L, Maritz D. ECG interpretation skills of South African emergency medicine residents. Int J Emerg Med 2010;3:309–14. 10.1007/s12245-010-0227-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Novotny T, Bond RR, Andrsova I, et al. Data analysis of diagnostic accuracies in 12-lead electrocardiogram interpretation by junior medical fellows. J Electrocardiol 2015;48:988–94. 10.1016/j.jelectrocard.2015.08.023 [DOI] [PubMed] [Google Scholar]
- 17.Hurst JW. Methods used to interpret the 12-lead electrocardiogram: pattern memorization versus the use of vector concepts. Clin Cardiol 2000;23:4–13. 10.1002/clc.4960230103 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Kadish AH, Buxton AE, Kennedy HL, et al. ACC/AHA clinical competence statement on electrocardiography and ambulatory electrocardiography: A report of the ACC/AHA/ACP-ASIM task force on clinical competence (ACC/AHA Committee to develop a clinical competence statement on electrocardiography and ambulatory electrocardiography) endorsed by the International Society for Holter and noninvasive electrocardiology. Circulation 2001;104:3169–78. [PubMed] [Google Scholar]
- 19.Burke JF, Gnall E, Umrudden Z, et al. Critical analysis of a computer-assisted tutorial on ECG interpretation and its ability to determine competency. Med Teach 2008;30:e41–8. 10.1080/01421590801972471 [DOI] [PubMed] [Google Scholar]
- 20.Hurst JW. Current status of clinical electrocardiography with suggestions for the improvement of the interpretive process. Am J Cardiol 2003;92:1072–9. 10.1016/j.amjcard.2003.07.006 [DOI] [PubMed] [Google Scholar]
- 21.Dong R, Yang X, Xing B, et al. Use of concept maps to promote electrocardiogram diagnosis learning in undergraduate medical students. Int J Clin Exp Med 2015;8:7794–801. [PMC free article] [PubMed] [Google Scholar]
- 22.Eva KW, Hatala RM, Leblanc VR, et al. Teaching from the clinical reasoning literature: combined reasoning strategies help novice diagnosticians overcome misleading information. Med Educ 2007;41:1152–8. 10.1111/j.1365-2923.2007.02923.x [DOI] [PubMed] [Google Scholar]
- 23.Ark TK, Brooks LR, Eva KW. Giving learners the best of both worlds: do clinical teachers need to guard against teaching pattern recognition to novices? Acad Med 2006;81:405–9. 10.1097/00001888-200604000-00017 [DOI] [PubMed] [Google Scholar]
- 24.Alinier G, Gordon R, Harwood C, et al. 12-lead ECG training: the way forward. Nurse Educ Today 2006;26:87–92. 10.1016/j.nedt.2005.08.004 [DOI] [PubMed] [Google Scholar]
- 25.Devitt P, Worthley S, Palmer E, et al. Evaluation of a computer based package on electrocardiography. Aust N Z J Med 1998;28:432–5. 10.1111/j.1445-5994.1998.tb02076.x [DOI] [PubMed] [Google Scholar]
- 26.Barthelemy FX, Segard J, Fradin P, et al. ECG interpretation in emergency department residents: an update and e-learning as a resource to improve skills. Eur J Emerg Med 2017;24:149–56. 10.1097/MEJ.0000000000000312 [DOI] [PubMed] [Google Scholar]
- 27.Cantillon P. Teaching large groups. BMJ 2003;326:437 10.1136/bmj.326.7386.437 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Raupach T, Brown J, Anders S, et al. Summative assessments are more powerful drivers of student learning than resource intensive teaching formats. BMC Med 2013;11:1–10. 10.1186/1741-7015-11-61 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.O’Brien KE, Cannarozzi ML, Torre DM, et al. Training and assessment of ECG interpretation skills: results from the 2005 CDIM survey. Teach Learn Med 2009;21:111–5. 10.1080/10401330902791255 [DOI] [PubMed] [Google Scholar]
- 30.Luscombe C, Montgomery J. Exploring medical student learning in the large group teaching environment: examining current practice to inform curricular development. BMC Med Educ 2016;16:184 10.1186/s12909-016-0698-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Fent G, Gosai J, Purva M. Teaching the interpretation of electrocardiograms: which method is best? J Electrocardiol 2015;48:190–3. 10.1016/j.jelectrocard.2014.12.014 [DOI] [PubMed] [Google Scholar]
- 32.Moffett J, Berezowski J, Spencer D, et al. An investigation into the factors that encourage learner participation in a large group medical classroom. Adv Med Educ Pract 2014;5:65–71. 10.2147/AMEP.S55323 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Walton H. Small group methods in medical teaching. Med Educ 1997;31:459–64. 10.1046/j.1365-2923.1997.00703.x [DOI] [PubMed] [Google Scholar]
- 34.Gillispie V. Using the flipped classroom to bridge the gap to generation Y. Ochsner J 2016;16:32–6. [PMC free article] [PubMed] [Google Scholar]
- 35.Chen F, Lui AM, Martinelli SM. A systematic review of the effectiveness of flipped classrooms in medical education. Med Educ 2017;51:585–97. 10.1111/medu.13272 [DOI] [PubMed] [Google Scholar]
- 36.Rui Z, Lian-Rui X, Rong-Zheng Y, et al. Friend or Foe? Flipped classroom for undergraduate electrocardiogram learning: a randomized controlled study. BMC Med Educ 2017;17:53 10.1186/s12909-017-0881-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Ten Cate O, Durning S. Dimensions and psychology of peer teaching in medical education. Med Teach 2007;29:546–52. 10.1080/01421590701583816 [DOI] [PubMed] [Google Scholar]
- 38.Manyama M, Stafford R, Mazyala E, et al. Improving gross anatomy learning using reciprocal peer teaching. BMC Med Educ 2016;16:95 10.1186/s12909-016-0617-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Gregory A, Walker I, McLaughlin K, et al. Both preparing to teach and teaching positively impact learning outcomes for peer teachers. Med Teach 2011;33:e417–e422. 10.3109/0142159X.2011.586747 [DOI] [PubMed] [Google Scholar]
- 40.Hwang SY, Kim MJ. A comparison of problem-based learning and lecture-based learning in an adult health nursing course. Nurse Educ Today 2006;26:315–21. 10.1016/j.nedt.2005.11.002 [DOI] [PubMed] [Google Scholar]
- 41.Mahler SA, Wolcott CJ, Swoboda TK, et al. Techniques for teaching electrocardiogram interpretation: self-directed learning is less effective than a workshop or lecture. Med Educ 2011;45:347–53. 10.1111/j.1365-2923.2010.03891.x [DOI] [PubMed] [Google Scholar]
- 42.Fincher RM, Abdulla AM, Sridharan MR, et al. Teaching fundamental electrocardiography to medical students: computer-assisted learning compared with weekly seminars. Res Med Educ 1987;26:197–202. [PubMed] [Google Scholar]
- 43.Losco CD, Grant WD, Armson A, et al. Effective methods of teaching and learning in anatomy as a basic science: A BEME systematic review: BEME guide no. 44. Med Teach 2017;39:234–43. 10.1080/0142159X.2016.1271944 [DOI] [PubMed] [Google Scholar]
- 44.Pourmand A, Tanski M, Davis S, et al. Educational technology improves ECG interpretation of acute myocardial infarction among medical students and emergency medicine residents. West J Emerg Med 2015;16:133–7. 10.5811/westjem.2014.12.23706 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Montassier E, Hardouin JB, Segard J, et al. e-Learning versus lecture-based courses in ECG interpretation for undergraduate medical students: a randomized noninferiority study. Eur J Emerg Med 2016;23:108–13. 10.1097/MEJ.0000000000000215 [DOI] [PubMed] [Google Scholar]
- 46.Rolskov Bojsen S, Räder SB, Holst AG, et al. The acquisition and retention of ECG interpretation skills after a standardized web-based ECG tutorial-a randomised study. BMC Med Educ 2015;15:36 10.1186/s12909-015-0319-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Nilsson M, Bolinder G, Held C, et al. Evaluation of a web-based ECG-interpretation programme for undergraduate medical students. BMC Med Educ 2008;8:25 10.1186/1472-6920-8-25 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Cook DA. Web-based learning: pros, cons and controversies. Clin Med 2007;7:37–42. 10.7861/clinmedicine.7-1-37 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Garde S, Heid J, Haag M, et al. Can design principles of traditional learning theories be fulfilled by computer-based training systems in medicine: the example of CAMPUS. Int J Med Inform 2007;76:124–9. 10.1016/j.ijmedinf.2006.07.009 [DOI] [PubMed] [Google Scholar]
- 50.Hurst JW. The interpretation of electrocardiograms: pretense or a well-developed skill? Cardiol Clin 2006;24:305–7. 10.1016/j.ccl.2006.03.001 [DOI] [PubMed] [Google Scholar]
- 51.Raupach T, Harendza S, Anders S, et al. How can we improve teaching of ECG interpretation skills? Findings from a prospective randomised trial. J Electrocardiol 2016;49:7–12. 10.1016/j.jelectrocard.2015.10.004 [DOI] [PubMed] [Google Scholar]
- 52.Nilsson M, Östergren J, Fors U, et al. Does individual learning styles influence the choice to use a web-based ECG learning programme in a blended learning setting? BMC Med Educ 2012;12:5 10.1186/1472-6920-12-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Paul B, Baranchuk A. Electrocardiography teaching in Canadian family medicine residency programs: a national survey. Fam Med 2011;43:267–71. [PubMed] [Google Scholar]
- 54.Chan T, Sennik S, Zaki A, et al. Studying with the cloud: the use of online web-based resources to augment a traditional study group format. CJEM 2015;17:192–5. 10.2310/8000.2014.141425 [DOI] [PubMed] [Google Scholar]
- 55.Roberts DH, Newman LR, Schwartzstein RM. Twelve tips for facilitating Millennials′ learning. Med Teach 2012;34:274–8. 10.3109/0142159X.2011.613498 [DOI] [PubMed] [Google Scholar]
- 56.Hopkins L, Hampton BS, Abbott JF, et al. To the point: medical education, technology, and the millennial learner. Am J Obstet Gynecol 2017. 10.1016/j.ajog.2017.06.001 [DOI] [PubMed] [Google Scholar]
- 57.Desy JR, Reed DA, Wolanskyj AP. Milestones and Millennials: a perfect pairing-competency-based medical education and the learning preferences of generation Y. Mayo Clin Proc 2017;92:243–50. 10.1016/j.mayocp.2016.10.026 [DOI] [PubMed] [Google Scholar]
- 58.Cook DA, Levinson AJ, Garside S, et al. Internet-based learning in the health professions: a meta-analysis. JAMA 2008;300:1181–96. 10.1001/jama.300.10.1181 [DOI] [PubMed] [Google Scholar]
- 59.Tarpada SP, Morris MT, Burton DA. E-learning in orthopedic surgery training: a systematic review. J Orthop 2016;13:425–30. 10.1016/j.jor.2016.09.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Hammick M, Dornan T, Steinert Y. Conducting a best evidence systematic review. Part 1: from idea to data coding. BEME Guide No. 13. Med Teach 2010;32:3–15. 10.3109/01421590903414245 [DOI] [PubMed] [Google Scholar]
- 61.Curran VR, Fleet L. A review of evaluation outcomes of web-based continuing medical education. Med Educ 2005;39:561–7. 10.1111/j.1365-2929.2005.02173.x [DOI] [PubMed] [Google Scholar]
- 62.Harden RM, Grant J, Buckley G, et al. BEME guide no. 1: best evidence medical education. Med Teach 1999;21:553–62. 10.1080/01421599978960 [DOI] [PubMed] [Google Scholar]
- 63.Lameris AL, Hoenderop JG, Bindels RJ, et al. The impact of formative testing on study behaviour and study performance of (bio)medical students: a smartphone application intervention study. BMC Med Educ 2015;15:72 10.1186/s12909-015-0351-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Spencer J. ABC of learning and teaching in medicine: Learning and teaching in the clinical environment. BMJ 2003;326:591–4. 10.1136/bmj.326.7389.591 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Custers EJ. Long-term retention of basic science knowledge: a review study. Adv Health Sci Educ Theory Pract 2010;15:109–28. 10.1007/s10459-008-9101-y [DOI] [PubMed] [Google Scholar]
- 66.Son LK, Simon DA. Distributed learning: data, metacognition, and educational implications. Educ Psychol Rev 2012;24:379–99. 10.1007/s10648-012-9206-y [DOI] [Google Scholar]
- 67.Carpenter SK, Cepeda NJ, Rohrer D, et al. Using spacing to enhance diverse forms of learning: review of recent research and implications for instruction. Educ Psychol Rev 2012;24:369–78. 10.1007/s10648-012-9205-z [DOI] [Google Scholar]
- 68.Hatala RM, Brooks LR, Norman GR. Practice makes perfect: the critical role of mixed practice in the acquisition of ECG interpretation skills. Adv Health Sci Educ Theory Pract 2003;8:17–26. 10.1023/A:1022687404380 [DOI] [PubMed] [Google Scholar]
- 69.Moulaert V, Verwijnen MG, Rikers R, et al. The effects of deliberate practice in undergraduate medical education. Med Educ 2004;38:1044–52. 10.1111/j.1365-2929.2004.01954.x [DOI] [PubMed] [Google Scholar]
- 70.McGaghie WC, Issenberg SB, Cohen ER, et al. Medical education featuring mastery learning with deliberate practice can lead to better health for individuals and populations. Acad Med 2011;86:e8–9. 10.1097/ACM.0b013e3182308d37 [DOI] [PubMed] [Google Scholar]
- 71.Burch VC, Seggie JL, Gary NE. Formative assessment promotes learning in undergraduate clinical clerkships. S Afr Med J 2006;96:430–3. [PubMed] [Google Scholar]
- 72.Raupach T, Hanneforth N, Anders S, et al. Impact of teaching and assessment format on electrocardiogram interpretation skills. Med Educ 2010;44:731–40. 10.1111/j.1365-2923.2010.03687.x [DOI] [PubMed] [Google Scholar]
- 73.Hung D. Theories of learning and computer-mediated instructional technologies. EMI Educ Media Int 2001;38:281–7. 10.1080/09523980110105114 [DOI] [Google Scholar]
- 74.Taylor DC, Hamdy H. Adult learning theories: implications for learning and teaching in medical education: AMEE Guide No. 83. Med Teach 2013;35:e1561–72. 10.3109/0142159X.2013.828153 [DOI] [PubMed] [Google Scholar]
- 75.Ertmer PA, Newby TJ. Behaviorism, cognitivism, constructivism: comparing critical features from an instructional design perspective. Performance Improvement Quarterly 1993;6:50–72. 10.1111/j.1937-8327.1993.tb00605.x [DOI] [Google Scholar]
- 76.Torre DM, Daley BJ, Sebastian JL, et al. Overview of current learning theories for medical educators. Am J Med 2006;119:903–7. 10.1016/j.amjmed.2006.06.037 [DOI] [PubMed] [Google Scholar]
- 77.West DC, Pomeroy JR, Park JK, et al. Critical thinking in graduate medical education: a role for concept mapping assessment? JAMA 2000;284:1105–10. [DOI] [PubMed] [Google Scholar]
- 78.Dolmans DH, De Grave W, Wolfhagen IH, et al. Problem-based learning: future challenges for educational practice and research. Med Educ 2005;39:732–41. 10.1111/j.1365-2929.2005.02205.x [DOI] [PubMed] [Google Scholar]
- 79.Yardley S, Teunissen PW, Dornan T. Experiential learning: transforming theory into practice. Med Teach 2012;34:161–4. 10.3109/0142159X.2012.643264 [DOI] [PubMed] [Google Scholar]
- 80.Maudsley G, Strivens J. Promoting professional knowledge, experiential learning and critical thinking for medical students. Med Educ 2000;34:535–44. 10.1046/j.1365-2923.2000.00632.x [DOI] [PubMed] [Google Scholar]
- 81.Kaufman DM. ABC of learning and teaching in medicine: applying educational theory in practice. BMJ 2003;326:213–6. 10.1136/bmj.326.7382.213 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.Ryan RM, Deci EL. Intrinsic and extrinsic motivations: classic definitions and new directions. Contemp Educ Psychol 2000;25:54–67. 10.1006/ceps.1999.1020 [DOI] [PubMed] [Google Scholar]
- 83.Young JQ, Van Merrienboer J, Durning S, et al. Cognitive Load Theory: implications for medical education: AMEE Guide No. 86. Med Teach 2014;36:371–84. 10.3109/0142159X.2014.889290 [DOI] [PubMed] [Google Scholar]
- 84.Neufeld VR, Barrows HS. The ‘McMaster Philosophy’: an approach to medical education. J Med Educ 1974;49:1040–50. [PubMed] [Google Scholar]
- 85.Warschauer M. Computer-mediated collaborative learning: theory and practice. The Modern Language Journal 1997;81:470–81. 10.1111/j.1540-4781.1997.tb05514.x [DOI] [Google Scholar]
- 86.Ruiz JG, Mintzer MJ, Leipzig RM. The impact of E-learning in medical education. Acad Med 2006;81:207–12. 10.1097/00001888-200603000-00002 [DOI] [PubMed] [Google Scholar]
- 87.Hatala R, Norman GR, Brooks LR. Impact of a clinical scenario on accuracy of electrocardiogram interpretation. J Gen Intern Med 1999;14:126–9. 10.1046/j.1525-1497.1999.00298.x [DOI] [PubMed] [Google Scholar]
- 88.Bergman EM, Sieben JM, Smailbegovic I, et al. Constructive, collaborative, contextual, and self-directed learning in surface anatomy education. Anat Sci Educ 2013;6:114–24. 10.1002/ase.1306 [DOI] [PubMed] [Google Scholar]
- 89.Shamseer L, Moher D, Clarke M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ 2015;349:g7647 10.1136/bmj.g7647 [DOI] [PubMed] [Google Scholar]
- 90.Hojat M, Xu G. A visitor’s guide to effect sizes: statistical significance versus practical (clinical) importance of research findings. Adv Health Sci Educ Theory Pract 2004;9:241–9. 10.1023/B:AHSE.0000038173.00909.f6 [DOI] [PubMed] [Google Scholar]
- 91.Sullivan GM, Feinn R. Using effect size-or why the P value is not enough. J Grad Med Educ 2012;4:279–82. 10.4300/JGME-D-12-00156.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 92.Higgins JP, Green S. Cochrane handbook for systematic reviews of interventions. USA: John Wiley & Sons, 2011. [Google Scholar]
- 93.Peters MD. Managing and coding references for systematic reviews and scoping reviews in endnote. Med Ref Serv Q 2017;36:19–31. 10.1080/02763869.2017.1259891 [DOI] [PubMed] [Google Scholar]
- 94.Harris PA, Taylor R, Thielke R, et al. Research electronic data capture (REDCap)–a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009;42:377–81. 10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 95.Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika 1951;16:297–334. 10.1007/BF02310555 [DOI] [Google Scholar]
- 96.Reed DA, Cook DA, Beckman TJ, et al. Association between funding and quality of published medical education research. JAMA 2007;298:1002–9. 10.1001/jama.298.9.1002 [DOI] [PubMed] [Google Scholar]
- 97.Sharma R, Gordon M, Dharamsi S, et al. Systematic reviews in medical education: a practical approach: AMEE guide 94. Med Teach 2015;37:108–24. 10.3109/0142159X.2014.970996 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
bmjopen-2017-018811supp001.docx (114.7KB, docx)