Skip to main content
European Heart Journal. Digital Health logoLink to European Heart Journal. Digital Health
. 2021 Feb 22;2(2):202–214. doi: 10.1093/ehjdh/ztab027

Utility of mobile learning in Electrocardiography

Charle André Viljoen 1,2,3,, Rob Scott Millar 1,2, Julian Hoevelmann 3,4, Elani Muller 3, Lina Hähnle 3, Kathryn Manning 2, Jonathan Naude 2, Karen Sliwa 3, Vanessa Celeste Burch 2
PMCID: PMC9707875  PMID: 36712390

Abstract

Aims

Mobile learning is attributed to the acquisition of knowledge derived from accessing information on a mobile device. Although increasingly implemented in medical education, research on its utility in Electrocardiography remains sparse. In this study, we explored the effect of mobile learning on the accuracy of electrocardiogram (ECG) analysis and interpretation.

Methods and results

The study comprised 181 participants (77 fourth- and 69 sixth-year medical students, and 35 residents). Participants were randomized to analyse ECGs with a mobile learning strategy [either searching the Internet or using an ECG reference application (app)] or not. For each ECG, they provided their initial diagnosis, key supporting features, and final diagnosis consecutively. Two weeks later, they analysed the same ECGs, without access to any mobile device. ECG interpretation was more accurate when participants used the ECG app (56%), as compared to searching the Internet (50.3%) or neither (43.5%, P =0.001). Importantly, mobile learning supported participants in revising their initial incorrect ECG diagnosis (ECG app 18.7%, Internet search 13.6%, no mobile device 8.4%, P <0.001). However, whilst this was true for students, there was no significant difference amongst residents. Internet searches were only useful if participants identified the correct ECG features. The app was beneficial when participants searched by ECG features, but not by diagnosis. Using the ECG reference app required less time than searching the Internet (7:44 ± 4:13 vs. 9:14 ± 4:34, P < 0.001). Mobile learning gains were not sustained after 2 weeks.

Conclusion

Whilst mobile learning contributes to increased ECG diagnostic accuracy, the benefits were not sustained over time.

Keywords: App, Electrocardiography, ECG, Internet, Mobile learning, Medical education

Graphical Abstract

Graphical Abstract.

Graphical Abstract

Introduction

The current generation of medical students and residents, who are commonly referred to as ‘millennials’, often seek technologically enhanced means of education.1–3 Mobile learning, an educational method that is attributed to the acquisition of knowledge derived from accessing information on a mobile device,4,5 is increasingly being deployed in medical education.6 Medical students and healthcare professionals no longer carry bags around through the hospital, loaded with heavy textbooks that could potentially be out of date. Instead, they now have access to the same educational resources, and much more, on their personal handheld mobile devices, be it a smartphone or tablet.7 Mobile technology allows for the delivery of educational content which is easy to access,8 up to date,9 and often enriched with graphics and multimedia.10 In this regard, information is accessed through applications or ‘apps’ that are downloaded onto the smartphone or tablet. These applications are tailor-made, with a specific purpose and use.10

One way of accessing information on a mobile device is by using an app that functions as a browser to search the world wide web.7,11 A typical example of a search engine used for this purpose would be Google.12 The advantage of browsing the Internet from a mobile device is that a vast amount of knowledge can be accessed through search terms of their choice and websites of their preference. The downside, however, is that the mobile device requires a connection to the Internet, either by Wi-Fi or cellular connection, for the search engine to access the information.13 This is a potential limitation of mobile learning, especially for users in low- and middle-income countries. Furthermore, a potential pitfall of using search engines for mobile learning is that students may access information that is not necessarily peer-reviewed.10

Alternatively, applications that serve as a point of reference for a certain subject matter can be downloaded onto mobile devices.10,14 The information contained in these applications is accessed with or without Wi-Fi or cellular connection to the Internet. Although some of these apps are free, many are purchased or have in-app purchases, to access their full educational content. Although mobile health applications that are used for patient monitoring undergo regulatory testing prior to release in app stores, it is an ongoing concern that medical education apps are not routinely peer-reviewed to ensure that the educational content is accurate and up to date.15

Although mobile learning is gaining ground, research on its utility in medical education remains sparse.5 Moreover, to the best of our knowledge, there is no prior study assessing the impact of mobile learning on achieving more accurate electrocardiogram (ECG) analysis and interpretation amongst medical students or residents.16 There is no evidence to suggest that educational apps with curated content are better than unguided Internet searches in aiding novice clinicians with the process of making accurate ECG diagnoses.

In this study, we aimed to (i) assess whether medical students or residents were more accurate with their ECG interpretation when using an ECG reference app or searching the Internet, or neither; (ii) establish whether medical students or residents revised their initial diagnosis after identification of ECG features in support of the diagnosis, and whether this was influenced by having access to an ECG reference app or searching the Internet, or neither; (iii) determine the search strategies on the Internet and the ECG reference app, and how this impacted on the accuracy of ECG interpretation; (iv) determine the time differences between using an ECG reference app as compared to searching the Internet or neither; and (v) ascertain mobile learning preferences in Electrocardiography amongst undergraduate and postgraduate trainees.

Methods

We invited undergraduate and postgraduate students from the University of Cape Town (UCT) to participate in this study. Undergraduate participants included fourth- and sixth-year medical students. The fourth-year Internal Medicine clerkship is their first exposure to clinical medicine, whereas the sixth-year Internal Medicine clerkship is the final clinical clerkship before graduating as a medical doctor. Hereafter, the fourth- and sixth-year medical students are referred to as junior and senior students, respectively. Postgraduate students comprised residents from the Department of Medicine, with at least 4 years of clinical work experience.

Ethical approval was obtained from the Human Research Ethics Committee (HREC) at the Faculty of Health Sciences (HREC reference number 111/2018), as well as institutional permission from the Department of Student Affairs at UCT. All participants signed informed consent prior to enrolment to the study. Participation did not contribute to their year marks.

Study design

As shown in Figure 1, we employed a mixed methods research strategy, which included two quantitative ECG tests (to determine ECG diagnostic accuracy) and a qualitative survey (designed to elicit feedback on mobile learning preferences in Electrocardiography):

Figure 1.

Figure 1

Study flow.

  • Test 1, conducted on entry to the study, was designed to measure the utility of mobile learning (i.e. comparing ECG diagnostic accuracy with access to an ECG app, searching the Internet or using neither);

  • Test 2, conducted 2 weeks later, was designed to determine retention of knowledge after the initial mobile learning exposure (i.e. subsequent ECG interpretation without access to the ECG app or the Internet); and

  • The survey was completed immediately after Test 2.

Mobile platforms tested as point of reference

The mobile learning platforms (educational interventions) that were compared to using no mobile device (control) were:

  • A search engine (e.g. ‘Google’) for Internet browsing: participants were not guided in the use of search terms or the websites they accessed during ECG analysis and interpretation. All participants had free access to Wi-Fi (i.e. ‘eduroam’), which is available to all enrolled undergraduate and postgraduate students at the University.

  • An algorithm-based ECG reference app, ECG APPtitude, developed by the Division of Cardiology at UCT. ECG APPtitude is a comprehensive reference guide to the systematic analysis and interpretation of the 12-lead ECG. The content is structured according to normal and abnormal features on the ECG for both rhythm abnormalities (Figure 2A) and morphological abnormalities (Figure 2B), as well as ECG diagnoses (Figure 2C). The text is accompanied by annotated ECGs and illustrations, which explain the mechanisms of normal and abnormal rhythms and morphological patterns. The diagnostic features are shown for each ECG. Participants downloaded the app, free of charge, from the App Store (in the case of iOS handheld devices) or Google Play (in the case of Android handheld devices) at enrolment to the study.

Figure 2.

Figure 2

The content in this app is organized by features and by diagnoses. The section on Rhythm Analysis (A) provides approaches and differential diagnoses for different rate and rhythm abnormalities, whereas the section on Waveform Analysis (B) provides the normal parameters, as well as the differential diagnoses for abnormal waveforms. The section on ECG interpretation (C) provides a list of ECG diagnoses, where the key ECG features are given for each diagnosis.

Test 1 (measuring utility of mobile learning)

The first ECG test took place at study entry, and consisted of three parts

  • For the first set of three ECGs, the participants were not permitted to use any mobile devices to assist them with the ECG analysis and interpretation. These ECGs served as the control for ECGs that were subsequently analysed with the help of a mobile device (Figure 3A).

  • For the second set of three ECGs, participants were asked to browse the Internet by means of a search engine (e.g. ‘Google’) from their phone or tablet to assist them with the ECG analysis and interpretation, after they have provided their initial ECG diagnosis. Participants could access any website of their choice. They were asked to indicate, for each ECG, which website they accessed and what search terms they used. These ECGs served as part of the educational intervention (mobile learning), to assess the value of unguided Internet searches, not restricted to content that is necessarily peer-reviewed (Figure 3B).

  • For the third set of three ECGs, participants were required to use the ECG reference app (‘ECG APPtitude’) downloaded onto their mobile device during their ECG analysis and interpretation, after they have provided their initial ECG diagnosis. Participants were asked to indicate which section of the app they accessed for each ECG [i.e. section on ECG features (rhythms, waveforms) or section on ECG diagnoses]. These ECGs served as part of the educational intervention (mobile learning), to assess the value of accessing curated content in a customized algorithm-based app (Figure 3C).

Figure 3.

Figure 3

Participants were asked to provide an initial diagnosis (spot diagnosis) for each ECG. Once the answer was submitted, participants were asked to provide the key features on the ECG that supported the diagnosis. They could then proceed to providing their final diagnosis. Analysis and interpretation of the first three ECGs occurred without access to any mobile device (A). For the subsequent sets of three ECGs, participants could search the Internet freely (B) or access the ECG reference app (C) respectively from their mobile devices, during ECG analysis and interpretation.

As shown in Figure 1, participants were randomized to three groups. All groups analysed and interpreted the same set of nine ECGs, but a randomization process determined the order in which they analysed the ECGs (Table 1). For each participant, the order in which the ECGs were analysed determined which ECGs would be analysed without any mobile device (Figure 3A), or with a mobile device from which they could either search and browse the Internet freely (Figure 3B) or use the ECG reference app (Figure 3C). The randomization ensured that each of the nine ECGs was analysed and interpreted by participants who had access to mobile learning [browsing the Internet from a search engine (‘Google’), or the ECG reference app (‘ECG APPtitude’)] or neither.

Table 1.

Each ECG was analysed by a group of participants who had access to a mobile device or not. When access to a mobile device was allowed, the randomization process determined whether they could search the Internet freely or access the ECG reference app

No mobile device Internet search ECG app
ECG 1 Group A Group B Group C
ECG 2 Group A Group B Group C
ECG 3 Group A Group B Group C
ECG 4 Group C Group A Group B
ECG 5 Group C Group A Group B
ECG 6 Group C Group A Group B
ECG 7 Group B Group C Group A
ECG 8 Group B Group C Group A
ECG 9 Group B Group C Group A

ECG 1, third degree AV block (complete heart block); ECG 2, inferior ST-segment elevation myocardial infarction (STEMI); ECG 3, atrial flutter; ECG 4, hyperkalaemia; ECG 5, Mobitz type I second degree AV block; ECG 6, atrial fibrillation (AF) with uncontrolled rate; ECG 7, right ventricular hypertrophy (RVH); ECG 8, Mobitz type II second degree AV block; ECG 9, ventricular tachycardia (VT).

The ECG test was administered online at UCT’s Faculty of Health Sciences computer laboratories. The test was password-protected and invigilated to ensure that participants used the mobile learning platform, or not, as determined and specified by the randomization schedule. For easy recognition by the invigilators, and to ensure that the participants used the correct mobile platform for each ECG, the background screen was white for ECGs that were assessed without a mobile device. ECGs that were analysed with the help of browsing the Internet were displayed with a green background, and those analysed with the reference app had a blue background.

As illustrated in Figure 3AC, participants were asked to provide an initial diagnosis (‘spot diagnosis’) for all ECGs in the test. Once this answer was submitted, participants were asked to provide the key features on the ECG, both normal and abnormal, that supported the ECG diagnosis. From this point onwards, they were allowed to search the Internet, consult the ECG reference app or use neither mobile learning strategy, as was predetermined by the study randomization schedule. Participants could not go back to alter their initial diagnosis. Once they had identified the characteristic features supporting the ECG diagnosis, they were asked to provide their final ECG diagnosis.

Participants provided their answers in free text form on the online test. These typed answers were marked by the investigators (J.H., E.M., and L.H.) according to a marking memorandum, containing the ECG diagnosis and supporting key features (Supplementary material online). The answers were checked (C.A.V.) and entered into a purpose built database hosted on REDCap.17 The investigators agreed that the nine ECGs used in the ECG tests were unequivocal examples of the conditions tested. The investigators also reached full agreement with supporting features of each ECG diagnosis, as listed in the marking memorandum (Supplementary material online).

The online tests registered the time from opening the question to submitting the final answer. This allowed the investigators to measure how long it took to analyse and interpret each ECG.

Test 2 (assessing retention of knowledge)

Participants were asked to complete a second ECG test, 2 weeks after the first test, to test their retention of knowledge (Figure 1). The second test consisted of the same nine ECGs that participants analysed and interpreted in the first test. However, during the second test, participants could not use any mobile technology (for Internet searches or accessing the ECG reference app) to assist in the ECG analysis and interpretation. The correct answers for the nine ECGs used in both tests were only provided to participants after they submitted the second ECG test.

Survey on use of mobile learning in Electrocardiography

After completion of the second ECG test, participants were asked to complete a survey on mobile learning (Figure 1). The survey collected data of which mobile device (i.e. phone, tablet) they preferred using to access information, and what the operating system on the device was (e.g. Apple, Android). They were asked to indicate how often they studied by means of mobile learning, i.e. searching the Internet from mobile device (e.g. Wikipedia, other web sites) or studying from apps (e.g. ECG APPtitude, other apps) for Electrocardiography. Participants were also asked to comment on the limitations of mobile learning.

Statistical analysis

Data were exported from REDCap.17 Data exploration and analysis were done in Stata (V.14.2; Stata Corp, College Station, TX, USA). Descriptive statistics were used to summarize the ECG test scores and the time that it took to analyse and interpret each ECG and the survey feedback. Continuous variables were described using means ± standard deviations (SDs). Categorical data were expressed as frequency and proportion and compared using the Chi-square test. Variables were compared between the ECG reference app, searching the Internet and no access to mobile device. Associations between correcting an initial incorrect ECG diagnosis and the search strategy used (searching the Internet or using the app) were assessed using odds ratios (ORs) determined by logistic regression. The differences in time spent on ECG analysis and interpretation with the assistance of the ECG app, searching the Internet or having access to no mobile device were analysed by means of one-way analysis of variance. Where applicable, a P-value of <0.05 was considered statistically significant, and 95% confidence intervals (CIs) were used to determine the precision of estimates.

Results

Our study comprised 181 participants, of which 77 (42.6%) were junior medical students, 69 (38.1%) were senior medical students and 35 (19.3%) were residents. All participants completed the first test and 135 (74.6%) completed the second test [i.e. 62/69 (89.9%) junior students, 45/69 (65.2%) senior students, 28/35 (80%) residents].

Accuracy of final ECG interpretation with different mobile learning platforms

In Test 1, final ECG diagnoses were significantly less accurate amongst participants who did not have access to any mobile learning strategy (43.5%), as compared to those who used the ECG app (56%, P <0.001) or searched the Internet (50.3%, P <0.030). However, there was no significant difference in diagnostic accuracy when comparing use of the app to an Internet search (P =0.068) (Figure 4).

Figure 4.

Figure 4

Accuracy of final ECG interpretation, after having had access to no mobile device, searching the Internet or accessing the ECG reference app. The results of Test 1 are shown for junior and senior medical students, medical residents and all participants. Two weeks later, during Test 2, the same ECGs were analysed without access to a mobile device. The results shown for junior and senior medical students, medical residents and all participants, and are categorized according to whether the ECGs were analysed 2 weeks prior without access to a mobile device, or having access to a mobile device and able to search the Internet or access the ECG reference app. Data were expressed as proportions and compared using the Chi-square test. Significant differences between subgroups are indicated as follows: ***P < 0.001, **P < 0.01, and *P < 0.05.

Subgroup analyses of the Test 1 data showed that the diagnostic accuracy of senior medical students was significantly influenced by the use of mobile learning strategies. As compared to ECG interpretation without mobile learning support (32.7%), diagnostic accuracy was significantly better when using the app (51.8%, P <0.001) or searching the Internet (48.7%, P =0.001). Although the gains were more pronounced with the app, these were not significantly better than with Internet searches. In contrast, junior medical students did not benefit from using the Internet, as compared to unsupported ECG analysis. Their diagnostic accuracy was significantly better when using the app (54.8%) as compared to searching the Internet (45.3%, P =0.048) or not having access to these mobile learning strategies (45.2%, P =0.044). Residents did not benefit from the use of either mobile learning strategy (Figure 4, Test 1).

Two weeks later, repeat testing (using the same ECGs) with no access to the Internet or ECG app for any of the ECGs, showed that there was no difference in diagnostic accuracy amongst any of the cohorts (Figure 4, Test 2).

Revision of initial ECG interpretation with different mobile learning platforms

As shown in Figure 5, participants made a correct initial and final diagnosis for a third of the ECGs, regardless of whether they had access to the ECG app, could search the Internet, or neither. Overall, the diagnostic accuracy of residents was greater than that of senior and junior medical students (P <0.001).

Figure 5.

Figure 5

The proportion of ECGs for which the participants gave a correct or incorrect initial diagnosis and subsequently changed to a correct or incorrect final diagnosis, or not. The ECGs are categorized as to whether the participants had no mobile device, could search the Internet, or access the ECG app. Whilst junior (A) and senior medical students (B) were influenced by searching the Internet or accessing the ECG app when changing an incorrect initial diagnosis to a correct final diagnosis, this was not true for medical residents (C). Overall (D), the greatest gains were present with ECGs for which participants had access to the ECG reference app.

Figure 5 also shows that access to the ECG app or searching the Internet contributed to a greater proportion of ECGs for which participants changed their incorrect initial diagnosis to a correct final diagnosis. The greatest gains were amongst the ECGs with which the participants could use the ECG reference app (18.7% with ECG app, 13.6% when searching the Internet, 8.4% without any mobile learning strategy, P <0.001). Subgroup analyses showed that whilst this was true for senior and junior medical students, the residents did not show any benefit of accessing the app, or searching the Internet, over having no access to a mobile device.

Mobile learning search strategies and their impact on diagnostic accuracy

Life in the Fast Lane (https://litfl.com) was the website that was the most frequently accessed by participants in this study [327 (56.48%) ECGs]. Participants used the images that appeared on the Google search as their point of reference for 82 (14.2%) ECGs and Wikipedia (en.wikipedia.org) for 16 (2.8%) ECGs. Figure 6 shows that trainees improved their diagnostic accuracy if they searched the Internet using the correct ECG features (OR 2.24, 95% CI 1.34–3.76) or ECG diagnosis (OR 2.38, 95% CI 1.42–3.99). This was not true if they used the wrong ECG features or diagnoses as search terms. Indeed, if participants reported any wrong features on the ECG, they were not likely to revise an incorrect initial diagnosis when searching the Internet (OR 0.24, 95% CI 0.12–0.49).

Figure 6.

Figure 6

There was an association between searching the correct ECG features or diagnosis on the web and correcting an initial incorrect ECG diagnosis. When using the ECG reference app, searching by ECG diagnosis was not associated with changing an incorrect initial diagnosis, whereas searching by features was.

When using the ECG reference app, participants preferred to look up ECGs by their diagnosis (49.1%) (Figure 2C), rather than their features [31.59% rhythm features (Figure 2A), 32.26% waveform features (Figure 2B)]. However, as depicted in Figure 6, trainees only improved their diagnostic accuracy if they searched the app for abnormal waveforms (OR 2.62, 95% CI 1.23–5.53) or rhythm features (OR 2.43, 95% CI 1.18–5.00 for bradyarrhythmias; OR 2.12, 95% CI 1.42–4.40 for tachyarrhythmias). This was not possible if they reported any wrong features on the ECG (OR 0.12, 95% CI 0.05–0.26). Searching the app by ECG diagnosis did not improve their diagnostic accuracy.

Time differences between using different mobile learning platforms

Table 2 shows that there was a significant difference in the time (measured in minutes:seconds, displayed as mean ± SD) taken to analyse and interpret the ECGs in Test 1, depending on whether or not participants browsed the Internet (9:14 ± 4:34) or accessed the ECG reference app (7:44 ± 4:13, P <0.001). This was true for all participants; subgroup analyses confirmed the same findings in residents and students. Students spent more time with the app (junior students 8:22 ± 4:37; senior students 7:34 ± 3:58) than residents (6:44 ± 3:34), whereas residents spent more time searching the Internet (10:55 ± 5:02) from their mobile devices than students (junior students 8:59 ± 4:23; senior students 8:37 ± 4:18).

Table 2.

Difference in the time taken to analyse and interpret the ECGs, depending on whether the participants searched the Internet from their phone, accessed the ECG reference app or did not have access to their mobile device

No mobile device Internet search ECG app P-value
Junior students (n =77) 6:03 ± 3:30 8:59 ± 4:23 8:22 ± 4:37 <0.001
Senior students (n =69) 6:13 ± 3:48 8:37 ± 4:18 7:34 ± 3:58 <0.001
Medical residents (n =35) 8:29 ± 4:39 10:55 ± 5:02 6:44 ± 3:34 <0.001
All participants (n =181) 6:36 ± 3:58 9:14 ± 4:34 7:44 ± 4:13 <0.001

Time is shown as minutes:seconds, all values are mean ± SD.

Mobile learning preferences in Electrocardiography

As shown in Table 3, smartphones were the preferred device for mobile learning. The most important barrier to the use of mobile devices for educational purposes was the cost of data. Students were more unsure than residents about which medical educational websites to access and which apps to download and use. Nonetheless, most medical students reported that they searched the Internet or used apps at least once a week to study ECGs, albeit at hospital or at home (Table 4). Residents, however, made less use of these mobile learning strategies when studying ECGs.

Table 3.

Self-reported mobile learning behaviour and preferences of participants

Junior students (n =69)a Senior students (n =59)b Medical residents (n =34)c
Operating system on mobile device
 Android 37 (53.6) 31 (52.5) 19 (44.1)
 Apple 32 (46.4) 28 (47.5) 19 (55.9)
Device used for mobile learning
 Smartphone 54 (78.3) 41 (69.5) 23 (67.7)
 Tablet 5 (7.3) 5 (8.5) 4 (11.8)
 Smartphone and tablet 10 (14.5) 13 (22.0) 7 (20.6)
Barriers to the use of mobile devices for educational purposes
 Apps are expensive 11 (15.9) 9 (15.3) 8 (23.5)
 Data are expensive 38 (55.1) 42 (71.2) 20 (58.8)
 Not sure which apps to use 17 (24.6) 13 (22.0) 2 (5.9)
 Not sure which web sites to access 17 (24.6) 19 (32.2) 4 (11.8)
 Mobile phone is distracting 14 (20.3) 6 (10.2) 3 (8.8)

Values are expressed as N (%).

a

Sixty-nine of the 77 junior students that participated in the study, completed the survey.

b

Fifty-nine of the 69 senior students that participated in the study, completed the survey.

c

Thirty-four of the 35 medical residents that participated in the study, completed the survey.

Table 4.

Frequency of mobile learning in Electrocardiography as reported by medical student and residents

Internet searches
ECG apps
Junior students (n =69)a Senior students (n =59)b Medical residents (n =34)c Junior students (n =69)a Senior students (n =59)b Medical residents (n =34)c
ECG learning in the clinical setting
 At least once a day 9/62 (14.5) 7/49 (14.3) 9/32 (28.1) 6/62 (9.7) 1/47 (2.13) 4/25 (16.0)
 At least every second day 14/62 (22.6) 11/49 (22.5) 4/32 (12.5) 15/62 (24.2) 5/47 (10.6) 4/25 (16.0)
 At least once a week 31/62 (50) 23/49 (46.9) 12/32 (37.5) 30/62 (48.4) 19/47 (40.4) 5/25 (20.0)
 At least once a month 8/62 (12.9) 8/49 (16.3) 7/32 (21.9) 11/62 (17.7) 22/47 (46.8) 12/25 (48.0)
ECG learning at home
 At least once a day 5/61 (8.2) 5/51 (9.8) 6/30 (20.0) 4/58 (6.9) 0/45 2/24 (8.3)
 At least every second day 15/61 (24.6) 10/51 (19.6) 3/30 (10.0) 10/58 (17.2) 6/45 (13.3) 2/24 (8.3)
 At least once a week 31/61 (50.8) 20/51 (39.2) 12/30 (40.0) 37/58 (63.8) 18/45 (40.0) 6/24 (25.0)
 At least once a month 10/61 (16.4) 16/51 (31.4) 9/30 (30.0) 7/58 (12.1) 21/45 (46.7) 14/24 (58.3)

Values are expressed as N (%).

a

Sixty-nine of the 77 junior students that participated in the study, completed the survey.

b

Fifty-nine of the 69 senior students that participated in the study, completed the survey.

c

Thirty-four of the 35 medical residents that participated in the study, completed the survey.

Discussion

This study explored the effect of mobile learning on the accuracy of ECG analysis and interpretation by undergraduate and postgraduate medical trainees. To the best of our knowledge, this is the first study evaluating the utility of two mobile learning strategies (i.e. Internet searches and an algorithm-based reference app) in Electrocardiography. We found that mobile learning improved the accuracy of ECG interpretation of junior and senior medical students, but not that of residents. Mobile learning specifically aided students in the process of revising and correcting an incorrect initial ECG diagnosis. The benefit was greater when using a diagnostic algorithm-based ECG reference app, as compared to unguided Internet searches. Over and above assisting users in achieving better diagnostic accuracy, the ECG reference app also required less time than searching the Internet. However, an interval analysis showed that the ability to make a correct ECG diagnosis was not sustained 2 weeks after engaging in the mobile learning activities.

An essential aspect of this study was the evaluation of whether participants reconsidered their ECG diagnoses, and whether this was influenced by mobile learning or not. We found that diagnostic accuracy improved when undergraduate students used a reference app or searched the Internet. However, a significant proportion of ECG diagnoses remained incorrect, despite an opportunity to revise the diagnosis after reflecting on the characteristic findings on the ECG. In this regard, residents were not influenced by mobile learning. A potential explanation for this observation may be the influence of premature closure on diagnostic accuracy. With premature closure, a wrong initial diagnosis is perpetuated due to the lack of consideration of alternative diagnoses.18 Clinicians who rely on pattern recognition without adequate reflective practice, are at risk of premature closure in the diagnostic decision-making process.19 This phenomenon has previously been described for residents,20 i.e. clinicians with more experience than undergraduate students (complete novices). It is especially worrying since residents made a correct initial diagnosis in only half of the ECGs. Their reluctance to reconsider and revise their final ECG diagnoses, with the assistance of mobile learning strategies or not, led to significant diagnostic error.

In this study, we found that mobile learning helped to correct diagnostic error. However, this was only possible if participants could identify the correct characteristic ECG features in support of the diagnosis. Indeed, if they reported any wrong features (i.e. features that were not present on the ECG or did not support the diagnosis), they were unlikely to provide the correct final diagnosis. Although this has not previously been described in the ECG literature, faulty data gathering and synthesis have been identified as important reasons for diagnostic error in other domains of Medicine.21 Furthermore, this study also showed that the use of incorrect search terms was associated with a failure to correct a wrong initial ECG diagnosis. Thus, for mobile learning to be of any benefit, it is crucial that ECG training should focus on the correct identification of features that support ECG diagnoses.

It is possible that the benefits of using the ECG reference app relate to a process of systematic ECG analysis, as users were provided with diagnostic algorithms for rhythm and waveform analyses. This supports previous literature reporting that applications that encourage algorithm-based diagnostic processes improved diagnostic accuracy.22 In contrast, unguided Internet searches may require a higher level of understanding of Electrocardiography to search for terms that are likely to lead the user to the correct diagnoses. Another potential limitation of Internet searches is that they yield large amounts of unfocused information, which then requires time-consuming review. For example, an Internet search on Google for an ECG feature such as ‘ST elevation’ yielded > 200 million hits in 0.9 s, and a diagnosis such as ‘STEMI’ about 3.5 million hits in 0.5 s. Access to these sites is also dependent on the order that the search engines present them. Users are likely to access the first few websites, especially if they are faced with time constraints in the workplace and wish to rapidly access information.

We showed that it was more useful to search for ECG features than for ECG diagnoses on the ECG reference app. This is an important finding, as the ECG novice might recognize features on the ECG, but still struggle to put the constellation of ECG features together to make an accurate ECG diagnosis. For example, students might not know that they are looking at an ECG with atrial fibrillation and, therefore, fail to search for atrial fibrillation as a diagnosis per se. However, if they correctly identify features such as, for example, an irregular tachycardia or the absence of discernible P waves and look up an algorithmic approach to either of these ECG features, this could assist them in making an accurate ECG diagnosis. In this regard, consideration of differential diagnoses and diagnostic algorithms offered by mobile learning platforms should be kept reasonably simple in order to be useful. Complicated diagnostic algorithms are time consuming and can be confusing.23–25

Mobile learning facilitates flexible and asynchronous learning, as mobile devices can be used anywhere and anytime.26,27 Rapid access to information from a mobile device allows students to maximize their learning by acquiring new knowledge ‘on the go’ and not only when in the classroom or studying in the library.28 In the clinical setting, mobile learning allows for contextualized learning, i.e. the learning is immediate and relevant to the clinical context (e.g. looking up the correct dose of treatment at the time of its prescription and/or administration).4,29 Our study adds to the literature in that we could show that students and residents spent less time with ECG analysis and interpretation when they used a dedicated ECG reference app, than they did searching the Internet from their mobile devices. This is important, because overall, the reference app was associated with shorter analysis time and greater diagnostic accuracy than searching the Internet freely from a mobile device.

In this study, we found that mobile learning gains were not retained, when tested 2 weeks later. This finding is not surprising and could be explained by the fact that the educational intervention was a once off exposure at study entry only.30 Deliberate practice with feedback, which has previously been shown to enhance learning,31–34 was also not part of the learning process. As an educational method, mobile learning is, therefore, better classified as informal learning than formal learning,4 and should not replace, but rather supplement, formal methods of ECG instruction.35,36

Our study provides valuable feedback on the perceptions of millennial medical students and residents on mobile learning in Electrocardiography. We found that both undergraduate and postgraduate students made frequent use of mobile learning modalities such as reference apps and Internet searches to study ECGs. In our setting, there was equal use of Apple and Android platforms, which may differ from other parts of the world.5,28 Participants preferred using their smartphones over tablet devices for mobile learning purposes. However, potential caveats to mobile learning included the expense of data and purchasing apps, which may be a major limiting factor in the developing world. Although they found their mobile devices less distracting than what has previously been reported,8,9 participants were often uncertain of which mobile apps to use and which educational websites to visit. As mobile devices are increasingly used for rapid access to information, clinician educators should aim to teach students and residents how to use mobile learning to assist them with diagnostic reasoning in the classroom and in the clinical setting, and to guide trainees with regards to which resources to use.36–38

Study limitations

The participants were split into three groups (junior students, senior students, and medical residents) to remove the confounding effect of experience of ECG analysis and interpretation. However, this reduced the sample size and the statistical power to detect differences between the modalities for each group separately. Furthermore, as only three quarters of participants completed Test 2, the results of this assessment are subject to attrition bias and should therefore be interpreted with caution. Although we found that the benefit of mobile learning was not retained, we did not evaluate the effect of repeated exposure to mobile learning on retention of knowledge. In real life, trainees are likely to access reference apps or search the Internet when they analyse and interpret ECGs.

The increased diagnostic accuracy that was found in the second test (assessing retention of knowledge) could possibly be explained by performance bias.39 As our study was conducted during clinical clerkships when ECGs are taught, we could not control for exposure to learning opportunities other than the educational intervention in the first test (mobile learning) between the two tests.

In our study, there was a lack of feedback to the mobile technology user. It is well known that diagnostic reasoning is enhanced when diagnostic error is pointed out.40 Due to the interactive capabilities of mobile technology, feedback mechanisms through mobile learning could potentially improve diagnostic accuracy and should be studied in future.

Conclusion

This study found that mobile learning improves the diagnostic accuracy of ECG interpretation of undergraduate medical trainees. Mobile learning specifically aided students in the process of revising and correcting an incorrect initial ECG diagnosis. However, mobile learning was only beneficial when the user could correctly identify the characteristic features in support of an ECG diagnosis. For diagnostic accuracy, an algorithm-based ECG reference app was better than unguided Internet searches and required less time. Interestingly, the benefit of mobile learning appears to be limited largely to novice clinicians who are known to be less prone to premature closure. It is important to note that the benefits of once-off mobile learning experiences were not sustained over time.

Glossary terms

Mobile learning: method of instruction where the student assimilates knowledge by accessing information by means of a handheld or mobile device (i.e. smartphone, tablet), which allows the user to connect to the Internet and browse websites, communicate via e-mail and text messages, access social media and use ‘apps’ downloaded onto the mobile device.5,28

Mobile browser: a programme or software installed on a mobile device (e.g. ‘Google’) that allows the user to connect to the Internet and access web content. Mobile browsers or search engines require a connection to the Internet, either by cellular connection or Wi-Fi.13

Applications, or ‘apps’: are programmes that are installed on mobile devices, and which can be used with or without an Internet connection.28 In medical education, mobile apps are often tailored as reference guides and calculators.41

ECG analysis: the detailed examination of an ECG tracing, which requires the measurement of intervals and the evaluation of the rhythm and each waveform.42

ECG interpretation: the conclusion reached after careful ECG analysis, that is, making a diagnosis of an arrhythmia, ischaemia, etc.42

Supplementary Material

ztab027_Supplementary_Material

Acknowledgements

The authors would like to thank Dr Rory Leisegang for hosting the app on the App Store and Google Play during the time of the study, and to Moegamat Johnson for his assistance with the app maintenance. We are grateful to Sylvia Dennis for her excellent help with the manuscript preparation, and to Associate Professor Ashley Chin, Professor Mpiko Ntsekhe and Professor Ntobeko Ntusi from UCT for their continued advice and support. Finally, we would like to thank our students and residents for their dedication to take part in the study and their valuable feedback to help improving ECG training.

Funding

ECG APPtitude was developed by the Division of Cardiology at the University of Cape Town (UCT). Development was funded by the Technology Innovation Agency (TIA) and Research Contracts and Innovation (RC&I) at UCT. Access to ECG APPtitude was free. Article processing charges were funded by the Hippocrate Fund.

Data availability

The datasets used and/or analysed during the current study, are available in the ‘Utility of mobile learning in Electrocardiography’ repository, which could be accessed at https://doi.org/10.25375/uct.13317941.v1.

Conflict of interest: C.A.V. reports receiving sponsorship from the Cardiac Arrhythmia Society of Southern Africa (CASSA) during the conduct of the study. R.S.M. is a lecturer and host of the AO Memorial Advanced ECG and Arrhythmia Course and receives an honorarium from Medtronic Africa. None of the authors received remuneration from ECG APPtitude. The authors alone are responsible for the content and writing of the article.

References

  • 1. Chan T, Sennik S, Zaki A, Trotter B.. Studying with the cloud: the use of online Web-based resources to augment a traditional study group format. CJEM 2015;17:192–195. [DOI] [PubMed] [Google Scholar]
  • 2. Roberts DH, Newman LR, Schwartzstein RM.. Twelve tips for facilitating Millennials' learning. Med Teach 2012;34:274–278. [DOI] [PubMed] [Google Scholar]
  • 3. Hopkins L, Hampton BS, Abbott JF, Buery-Joyner SD, Craig LB, Dalrymple JL, Forstein DA, Graziano SC, McKenzie ML, Pradham A, Wolf A, Page-Ramsey SM.. To the point: medical education, technology, and the millennial learner. Am J Obstet Gynecol 2018;218:188–192. [DOI] [PubMed] [Google Scholar]
  • 4. Traxler J. Defining, discussing and evaluating mobile learning: the moving finger writes and having writ. Int Rev Res Open Distributed Learning 2007;8:1–12. [Google Scholar]
  • 5. Briz-Ponce L, Juanes-Mendez JA, Garcia-Penalvo FJ, Pereira A.. Effects of mobile learning in medical education: a counterfactual evaluation. J Med Syst 2016;40:136. [DOI] [PubMed] [Google Scholar]
  • 6. Desy JR, Reed DA, Wolanskyj AP.. Milestones and millennials: a perfect pairing-competency-based medical education and the learning preferences of Generation Y. Mayo Clin Proc 2017;92:243–250. [DOI] [PubMed] [Google Scholar]
  • 7. Payne KB, Wharrad H, Watts K.. Smartphone and medical related App use among medical students and junior doctors in the United Kingdom (UK): a regional survey. BMC Med Inform Decis Mak 2012;12:121. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Dimond R, Bullock A, Lovatt J, Stacey M.. Mobile learning devices in the workplace:‘as much a part of the junior doctors’ kit as a stethoscope’? BMC Med Educ 2016;16:207. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Chase TJG, Julius A, Chandan JS, Powell E, Hall CS, Phillips BL, Burnett R, Gill D, Fernando B.. Mobile learning in medicine: an evaluation of attitudes and behaviours of medical students. BMC Med Educ 2018;18:152. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Bhatheja S, Fuster V, Chamaria S, Kakkar S, Zlatopolsky R, Rogers J, Otobo E, Atreja A, Sharma SK, Kini AS.. Developing a mobile application for global cardiovascular education. J Am Coll Cardiol 2018;72:2518–2527. [DOI] [PubMed] [Google Scholar]
  • 11. Guarino S, Leopardi E, Sorrenti S, De Antoni E, Catania A, Alagaratnam S.. Internet-based versus traditional teaching and learning methods. Clin Teach 2014;11:449–453. [DOI] [PubMed] [Google Scholar]
  • 12. Hanafi HF, Samsudin K. Mobile learning environment system (MLES): the case of Android-based learning application on undergraduates' learning. International Journal of Advanced Computer Science and Applications. 2012;3(3).
  • 13. Fu L, Salvendy G.. The contribution of apparent and inherent usability to a user's satisfaction in a searching and browsing task on the Web. Ergonomics 2002;45:415–424. [DOI] [PubMed] [Google Scholar]
  • 14. Albrecht UV, Folta-Schoofs K, Behrends M, von Jan U.. Effects of mobile augmented reality learning compared to textbook learning on medical students: randomized controlled pilot study. J Med Internet Res 2013;15:e182. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Barton AJ. The regulation of mobile health applications. BMC Med 2012;10:46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Viljoen CA, Scott Millar R, Engel ME, Shelton M, Burch V.. Is computer-assisted instruction more effective than other educational methods in achieving ECG competence amongst medical students and residents? A systematic review and meta-analysis. BMJ Open 2019;9:e028800. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Harris PA, Taylor R, Thielke R, Payne J,, Gonzalez N, Conde JG.. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009;42:377–381. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med 2003;78:775–780. [DOI] [PubMed] [Google Scholar]
  • 19. Krupat E, Wormwood J, Schwartzstein RM, Richards JB.. Avoiding premature closure and reaching diagnostic accuracy: some key predictive factors. Med Educ 2017;51:1127–1137. [DOI] [PubMed] [Google Scholar]
  • 20. Monteiro SD, Sherbino J, Patel A, Mazzetti I, Norman GR, Howey E.. Reflecting on diagnostic errors: taking a second look is not enough. J Gen Intern Med 2015;30:1270–1274. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Naude JM, Burch VC.. Checklist of cognitive contributions to diagnostic errors: a tool for clinician-educators. Afr J Health Prof Educ 2018;10:153–158. [Google Scholar]
  • 22. Vinny P, Gupta A, Modi M, Srivastava M, Lal V, Sylaja PN, Narasimhan L, Dwivedi SN, Nair PP, Iype T, Vishnu VY.. Head to head comparison between neurology residents and a mobile medical application for diagnostic accuracy in cognitive neurology. QJM 2019;112:591–598. [DOI] [PubMed] [Google Scholar]
  • 23. Scordo KA. Differential diagnosis: correctly putting the pieces of the puzzle together. AACN Adv Crit Care 2014;25:230–236. [DOI] [PubMed] [Google Scholar]
  • 24. Sousa PA, Pereira S, Candeias R, de Jesus I.. The value of electrocardiography for differential diagnosis in wide QRS complex tachycardia. Rev Port Cardiol 2014;33:165–173. [DOI] [PubMed] [Google Scholar]
  • 25. Chin A, Vezi B, Namane M, Weich H, Scott-Millar R.. An approach to the patient with a suspected tachycardia in the emergency department. S Afr Med J 2016;106:246–250. [DOI] [PubMed] [Google Scholar]
  • 26. El-Hussein MOM, Cronje JC.. Defining mobile learning in the higher education landscape. Educ Technol Soc 2010;13:12–21. [Google Scholar]
  • 27. Martin R, McGill T, Sudweeks F.. Learning anywhere, anytime: student motivators for m-learning. J Inf Technol Educ Res 2013;12:51–67. [Google Scholar]
  • 28. Wallace S, Clark M, White J. ‘ It's on my iPhone’: attitudes to the use of mobile computing devices in medical education, a mixed-methods study. BMJ Open 2012;2:e001099. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Walsh K. Mobile learning in medical education: review. Ethiop J Health Sci 2015;25:363–366. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. Monteiro S, Melvin L, Manolakos J, Patel A, Norman G.. Evaluating the effect of instruction and practice schedule on the acquisition of ECG interpretation skills. Perspect Med Educ 2017;6:237–245. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ.. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27:10–28. [DOI] [PubMed] [Google Scholar]
  • 32. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM.. Instructional design variations in internet-based learning for health professions education: a systematic review and meta-analysis. Acad Med 2010;85:909–922. [DOI] [PubMed] [Google Scholar]
  • 33. Ericsson KA, Krampe RT, Tesch-Römer C.. The role of deliberate practice in the acquisition of expert performance. Psychol Rev 1993;100:363. [Google Scholar]
  • 34. Viljoen CA, Millar RS, Manning K, Burch VC.. Effectiveness of blended learning versus lectures alone on ECG analysis and interpretation by medical students. BMC Med Educ 2020;20:488. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. Motiwalla LF. Mobile learning: a framework and evaluation. Comput Educ 2007;49:581–596. [Google Scholar]
  • 36. Viljoen CA, Millar RS, Manning K, Burch VC.. Determining electrocardiography training priorities for medical students using a modified Delphi method. BMC Med Educ 2020;20:431. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Patel VL, Kaufman DR, Kannampallil TG.. Diagnostic reasoning and decision making in the context of health information technology. Rev Hum Fact Ergon 2013;8:149–190. [Google Scholar]
  • 38. Ozuah PO. Undergraduate medical education: thoughts on future challenges. BMC Med Educ 2002;2:8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Higgins JP, Green S.. Cochrane Handbook for Systematic Reviews of Interventions. Chichester, UK: John Wiley & Sons; 2011. [Google Scholar]
  • 40. Veloski J, Boex JR, Grasberger MJ, Evans A, Wolfson DB.. Systematic review of the literature on assessment, feedback and physicians' clinical performance: BEME Guide No. 7. Med Teach 2006;28:117–128. [DOI] [PubMed] [Google Scholar]
  • 41. Oehler RL, Smith K, Toney JF.. Infectious diseases resources for the iPhone. Clin Infect Dis 2010;50:1268–1274. [DOI] [PubMed] [Google Scholar]
  • 42. Viljoen CA, Scott Millar R, Engel ME, Shelton M, Burch V.. Is computer-assisted instruction more effective than other educational methods in achieving ECG competence among medical students and residents? Protocol for a systematic review and meta-analysis. BMJ Open 2017;7:1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

ztab027_Supplementary_Material

Data Availability Statement

The datasets used and/or analysed during the current study, are available in the ‘Utility of mobile learning in Electrocardiography’ repository, which could be accessed at https://doi.org/10.25375/uct.13317941.v1.

Conflict of interest: C.A.V. reports receiving sponsorship from the Cardiac Arrhythmia Society of Southern Africa (CASSA) during the conduct of the study. R.S.M. is a lecturer and host of the AO Memorial Advanced ECG and Arrhythmia Course and receives an honorarium from Medtronic Africa. None of the authors received remuneration from ECG APPtitude. The authors alone are responsible for the content and writing of the article.


Articles from European Heart Journal. Digital Health are provided here courtesy of Oxford University Press on behalf of the European Society of Cardiology

RESOURCES