Skip to main content
. 2019 Feb 25;21(2):e13269. doi: 10.2196/13269

Table 2.

Summary of findings for online digital education as compared to face-to-face learning. patient or population: postregistration medical doctors; setting: universities, hospitals, and primary care; intervention: online and local area network–based digital education; comparison: face-to-face learning.

Outcomes Number of participants (number of RCTsa) Quality of evidence (GRADEb) Direction of effects
Knowledge assessed with multiple-choice questions. Follow-up ranged from posttest to 18 months 1202 (9) Very lowc,d,e,f Two studies [81,82] reported that ODEg was significantly more effective in improving physicians’ knowledge scores than face-to-face learning (very low certainty evidence). Six studies [83-88] found that ODE was as effective as face-to-face learning in improving physicians’ knowledge scores (very low certainty evidence). One study [89] reported that face-to-face learning was significantly more effective than ODE in improving physicians’ knowledge scores.
Skills assessed with OSCEh, diagnostic assessment, examination, questionnaires, and surveys. Follow-up ranged from posttest to 12 months 291 (7) Lowc,d,i Six studies [84,87,90-93] reported ODE was as effective as face-to-face learning in improving physicians’ skills (low certainty evidence). In one study [94], data were missing.
Attitude assessed with questionnaires. Follow-up ranged from posttest to 18 months 220 (2) Lowc,d Two studies [82,95] reported that ODE was as effective as face-to-face learning in improving physicians’ attitude (low certainty evidence).
Satisfaction assessed with questionnaires. Follow-up ranged from posttest to 12 weeks 260 (4) Lowc,d Two studies [83,87] reported that ODE was significantly more effective than face-to-face learning for improving physicians’ satisfaction (low certainty evidence). Two studies [81,84] reported that ODE was as effective as face-to-face learning in improving physicians’ satisfaction (low certainty evidence).

aRCT: randomized controlled trial.

bGRADE: Grading of Recommendations, Assessment, Development and Evaluations.

cRated down by one level for study limitations. Most studies were considered to be at an unclear or high risk of bias. Overall, the risk of bias for most studies was unclear due to a lack of information reported.

dRated down by one level for inconsistency. There was variation in effect size (ie, very large and very small effects were observed).

eRated down by one level for publication bias. The effect estimates were asymmetrical, suggesting possible publication bias.

fVery low quality (+ – – –): We have very little confidence in the effect estimate. The true effect is likely to be substantially different from the estimate of effect.

gODE: online and local area network–based digital education.

hOSCE: objective structured clinical examination.

iLow quality (+ + – –): Our confidence in the effect estimate is limited. The true effect may be substantially different from the estimate of the effect