Skip to main content
. 2017 Nov 2;3(2):e21. doi: 10.2196/mededu.8100

Table 2.

Overview of errors that can be detected by the tool in case the learner has submitted a final diagnosis that is different from that of the expert’s.

Type of error Detection Data required
Premature closure
(accepting a diagnosis before it is fully confirmed)
Submission of a final diagnosis at an early stage, after which the expert has added finding(s) or tests that are connected to the final diagnosis Findings and tests of the learner and the expert (including stage)
Connections to final diagnosis of expert
Submission stage
Availability bias
(what recently has been seen is more likely to be diagnosed later on)
Learner has worked on or accessed a virtual patient with a related final diagnosis (one Medical Subject Heading hierarchy level up/down) within the last 5 days Previously created concept maps (date of last access and final diagnoses)
Confirmation bias
(tendency to look for confirming evidence for a diagnosis)
Learner has not added disconfirming finding(s) or “speaks against” connections between disconfirming finding and the final diagnosis Findings of the learner and the expert
Connections between findings and differential diagnoses
Representativeness
(focus on prototypical features of a disease)
Learner has connected nonprototypical findings as “speak against” findings to the correct final diagnosis Findings of the learner and the expert
Nonprototypical findings (additional information in expert map)
Base rate neglect
(ignoring the true rate of a disease)
A rare final diagnosis has been submitted instead of the more prevalent correct final diagnosis Differential diagnoses of the learner and the expert
Prevalence of diagnoses (additional information in expert map)