Abstract
This study aims to investigate the effectiveness of the use of different feedback modalities in improving the knowledge, attitudes, and skills of medical students compared to students receiving no feedback or unstructured feedback. A systematic review and meta-analysis of randomized controlled trials was conducted based on a search of the Cochrane, ERIC, PubMed, Scopus, and Web of Science databases. A total of 26 studies were included for the systematic review and 13 for the meta-analysis. The meta-analysis revealed that the use of feedback was associated with better results compared to control groups (SMD = 0.80 [0.56–1.04], p < 0.001), and also when only high-quality studies were included (SMD = 0.86 [0.56–1.16], p < 0.001). Our findings revealed high heterogeneity in the use of feedback in medical education. However, the results of most of the studies and of the meta-analysis were positive, showing that feedback had a positive influence on the education-learning process of the students. PROSPERO registration: CRD42018112688.
Keywords: Feedback, Medical students, Medical education, Systematic review, Meta-analysis
Introduction
Feedback is a term widely used in medical education since the 1980s [1] and can be defined as the “return of information on the outcome of a process or task” or as “a response able to return specific information to the trainee to accomplish a satisfactory standard of learning” [1–3]. The use of feedback has been associated with improved learning in medicine, generally favoring the acquisition of knowledge, skills, and attitudes [3–5].
For feedback to be effective and result in improved performance, several of its attributes must be considered, such as being focused and concise, applied concomitantly or at end of task, as well as being descriptive, contextualized, constructive, and avoiding information overload [6]. The way feedback is received can be affected by multiple factors such as student maturity and self-awareness; nature of the information provided; preparation and environment in which the feedback is given; inherent characteristics of the teacher; communication conditions; and learning environment and institutional culture [3, 7]. The use of feedback in the teaching–learning process is underpinned by different educational theories, such as behaviorism (i.e., learning is the acquisition of new behavior) and constructivism (i.e., learning is search for meaning) [8], and also by 3 key psychosocial theories: sociocultural theory (i.e., the role of the recipient, the provider and the context in the process), politeness theory (i.e., the contrast between individual’s need to be appreciated by others, and the individual’s need for freedom of action), and the self-determination theory (i.e., students regulate behaviors autonomously, take on challenges, and learn through intrinsic rather than extrinsic motivation) [9].
The structuring and use of feedback in medical education remains challenging, and, in order to be successful, factors like the nature, structure, and timing of feedback should be considered [2]. In recent years, a body of evidence has pointed to the effectiveness of the use of feedback throughout medical training, where this feedback can be given by teachers [10], virtual patients [11], other students [12], physicians [13], simulators [14–16], and simulated patients [17, 18]. This tool has been most applied in teaching scenarios within skills laboratories and real-life environments [19].
Currently, there are other systematic reviews addressing the effectiveness of feedback for medical education in the scientific literature [3, 20–22]. However, some problems are identified in these reviews, such as including non-randomized controlled trials (e.g., quasi-experimental studies) and including heterogeneous samples in the same analysis (i.e., merging students, residents, and physicians altogether), thereby reducing the power of the evidence found.
Similarly, although various clinical trials indicate the effectiveness of feedback in medical education, to our knowledge, there are no systematic reviews or meta-analyses focused solely on medical students that include only randomized controlled trials. This is an important gap that this study aims to bridge. It is noteworthy that the culture of feedback in graduate and post-graduate medical education programs has become substantially more formally structured along milestones, probably due to the elaboration of entrustable professional activities for practice-contextualized workplace competency-based frameworks [23]. Therefore, investigating the effectiveness of feedback in undergraduate medical students is needed, since the formalization of structured feedback across undergraduate medical education programs remains much more heterogeneous.
Examining results of the use of this teaching tool can help guide application of the instrument in undergraduate medical education and help to shed a light in the characteristics of an effective feedback for medical students and on the optimal timing for its application. The present review can contribute to the educational process and improve student learning by showing teachers and managers the possibilities and benefits of its use, favoring the culture of feedback in institutions.
Thus, the objective of the present study was to investigate the effectiveness of the use of different feedback modalities in improving the knowledge, attitudes, and skills of medical students compared to students receiving no feedback or unstructured feedback.
Methods
A systematic review and meta-analysis of randomized controlled trials was carried out. A qualitative analysis of the randomized controlled trials was performed according to the PRISMA recommendation (Preferred Reporting Items for Systematic Reviews and Meta-analysis) (See Supplementary material) [24]. The study was registered on the PROSPERO platform under registration CRD42018112688.
Eligibility Criteria
The following criteria were applied for inclusion of studies in the review: involved feedback to medical students with no restrictions on course year; were randomized controlled trials; had a control group (i.e., students that received no feedback or unstructured feedback); and tested both before and after intervention. Exclusion criteria were studies involving feedback to residents or physicians; inclusion of students from other courses in the health area; non-randomized (e.g., quasi-experimental, observational, other systematic reviews or meta-analyses); and studies and articles not relevant to the topic of interest or that addressed biofeedback.
Search Strategy
The literature search was carried out on the Cochrane, ERIC, PubMed, Scopus, and Web of Science databases, and only articles reporting randomized controlled trials were included. Keywords were derived from meetings of the researchers and based on literature reviews. The keywords were grouped using the Boolean expression:
(feedback* OR feed back*) AND (“medical student*” OR "medical education" OR “medical teaching” OR "medical training") AND (trial OR controlled OR randomized)”
This expression was adjusted for the respective database, and searches were performed in May 2018. There were no restrictions on the publication date or language of the articles.
Selection of Studies
In stage 1, two investigators, independently and concomitantly, commenced the search of articles on the 5 databases using the Boolean expression described above. All duplicate articles were excluded using Endnote software. The same two researchers then independently analyzed titles, abstracts, type of study, and publication to check whether they fulfilled the eligibility criteria.
In stage 2, the remaining articles were read in full, and only randomized controlled trials on feedback to medical students involving different scenarios, feedback types, and sources (teacher, manikin, peers, physicians, or device) were retained. Control groups comprised students who received no feedback or unstructured feedback (e.g., congratulations and/or praise or negative feedback). Lastly, a search for other relevant reviews using the references from the articles already included and Open Grey literature sources was carried out to check for other eligible trials.
In stage 3, the studies were categorized independently by two reviewers according to the following data: author, periodical, study type, objectives, content assessed and feedback type, way of providing feedback, follow-up, outcome assessed, and results compared to the control group. Also in the descriptive stage are the following: a table containing the description of the feedback concept, level of feedback according to Hattie and Timperley [6], and level effectiveness of the feedback according to Kirkpatrick [25], timing of reassessment after feedback, type and source of feedback, study setting, area of education evaluated, whether chance to learn provided, and country and year of study. The quality of each study was also assessed at this stage, as outlined below.
Quality of the Studies
Two reviewers, specialists in the medical education area, independently assessed the quality of the studies included using the Cochrane Back Review Scale of Methodological Quality [26]. This tool allows the assessment of the randomization method, allocation concealment, blinding of patients, providers and assessors, drop-out rate, co-intervention avoided, intention-to-treat, timing of outcome assessment, and acceptable compliance. A third researcher was responsible for analyzing cases in the event of disagreement. Studies attaining a score > 6 were deemed “high quality.”
Meta-analysis
The software program RevMan 5.2 (Cochrane) was employed for the meta-analysis. The analysis encompassed articles that included a comparison of the use of different feedback modalities versus students receiving no feedback or unstructured feedback. Only studies reporting full data on mean, standard deviation, and sample size for each group were included.
Likewise, although some studies assessed several different outcomes, in this meta-analysis, we decided to include only the most important (i.e., primary) outcomes as described in the objectives of each article. For studies that had more than one intervention group (e.g., verbal feedback in one group and written feedback in another), more than one outcome group (e.g., skills and examination skills), or that assessed different timepoints (e.g., immediate outcome or outcome after 2 months), analysis was carried out separately for each of these items, where studies were labeled with a letter in parentheses (e.g., b) for each of these comparisons.
Effect size was based on the mean, standard deviation, and sample size of the intervention and control groups for each comparison. For the meta-analyses that compiled different scales, the effect size was calculated by the standardized mean difference (SMD) with its 95% confidence interval (CI). This approach enabled the inclusion of different outcome measures in the same synthesis.
Some scales measuring “force reduction” or “number of trials to reach proficiency” may be scored inversely (i.e., some scales decrease in value with improved outcomes), and such scales were adjusted by multiplying by − 1 [27].
The meta-analysis was conducted for all studies that had full data, while sub-analyses were performed for high-quality studies and were separated according to skill type (surgery, clinical, basic life support, and communication), outcome timing (immediate or not immediate), student groups (junior or senior medical students), and feedback type (electronic or non-electronic).
A p-value ≥ 0.05 was adopted as significant and heterogeneity was determined using I2.
Results
The flow diagram of the process of article selection on the databases is depicted in Fig. 1. The initial search yielded 3596 studies, of which 820 were excluded due to duplication on the databases. In stage 1, of the total 2776 articles, a further 2647 were eliminated (most common reasons: involving other health professionals, such as resident physicians, physicians, and post-graduate students and for using a methodology other than a randomized controlled trial). In stage 2, a total of 129 articles were read in full. Of these, 103 were subsequently excluded for employing other trial methodologies, involving other populations, theses, and abstracts, and for comparing types of feedback between groups. This gave a final total of 26 articles for inclusion in the systematic review [11, 13–15, 18, 19, 28–47]. Of these 26 articles, 13 were included in the meta-analysis (studies that contained all the requisite data for this procedure) [11, 13, 15, 28, 31, 33, 38–43, 45].
Fig. 1.
PRISMA flow diagram
Description of the Studies
The 26 studies included in the systematic review are given in Table 1, together with a general description, listing author, number of participants and year of medical school course, type of feedback provided, reassessment period, skill investigated, and results.
Table 1.
General description of articles in systematic review (n = 26)
| Author (Year) | N | Year of course | Feedback type | Control group | Skill assessed | Reassessment period | Results |
|---|---|---|---|---|---|---|---|
| Ahlborg et al. [14] | 16 | 5th year | Verbal, individualized, and simulator | No feedback received | Surgery | Immediate | Positive |
| Ahn et al. [28] | 40 | N/A | Visual and tactile | No feedback received | Basic life support | Immediate | Positive |
| Beckers et al. [29] | 218 | 1st year | Verbal | No feedback received | Basic life support | Short term | Positive |
| Bjerrum et al. [30] | 91 | 4th, 5th and 6th years | Verbal and simulator | No instructor for feedback | Surgery | Long term | No difference |
| Blake et al. [31] | 52 | 4th year | Verbal | No feedback received | Communication | Long term | Positive |
| Boehler et al. [32] | 33 | 2nd and 3rd years | Verbal | No feedback received | Surgery | Immediate | Positive |
| Denadai et al. [13] | 16 | 2nd year | Verbal | No feedback received | Surgery | Immediate | Positive |
| Díez et al. [33] | 43 | 2nd year | Verbal and manikin | Only instruction received | Basic life support | Immediate | Positive |
| Foster et al. [11] | 70 | 1st year | Visual | No feedback received | Communication | Immediate | Positive |
| Garner et al. [34] | 33 | 3rd year | Verbal | No feedback received | Clinical | Immediate | Positive |
| Judkins et al. [35] | 12 | N/A | Visual | No feedback received | Surgery | Short term | Positive |
| Judkins et al. [36] | 30 | 1st and 2nd years | Visual | No feedback received | Surgery | Short term | Positive |
| Kannappan et al. [37] | 25 | 1st year | Feedback (verbal) positive | Negative feedback (verbal) | Surgery | Immediate | No difference |
| Li et al. [38] | 40 | 3rd year | Verbal and Visual | No feedback received | Basic life support | Immediate | Positive |
| Li et al. [39] | 330 | 3rd year | Verbal and Visual | No feedback received | Basic life support | Long term | Positive |
| Mohammed et al. [15] | 69 | 1st and 2nd years | Verbal, audible alarm, visual stimulus | No feedback received | Surgery | Immediate | Positive |
| Park et al. [40] | 36 | Final year | Verbal and written | No feedback received | Clinical | Long term | Positive |
| Pavo et al. [18] | 326 | 3rd year | Audiovisual parameter, verbal | No feedback received | Basic life support | Short term | Positive |
| Rodrigues et al. [41] | 72 | 1st to 6th years | Visual | No feedback received | Surgery | Medium term | Positive |
| Scheidt et al. [42] | 105 | 3rd year | Verbal and written | No feedback received | Communication | Immediate | Positive |
| Schmidt et al. [43] | 142 | 4th year | Verbal, by video | No feedback received | Communication | Medium term | Positive |
| Sox et al. [19] | 870 | Clerkship | Verbal | No feedback received | Communication | Short term | Positive |
| Sultan et al. [44] | 30 | 4th year | Recorded (video) | No feedback received | Surgery | Immediate | Positive |
| van de Ridder et al. [45] | 74 | 1st year | Feedback verbal positive | Negative verbal feedback | Clinical | Short term | Positive |
| Walsh et al. [46] | 55 | Final year | Recorded (video) | No feedback received | Clinical | Short term | Positive |
| Xeroulis et al. [47] | 60 | 1st year | Recorded (video) and verbal | No feedback received | Surgery | Short term | Positive |
N/A not available
The detailed characteristics and results for each study are given in Table 2. In terms of general characteristics of the trials, 19 studies (73%) were published from 2011 onwards, while only three were prior to 2000. The countries with greater research output in the area were the USA (38.4%), followed by Canada, China, and Korea.
Table 2.
Characteristics of randomized trials on feedback
| n = 26 | n | % |
|---|---|---|
| Country | ||
| USA | 10 | 38.5 |
| China, Canada, South Korea | 2 | 7.7 |
| Canada | 2 | 7.7 |
| South Korea | 2 | 7.7 |
| Others | 10 | 38.5 |
| Publication period | ||
| Up to 1990 | 1 | 3.8 |
| 1991–2000 | 2 | 7.7 |
| 2001–2010 | 4 | 15.4 |
| 2011–2018 | 19 | 73.1 |
| Population | ||
| Students from 1st year | 5 | 19.2 |
| Students from 2nd year | 2 | 7.7 |
| Students from 3rd year | 5 | 19.2 |
| Students from 4th year | 3 | 11.6 |
| Students from 5th year | 1 | 3.8 |
| Students from 6th year | 2 | 7.7 |
| Students from mixed periods | 6 | 23.1 |
| No information available | 2 | 7.7 |
| Sample size | ||
| ≤ 25 | 4 | 15.4 |
| ≤ 50 | 8 | 30.8 |
| ≤ 75 | 7 | 26.9 |
| > 75 | 7 | 26.9 |
| Area of education assessed | ||
| Communication | 5 | 19.2 |
| Surgical skills | 11 | 42.3 |
| Basic life support (BLS) | 6 | 23.1 |
| Clinical skills (physical exam, procedures, anamnesis) | 4 | 15.4 |
| Results | ||
| Better than control groups | 24 | 92.3 |
| Similar to control groups | 2 | 7.7 |
| Feedback type | ||
| Verbal | 9 | 34.6 |
| Verbal and written | 2 | 7.7 |
| Verbal and visual (video) | 3 | 11.5 |
| Verbal and simulator parameter | 4 | 15.4 |
| Visual | 4 | 15.4 |
| Visual and simulator parameter | 2 | 7.7 |
| Video | 2 | 7.7 |
| Feedback concept | ||
| Jack Ende (1983) | 6 | 23.1 |
| Monica Van de Ridder (2008) | 2 | 7.7 |
| No concept adopted in study | 18 | 69.2 |
| Feedback level according to Hattie and Timperley | ||
| Task | 14 | 53.8 |
| Process | 6 | 23.1 |
| Task and process | 6 | 23.1 |
| Timing of reassessment after feedback | ||
| Immediate | 12 | 46.1 |
| Short term | 8 | 30.8 |
| Medium term | 2 | 7.7 |
| Long term | 4 | 15.4 |
Of the 26 studies included, 24 (92%) revealed significantly greater learning of the students given feedback compared to controls, while 2 studies (8%) showed similar performance of experimental and control groups. The two studies in question investigated skills for performing surgical procedures, one with immediate reassessment and the other long term.
Concerning the feedback concept underpinning the application of the tool, only 8 (30.7%) studies drew on the concepts of Van de Ridder [3] (i.e., “Specific information on comparing a student’s observed performance and a proposed standard with the intention of improving”) and Ende [1] (i.e., “information that describes the performance of students or of an activity that is intended to guide their future performance in that same or a related activity”). Regarding levels of self-regulation and self-defined by Hattie and Timperley [6] (i.e., “the student is able to self-monitor, that is, self-evaluate and carry out the direction and regulation of learning actions”), none of the evidence attained this level of feedback.
In over half (53.8%) of the studies, feedback was given for a specific task, i.e., after practicing a skill. For 71.4% of the studies, the reassessment period (interval between pre and post-tests) was shorter than 4 weeks, while only 4 (15.3%) involved periods longer than 90 days.
With regard to the populations participating in the studies, only 6 studies involved senior medical students (23%). In terms of sample size, 7 (26.9%) studies had more than 75 students, 7 involved between 75 and 51 students, 8 (30.7%) had between 50 and 26 students, whereas only 4 (15.3%) included less than 25 students.
The skills trained and investigated were split into 4 main groups:
Surgical skills: 11 (42.3%) of the randomized clinical trials pertained to surgical skills. Of these, 5 (45.4%) involved manipulation of laparoscopy devices, and 6 (54.5%) suture procedures. With regard to feedback, this ranged from verbal and visual only to verbal plus another resource such as simulator parameters, auditory alarm, visual cues, and written feedback. Reassessment performed in the medium and long term occurred in only 2 (18.1%) of the articles, while the effect of feedback was assessed immediately after training in most of the studies (54.5%).
Communication skills: 5 of the 26 (19.2%) articles assessed the students’ performance for communication skills. In these articles, feedback was given by the instructor with the aid of a manikin, virtual patient or simulated patient, and instructor. Regarding assessment timing, two studies reassessed students immediately, one in the short term, one medium term, and one over the long term. The feedback had a positive effect on medical student performance in all 5 studies.
Basic life support (BLS) skills: 6 (23%) studies assessed basic life support (BLS) skills. These focused mainly on compressions and ventilations. The feedback was given solely by the instructor in one article, while 4 articles involved feedback from the instructor and manikin. The sixth and final article involved visual and tactile feedback. The adoption of different forms of feedback mechanisms did not affect the positive outcomes of the studies in enhancing student learning.
Clinical skills: 4 (15.3%) studies focused on clinical skills taught at different stages of medical training, such as physical examination and technical procedures. In most of these studies, the assessment for these skills was immediate and resulted in greater learning for the students.
Quality: Risk of Bias in Individual Studies
The methodological quality of the articles, together with the final scores, are presented in the Supplementary Material. None of the articles attained a score of over 9 points. However, 14 (53.8%) scored over 6 points, indicating high methodological quality.
Meta-analysis
The meta-analysis revealed that the use of feedback promoted better results in experimental groups than in control groups (SMD = 0.80 [0.56–1.04], p < 0.001, I2 = 84%) across all studies (13 studies included, with 37 comparisons) (Fig. 2). The same result persisted when only the 7 high-quality studies (17 comparisons) were included (SMD = 0.86 [0.56–1.16], p < 0.001, I2 = 80%) or just the 6 low-quality studies (SMD = 0.76 [0.40–1.12], p < 0.001, I2 = 86%).
Fig. 2.
Meta-analysis of experimental (feedback) groups versus control groups for all studies
The sub-analyses (Table 3) revealed that results were better in feedback groups than in control groups for surgical skills (SMD = 0.53 [0.26–0.79], p < 0.001, I2 = 54%), clinical skills (SMD = 1.46 [0.69–2.24], p < 0.001, I2 = 89%), communication (SMD = 0.67 [0.10–1.24], p < 0.001, I2 = 90%), and basic life support (SMD = 0.90 [0.46–1.34], p < 0.001, I2 = 84%). Similarly, outcomes associated with use of feedback were significant for immediate assessments (SMD = 1.16 [0.79–1.54], p < 0.001, I2 = 87%), non-immediate assessments (SMD = 0.40 [0.20–0.61], p = 0.001, I2 = 60%), and also for both junior students (SMD = 0.84 [0.53–1.15], p < 0.001, I2 = 86%) and senior students (SMD = 0.20 [0.12–0.29], p < 0.001, I2 = 96%). Feedback also significantly improved learning whether delivered by electronic (SMD = 0.63 [0.26–0.99], p < 0.001, I2 = 83%) and non-electronic (SMD = 0.92 [0.61–1.23], p < 0.001, I2 = 85%) means.
Table 3.
Meta-analysis of sub-analyses of feedback group versus control group
| Studies n |
Comparisons n |
Feedback n |
Control n |
SMD | 95% CI | p | I2 | |
|---|---|---|---|---|---|---|---|---|
| All studies | 13 | 37 | 1090 | 967 | 0.80 | 0.56–1.04 | < 0.001 | 84% |
| Study quality | ||||||||
| Low-quality studies | 6 | 20 | 554 | 471 | 0.76 | 0.40–1.12 | < 0.001 | 86% |
| High-quality studies | 7 | 17 | 536 | 496 | 0.86 | 0.56–1.16 | < 0.001 | 80% |
| Skills | ||||||||
| Surgical skills | 3 | 12 | 310 | 242 | 0.53 | 0.26–0.79 | < 0.001 | 54% |
| Clinical skills | 2 | 7 | 189 | 154 | 1.46 | 0.69–2.24 | < 0.001 | 89% |
| Communication | 4 | 8 | 302 | 255 | 0.67 | 0.10–1.24 | < 0.001 | 90% |
| Basic life support | 4 | 10 | 289 | 290 | 0.90 | 0.46–1.34 | < 0.001 | 84% |
| Assessments | ||||||||
| Immediate | 9 | 21 | 583 | 502 | 1.16 | 0.79–1.54 | < 0.001 | 87% |
| Non-immediate | 5 | 16 | 507 | 465 | 0.40 | 0.20–0.61 | 0.001 | 60% |
| Medical students | ||||||||
| Junior | 8 | 23 | 729 | 622 | 0.84 | 0.53–1.15 | < 0.001 | 86% |
| Senior | 5 | 14 | 361 | 345 | 0.20 | 0.12–0.29 | < 0.001 | 96% |
| Feedback type | ||||||||
| Electronic | 6 | 15 | 436 | 372 | 0.63 | 0.26–0.99 | < 0.001 | 83% |
| Non-electronic | 7 | 22 | 654 | 595 | 0.92 | 0.61–1.23 | < 0.001 | 85% |
Discussion
Our findings revealed the diverse use of feedback tools in medical education. The results were heterogeneous in terms of individual response, types of feedback adopted, scenarios included, study quality, and underlying theory. However, the results found by most of the trials and the meta-analysis were largely positive, suggesting that appropriate feedback can improve students’ learning. The results of the sub-analyses proved most robust for clinical skills, immediate outcomes, non-electronic feedback, and junior medical students.
The positive impact of feedback found in this study corroborates the findings of previous systematic reviews [20, 22, 48]. Veloski et al. [48] conducted a search for articles spanning the period from 1966 to 2003 and found that most studies reported a positive impact of feedback on physicians’ performance. Hatala et al. [22] conducted a meta-analysis on the use of feedback in simulation and found feedback to be moderately effective in procedural skills training for students, residents, and physicians. Bing-You [20] conducted a scoping review and found 650 articles addressing feedback published between 1980 and 2015, showing that feedback improved the performance of students, residents, and physicians.
These reviews yield important information for the literature, but incorporated studies with different designs and/or aimed at different populations, including physicians and residents [48]. The present study sought to provide a novel perspective on the effectiveness of feedback in medical education by including only randomized controlled trials and focusing on undergraduate students. In addition, the information was statistically compiled using meta-analysis, providing more robust results on the effectiveness of introducing feedback to students.
The effectiveness of the feedback in promoting learning depends on a number of important factors including the student receiving the feedback, the type, and the level of difficulty of the task [7]. In our study, surgical skills had lower effect sizes as compared to clinical skills and basic life support. This finding can be explained by the fact that the clinical skills tested in the studies included in our meta-analysis involved single physical examination tests (e.g., neurological exam and Rinne and Weber Test), which are specific and tended to involve less procedural tasks. On the other hand, the surgical skills included in this review (e.g., suture, laparoscopy, and surgery) involved more tasks and more planning and assessed more cognitive and psychomotor domains [49].
In fact, the complexity of the task may have an influence on the quality and on the effectiveness of the feedback. According to van der Ridder et al. [7] tasks of higher complexity have lower agreement on ratings by feedback providers because they are difficult to be observed and have a number of sub-tasks that are difficult to assess. In the same direction, a previous meta-analysis by Kluger and DeNisi [50] has found that feedback interventions in the context of complex tasks yielded weaker effect sizes than simpler tasks. According to the authors, the performance of feedback on simpler, familiar, and objective tasks is facilitated, while in complex and subjective tasks is hindered. This could be explained by the fact that, in demanding complex tasks, the information assimilated from each attempt would be more difficult to apply constructively for subsequent efforts [51].
With regard to the timing of outcome reassessment after the use of feedback, the majority of the studies carried out immediate assessment of outcomes. This finding was noted in a previous systematic review [22] in which most of the studies assessed outcomes immediately, and positive results were observed in the short term. In the present review, although only a few studies with long-term assessment were found, our results showed a greater effect of feedback on immediate performance compared with delayed. Thus, the effectiveness of feedback seems to be time-dependent, i.e., the sooner the student is reassessed, the better the performance [8, 11, 44]. An explanation for this finding is the decline in information retention which takes place over time in students [52]. For this reason, continuous, multi-source feedback is more recommended during undergraduate training [53].
Throughout their courses, students are submitted to a high load of information and practical training in both simulated environments and real-life scenarios, necessary for the development of multiple skills for professional practice. This continuous, progressive growth during training may, drawing on cognitive load theory, explain one of the intriguing results of this study. Senior students exhibited a lower effect of feedback compared with junior students. This difference might have been due to the need to process multiple items of information with higher overload and consequent lower performance (20). Another factor to bear in mind is that low initial skill can be associated with greater effects of feedback. One of the hypothesis for this finding is that junior students may be exposed to a skill with less likelihood of having experienced it beforehand, and therefore will not have attained the minimum base level of competence required, with consequent more marked improvement in performance [7]. Another hypothesis could be due to lower expectations by the evaluator that may be more lenient with junior students.
In this study, several factors were found to impact the effects of feedback. These factors need to be taken into account when applying feedback in medical education. Reception of the feedback and its frequency and communication, besides form and content, are all elements which can influence its effect on learning [7, 21]. Feedback is more readily accepted and has a more positive impact when it comes from a reliable source [48]. This is exemplified in the present study by the greater effect of non-electronic feedback, which can be explained by the direct contact between feedback provider and recipient. This allows the provider to observe the student’s behavior in terms of receptiveness through the verbal and non-verbal communication which takes place in this relationship.
Corroborated by the results of most of the studies reviewed, Van de Rider et al. [7] highlighted that, for better responses to feedback, it should be informative, non-judgmental, and constructive, besides being objective, specific, and documented, and provide learning based on goals to be accomplished [7, 53]. Instructors should know in advance what level of feedback they intend to deliver, and be deliberate in their delivery. The present results reveal that most of the studies were performed at a task level, i.e., the students are informed about their performance upon carrying out a task. This is the most common type and is often called corrective feedback [6]. Other evidence shows that the use of this method is essential for the teaching of skills, allowing students to improve their performance. This review corroborates other systematic reviews performed over recent years in finding that most studies of feedback are performed without grounding in a specific construct, besides great heterogeneity in the way feedback is given [2, 7, 20, 22, 48]
Taken together, the findings of the present and previous reviews show that feedback should be used in the proper way, by trained individuals and preferably in a continuous, longitudinal manner throughout the whole educational process as part of an institutional culture [9, 54]. Feedback is inherent to the formative process of assessment, and should promote reflection by the student and be descriptive in nature and linked to performance [54]. In order to make feedback effective, the promotion of self-regulation and interaction should occur, as well as the respect for student’s autonomy, self-control, and orientation/guidance [6]. These findings can support educators, administrators, and institutions in their goal of incorporating a culture of feedback into their curricula.
This review has some limitations which should be taken into consideration. First, there may be publications not indexed on the databases searched, and, although the Grey Literature was also searched, some relevant articles may not have been found. Second, although the Boolean expression was broad, some terms may not have been used in all articles. Third, it was decided to include studies which reported using feedback, even when the feedback construct had not been clearly defined. Fourth, we have decided to separate feedback characteristics generating several sub-analyses. This was divided using an arbitrary way and was based on the review of the literature, separating factors that may influence feedback effectiveness, such as task subject, feedback timing, students’ skills (novice or advanced), and feedback source [7]. Thus, these findings should be interpreted with caution. Despite that, we believe that these sub-analyses will have important implications to educators and clinicians that may be able to understand further the specificity of the feedback intervention. Finally, some articles lacked the data needed to perform a meta-analysis (i.e., mean, standard deviation, and sample size) and were therefore only included in the systematic review. Notwithstanding, the present review has several noteworthy strengths, such as the inclusion of only trials (articles with a high level of evidence) and studies targeting a homogenous population (i.e., medical students).
Conclusion
In conclusion, the randomized controlled trials selected for this study showed the effectiveness of feedback for the acquisition of knowledge and for the education and learning process of medical students. The plethora of feedback types, sources, settings, and the heterogeneity of populations investigated reveal the need for further studies to elucidate the effectiveness of this tool and its appropriate use in medical education.
Abbreviations
- BLS
Basic life support
- CI
95% Confidence interval
- ERIC
Education Resources Information Center
- I2
Variation in study outcomes between studies
- PRISMA
Preferred Reporting Items for Systematic Reviews and Meta-analysis
- PROSPERO
The International Prospective Register of Systematic Reviews
- RevMan
Review Manager
- SMD
Sandardized mean difference
Author Contribution
MABC made substantial contributions to the conception and design of the work; the acquisition, analysis, and interpretation of data; and has drafted the work. RLMA contributed to the acquisition, analysis, and interpretation of data and has substantively revised it. ALGL made substantial contributions to the conception and design of the work; analysis and interpretation of data; and has substantively revised it. SHCT made substantial contributions to the conception of the work; interpretation of data; and has substantively revised it. OSE made substantial contributions to the conception and design of the work; analysis and interpretation of data; and has substantively revised it. GL made substantial contributions to the conception and design of the work; the acquisition, analysis, and interpretation of data; and has drafted the work. All authors have read and approved the manuscript.
Availability of Data and Materials
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.
Declarations
Ethics Approval and Consent to Participate
Not applicable. This is a systematic review.
Consent for Publication
Not applicable.
Competing Interests
The authors declare no competing interests.
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Ende J. Feedback in clinical medical education. JAMA. 1983;250(6):777–781. doi: 10.1001/jama.1983.03340060055026. [DOI] [PubMed] [Google Scholar]
- 2.Archer JC. State of the science in health professional education: effective feedback. Med Educ. 2010;44(1):101–108. doi: 10.1111/j.1365-2923.2009.03546.x. [DOI] [PubMed] [Google Scholar]
- 3.van de Ridder JM, Stokking KM, McGaghie WC, ten Cate OT. What is feedback in clinical education? Med Educ. 2008;42(2):189–197. doi: 10.1111/j.1365-2923.2007.02973.x. [DOI] [PubMed] [Google Scholar]
- 4.Lai MMY, Roberts N, Mohebbi M, Martin J. A randomised controlled trial of feedback to improve patient satisfaction and consultation skills in medical students. BMC Med Educ. 2020;20(1):277. doi: 10.1186/s12909-020-02171-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Bastos ECMA, Lucchetti ALG, Tibiriçá SHC, da Silva EO, Lucchetti G. Use of feedback on medium-term blood pressure measurement skills in medical students: a randomized controlled trial. Blood Press Monit. 2020;25(3):147–154. doi: 10.1097/mbp.0000000000000433. [DOI] [PubMed] [Google Scholar]
- 6.Hattie J, Timperley H. The power of feedback. Rev Educ Res. 2007;77(1):81–112. doi: 10.3102/003465430298487. [DOI] [Google Scholar]
- 7.van de Ridder JM, McGaghie WC, Stokking KM, ten Cate OT. Variables that affect the process and outcome of feedback, relevant for medical training: a meta-review. Med Educ. 2015;49(7):658–673. doi: 10.1111/medu.12744. [DOI] [PubMed] [Google Scholar]
- 8.Khalil MK, Elkhider IA. Applying learning theories and instructional design models for effective instruction. Adv Physiol Educ. 2016;40(2):147–156. doi: 10.1152/advan.00138.2015. [DOI] [PubMed] [Google Scholar]
- 9.Ramani S, Konings KD, Ginsburg S, van der Vleuten CP. Feedback redefined: principles and practice. J Gen Intern Med. 2019;34(5):744–749. doi: 10.1007/s11606-019-04874-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Al-Jundi W, Elsharif M, Anderson M, Chan P, Beard J, Nawaz S. A Randomized controlled trial to compare e-feedback versus "standard" face-to-face verbal feedback to improve the acquisition of procedural skill. J Surg Educ. 2017;74(3):390–397. doi: 10.1016/j.jsurg.2016.11.011. [DOI] [PubMed] [Google Scholar]
- 11.Foster A, Chaudhary N, Kim T, Waller JL, Wong J, Borish M, et al. Using virtual patients to teach empathy: a randomized controlled study to enhance medical students' empathic communication. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2016;11(3):181–189. doi: 10.1097/sih.0000000000000142. [DOI] [PubMed] [Google Scholar]
- 12.Lean LL, Hong RYS, Ti LK. End-task versus in-task feedback to increase procedural learning retention during spinal anaesthesia training of novices. Adv Health Sci Educ Theory Pract. 2017;22(3):713–721. doi: 10.1007/s10459-016-9703-8. [DOI] [PubMed] [Google Scholar]
- 13.Denadai R, Saad-Hossne R, Oshiiwa M, Bastos EM. Training on synthetic ethylene-vinyl acetate bench model allows novice medical students to acquire suture skills. Acta cirurgica brasileira. 2012;27(3):271–278. doi: 10.1590/S0102-86502012000300012. [DOI] [PubMed] [Google Scholar]
- 14.Ahlborg L, Weurlander M, Hedman L, Nisel H, Lindqvist PG, Fellander-Tsai L, et al. Individualized feedback during simulated laparoscopic training: a mixed methods study. Int J Med Educ. 2015;6:93–100. doi: 10.5116/ijme.55a2.218b. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Al Fayyadh MJ, Hassan RA, Tran ZK, Kempenich JW, Bunegin L, Dent DL, et al. Immediate auditory feedback is superior to other types of feedback for basic surgical skills acquisition. J Surg Educ. 2017;74(6):e55–e61. doi: 10.1016/j.jsurg.2017.08.005. [DOI] [PubMed] [Google Scholar]
- 16.Oestergaard J, Bjerrum F, Maagaard M, Winkel P, Larsen CR, Ringsted C, et al. Instructor feedback versus no instructor feedback on performance in a laparoscopic virtual reality simulator: a randomized educational trial. BMC Med Educ. 2012;12:7. doi: 10.1186/1472-6920-12-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Macdonald M, MacCuspie J, Mann K, Blake K. Improving medical student's confidence regarding adolescent interviewing. Pediat Therapeut. 2014;4(4):218. [Google Scholar]
- 18.Pavo N, Goliasch G, Nierscher FJ, Stumpf D, Haugk M, Breckwoldt J, et al. Short structured feedback training is equivalent to a mechanical feedback device in two-rescuer BLS: a randomised simulation study. Scandinavian journal of trauma, resuscitation and emergency medicine. 2016;24:70. doi: 10.1186/s13049-016-0265-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Sox CM, Tenney-Soeiro R, Lewin LO, Ronan J, Brown M, King M, et al. Efficacy of a web-based oral case presentation instruction module: multicenter randomized controlled trial. Acad Pediatr. 2018;18(5):535–541. doi: 10.1016/j.acap.2017.12.010. [DOI] [PubMed] [Google Scholar]
- 20.Bing-You R, Hayes V, Varaklis K, Trowbridge R, Kemp H, McKelvy D. Feedback for learners in medical education: what is known? A scoping review. Academic medicine : journal of the Association of American Medical Colleges. 2017;92(9):1346–1354. doi: 10.1097/acm.0000000000001578. [DOI] [PubMed] [Google Scholar]
- 21.Kornegay JG, Kraut A, Manthey D, Omron R, Caretta-Weyer H, Kuhn G, et al. Feedback in medical education: a critical appraisal. AEM education and training. 2017;1(2):98–109. doi: 10.1002/aet2.10024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Hatala R, Cook DA, Zendejas B, Hamstra SJ, Brydges R. Feedback for simulation-based procedural skills training: a meta-analysis and critical narrative synthesis. Adv Health Sci Educ Theory Pract. 2014;19(2):251–272. doi: 10.1007/s10459-013-9462-8. [DOI] [PubMed] [Google Scholar]
- 23.Ot C. An updated primer on entrustable professional activities (EPAs) Revista Brasileira de Educação Médica. 2020;43:712–720. [Google Scholar]
- 24.Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4:1. doi: 10.1186/2046-4053-4-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Johnston S, Coyer FM, Nash R. Kirkpatrick's evaluation of simulation and debriefing in health care education: a systematic review. J Nurs Educ. 2018;57(7):393–398. doi: 10.3928/01484834-20180618-03. [DOI] [PubMed] [Google Scholar]
- 26.Berger VW, Alperson SY. A general framework for the evaluation of clinical trial quality. Rev Recent Clin Trials. 2009;4(2):79–88. doi: 10.2174/157488709788186021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Cochrane Database Syst Rev 2016;4(4):CD012106. [DOI] [PMC free article] [PubMed]
- 28.Ahn C, Lee J, Oh J, Song Y, Chee Y, Lim TH, et al. Effectiveness of feedback with a smartwatch for high-quality chest compressions during adult cardiac arrest: a randomized controlled simulation study. PLoS ONE. 2017;12(4):e0169046. doi: 10.1371/journal.pone.0169046. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Beckers SK, Biermann H, Sopka S, Skorning M, Brokmann JC, Heussen N, et al. Influence of pre-course assessment using an emotionally activating stimulus with feedback: a pilot study in teaching Basic Life Support. Resuscitation. 2012;83(2):219–226. doi: 10.1016/j.resuscitation.2011.08.024. [DOI] [PubMed] [Google Scholar]
- 30.Bjerrum F, Maagaard M, Led Sorensen J, Rifbjerg Larsen C, Ringsted C, Winkel P, et al. Effect of instructor feedback on skills retention after laparoscopic simulator training: follow-up of a randomized trial. J Surg Educ. 2015;72(1):53–60. doi: 10.1016/j.jsurg.2014.06.013. [DOI] [PubMed] [Google Scholar]
- 31.Blake K, Mann KV, Kaufman DM, Kappelman M. Learning adolescent psychosocial interviewing using simulated patients. Academic medicine : journal of the Association of American Medical Colleges. 2000;75(10 Suppl):S56–S58. doi: 10.1097/00001888-200010001-00018. [DOI] [PubMed] [Google Scholar]
- 32.Boehler ML, Rogers DA, Schwind CJ, Mayforth R, Quin J, Williams RG, et al. An investigation of medical student reactions to feedback: a randomised controlled trial. Med Educ. 2006;40(8):746–749. doi: 10.1111/j.1365-2929.2006.02503.x. [DOI] [PubMed] [Google Scholar]
- 33.Díez N, Rodríguez-Díez MC, Nagore D, Fernández S, Ferrer M, Beunza JJ. A randomized trial of cardiopulmonary resuscitation training for medical students: voice advisory mannequin compared to guidance provided by an instructor. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2013;8(4):234–241. doi: 10.1097/SIH.0b013e31828e7196. [DOI] [PubMed] [Google Scholar]
- 34.Garner MS, Gusberg RJ, Kim AW. The positive effect of immediate feedback on medical student education during the surgical clerkship. J Surg Educ. 2014;71(3):391–397. doi: 10.1016/j.jsurg.2013.10.009. [DOI] [PubMed] [Google Scholar]
- 35.Judkins TN, Oleynikov D, Stergiou N. Real-time augmented feedback benefits robotic laparoscopic training. Studies in health technology and informatics. 2006;119:243–248. [PubMed] [Google Scholar]
- 36.Judkins TN, Oleynikov D, Stergiou N. Enhanced robotic surgical training using augmented visual feedback. Surgical innovation. 2008;15(1):59–68. doi: 10.1177/1553350608315953. [DOI] [PubMed] [Google Scholar]
- 37.Kannappan A, Yip DT, Lodhia NA, Morton J, Lau JN. The effect of positive and negative verbal feedback on surgical skills performance and motivation. J Surg Educ. 2012;69(6):798–801. doi: 10.1016/j.jsurg.2012.05.012. [DOI] [PubMed] [Google Scholar]
- 38.Li Q, Ma EL, Liu J, Fang LQ, Xia T. Pre-training evaluation and feedback improve medical students' skills in basic life support. Med Teach. 2011;33(10):e549–e555. doi: 10.3109/0142159x.2011.600360. [DOI] [PubMed] [Google Scholar]
- 39.Li Q, Zhou RH, Liu J, Lin J, Ma EL, Liang P, et al. Pre-training evaluation and feedback improved skills retention of basic life support in medical students. Resuscitation. 2013;84(9):1274–1278. doi: 10.1016/j.resuscitation.2013.04.017. [DOI] [PubMed] [Google Scholar]
- 40.Park JH, Son JY, Kim S, May W. Effect of feedback from standardized patients on medical students' performance and perceptions of the neurological examination. Med Teach. 2011;33(12):1005–1010. doi: 10.3109/0142159x.2011.588735. [DOI] [PubMed] [Google Scholar]
- 41.Rodrigues SP, Horeman T, Sam P, Dankelman J, van den Dobbelsteen JJ, Jansen FW. Influence of visual force feedback on tissue handling in minimally invasive surgery. Br J Surg. 2014;101(13):1766–1773. doi: 10.1002/bjs.9669. [DOI] [PubMed] [Google Scholar]
- 42.Scheidt PC, Lazoritz S, Ebbeling WL, Figelman AR, Moessner HF, Singer JE. Evaluation of system providing feedback to students on videotaped patient encounters. J Med Educ. 1986;61(7):585–590. doi: 10.1097/00001888-198607000-00006. [DOI] [PubMed] [Google Scholar]
- 43.Schmidt M, Freund Y, Alves M, Monsel A, Labbe V, Darnal E, et al. Video-based feedback of oral clinical presentations reduces the anxiety of ICU medical students: a multicentre, prospective, randomized study. BMC Med Educ. 2014;14:103. doi: 10.1186/1472-6920-14-103. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Farjad Sultan S, Iohom G, Shorten G. Effect of feedback content on novices' learning ultrasound guided interventional procedures. Minerva Anestesiol. 2013;79(11):1269–1280. [PubMed] [Google Scholar]
- 45.van de Ridder JM, Peters CM, Stokking KM, de Ru JA, Ten Cate OT. Framing of feedback impacts student's satisfaction, self-efficacy and performance. Adv Health Sci Educ Theory Pract. 2015;20(3):803–816. doi: 10.1007/s10459-014-9567-8. [DOI] [PubMed] [Google Scholar]
- 46.Walsh RA, Sanson-Fisher RW, Low A, Roche AM. Teaching medical students alcohol intervention skills: results of a controlled trial. Med Educ. 1999;33(8):559–565. doi: 10.1046/j.1365-2923.1999.00378.x. [DOI] [PubMed] [Google Scholar]
- 47.Xeroulis GJ, Park J, Moulton CA, Reznick RK, Leblanc V, Dubrowski A. Teaching suturing and knot-tying skills to medical students: a randomized controlled study comparing computer-based video instruction and (concurrent and summary) expert feedback. Surgery. 2007;141(4):442–449. doi: 10.1016/j.surg.2006.09.012. [DOI] [PubMed] [Google Scholar]
- 48.Veloski J, Boex JR, Grasberger MJ, Evans A, Wolfson DB. Systematic review of the literature on assessment, feedback and physicians' clinical performance: BEME Guide No. 7. Medical teacher. 2006;28(2):117–28. 10.1080/01421590600622665. [DOI] [PubMed]
- 49.Costa G, Rocha HAL, Moura Júnior LG, Medeiros FDC. Taxonomy of educational objectives and learning theories in the training of laparoscopic surgical techniques in a simulation environment. Revista do Colegio Brasileiro de Cirurgioes. 2018;45(5):e1954. doi: 10.1590/0100-6991e-20181954. [DOI] [PubMed] [Google Scholar]
- 50.Kluger AN, DeNisi A. Feedback interventions: toward the understanding of a double-edged sword. Curr Dir Psychol Sci. 1998;7(3):67–72. doi: 10.1111/1467-8721.ep10772989. [DOI] [Google Scholar]
- 51.Mahmood T, Darzi A. The learning curve for a colonoscopy simulator in the absence of any feedback: no feedback, no learning. Surg Endosc. 2004;18(8):1224–1230. doi: 10.1007/s00464-003-9143-4. [DOI] [PubMed] [Google Scholar]
- 52.Custers E. Long-term retention of basic science knowledge: a review study. Adv Health Sci Educ Theory Pract. 2010;15(1):109–128. doi: 10.1007/s10459-008-9101-y. [DOI] [PubMed] [Google Scholar]
- 53.Ramani S, Krackov SK. Twelve tips for giving feedback effectively in the clinical environment. Med Teach. 2012;34(10):787–791. doi: 10.3109/0142159x.2012.684916. [DOI] [PubMed] [Google Scholar]
- 54.Jug R, Jiang XS, Bean SM. Giving and receiving effective feedback: a review article and how-to guide. Arch Pathol Lab Med. 2019;143(2):244–250. doi: 10.5858/arpa.2018-0058-RA. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.


