Skip to main content
Journal of Graduate Medical Education logoLink to Journal of Graduate Medical Education
. 2018 Dec;10(6):676–677. doi: 10.4300/JGME-D-18-00817.1

Should Trainees Get Paid to Submit Patient Safety Reports?

Rebecca L Volpe 1,, Steve Mrozowski 1, Michael J Green 1
PMCID: PMC6314361  PMID: 30619526

In the December 2018 issue of the Journal of Graduate Medical Education, Turner and colleagues1 describe a financial incentive program aimed at increasing the number of patient safety reports submitted by residents and fellows. The researchers found that implementing this program, which included a new online reporting system, an educational initiative, and financial incentives, resulted in an increase in trainee reports of safety events. Preincentive, less than 0.5% of submitted safety reports came from trainees, but after the new program was implemented, 7% originated from this group. At first look, these outcomes are tantalizing, yet they raise questions that should be further pursued.

First, it may be necessary to question the underlying premise of the study—that more reporting by residents and fellows improves the quality of patient care. Turner and colleagues1 acknowledge that this might not be the case, noting that “an increase in safety event reporting by itself will not achieve the desired impact on patient safety unless reporting is paired with robust feedback and demonstrable changes in practice.” We believe this is a critical issue, but the authors do not include data that show a link between reporting and quality. There is evidence that in US states where reporting of adverse events is mandatory, reports submitted by individuals (contrasted with automated reporting) have improved safety and reduced harm at the institutional level after root cause analysis.2 However, this positive outcome was found in the context of (1) voluntary reporting and (2) conducting a root cause analysis. We wonder whether voluntary reporting is comparable to incentivized reporting and whether root cause analyses were conducted following the receipt of the incentivized safety reports.

It is also conceivable that increasing the number of reports by trainees may not result in patient safety improvements because the event the trainee is reporting has already been reported by someone else. Turner and colleagues1 indicate that in 2014–2015, trainees submitted 1288 reports. It would be interesting to know what percentage of these reports were for events not previously submitted by another health professional. If the bulk of the trainee reports are related to events already reported by others, it is unclear what value the duplicate report holds.

We also wonder whether the $200 per year an individual trainee can receive for meeting the patient safety report metric may create an incentive to overreport or to create trivial reports. Is it known whether all trainee-reported events warranted reporting or whether duplicate reports were entered to increase the numbers? More details in this area would help assuage concerns that trainee reports could be motivated by factors other than the goal of improving patient care. A follow-up study might assess a trainee's ability to define and describe what event types and scenarios are reportable based on the institution's standards.

Finally, we want to note a study interpretation issue: correlation is not causation. The authors do acknowledge this in their limitations section, but it does not stop them from drawing causal conclusions such as “an incentive program . . . can substantially increase GME reporting of patient safety events.” We do not believe the methodology supports such a conclusion. Most critical is that the study has several interventions. Rates of safety event reporting increased after residents and fellows completed a required educational training, after a link for the reporting site was added to a resident web page, and after trainees were told about the incentive program. Any of these factors could have contributed to increasing reporting rates. It is also possible that the institutional culture shifted during the study period to become more accepting of open discussions of safety events and reporting—which also could have affected the number of reports. Though it is plausible that the incentive program played an important role in increasing reporting rates, with other variables at play, we do not know whether, or to what extent, this was the case.

The reported cost for the intervention was $197,000 for a single year, which is substantial. Providing this level of financial support in the form of incentives to residents and fellows makes sense only if there is evidence that when trainees report patient safety events, patient care improves. Since a significant barrier in voluntary reporting is an absence of feedback to the reporter, organizations should consider whether they would be better served by investing in broad feedback mechanisms to relay the outcome of safety event reports rather than investing in incentivizing event reporting.

References

  • 1.Turner DA, Bae J, Cheely G, et al. Improving resident and fellow engagement in patient safety through a graduate medical education incentive program. J Grad Med Educ. 2018;10(6):671–675. doi: 10.4300/JGME-D-18-00281.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Steen S, Jaeger C, Price L, et al. Increasing patient safety event reporting in an emergency medicine residency. BMJ Qual Improv Rep. 2017;6(1) doi: 10.1136/bmjquality.u223876.w5716. :u223876.w5716. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education

RESOURCES