Abstract
Real-time feedback is a growing trend in patient- and family experience (PFE) work as it allows for immediate service recovery, though it typically requires a significant investment of time and financial resources. We describe a partnership with our “edutainment” system to administer an automated daily experience question (the “Daily Pulse Measure [DPM]”) that allowed targeted just-in-time responses to low scores with minimal administrative cost. Through a series of Plan-Do-Study-Act cycles guided by family feedback, the question was created and modified, and the use of the question spread to all hospital units. The response rate was 23%, similar to our Hospital Consumer Assessment of Healthcare Providers and Systems survey response rate of 24% during the study period. Though the DPM did not have a consistent impact on the results of the 2 PFE survey questions we evaluated, units with improved PFE scores after the DPM roll-out tended to have more robust service recovery than those with low scores.
Keywords: patient feedback, technology, leadership rounding, pediatrics, survey data
Key Points
Real-time feedback is a useful tool in patient- and family experience reporting as it provides an opportunity for in-the-moment service recovery.
Partnering with current inpatient information technology is a cost-effective way to solicit real-time feedback.
Existing technology can be leveraged to administer survey questions and then provide immediate, automated notification of a report of poor experience to key stakeholders.
Introduction
The Hospital Consumer Assessment of Healthcare Providers and Systems survey tool (HCAHPS) is the most commonly used survey tool for reporting patient experience, though is significantly limited in the actionability of data collected.1–5 Depending on the mode of administration, survey responses can be received 48 hours—3 months after hospital discharge. This inconsistency and delay make service recovery at the level of the patient impossible. To overcome this, many institutions use alternate technology to obtain real-time feedback, mostly in the form of surveys on kiosks placed in waiting rooms or on hand-held tablets.3,6,7 Response rates to these surveys are higher when staff remind patients to complete surveys or when surveys are verbally facilitated,6–8 but this requires time investment for staff to administer surveys. Technological limitations (internet connectivity, equipment function, and kiosk physical placement) also affect survey completion. 6 Finally, cost is a barrier to implementation, especially tablet or kiosk purchase. 7 Benefits of real-time feedback include obtaining complaint data with more detail, the ability to act on service problems early, decreasing recall and nonresponse biases, and showing patients their views matter.2–4,9,10
Description of the Intervention
In 2015, our children's hospital identified patient- and family experience (PFE) as a 5-year strategic plan priority while also targeting cost-effectiveness. We aimed to use our already-established “edutainment” system, the GetWellNetwork™ (GWN) to obtain real-time feedback in the form of a “Daily Pulse Measure” (DPM) question. The GWN was available for use with interactive remote controls on televisions in every patient room. Because the GWN was the primary hub for inpatient technology-based entertainment, we capitalized on patient and family engagement in it, creating a pop-up box displaying the DPM question on the screen with a prompt to answer before normal programming resumed. We used Plan-Do-Study-Act (PDSA) cycles 11 to optimize the process on pilot units before implementing it hospital-wide in July 2016. In 2017, we retrospectively analyzed the data we obtained to better understand the impact the DPM had on select PFE scores for hospital acute care units.
The DPM was based on The Idealized Design of Clinical Office Practices model, 12 an Institute for Healthcare Improvement initiative to redesign clinical practice in themes of access, interaction, reliability, and vitality such that families would say, “They give me exactly the help I want (and need) exactly when I want (and need) it.” We altered the statement to be more specific: “Today, this hospital gave us exactly the care we wanted and needed.” Integration of the DPM question pop-up box and data collection capability into GWN occurred in collaboration between the GWN vendor and an internal hospital GWN Steering Committee. The plan for the DPM was presented by hospital PFE leadership to our Family Advisory Council in June 2015 with feedback incorporated into the first pilot. PDSAs were completed throughout project development and refinement and are detailed in Figure 1.
Figure 1.
PDSA cycles in the creation of the DPM.
Abbreviations: PDSA, Plan-Do-Study-Act; DPM, Daily Pulse Measure.
The first iteration of the DPM question was piloted on 3 medical units in December 2015. We used star ratings to leverage societal norms. For the DPM pop-up box to show on the screen (a “launch”), the television in the room needed to be on, and a patient assigned to the room. The box either remained on the television screen until a response was selected or for 20 min in the case of no response. When a family gave a negative response (a response of “never” or “2” stars), they were prompted with 5 additional questions to clarify how their needs were not being met.
Negative responses were summarized and sent in real-time via email to unit leaders. Unit leaders were encouraged to use the alerts for real-time service recovery but were not required to do so; there was no mandated or standardized recommendation for response to negative feedback. Both positive and negative responses were captured by the GWN system and were available immediately for review if desired and in a monthly summary to unit leaders.
Early structured interviews with staff and families identified that the design of the pop-up box prompted false negative responses since the cursor defaulted to “never” when the statement was displayed. A patient or family member quickly clicking to remove the box or accidentally clicking while using other GWN features would thus inadvertently select “never.” In January 2016, the option “Remind Me Later” was moved to this default position. Interviews also revealed that families who had been admitted only a few hours before the DPM launch preferred the option of answering the question later in the day, so a third launch was added. If “Remind Me Later” was selected, GWN would send the question again later, up to 3 times daily.
Findings from the 5-unit pilot phase were shared with the Family Advisory Council in April 2016. Recommendations for improvement included: (1) removal of the third launch to avoid too many interruptions to entertainment, (2) having the second launch later in the evening (8:00 pm) to catch more parents and families in their rooms, (3) switch from the stars scale to a “smiley face” scale to align with hospital pain rating scales, (4) decrease the DPM question frequency to every other day on units with longer hospital stays, and (5) provide a place for comments.
In May 2016, all units involved in pilot testing employed the first 4 recommendations. House-wide testing of the improved DPM occurred in June 2016 with subsequent formal data collection. Units with an average length of stay <7 days launched twice daily while those with average stays ≥7 days launched 3 times weekly. We also made changes to the custom data collection algorithms in GWN as issues were identified. For example, GWN initially used the number of launches per day as the denominator to calculate the response rate. Over our test period, the team recognized this and created new rules such that GWN more appropriately used the denominator of “patients in a bed at 2:00 pm” for automated response rate calculations.
Evidence of Impact
Between our study period of July 1, 2016–December 31, 2017, the average response rate to our HCAHPS survey was 24.2%. Across all units over the study period, a total of 25,856 patients received DPM launches. The total response rate for all units was 43.1% (41,026 responses, 25,856 patients). Of the total responses, 23.0% (21,856 responses, 11,781 patients) were scaled numerical responses (strongly disagree–strongly agree, or 1–5) with the remainder of responses being “Ask Me Later.”
Finally, we attempted to link the use of the DPM to changes in patient experience as measured by our hospital PFE survey. Examining the relationship between DPM initiation and all survey questions was deemed too onerous, so we chose 2 questions to analyze. The first was the question we felt was most likely to show improvement, a custom hospital survey question, “During this hospital stay, do you think your child got all the care he/she needed?” The second was the HCAHPS global care question “Using any number from 0 to 10 where 0 is the worst hospital possible and 10 is the best hospital possible, what number would you use to rate this hospital during your child's stay?” We used segmented regression 13 to compare data from 2 years prior (August 2014–July 2016) to the 2 years in our study period.
The majority of our inpatient acute care units showed no change in scores for these 2 questions after DPM initiation. One unit showed improvement in score on our custom survey question and 1 unit showed worse score. Scores for the HCAHPS global care question improved on 1 unit and worsened on 2 units.
Post-hoc interviews with nursing leadership on the 2 units showing improved scores after DPM initiation revealed that these units had invested significant time into responding to negative responses, rounding on families who provided negative feedback within a few hours of receiving the feedback, and following up until the issue could be entirely resolved. On the 3 units that showed decreased scores, unit leadership either did not receive negative responses or did not have a process in place to respond when notified of one.
Discussion
We were able to successfully partner with GWN to create and integrate a mechanism for real-time feedback into our preexisting patient room media service. While the process was initially time-intensive, it required little maintenance by the end of the study period. While similar to our HCAHPS response rate at the same time, our 23% rate of graded responses was potentially more valuable since real-time feedback allowed the opportunity for immediate service recovery. Though there was no impact of the DPM on experience survey results for the 2 questions we evaluated, we noted that the few units with improved PFE scores after the DPM roll-out actively addressed concerns raised, whereas units with decreases in scores had no such plan. We lacked a standardized approach to negative DPM feedback, thereby making it difficult to interpret which parts of the intervention were the most helpful. This study supports the growing body of evidence that PFE data needs to be acted upon to be impactful.2,9,14
Acknowledgments
We thank Terri Byczkowski, MBA, PhD for her guidance in study design, Anne Boat, MD, for her conceptualization and actualization of the Daily Pulse Measure process, Jesse Hawke, PhD for his assistance with statistical analysis, and GetWellNetwork™ for their partnership in creating the DPM.
Footnotes
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article
Ethical Approval: Our institution does not require ethical approval for reporting individual cases or case series.
Statement of Informed Consent: Informed consent for patient information to be published in this article was not obtained because consent to use the reported data was implied by family completion of the survey.
Funding: The authors received no financial support for the research, authorship, and/or publication of this article.
ORCID iD: Lisa M Remer https://orcid.org/0000-0001-6760-9679
References
- 1.DiRocco DN, Day SC. Obtaining patient feedback at point of service using electronic kiosks. Am J Manag Care. 2011;17(7):e270-6. [PubMed] [Google Scholar]
- 2.Russell S. Patients’ experiences: top heavy with research. 2013. Accessed July 5, 2018. https://www.research-matters.com.au/publications/PatientsExperiencesReview.pdf.
- 3.Wofford JL, Campos CL, Jones RE, Stevens SF. Real-time patient survey data during routine clinical activities for rapid-cycle quality improvement. JMIR Med Inform. 2015;3(1):e13. DOI: 10.2196/medinform.3697 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Zakare-Fagbamila RT, Howell E, Choi AY, et al. Clinic satisfaction tool improves communication and provides real-time feedback. Neurosurgery. 2019;84(4):908-18. DOI: 10.1093/neuros/nyy137 [DOI] [PubMed] [Google Scholar]
- 5.The HCAHPS survey-frequently asked questions. Accessed July 6, 2018. https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Downloads/HospitalHCAHPSFactSheet201007.pdf.
- 6.Kasbauer S, Cooper R, Kelly L, King J. Barriers and facilitators of a near real-time feedback approach for measuring patient experiences of hospital care. Health Policy Technol. 2017;6(1):51-8. DOI: 10.1016/j.hlpt.2016.09.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Wright C, Davey A, Elmore N, et al. Patients’ use and views of real-time feedback technology in general practice. Health Expect. 2017;20(3):419-33. DOI: 10.1111/hex.12469 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.DiRocco D, Day S. Obtaining patient feedback at point of service using electronic kiosks. Am J Manag Care. 2011;17(7):e270-276. [PubMed] [Google Scholar]
- 9.Graham C, Kasbauer S, Cooper R, et al. An evaluation of a near real-time survey for improving patients’ experiences of the relational aspects of care: a mixed-methods evaluation. Health Serv Deliv Res. 2018;6(15). [PubMed] [Google Scholar]
- 10.Larsen D, Peters H, Keast J, Devon R. Using real time patient feedback to introduce safety changes. Nurs Manage. 2011;18(6):27-31. [DOI] [PubMed] [Google Scholar]
- 11.Langley GL, Moen R, Nolan KM, Nolan TW, Norman CL, Provost LP. The improvement guide: a practical approach to enhancing organizational performance. 2nd ed. Jossey-Bass Publishers; 2009. [Google Scholar]
- 12.Idealized design of clinical office practices. Institute for Healthcare Improvement. Accessed November 15, 2017. http://www.ihi.org/Engage/Initiatives/Completed/IDCOP/Pages/default.aspx.
- 13.Wong EC, Chen P, Hung D. Analyzing phased interventions with segmented regression and stepped wedge designs. lexajansen.com: 2014. Accessed May 4, 2018. https://www.lexjansen.com/wuss/2014/74_Final_Paper_PDF.pdf.
- 14.Coulter A, Locock L, Ziebland S, Calabrese J. Collecting data on patient experience is not enough: they must be used to improve care. Br Med J. 2014;348:g2225. DOI: 10.1136/bmj.g2225 [DOI] [PubMed] [Google Scholar]

