Abstract
Objective:
Most US hospitals conduct patient experience surveys by mail or telephone after discharge to assess patient/family centeredness of care. Pediatric response rates are usually very low, especially for black, Latino, and low-income respondents. We investigated whether day of discharge surveying using tablets improves response rates and respondent representativeness.
Methods:
This was a quasi-experimental study of parents of patients discharged from 4 units of a children’s hospital. Parents were assigned to receive the Child Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) via an audio-enabled tablet before discharge or via mail at approximately 1 week postdischarge. Intervention and control conditions alternated by week. We compared response rates, child/respondent characteristics, and mean top-box scores between tablet and mail only arms.
Results:
Administering Child HCAHPS on a tablet was administratively feasible and did not interfere with the discharge process (median completion time, 12.4 minutes). The response rate was 71.1% (424 of 596) for tablet versus 16.3% (96 of 588) for mail only. Although the tablet response rate was higher in every subgroup, tablet respondents were more likely to be fathers (20.4% vs 6.4%; P = .006), more likely to have a high school education or less (17.5% vs 8.4%; P = .002), less likely to be white (56.8% vs 71.9%; P = .006), and more likely to be publicly insured (31.4% vs 19.8%; P = .02). Tablet scores were significantly higher than mail only scores for 3 of 17 measures.
Conclusions:
The response rate for day of discharge tablet survey administration was >4-fold higher than with single-wave mail-only administration, with greater participation of hard-to-reach groups. These findings suggest tablet administration before discharge shows great promise for real-time feedback and QI and may transform the field of inpatient survey administration.
Keywords: patient experience, quality improvement, survey administration
Patient and family experiences are critical components of health care quality, with positive experiences associated with improved health outcomes and reduced disparities.1–6 Nearly all hospitals in the United States survey patients or families about their experience of inpatient care, typically by mail or telephone after discharge. Survey results are often used to identify quality improvement (QI) targets, and interventions aimed at improving patient experience have increased patient experience scores.7,8 Patient/family experience measures are being increasingly used to measure performance across sites. For instance, the Adult Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) Survey is included as a measure in the Centers for Medicare & Medicaid Services’ Hospital Value-Based Purchasing Program and is reported on Medicare’s Hospital Compare website.
Adult HCAHPS is administered nationally to measure experience in adult inpatient settings, while the recently developed Child HCAHPS Survey is being used by a growing number of hospitals to measure family experience of pediatric inpatient care. Although the Child HCAHPS implementation guidelines recommend standardized administration procedures (ie, 2 waves of mail, or telephone, or mail with telephone follow-up [mixed mode]),9 currently most hospitals use a single-wave mail protocol for pediatric surveys. Partly for this reason, survey response rates in pediatrics are generally low (~15%–25%), especially for black and Latino caregivers or those with lower socioeconomic status or low literacy,10–12 leaving the experiences and needs of many patients and families unheard. Low response rates for experience surveys make it challenging for many hospitals to obtain the minimum number of completed surveys needed to achieve target reliability for all items (ie, 300 completed surveys per hospital) and to compare performance across hospitals and within their own hospital over time.9 Low response rates also make it hard for hospitals to reach the larger sample sizes needed to compare results by service line (eg, medicine, surgery), unit, or patient characteristics; this deficiency makes it challenging for hospitals to identify areas in particular need of improvement and to be able to demonstrate change in response to local interventions. As a result, hospitals are interested in alternative patient/family experience survey strategies that will increase response rates and representativeness of respondents.
Among alternative strategies for survey administration, use of handheld tablets show promise. In health care settings, tablet surveys are administered at the point of service and time of service. Tablets are easy to use and integrate well into various clinical and nonclinical settings for administering surveys covering topics like general health screening and patient-reported quality of life.13,14 A particular appeal is that tablet surveys can significantly increase response rates.13–16 In addition, they can incorporate audio computer assisted self-interview (ACASI) software, which reads questions aloud to the respondent and facilitates completion by those with low literacy.17 How these strategies could be applied to inpatient experience of care surveys has not been extensively examined. Use of tablets changes the interface by which the respondent completes the survey (ie, on a screen vs by mail), location (ie, in person in the hospital vs at home) and timing (ie, on the day of discharge vs after discharge). These differences have the potential to increase response rates and the representativeness of respondents, but they also might change how respondents answer the questions.
We compared administration of Child HCAHPS using 2 methods—providing the survey on tablets with ACASI software18 on the day of discharge and providing it by mail postdischarge—to determine the effects on response rate, representativeness of respondents, and measurement scores.
Methods
Setting and Participants
The study was conducted at a tertiary care children’s hospital on 2 medical and 2 surgical units. Eligible respondents were parents/guardians (hereinafter, parents) of patients age <18 years whose hospitalization included at least 1 overnight stay. Parents were excluded if they could not complete the survey in English or Spanish or if they met other standard exclusion criteria.19
Child HCAHPS Survey
The Center of Excellence for Pediatric Quality Measurement at Boston Children’s Hospital recently developed Child HCAHPS to measure family experience of pediatric inpatient care.19 The 62-item survey has 18 composite and single-item measures. The tablet instrument used 51 items (constituting 17 measures). The remaining 11 items (constituting 1 measure) that covered the discharge process (eg, did providers ask if parents had concerns about their child’s readiness to leave?) were administered after discharge using a nontablet protocol. Survey scores were case mix adjusted, using both parent (age, relationship to child, education, preferred language) and child (age, parent-reported global health status) characteristics. Additional covariates obtained from administrative data included child race/ethnicity, child insurance status, and visit characteristics (length of stay and service).
Intervention and Study Design
In this quasi-experimental study, patients were assigned to either the novel tablet arm (intervention group) or the standard mail arm (control group) based on the unit from which the patient was discharged, which removed any possible bias owing to associations between survey variables and the day of the week of discharge. Intervention and control conditions alternated by week. Each week (Monday to Friday), 1 medical unit and 1 surgical unit were assigned to the tablet arm, and 1 medical unit and 1surgical unit were assigned to the mail arm; these units alternated between the 2 arms weekly. Randomization was driven by the natural randomness of the discharge day determined by the clinical team, which was not influenced in any manner by the study staff. Parents were told that they could choose to not answer questions or to end their participation at any time.
In the tablet arm, we administered Child HCAHPS on the day of discharge using a handheld tablet computer preloaded with ACASI software in English and Spanish. Questions and response options could be read on the screen and heard through earphones; participants could answer questions without waiting for the audio to finish if they desired. The software incorporated automatic skips following branching questions. At the end of the survey, we asked about the tablet’s ease of use.
Once the clinical team determined that a patient was going to be discharged that day, administrative staff were notified. Hospital or study staff approached all eligible parents on the day of discharge using a standardized introduction script, brought tablets (with disposable headphones) attached to wheeled stands into patient rooms, and later returned to collect tablets. Tablets were sanitized after each use. Approximately 2 days after discharge, staff administered the 11 discharge items to parents first by e-mail, then by mail, and then by telephone using a standardized protocol (when contact information was available).
For the mail arm, the hospital’s existing survey vendor used its standard protocol, mailing the survey to the home once, approximately 1 week after discharge. Parents mailed completed surveys to the survey vendor.
Statistical Analysis
We compared response rates and characteristics of respondents between the 2 arms using the chi-square test. Within each arm, we compared respondents and nonrespondents using multivariate logistic regression models. Overall P values for the respondent and nonrespondent multivariate logistic regression models are from Wald tests of the global null hypothesis with all of the foregoing characteristics included. For these multivariate models only, nonwhite race/ethnicity was parameterized as yes, no, or missing. We also report response rates for the discharge items.
We compared the responses to the Child HCAHPS measures in the 2 arms, adjusting for the Child HCAHPS case mix adjustment variables. Top-box item scores were defined as the percentage of respondents selecting the most positive response option, such as “always,” “yes, definitely,” or a response of “9” or “10” on a 10-point scale for overall hospital rating. Composite scores for the 17 measures were defined as the mean of the case mix–adjusted hospital-level scores for the component items. For these comparisons, data were weighted to account for differences in distributions of characteristics of tablet and mail respondents. These weights were generated from a multivariable logistic regression model for response by tablet (vs by mail) with case mix adjustors, race/ethnicity, and membership in public health insurance.
Statistical analyses were conducted using SAS version 9.4 (SAS Institute, Cary, NC). The Boston Children’s Hospital Institutional Review Board approved the study.
Results
Response Rates
During the tablet arm intervention period (April 13, 2015 to July 31, 2015) and the mail arm control period (March 30, 2015 to August 14, 2015), a total of 1690 patients were discharged, 1184 of whom were eligible to have their parents complete the survey. The response rate was 71.1% (424 of 596) for the tablet arm and 16.3% (96 of 588) for the mail arm. Families that left before they could be approached accounted for 69.2% of the nonrespondents in the tablet arm. The median tablet survey completion time was 12.4 minutes. Ninety-six percent of respondents reported that the tablet was “very easy” or “somewhat easy” to use. No tablets were damaged, lost, or stolen.
Among parents who completed the tablet survey, 340 (80.2%) completed the postdischarge items by email (27.4% of discharge-item respondents), mail (23.5%), or telephone (49.1%), yielding a 57.0% response rate for the after-discharge survey over the entire sample assigned to the tablet arm.
Respondent Characteristics and Representativeness
Tablet arm participants were more similar to the eligible population than were mail arm participants with regard to child age, insurance status, and service (Table 1). For instance, 32.1% of all eligible discharged patients had public insurance, as did 31.4% of the tablet respondents. Significantly fewer of the mail respondents had public insurance (19.8%; P = .02).
Table 1.
Characteristic | Tablet | Tablet vs Mail, P Value* | All Eligible Patients | |
---|---|---|---|---|
Child characteristics, n (%) | ||||
Female sex | 212 (50.0) | 44 (45.8) | .46 | 563 (47.6) |
Age, y | .05 | |||
0–4 | 157 (37.0) | 29 (30.2) | 463 (39.1) | |
5–12 | 151 (35.6) | 31 (32.3) | 430 (36.3) | |
13–17 | 116 (27.4) | 36 (37.5) | 291 (24.6) | |
Race/ethnicity | .006 | |||
Black/non-Latino | 37 (8.7) | 3 (3.1) | 99 (8.4) | |
Latino | 47 (11.1) | 5 (5.2) | 147 (12.4) | |
White/non-Latino | 241 (56.8) | 69 (71.9) | 685 (57.9) | |
Other/non-Latino | 23 (5.4) | 9 (9.4) | 57 (4.8) | |
Missing | 76 (17.9) | 10 (10.4) | 196 (16.6) | |
Parent-reported health status | .11 | N/A | ||
Excellent | 136 (33.0) | 39 (41.1) | ||
Very good | 133 (32.3) | 32 (33.7) | ||
Good | 93 (22.6) | 15 (15.8) | ||
Fair/poor | 50 (12.1) | 9 (9.5) | ||
Public insurance (vs private) | 133 (31.4) | 19 (19.8) | .02 | 379 (32.1) |
Respondent characteristics, n (%) | ||||
Relationship to child | .006 | N/A | ||
Mother | 320 (77.9) | 86 (91.5) | ||
Father | 84 (20.4) | 6 (6.4) | ||
Other | 7 (1.7) | 2 (2.1) | ||
Age, y | .33 | N/A | ||
<25 | 22 (5.4) | 4 (4.3) | ||
25–34 | 104 (25.4) | 18 (19.2) | ||
35–44 | 160 (39.0) | 42 (44.7) | ||
45+ | 124 (30.2) | 30 (31.9) | ||
Education | .002 | N/A | ||
High school graduate or less | 71 (17.5) | 8 (8.4) | ||
Some college | 120 (29.6) | 16 (16.8) | ||
College graduate or more | 215 (53.0) | 71 (74.7) | ||
Preferred language | .42 | |||
English | 390 (92.0) | 89 (95.7) | 1006 (85.0) | |
Spanish | 29 (6.8) | 3 (3.2) | 77 (6.5) | |
Other | 5 (1.2) | 1 (1.1) | 101 (8.5) | |
Visit characteristics, n (%) | ||||
Length of stay, d | .14 | |||
0–1 | 148 (34.9) | 31 (32.3) | 430 (36.3) | |
2 | 93 (21.9) | 28 (29.2) | 270 (22.8) | |
3–4 | 98 (23.1) | 26 (27.1) | 255 (21.5) | |
5–9 | 56 (13.2) | 8 (8.3) | 146 (12.3) | |
10+ | 29 (6.8) | 3 (3.1) | 83 (7.0) | |
Medical service (vs surgical) | 204 (48.1) | 35 (36.5) | .04 | 583 (49.2) |
N/A indicates not applicable.
Tablet and control columns primarily reflect survey response; results in the All Eligible Patients column are based solely on administrative data.
P value from the chi square test. For ordinal characteristics (age, length of stay), Mantel-Haenszel chi-square P value is reported.
Patients for whom responses were obtained in the Tablet arm were less likely than those in the Mail arm to be white (56.8% vs 71.9%; P = .006) and more likely to have been discharged from the medical service (48.1% vs 36.5%; P = .04). Tablet respondents were more likely to be the patient’s father (20.4% vs 6.4%; P = .006) and have a high school education or less (17.5% vs 8.4%; P = .002).
We assessed child characteristics available from administrative data for all eligible parents (respondents and nonrespondents): sex, age, race/ethnicity, insurance status, preferred language, length of stay, and clinical service. We found no significant differences between respondents and nonrespondents within the tablet arm, and the null hypothesis of no associations of response with all characteristics was not rejected for the full model (Table 2). Within the mail arm, respondents were more likely than nonrespondents to be white (71.9% vs 57.5%; P < .001) and less likely to have public insurance (19.8% vs 33.9%; P = .007) and to have been discharged from the medical service (36.5% vs 52.9%; P = .003). The patients of mail arm respondents were significantly older than nonrespondents (P < .001). Respondents and nonrespondents also differed significantly in the overall model for the mail arm (P = .006).
Table 2.
Tablet |
Mail |
|||||
---|---|---|---|---|---|---|
Respondents | Nonrespondents | P Value* | Respondents | Nonrespondents | P Value* | |
Characteristic | Overall P = .47 | Overall P = .006 | ||||
Child characteristics, n (%) | ||||||
Female sex | 212 (50.0) | 75 (43.6) | .16 | 44 (45.8) | 232 (47.2) | .81 |
Age, y | .12 | <.001 | ||||
0–4 | 157 (37.0) | 72 (41.9) | 29 (30.2) | 205 (41.7) | ||
5–12 | 151 (35.6) | 63 (36.6) | 31 (32.3) | 185 (37.6) | ||
13–17 | 116 (27.4) | 37 (21.5) | 36 (37.5) | 102 (20.7) | ||
Race/ethnicity | .90 | <.001 | ||||
Black/non-Latino | 37 (8.7) | 19 (11.0) | 3 (3.1) | 40 (8.1) | ||
Latino | 47 (11.1) | 19 (11.0) | 5 (5.2) | 76 (15.5) | ||
White/non-Latino | 241 (56.8) | 92 (53.5) | 69 (71.9) | 283 (57.5) | ||
Other/non-Latino | 23 (5.4) | 9 (5.2) | 9 (9.4) | 16 (3.3) | ||
Missing | 76 (17.9) | 33 (19.2) | 10 (10.4) | 77 (15.7) | ||
Public insurance (vs private) | 133 (31.4) | 61 (35.7) | .32 | 19 (19.8) | 166 (33.9) | .007 |
Respondent characteristics, n (%) | ||||||
Preferred language | .32 | .28 | ||||
English | 362 (85.4) | 146 (84.9) | 86 (89.6) | 412 (83.7) | ||
Spanish | 25 (5.9) | 15 (8.7) | 3 (3.1) | 34 (6.9) | ||
Other | 37 (8.7) | 11 (6.4) | 7 (7.3) | 46 (9.4) | ||
Visit characteristics, n (%) | ||||||
Length of stay, d | .71 | .22 | ||||
0–1 | 148 (34.9) | 61 (35.5) | 31 (32.3) | 190 (38.6) | ||
2 | 93 (21.9) | 38 (22.1) | 28 (29.2) | 111 (22.6) | ||
3–4 | 98 (23.1) | 36 (20.9) | 26 (27.1) | 95 (19.3) | ||
5–9 | 56 (13.2) | 23 (13.4) | 8 (8.3) | 59 (12.0) | ||
10 | 29 (6.8) | 14 (8.1) | 3 (3.1) | 37 (7.5) | ||
Medical service (vs surgical) | 204 (48.1) | 84 (48.8) | .87 | 35 (36.5) | 260 (52.9) | .003 |
Overall P values come from Wald tests of global null hypothesis in multivariate logistic regression with all characteristics included. For these multivariate models only, race/ethnicity was parameterized as yes, no, or missing.
P value from the chi-square test. For ordinal characteristics (age, length of stay), Mantel-Haenszel chi-square P values are reported.
Child HCAHPS Scores
Top-box scores were significantly higher for the tablet arm for three of the 17 composite and single item measures (Table 3). Respondents in the tablet arm were more likely than those in the mail arm to report top-box scores for “keeping you informed about your child’s care in the emergency room” (92.5% vs 79.5%; P = .02), “how well nurses communicate with your child” (82.2% vs 67.6%; P = .004), and “helping your child feel comfortable” (72.5% vs 62.1%; P = .002). A test of overall differences in scores across all items by mode was nonsignificant (78.1% vs 74.6%; P = .11). No significant difference between arms was found in scores for the discharge composite (82.6% vs 80.9%; P = .60).
Table 3.
Measure | Tablet (n = 424), % | Mail (n = 96), % | Tablet – Mail Difference | P Value for Difference |
---|---|---|---|---|
Communication with parent | ||||
Communication between you and your child’s nurses† | 88.9 | 86.9 | 2.0 | .49 |
Communication between you and your child’s doctors† | 88.3 | 89.9 | −1.5 | .59 |
Communication about your child’s medicines* | 82.3 | 77.1 | 5.1 | .11 |
Keeping you informed about your child’s care† | 75.9 | 74.5 | 1.4 | .73 |
Privacy when talking with doctors, nurses, and other providers† | 81.6 | 87.5 | −5.9 | .14 |
Keeping you informed about your child’s care in the emergency room* | 92.5 | 79.5 | 13.0 | .02 |
Communication with child | ||||
How well nurses communicate with your child† | 82.2 | 67.6 | 14.6 | .004 |
How well doctors communicate with your child† | 74.8 | 71.3 | 3.5 | .48 |
Involving teens in their care*,† | 70.6 | 68.8 | 1.8 | .76 |
Attention to safety and comfort | ||||
Preventing mistakes and helping you report concerns*,† | 57.5 | 57.0 | 0.5 | .84 |
Responsiveness to the call button† | 63.4 | 52.6 | 10.8 | .06 |
Helping your child feel comfortable*,† | 72.5 | 62.1 | 10.3 | .002 |
Paying attention to your child’s pain* | 84.8 | 83.4 | 1.3 | .76 |
Hospital environment | ||||
Cleanliness of hospital room† | 57.7 | 65.2 | −7.5 | .17 |
Quietness of hospital room† | 51.2 | 52.7 | −1.5 | .79 |
Global rating | ||||
Overall rating‡ | 81.9 | 83.9 | −2.0 | .63 |
Recommend hospital§ | 92.9 | 90.1 | 2.8 | .37 |
Case mix adjusted for child health (5 level), Relationship to child (3 level), parent age (4 level), education (6 level), preferred language (3 level), patient age (5 level). Tablet scores incorporated propensity weights derived from a logistic regression modeling response by tablet (vs mail) with the aforementioned case mix adjustors, race/ethnicity (6 level) and public insurance (vs no public insurance).
Yes, definitely/yes, somewhat/no.
Never/sometimes/usually/always.
Scale from 0 (worst hospital possible) to 10 (best hospital possible).
Definitely no/Probably no/Probably yes/Definitely yes.
Discussion
Our study showed that hospital day of discharge survey administration on a tablet is a feasible alternative to the single wave mail survey typically used by vendors for family experience of pediatric inpatient care and may obtain dramatically higher response rates. Our response rate in the tablet arm was more than 4 times (71.1% vs 16.3%; P < .001) that attained with the single wave mailing. Three measures had significantly more positive responses by tablet than by mail, but we found no systematic tendency toward higher or lower scores across the 17 measures.
Our study demonstrates the potential for tablet survey administration on the day of discharge to improve the representation of typically underrepresented groups of respondents and their children. The tablet arm included a significantly greater share of respondents who were fathers and who had lower levels of education. The latter finding may to some extent be owing to the tablet’s use of ACASI software that reads questions aloud, which could facilitate survey completion by respondents with lower literacy. The tablet arm also obtained responses from significantly greater proportions of patients who were nonwhite or had public insurance. As is typically the case, in the mail arm, nonrespondents differed systematically from respondents on a number of characteristics. In contrast, no evidence was found that nonrespondents to the tablet survey differed from respondents on any of the assessed characteristics. Therefore, it is likely that the tablet not only increased the response rate, but also improved representativeness. Increasing the representativeness of respondents will provide hospitals with a more complete understanding of the experience their patients have at their hospital. This increased representativeness is particularly valuable as hospitals attempt to assess potential disparities in patient/family experience and target potential areas for improvement.
Alternative approaches such as those recommended for Child HCAHPS (ie, 2-wave mail, 2-wave telephone, or mail/telephone mixed mode) may result in higher response rates than the single wave mailing used by most hospitals. For example, a 2007 Adult HCAHPS study achieved a 42% response rate using mail/telephone mixed mode (mail with telephone follow-up of nonrespondents).20 More recent Adult HCAHPS studies found higher response rates for mixed mode (42%) and telephone only mode (32%) than for mail mode (22%).21 Our findings add tablets to the list of effective mode options.
Our results are consistent with other studies that found higher response rates for point-of-care tablet survey administration than for postdischarge mail survey administration in a variety of settings.13,22 Studies using tablets have shown that the majority of respondents prefer tablet to paper or web-based surveys, and affirm that surveys administered on tablets are easy to use.22–26 Furthermore, previous studies have found that pen and paper surveys do not inspire confidence in confidentiality, but respondents believe that the tablet can better ensure confidentiality of responses.27
We did not attempt to differentiate among the effects of survey interface (tablet vs paper), location (hospital in-person vs home), and timing (day of discharge vs after discharge) on response rates, representativeness, and participant responses because the tablet survey could not realistically be administered postdischarge in the home and paper surveys are considered ill advised in the hospital because of respondent fears that staff will see their individual results. This experiment demonstrates the potential for this package of survey interface, location, and timing to generate high response rates and representative responses.
In addition to the tablet interface, other aspects of our intervention could have contributed to our results. The tablet was handed to the respondent directly by a member of hospital staff, which might engage the caregiver more than receiving the survey by mail. Families’ motivation to respond might be higher in the tablet arm, as they are just finishing their hospitalization and their experiences might be more salient. The day of discharge often includes extended periods of waiting for discharge arrangements to be finished, and so completing a survey may be competing with fewer other activities than completing it at home. Furthermore, day of discharge administration allows for broader representation than alternatives such as web-based approaches that are constrained by limited access to email addresses.28 Another consideration is that responses may be more positive because of the auditory component.20 Studies also have found that timing can systematically affect scores, with surveys administered closer to the time of care demonstrating higher scores.29,30 However, our study did not find an overall tendency for higher or lower scores. Importantly, we did not find that potential social desirability bias or other concerns regarding administering patient or family experience surveys in the hospital led to extremely positive scores.
Better understanding of the effect of survey mode (in this case, in-person day of discharge tablet vs mailed paper) through a large-scale, multihospital randomized study will make possible appropriate adjustment of scores when comparing across modes (as is often the standard approach when comparing results of surveys in general administered by mail vs by telephone). It is possible that response rates for certain groups could be even higher than what we found. For instance, a hospital could build survey administration directly into the discharge process by requiring the survey to be offered to parents before discharge, which would decrease the number of families leaving before being offered the survey (departure before being offered the survey accounted for the vast majority of nonrespondents in our study). Another important question concerns the cost of tablet data collection relative to alternatives. Given the low response rates of mailed surveys, it is possible that tablet administration could be as cost-effective if survey administration was effectively integrated into the discharge process.
In general, survey response rates have declined in the past 2 decades,31 possibly owing to survey fatigue, as the number of surveys offered to people has increased dramatically.32,33 Furthermore, inaccurate contact information may contribute to low response rates for mail or phone surveys.34 Adult HCAHPS response rates have also declined over the last decade with CMS reporting an average state-level response rate of 34% in 2013.21 Young adults are the most challenging group to capture and most parents are young adults.12,21 Owing in part to declining response rates, hospitals are actively searching for alternative means of measuring patient/family experience, including some that are using point-of-care surveying for patient experience measurement. To our knowledge however, this is the first study to formally examine this approach.
Hospitals often lack sufficient data to recommend actionable QI initiatives because of the lag in receiving the survey results and the low response rate that often limits the ability to provide unit-level data. Increasing Child HCAHPS response rates and representation of hard-to-reach groups through our novel method of administration would help hospitals reach the minimum number of completed surveys needed to compare performance. Moreover, it could provide more representative information to direct QI efforts, yielding a more accurate assessment of the patient- and family centeredness of a hospital, as well as ensuring that QI efforts are targeted to the potentially different needs of patient subgroups.33,35 Tablets also might also help close the gap between when data are collected and when hospitals can use the data to inform QI efforts. Because day of discharge tablet surveys significantly increase response rates and offer feedback in near real time without delays for data entry from paper surveys, this method can provide hospitals with the data needed to accelerate the pace of QI initiatives. Moreover, by increasing the number of surveys completed, hospitals will be more able to use the data to evaluate local QI interventions to demonstrate the real-time impact on patient experience.
We believe that the high response rate for the discharge items is related to our multimode approach (ie, e-mail, mail, and phone call), plus any priming effect of the tablet phase of the study. Future studies should evaluate simpler follow-up approaches for the discharge items to determine what is optimal.
Our study has some limitations. We conducted the study with children hospitalized at a single tertiary care children’s hospital, and our findings might not generalize to patients and families cared for at other hospitals. Further research through a large-scale, multihospital randomized study should incorporate alternative modes, such as mixed mode (ie, mail and telephone), to permit investigation of questions such as the impact of in-hospital versus after-discharge data collection on the demographic composition of respondents and on their responses.
Conclusion
Patient experience measurement is a standard element of quality measurement across the United States. To understand, track, and improve the experiences of patients and families, hospitals and local leaders need reliable, representative, and timely information. Our findings suggest that day of discharge administration of Child HCAHPS on tablets has great potential to transform inpatient pediatric survey administration, enabling real-time feedback and rapid QI. The implications potentially go beyond the pediatric context to surveys of adult patients and could increase response rates for all health care experience surveys.
What’s New.
The response rate with day of discharge tablet survey administration was >4 times as high as with single wave mail-only surveys. Tablet administration appears promising for real-time feedback and quality improvement and may transform inpatient survey administration.
Acknowledgments
We thank Marcie Brostoff, MS, RN, NE-BC, Jean Gouthro, RN, CPN, Gail Hockman, BA, Amy Morrison, MS, Kellie Needham, Susan Shaw, MSN, MS, RN, NEA-BC, the administrative and nursing staff on the Boston Children’s Hospital (BCH) medical and surgical units that participated in the pilot, and Ajeet Singh in the BCH Information Services Department and Laura Neuenschwander at Press Ganey Associates, for their data support. We thank Scott Sughrue and his team at the Tufts Department of Public Health and Community Medicine for programming the ACASI software. We also thank the staff of the Center of Excellence for Pediatric Quality Measurement at BCH, especially Sepideh Ashrafzadeh, BS, Joseph Boltri, BS, Ari Coopersmith, BA, Susan Landon, BA, Noah Nsangou, Lee Ann Song, AB, and Matthew Westfall, BA.
Financial disclosure: Support for this work was provided by the US Department of Health and Human Services Agency for Healthcare Research and Quality and Centers for Medicare & Medicaid Services, CHIPRA Pediatric Quality Measures Program Centers of Excellence (Grants U18 HS 020513 and U18 HS 025299; PI, M.A.S.). The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding agencies. The funding sources had no role in the design and conduct of the study; collection, management, analysis, and interpretation of data; preparation, review, or approval of the manuscript; or decision to submit the manuscript for publication.
Footnotes
The authors have no conflicts of interest to disclose.
References
- 1.National Quality Forum. Person- and family-centered care. Available at: https://www.qualityforum.org/Topics/Person-_and_Family-Centered_Care.aspx Accessed February 27, 2018.
- 2.Measure Applications Partnership MAP Families of Measures: Safety, Care Coordination, Cardiovascular Conditions, Diabetes. Washington, DC: National Quality Forum; 2013. [Google Scholar]
- 3.Glickman SW, Boulding W, Manary M, et al. Patient satisfaction and its relationship with clinical quality and inpatient mortality in acute myocardial infarction. Circ Cardiovasc Qual Outcomes 2010;3:188–195. [DOI] [PubMed] [Google Scholar]
- 4.Boulding W, Glickman SW, Manary MP, et al. Relationship between patient satisfaction with inpatient care and hospital read-mission within 30 days. Am J Manag Care 2011;17:41–48. [PubMed] [Google Scholar]
- 5.Isaac T, Zaslavsky AM, Cleary PD, et al. The relationship between patients’ perception of care and measures of hospital quality and safety. Health Serv Res 2010;45:1024–1040. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Doyle C, Lennox L, Bell D. A systematic review of evidence on the links between patient experience and clinical safety and effectiveness. BMJ Open. 2013;3:e001570. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Mitchell MD, Lavenberg JG, Trotta RL, et al. Hourly rounding to improve nursing responsiveness: a systematic review. J Nurs Adm 2014;44:462–472. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Goldsack J, Bergey M, Mascioli S, et al. Hourly rounding and patient falls: what factors boost success. Nursing (Lond). 2015;45:25–30. [DOI] [PubMed] [Google Scholar]
- 9.Hospital Consumer Assessment of Healthcare Providers and Systems. HCAHPS hospital survey. Available at: http://www.hcahpsonline.org/home.aspx Accessed February 27, 2018.
- 10.Fredrickson DD, Jones TL, Molgaard CA, et al. Optimal design features for surveying low-income populations. J Health Care Poor Underserved. 2005;16:677–690. [DOI] [PubMed] [Google Scholar]
- 11.Weech-Maldonado R, Morales LS, Spritzer K, et al. Racial and ethnic differences in parents’ assessments of pediatric care in Medicaid managed care. Health Serv Res 2001;36:575–594. [PMC free article] [PubMed] [Google Scholar]
- 12.Klein DJ, Elliott MN, Haviland AM, et al. Understanding nonresponse to the 2007 Medicare CAHPS survey. Gerontologist. 2011;51:843–855. [DOI] [PubMed] [Google Scholar]
- 13.Gurland B, Alves-Ferreira PC, Sobol T, et al. Using technology to improve data capture and integration of patient-reported outcomes into clinical care: pilot results in a busy colorectal unit. Dis Colon Rectum 2010;53:1168–1175. [DOI] [PubMed] [Google Scholar]
- 14.Aktas A, Hullihen B, Shrotriya S, et al. Connected health: cancer symptom and quality-of-life assessment using a tablet computer: a pilot study. Am J Hosp Palliat Care 2015;32:189–197. [DOI] [PubMed] [Google Scholar]
- 15.Anand V, McKee S, Dugan TM, et al. Leveraging electronic tablets for general pediatric care: a pilot study. Appl Clin Inform 2015;6:1–15. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Jandee K, Lawpoolsri S, Taechaboonsermsak P, et al. Customized-language voice survey on mobile devices for text and image data collection among ethnic groups in Thailand: a proof-of-concept study. JMIR MHealth UHealth. 2014;2:e7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Djawe K, Brown EEJ, Gaul Z, et al. Community-based electronic data collections for HIV prevention research with black/African-American men in the rural, Southern USA. AIDS Care. 2014;26:1309–1317. [DOI] [PubMed] [Google Scholar]
- 18.Tufts University School of Medicine. ACASI, CASI, and CAPI software systems: easy electronic data collection for your research study. Available at: http://acasi.tufts.edu/ Accessed May 22, 2017.
- 19.Toomey SL, Zaslavsky AM, Elliott MN, et al. The development of a pediatric inpatient experience of care measure: Child HCAHPS. Pediatrics. 2015;136:360–369. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Elliott MN, Zaslavsky AM, Goldstein E, et al. Effects of survey mode, patient mix, and nonresponse on CAHPS hospital survey scores. Health Serv Res 2009;44(2 Pt 1):501–518. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Centers for Medicare and Medicaid Services. HCAHPS update training. Available at: http://www.hcahpsonline.org/Files/2017_Update_Training_Slides.pdf. Published March 2017. Accessed December 1, 2017.
- 22.Parker MJ, Manan A, Urbanski S. Prospective evaluation of direct approach with a tablet device as a strategy to enhance survey study participant response rate. BMC Res Notes 2012;5:605. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Martin P, Brown MC, Espin-Garcia O, et al. Patient preference: a comparison of electronic patient-completed questionnaires with paper among cancer patients. Eur J Cancer Care (Engl) 2015;25:334–341. [DOI] [PubMed] [Google Scholar]
- 24.Abernethy AP, Herndon JE, Wheeler JL, et al. Feasibility and acceptability to patients of a longitudinal system for evaluating cancer-related symptoms and quality of life: pilot study of an e/Tablet data-collection system in academic oncology. J Pain Symptom Manage. 2009;37:1027–1038. [DOI] [PubMed] [Google Scholar]
- 25.Abernethy AP, Herndon JE, Wheeler JL, et al. Improving health care efficiency and quality using tablet personal computers to collect research-quality, patient-reported data. Health Serv Res 2008;43:1975–1991. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Suzuki E, Mackenzie L, Sanson-Fisher R, et al. Acceptability of a touch screen tablet psychosocial survey administered to radiation therapy patients in Japan. Int J Behav Med 2016;23:485–491. [DOI] [PubMed] [Google Scholar]
- 27.Edgman-Levitan S, Brown J, Fowler FJ Jr, et al. California HealthCare Foundation. Feedback loop: testing a patient experience survey in the safety net. Available at: http://www.chcf.org/publications/2011/10/patient-experience-safety-net-clinics Accessed February 27, 2018.
- 28.Elliott MN, Brown JA, Lehrman WG, et al. A randomized experiment investigating the suitability of speech-enabled IVR and Web modes for publicly reported surveys of patients’ experience of hospital care. Med Care Res Rev MCRR 2013;70:165–184. [DOI] [PubMed] [Google Scholar]
- 29.Jensen HI, Ammentorp J, Kofoed PE. User satisfaction is influenced by the interval between a health care service and the assessment of the service. Soc Sci Med 2010;70:1882–1887. [DOI] [PubMed] [Google Scholar]
- 30.Bjertnaes OA. The association between survey timing and patient-reported experiences with hospitals: results of a national postal survey. BMC Med Res Methodol 2012;12:1–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Pew Research Center. Collecting survey data. Available at: http://www.pewresearch.org/methodology/u-s-survey-research/collecting-survey-data/ Accessed February 27, 2018.
- 32.National Science Foundation. Declining response rates, rising costs. Available at: https://www.nsf.gov/news/special_reports/survey/index.jsp?id=question Accessed February 27, 2018.
- 33.Coulter A, Locock L, Ziebland S, et al. Collecting data on patient experience is not enough: they must be used to improve care. BMJ. 2014;348:g2225. [DOI] [PubMed] [Google Scholar]
- 34.Office of Inspector General United States Postal Office. Declines in US Postal Service mail volume vary widely across the United States. 2015. Available at https://about.usps.com/who-we-are/postal-facts/decade-of-facts-and-figures.htm Accessed February 27, 2018.
- 35.Medicare.gov Hospital compare. Available at: https://www.medicare.gov/hospitalcompare/search.html? Accessed February 27, 2018.