Abstract
Introduction
Patient-reported outcome measures (PROMs) are data capture tools that collect information directly from patients. Several large research studies provide evidence that the use of PROMs in routine care provides benefits to mortality and morbidity outcomes in medical oncology patients. Despite this, implementation of PROMs in daily clinical routine is slow and challenging.
Methods and analysis
This study will use a stepped-wedge design to assess the implementation of a PROM intervention in highly frequented medical oncology outpatient clinics. During a lead-in period of 4 weeks, control data will be collected. The intervention will then be implemented for 4 weeks in Clinic 1 initially, then in Clinic 2 for another 4 weeks. 500 patient encounters will be measured over the 12 weeks in total. The process of implementation will be informed and evaluated using the Medical Research Council Guidelines for Implementing Complex Interventions. The study will be guided by the Promoting Action Research in Health Services framework approach for implementation. The intervention and implementation outcomes will be measured using qualitative and quantitative data.
Ethics and dissemination
Ethical approval has been obtained, approval number HREC/16/QRBW/100 by the Royal Brisbane and Women’s Hospital Human Research Ethics Committee. Results will be disseminated in peer-reviewed journals and at scientific meetings.
Trial registration
ACTRN12618000398202. Trial Status: Opened on 25 March 2018 and will continue until 12 months after the last PROMs reporting encounter.
Keywords: PROMs, implementation, complex intervention, PRO-CTCAE, iPARIHS
Strengths and limitations of this study.
One non-blinded researcher will implement the intervention, and collect and analyse the data.
Response bias and social desirability bias (of both health professionals and patients that choose to participate).
Bias by the Hawthorne effect whereby clinics being observed during the pre-implementation phase may start to change practice.
A stepped-wedge design ensures an incremental implementation into clinical practice.
Prospective use of an implementation framework will make sure that enablers and barriers in the setting are collected and reported allowing the findings from this study to inform future integration of patient reported outcomes into routine clinical care.
Introduction
What are patient-reported outcome measures?
The Federal Drug Administration defines patient reported outcome measures (PROMs) as ‘any report of the status of a patient’s health condition that comes directly from the patient, without interpretation of the patient’s response by a clinician or anyone else’.1 Revicki et al 2 describe PROMs as validated self-reporting assessment tools that capture the patient experience. PROMs have been extensively evaluated for their sensitivity, specificity, overall accuracy and predictive value. They are now regarded to have excellent precision, similar to many other widely used clinical assessment tools including pathological tests or medical imaging reports.3 PROMs can provide an overview of a patient’s physical, emotional, functional or overall health status, or can be used to assess specific treatment outcomes or symptoms.4
PROMs in clinical practice
PROMs are commonly used as outcome measures in research. However, more recently, there is evidence that their real-time application in clinical practice can enhance clinical interactions and improve patient experience. Several studies have shown that using PROMs in routine care leads to improved quality of life (QOL)3 5 as well as improved communication, decision-making, care planning and patient satisfaction.6–8 Two recent studies demonstrated improvements in patient mortality and morbidity when technology-facilitated PROMs data collection was incorporated in oncology care.5 9 10
Given these evidence-based benefits, translating these findings into practice by integrating PROMs into routine clinical care is the next required step in the implementation cycle.
The complexities of implementing PROMs into the clinical setting
A number of systematic reviews3 11 12 reported that multiple organisational, technical and clinical factors need to be overcome before introducing PROMs. In particular, a lack of engagement from healthcare professionals, concerns about the workflow of generating and filing of PROM reports and lack of clearly defined approaches in how to respond to the PROM data that indicate a patient need (eg, elevated pain or depression) have been identified as barriers to successful implementation. The International Society of Quality of Life (ISOQOL) advocates a stepwise approach to implementing PROMs and provides a User’s Guide,13 which was updated in 2018. Klinkhammer-Schalke (2014) identified that a stepwise approach was most useful when integrating a PROM intervention into routine care, as it allows cycles of iterative learning during the implementation.7
Incorporating PROMs into clinical practice should be considered a complex intervention, with many elements impacting on the intervention, and vice versa14 Given these complexities, it has been recommended to use an implementation framework to increase the likelihood of success when aiming to integrate PROMs into routine care.15 Use of a framework approach can help to consider both the processes and intended outcomes of implementation. The Promoting Action Research in Health Services (i-PARIHS) framework appears well suited, as it highlights elements for consideration within the context (eg, the features of the particular clinic in which PROMs are to be integrated), the stakeholders (eg, patients, clinicians, administrative staff) impacted by the intervention and the evidence surrounding the intervention (eg, how much do stakeholders value the new PROM information presented to them).16 A unique feature of iPARIHS is that it stresses the central importance of a facilitator, who works with the local stakeholders to adapt the evidence-based intervention for the local context. Antunes et al’s 3 systematic review provided evidence for the important role of a facilitator of the implementation process, with enhanced successful uptake if one was present.17 18 For example, Baskerville et al 17 showed that medical practices were 2.76 more likely to adopt evidence-based guidelines when a facilitator was working in the local context.
Besides the implementation framework, the Medical Research Council (MRC) Guidelines for Implementation of Complex Interventions can provide guidance on how to best incorporate prespecified process measure. The Guidelines ‘can be used to assess fidelity and quality of implementation, clarify causal mechanisms and identify contextual factors associated with variation in outcomes’.18 The MRC approach ensures active evaluation throughout the implementation and highlights how to mitigate the impact that the introduction of new workflows has on the context, participants and the intervention.
In summary, the aim of this implementation study is to investigate implementation of symptom-reporting PROMs system into the outpatient oncology setting. The objective of the intervention will be to increase detection of symptoms by clinicians using the PROMs data. The implementation objectives include the successful engagement of clinicians to use PROMs in clinical practice, the successful use of technology to obtain PROMs data from patients and present reports to clinicians, and the identification of appropriate local strategies to respond to PROM information.
Methods and analysis
Study design
This mixed-methods study will use a stepped-wedge cluster design. PROMs will be introduced sequentially into two independent clinics, and all intervention and implementation outcomes will be prospectively evaluated. The stepped-wedge approach has been chosen as it is a pragmatic solution for the systematic introduction of a complex intervention19 and has been successfully used in a number of studies related to service delivery improvements.20 21 Another advantage of this study design is that it limits bias by randomly assigning the clinics to the intervention in sequential order. There are key elements that require attention with this study design including the consideration of timing of study time points, cluster equivalence within the setting and intervention uptake assessed by process measures.22 23
The first clinic will be observed during a current standard practice lead-in period for 4 weeks, then introduced into the Integrating Patient Reported Outcomes in a Medical Oncology Setting (iPROMOS) intervention, whereas the other clinic will continue with current standard practice and await implementation of iPROMOS. Data collection and intervention time points are presented in table 1.
Table 1.
Time point | T1 (weeks 0–4) | T2 (weeks 4–8) | T3 (weeks 8–12) |
Clinic 1 | Control data | Intervention | Intervention |
Clinic 2 | Control data | Control data | Intervention |
This protocol was co-designed with clinicians, academics and patient representatives. The iPROMOS intervention was informed by pre-implementation data collected from health professionals and relevant local stakeholders (table 2). Reporting will follow Standards for Reporting Implementation Studies.24
Table 2.
Aim | Data collected | Description of findings | Implementation strategies |
To engage health professionals and patients | Physical environment mapped. Field notes. Focus groups/interviews with multidisciplinary team members and patient representatives of enablers and barriers Staff survey of knowledge modelled on Rouette’s et al 37 assessing knowledge about PROMs including facilitators and barriers, PROMs data format, enablers and barriers. Questions are scored on a Likert scale with questions such as ‘My understanding of PROs is…(very poor, poor, fair, good, very good)’, ‘My lack of understanding of PROs is a barrier to using them in clinical practice (almost never, rarely, sometimes, often, almost always)’. |
The physical environment is busy but movement of patients, staff and medical records is established. There are many established treatment pathways for patient care based on disease group, stage of disease and treatment regimen. Previous interventions have been unsuccessful due to a lack of collaboration with staff and patients. Knowledge about PROMs and current evidence is different across health discipline groups. |
Touchscreen computers will be positioned for easy access by patients as they enter the clinic area. PROMs reports will be made available to staff prior to patient encounter. PROMs data entry design and equipment were sourced in collaboration with consumer representatives. Information resources were developed in collaboration with staff and patient representatives, including posters, information sheets, staff brochures and inservice material. |
To effectively incorporate technology | Field notes. Map of Information Technology Systems that interact with patient care, including the physical environment. |
Many electronic medical records systems interact with patients and staff but not with each other. If PROMs data become a report, it can be stored as such in the patient’s medical record. Paper-based reports can be more easily integrated into patient records. Development of a system specific for each individual health service is expensive and time-consuming. It is unclear whether this would be integrated into current IT systems, or become another log on for staff, which reduces their likelihood of engagement. No ready-made system could be identified for purchase. |
A simple electronic data capture system (REDCap) will be used to collect PROMs data and generate reports. A simple set-up provides the flexibility needed for integration and implementation while ensuring the fidelity of the intervention. Developing/funding a more sophisticated platform for collecting PROMs from patients can be informed by the successful implementation process. |
To manage and respond to PROMs data | Focus groups/interviews and field notes to map referral and communication pathways iPARIHS context assessments of clinical areas.15 | Reports can inform referrals in the format of documentation in the medical record, verbal communication or by email. The best approach needs to be identified with the relevant clinical team/area. Symptom assessment by clinicians uses CTCAE v4.0 as standard practice. CTCAE is the Common Terminology Criteria for Adverse Events, developed by the US Department of Health and Human Services which offers universal assessment and grading of symptoms of disease and treatment. Allied health and specialist nurse roles are in place for management of specific symptoms. |
Alerts criteria will be generated directly to the appropriate specialist nurse and allied health team member to integrate into their practice. PROMs reports will be used to inform assessment and clinical decision-making. |
iPARIHS, Promoting Action Research in Health Services; PROMs, patient-reported outcome measures.
Patient and public involvement
The process of consumer engagement through protocol development informed the research question and study protocol. Consumer representatives within the health services, and on a research advisory group, were approached to discuss the project. They confirmed a need for patient self-reporting of symptoms that are integrated into routine care. Their reports would need to be available to staff so that their concerns could be actioned. During the development of the protocol, consumer representatives were involved in the development of patient resources and collection of pre-implementation data. They also assessed the anticipated burden of the intervention on patients, and this will continue to be evaluated with consumer input through the study. This will be done through Plan, Do, Study, Act (PDSA) cycle evaluation from qualitative data collected and ongoing consumer representative input.
Results will be disseminated on information boards in the health service, and reported back to Consumer Representative Forums.
Key features of the intervention
Based on the published evidence5 and data from local clinicians as summarised in table 2, the PRO-Common Terminology Criteria for Adverse Events (CTCAE) was selected as the PROM to be implemented, as it was developed to extend an assessment by clinicians using the CTCAE25 and has been demonstrated to provide significant benefits for patient care and outcomes.10 PRO-CTCAE is a validated (119 of 124 items met at least one construct validity criterion) symptom-reporting PROM that has been demonstrated to be reliable (test–retest was 0.7 or greater for 39 of 49 prespecified terms) and responsive (item changes corresponded to the Quality of Life Questionnaire-Core 30 scale).26 There are a number of studies that have demonstrated that the PRO-CTCAE is acceptable to patients from differing cancer populations internationally.27 28 This PROM allows patients to report how much they experience each symptom, and the impact on their daily activities, on a five-point Likert scale (ranging from ‘none’ to ‘very much’). The core set of questions includes anorexia, constipation, dyspnoea, diarrhoea, fatigue, nausea, pain, sensory neuropathy, vomiting, cough, low mood and anxiety. Basch et al’s 5 study used a weekly completion schedule on an app with alerts sent to clinicians in real time. However, use of apps for patient reporting was not compatible with the health service’s patient confidentiality policy. The intervention was adapted to include PROM reporting only during scheduled attendances for outpatient clinic appointments. Thus, reporting to clinicians will occur in line with existing clinic visits, which may be weekly or less frequently depending on cancer diagnosis, stage and treatment regimen. PROMs reports will be made available for health professionals to view and respond to. This could include referring the patient to allied health or supportive care, counselling, or additional pharmacological support (eg, adjusting pain medications). PROMs will be added in paper format to the patient chart, and in keeping with local practice, and then will be scanned into the electronic medical record at a later date.
In summary, the iPROMOS intervention consists of (a) patients self-reporting symptoms (PRO-CTCAE PROM) using a touchscreen computer with data captured on a custom-built REDCap database; (b) reports of this information are generated in real time; (c) these reports are available to all healthcare team members and filed in the patients’ medical record and (d) a copy of the report is also provided to the patient. Usual care is clinician assessment of symptoms without the additional use of a PROM.
In the co-design process, using the broader research evidence, investigated to support clinician’s recommendations, a reported symptom of grade 2 or higher for nausea, vomiting or anorexia, and grade 3 for all other symptoms is considered significant.5 If there is an increase in symptoms greater than 2 points from the previous visit, this will also trigger a referral by established pathways to the relevant allied health professional.
Setting of the implementation
This project will be conducted in a tertiary teaching/quaternary referral hospital located in South-East Queensland, Australia. The health service for this centre is the largest in Australia, with the oncology outpatients’ department running up to 14 clinics in 1 day. Each of these clinics is oncologist specific, providing service for treatment, surveillance and follow-up for the patients in their care.
Contextual pre-implementation information has revealed key factors for successful integration of the intervention (table 2). Most importantly, the intervention needs to engage all members of the multidisciplinary team and the staff who will have access to the PROM information to address symptoms, disease management and treatment. To make this likely, the facilitator will aim to integrate the PROM collection and reporting as much as possible into the existing workflow processes already in place at the clinic. Evidence shows that workflows differ greatly between hospitals and even within clinics in a hospital, and that staff are reluctant to change anything that interrupts established practice, given the very complex environment they are managing.29 They are only willing to take on a new intervention when the benefits and processes for patient care are tangible and clear. For successful implementation, it has been identified that it is necessary to integrate with existing patient care pathways and technological infrastructure, rather than impose another layer, which would likely be met with resistance.29
Participants
This study will collect data from two main groups of participants: (a) patients and (b) the clinicians caring for them.
-
Patients who attend the randomised medical oncology outpatients’ clinics for treatment, medical review, active surveillance or routine follow-up, with sufficient English knowledge to read the questionnaires. Patients with significant cognitive impairment, visual difficulties or from a non-English-speaking background who might have difficulty in completing the forms will be excluded from the study.
Patient Screening and Recruitment: patients attending selected clinics will be invited to use touchscreen computer to complete PROM information. The first page of the PROM collection form provides a patient information sheet and consent form. Potential participants will need to read the information and accept to enter PROM-reporting platform. If they do not wish to, they can choose to decline. Patient information will also be visible on a poster displayed in the clinical waiting area.
-
Staff who care for these patients including nursing and medical staff, pharmacists, dietitians, welfare workers, social workers, psychologists, speech therapists, physiotherapists and other allied health workers are eligible.
Staff participation: an opt-out approach to consent staff has been approved by the ethics committee. Multidisciplinary staff will be contacted using various communication channels, directly by the facilitator–researcher to collect pre-implementation information, as well as through distribution of information brochures and posters developed in collaboration with the clinical teams.
Methods of evaluation
Process measures used for implementation evaluation
In accordance with the MRC Guidelines for Complex Interventions, the iterative implementation will be evaluated using both quantitative and qualitative process measures as described in table 3.
Table 3.
Process measuring tool | Method of collection | Approach to analysis |
Context:
|
Facilitator field notes and site journal. | Qualitative: content analysis for a structured analysis. |
Feasibility:
|
Counts. Data from data-capture program. Self-report by staff. Field notes. |
Quantitative: descriptive statistics. Qualitative: content analysis for a structured analysis. |
Fidelity:
|
Counts. Case report form data. Field notes. |
Quantitative: descriptive statistics. Qualitative: content analysis for a structured analysis. |
Reach:
|
Counts. Case report form data. Field notes. |
Quantitative: descriptive statistics. Qualitative: content analysis for a structured analysis. |
PROM, patient-reported outcome measure.
Following the iPARIHS framework, data will be collected by the facilitator who works closely within the context. In this protocol, the facilitator will collect and use process measures, with protocol-specified data collected at prespecified time points (table 4).
Table 4.
Outcome measure | Method of data collection | Approach to analysis |
% Patients completing PROM form | Nominator of PROMs in electronic data capture; denominator of booking schedule of patients that attended clinic; facilitator field notes of reasons for any missing data. | Quantitative: descriptive statistical analysis; longitudinal analyses of % change. Qualitative: content analysis. |
% Staff acknowledging PROM data | Case report forms; facilitator field notes. | Quantitative: descriptive statistical analysis; longitudinal analyses of % change. Qualitative: content analysis. |
% PROMs in medical record | Communication in the medical record; completed PROMs in electronic data capture; referral data. | Quantitative: descriptive statistical analysis. Qualitative: content analysis. |
Acceptability of PROM reporting for staff and patients | Staff survey. Focus groups, interviews and field notes. |
Quantitative: descriptive statistical analysis. Qualitative: content analysis to identify themes and interpret. |
PROMs, patient-reported outcome measures.
PDSA cycles will be performed every 21 days as an interim data analysis to evaluate progress, and to report these findings to clinicians so that collaborative strategies can be established that maximise implementation. The purpose of each PDSA cycle is to summarise and reflect on the implementation process and improve it for the next cycle.16
Outcomes of the implementation
The primary outcome of interest is successful implementation and has been operationalised as ‘PROM reports are made available to clinicians in 85% of encounters, 70% of clinicians will respond to PROM data, and of those 50% of responses will be noted in the patients’ medical record’. This was selected as other studies reported that clinicians and patients are satisfied at such level of service when use is identified as feasible and acceptable.30 31
Secondary outcomes will measure patient and staff acceptance. Staff surveys will be distributed at the end of the PROMs data collection to capture change from baseline in staff knowledge, and identified facilitators and barriers.
Outcomes of the intervention
The primary outcome measure of the intervention will be counts of health professional notes in the patients’ chart about a symptom being of concern (eg, pain). As well as the response to such symptoms will be recorded (eg, referral to pain specialist).
Secondary outcomes will be an improvement in patient QOL, presenting as a clinically significant reduction in measured symptoms. More detailed explanation of outcome measures is provided in table 5.
Table 5.
Outcome measure | Methods of collection | Approach to analysis |
Symptoms assessment by clinicians | Medical record entries, case report forms. | Comparison of proportion of patients with symptom assessment between intervention and control group using χ2 test. |
Response to symptom information | Medical record entries, case report forms. | Proportion of patients referred for supportive care interventions compared between intervention and control groups using χ2 test. |
Change in symptom reporting and responding from pre-intervention to during intervention | Medical record entries, case report forms, PROM electronic data capture. | Proportion of patients before to during intervention period using χ2 analysis and process control analysis. |
Presentations to the emergency department | Medical record entries. | Proportion of patients before to during intervention period using χ2 analysis and process control analysis. |
Hospital admissions | Medical record entries. | Proportion of patients before to during intervention period using χ2 analysis and process control analysis. |
PROM, patient-reported outcome measure.
Sample size
Berry et al 32 conducted a randomised controlled trial that compared symptom reports between clinics using an electronic reporting tool. They assessed both processes and outcomes of care, comparing the impact of PROM reports between the control and intervention clinics. It was used to guide the sample size calculations because this study measured the identification of symptoms in usual care versus a symptom-PROMs intervention. To obtain an estimate of a minimal number of observations that should be included in each cluster in this study, Berry et al’s32 results were used. These researchers identified that a PROMs intervention increased symptom detection by 10%. Using these findings, and 80% power, given a baseline detection level of 0.75, 500 participant encounters would be needed to show improvement by 10% or more.
Methods of analysis
Quantitative analyses
Quantitative measures have been designed for the process measures of implementation evaluation, the outcome measures of the implementation and the outcome measures of the intervention. Descriptive statistics including counts, frequencies and proportions will be used to summarise data collected. Other statistical analyses to be used will include χ2analysis for comparing proportions, linear mixed models for longitudinal analyses and statistical control process analysis to identify trends over time.
Data from both clusters will be analysed using inverse variance weighting so that the difference can be estimated for all patient encounters. This analysis can be used to adjust for cancer types, or clustering by clinicians.33 This analysis will provide a measure of the intracluster effect, which can then be used for power calculations in future larger studies.34
Qualitative data
The facilitator site journal will be used to record observations, and will be content analysed to identify key themes, as a part of each PDSA cycle every 21 days.
The analysis of the facilitator site field notes will be used to triangulate other research findings highlighting aspects in need of further investigation. The function of field notes is to identify processes in a given situation and describe how participants contribute to, and impact, these.35 Extracted data will be interpreted in keeping with Miles et al’s36 approach using field notes who propose an analysis of systematic coding, word by word, presenting the data visually to identify patterns.
Data monitoring
Data monitoring will ascertain high data quality, ensure rigour and mitigate biases.
Data monitoring will be done through the following three processes:
Quantitative data will be double entered for a random sample of 10% records, and all records will be double entered should the error rate be greater than 5%.
Monthly meetings with expert facilitators who are not involved with the project to reflect on the implementation and evaluation of the project.
Supervision and oversight by the study team not directly involved in the process of implementation.
Safety considerations
The main purpose of the secondary outcome measures of the intervention is to measure the safety of using this implementation approach. A potential safety issue is that when patients complete the PROMs, they expect that staff will act on that information. If the implementation is not successful, staff may not do this in a timely fashion or at all, and patients who report symptoms may not receive suitable treatment. Any such issues where a PROMs report was not acted on will be noted and described using the data collection tools for the project. The facilitator will raise any issues where patient safety is at risk.
Data deposition and curation
All de-identified data will be stored on a REDCap database, on a secure university server. Patient information will be stored on their medical record and hospital-based servers that are password protected. Data will be stored for 5 years. A formal data management plan has been developed and approved by the Queensland University of Technology Research Unit.
Dissemination of results
Results will be disseminated in peer-reviewed publications and presented at national and international scientific meetings.
Discussion
This study proposes that successful implementation of PROMs requires sophisticated attention to the local clinical setting and existing clinical workflows and can overcome barriers previously experienced in other settings by following a prespecified implementation approach with an experienced facilitator. It is important to investigate implementation strategies as clinical trials have demonstrated significant benefits for patients, but also reported the difficulties of using PROMs in complex health systems outside the highly structured context of a clinical trial. Systematic reviews recommend a structured implementation approach that considers the many elements present in the health system into which PROMs are introduced. The use of the iPARIHS framework with the MRC Guidelines for Implementation of Complex Interventions, built on the work of ISOQOL, offers an implementation strategy that addresses the issues identified in the research to date. This study offers an opportunity to scientifically measure implementation, potentially rapidly implement PROMs into clinical practice and to inform future research and clinical practice.
Supplementary Material
Acknowledgments
I would like to acknowledge supervision and knowledge of my PhD supervisors MJ, KA and DW, and ongoing expertise and support from AM. Appreciation also goes to the Metro North Research Unit and the Queensland University of Technology. Thanks must go to the staff of the healthcare team, particularly Jenni Leutenegger, Therese Hayes and Annette Cubitt from Cancer Care Services at the Royal Brisbane and Women’s Hospital (RBWH) who contributed their time, knowledge and expertise for the development of this protocol. Thanks also to the Consumer Representatives at RBWH for their time in protocol development and pre-implementation data collection, in particular Gary Power and Anita McGrath who gave their time contributing to the development of patient education and patient information sheets.
Footnotes
Contributors: NR and MJ drafted the protocol for this publication. AM contributed significantly to the drafting of this publication, particularly with expertise in implementation science and multidisciplinary care. KA contributed expertise regarding nursing care, symptom management and PROMs. DW contributed expertise regarding specialist medical care and health services management.
Funding: This work is supported by the Royal Brisbane and Women’s Hospital Postgraduate Scholarship and a QUT APA Scholarship. Computer equipment and infrastructure are funded by the ‘Susan Sudak Day to End Cancers’ Diamond Care Grant.
Competing interests: None to declared.
Ethics approval: This project has received ethical approval from the Royal Brisbane and Women’s Hospital Human Research Ethics Committee number HREC/16/QRBW/100.
Provenance and peer review: Not commissioned; externally peer reviewed.
Patient consent for publication: Not required.
References
- 1. Coons SJ, Gwaltney CJ, Hays RD, et al. . Recommendations on evidence needed to support measurement equivalence between electronic and paper-based patient-reported outcome (PRO) measures: ISPOR ePRO Good Research Practices Task Force report. Value Health 2009;12:419–29. 10.1111/j.1524-4733.2008.00470.x [DOI] [PubMed] [Google Scholar]
- 2. Revicki DA, Osoba D, Fairclough D, et al. . Recommendations on health-related quality of life research to support labeling and promotional claims in the United States. Qual Life Res 2000;9:887–900. 10.1023/A:1008996223999 [DOI] [PubMed] [Google Scholar]
- 3. Antunes B, Harding R, Higginson IJ. EUROIMPACT. Implementing patient-reported outcome measures in palliative care clinical practice: a systematic review of facilitators and barriers. Palliat Med 2014;28:158–75. 10.1177/0269216313491619 [DOI] [PubMed] [Google Scholar]
- 4. Sharma P, Dunn RL, Wei JT, et al. . Evaluation of point-of-care PRO assessment in clinic settings: integration, parallel-forms reliability, and patient acceptability of electronic QOL measures during clinic visits. Qual Life Res 2016;25:575–83. 10.1007/s11136-015-1113-5 [DOI] [PubMed] [Google Scholar]
- 5. Basch E, Deal AM, Kris MG, et al. . Symptom Monitoring With Patient-Reported Outcomes During Routine Cancer Treatment: A Randomized Controlled Trial. J Clin Oncol 2016;34:557–65. 10.1200/JCO.2015.63.0830 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Velikova G, Keding A, Harley C, et al. . Patients report improvements in continuity of care when quality of life assessments are used routinely in oncology practice: secondary outcomes of a randomised controlled trial. Eur J Cancer 2010;46:2381–8. 10.1016/j.ejca.2010.04.030 [DOI] [PubMed] [Google Scholar]
- 7. Klinkhammer-Schalke M, Koller M, Steinger B, et al. . Direct improvement of quality of life using a tailored quality of life diagnosis and therapy pathway: randomised trial in 200 women with breast cancer. Br J Cancer 2012;106:826–38. 10.1038/bjc.2012.4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Basch E, Deal AM, Dueck AC, et al. . Overall survival results of a trial assessing patient-reported outcomes for symptom monitoring during routine cancer treatment. JAMA 2017;318:197–8. 10.1001/jama.2017.7156 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Denis F, Lethrosne C, Pourel N, et al. . Randomized Trial Comparing a Web-Mediated Follow-up With Routine Surveillance in Lung Cancer Patients. J Natl Cancer Inst 2017;109 10.1093/jnci/djx029 [DOI] [PubMed] [Google Scholar]
- 10. Basch E, Deal AM, Dueck AC, et al. . Overall Survival Results of a Trial Assessing Patient-Reported Outcomes for Symptom Monitoring During Routine Cancer Treatment. JAMA 2017;318:197–8. 10.1001/jama.2017.7156 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Porter I, Gonçalves-Bradley D, Ricci-Cabello I, et al. . Framework and guidance for implementing patient-reported outcomes in clinical practice: evidence, challenges and opportunities. J Comp Eff Res 2016;5:507–19. 10.2217/cer-2015-0014 [DOI] [PubMed] [Google Scholar]
- 12. Duncan EA, Murray J. The barriers and facilitators to routine outcome measurement by allied health professionals in practice: a systematic review. BMC Health Serv Res 2012;12:96 10.1186/1472-6963-12-96 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Snyder CF, Aaronson NK, Choucair AK, et al. . Implementing patient-reported outcomes assessment in clinical practice: a review of the options and considerations. Qual Life Res 2012;21:1305–14. 10.1007/s11136-011-0054-x [DOI] [PubMed] [Google Scholar]
- 14. Craig P, Petticrew M. Developing and evaluating complex interventions: reflections on the 2008 MRC guidance. Int J Nurs Stud 2013;50:585–7. 10.1016/j.ijnurstu.2012.09.009 [DOI] [PubMed] [Google Scholar]
- 15. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci 2015;10:53 10.1186/s13012-015-0242-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci 2016;11:33 10.1186/s13012-016-0398-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Baskerville NB, Liddy C, Hogg W. Systematic review and meta-analysis of practice facilitation within primary care settings. Ann Fam Med 2012;10:63–74. 10.1370/afm.1312 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Moore GF, Audrey S, Barker M, et al. . Process evaluation of complex interventions: Medical Research Council guidance. BMJ 2015;350:h1258 10.1136/bmj.h1258 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Hemming K, Girling A. A menu-driven facility for power and detectable-difference calculations in stepped-wedge cluster-randomized trials. Stata J 2014;14:363–80. 10.1177/1536867X1401400208 [DOI] [Google Scholar]
- 20. Fuller C, Michie S, Savage J, et al. . The Feedback Intervention Trial (FIT)–improving hand-hygiene compliance in UK healthcare workers: a stepped wedge cluster randomised controlled trial. PLoS One 2012;7:e41617 10.1371/journal.pone.0041617 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21. Campbell M, Fitzpatrick R, Haines A, et al. . Framework for design and evaluation of complex interventions to improve health. BMJ 2000;321:694–6. 10.1136/bmj.321.7262.694 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Hughes JP, Granston TS, Heagerty PJ. Current issues in the design and analysis of stepped wedge trials. Contemp Clin Trials 2015;45(Pt A):55–60. 10.1016/j.cct.2015.07.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Liao X, Zhou X, Spiegelman D. A note on “Design and analysis of stepped wedge cluster randomized trials”. Contemp Clin Trials 2015;45(Pt B):338–9. 10.1016/j.cct.2015.09.011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24. Pinnock H, Barwick M, Carpenter CR, et al. . Standards for Reporting Implementation Studies (StaRI) Statement. BMJ 2017;356:i6795 10.1136/bmj.i6795 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25. National Institute of Health. Common Terminology Criteria for Adverse Events (CTCAE) Version 4.0: US Department of health and human services, 2009. [Google Scholar]
- 26. Dueck AC, Mendoza TR, Mitchell SA, et al. . Validity and relaibility of the US National Cancer Institute’s Patient-Reported Outcomes Version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE). JAMA Oncol 2015;1:1051–9. 10.1001/jamaoncol.2015.2639 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27. Hagelstein V, Ortland I, Wilmer A, et al. . Validation of the German patient-reported outcomes version of the common terminology criteria for adverse events (PRO-CTCAE™). Ann Oncol 2016;27:2294–9. 10.1093/annonc/mdw422 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28. Baeksted C, Pappot H, Nissen A, et al. . Feasibility and acceptability of electronic symptom surveillance with clinician feedback using the Patient-Reported Outcomes version of Common Terminology Criteria for Adverse Events (PRO-CTCAE) in Danish prostate cancer patients. J Patient Rep Outcomes 2017;1:1 10.1186/s41687-017-0005-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29. Braithwaite J. Changing how we think about healthcare improvement. BMJ 2018;361:k2014 10.1136/bmj.k2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Bainbridge D, Seow H, Sussman J, et al. . Multidisciplinary health care professionals’ perceptions of the use and utility of a symptom assessment system for oncology patients. J Oncol Pract 2011;7:19–23. 10.1200/JOP.2010.000015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Detmar SB, Muller MJ, Schornagel JH, et al. . Health-related quality-of-life assessments and patient-physician communication: a randomized controlled trial. JAMA 2002;288:3027–34. [DOI] [PubMed] [Google Scholar]
- 32. Berry DL, Hong F, Halpenny B, et al. . The electronic self report assessment and intervention for cancer: promoting patient verbal reporting of symptom and quality of life issues in a randomized controlled trial. BMC Cancer 2014;14:513 10.1186/1471-2407-14-513 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33. Hooper R, Teerenstra S, de Hoop E, et al. . Sample size calculation for stepped wedge and other longitudinal cluster randomised trials. Stat Med 2016;35:4718–28. 10.1002/sim.7028 [DOI] [PubMed] [Google Scholar]
- 34. Grayling MJ, Wason JM, Mander AP. Stepped wedge cluster randomized controlled trial designs: a review of reporting quality and design features. Trials 2017;18:33 10.1186/s13063-017-1783-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35. Silverman D. Interpreting Qualitative Data. London: Sage; 2006. [Google Scholar]
- 36. Miles M, Huberman A, Saldana J. Qualitative Data Analysis: A Methods Sourcebook. California: Sage, 2014. [Google Scholar]
- 37. Rouette J, Blazeby J, King M, et al. . Integrating health-related quality of life findings from randomized clinical trials into practice: an international study of oncologists’ perspectives. Qual Life Res 2015;24:1317–25. 10.1007/s11136-014-0871-9 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.