Abstract
Objective:
Patient-reported outcomes (PROs) contain valuable information that can be leveraged by providers to perform timely interventions and improve quality of life and survival. However, the implementation of electronic PROs (ePROs) remains a challenge from technical and evaluation perspectives. Our objective was to construct a robust electronic health record (EHR)-integrated ePRO information infrastructure founded on RE-AIM (reach-effectiveness-adoption-implementation-maintenance) principles.
Materials and Methods:
We used Epic Systems as our EHR platform to build the MD Anderson Symptom Index Head and Neck Module (MDASI-HN) for release to all patients undergoing evaluation and/or treatment in our HN Radiation Oncology clinics. RE-AIM metrics were established and used to design patient, provider, and implementing facilitator information tools.
Results:
From January 2021 to July 2024, our ePRO program has collected 13,156 patient-submitted ePROs on 3,497 unique HN patients, with a 12-month sustained ePRO compliance rate of 82%. We also propose a dynamic 2-cycle implementation model. This model can be used to continuously (re)define, build, and adapt ePRO information tools for patients, providers, and program facilitators.
Discussion:
Our ePRO framework has several benefits, including integrated clinical data for enhanced decision-making, potential scalability, and use of a common EHR system. Formative (ie, mid-phase) evaluations were essential to our program, allowing for timely optimization of ePRO compliance, ePRO usage by clinical staff, and secondary use of high-quality ePRO data.
Conclusion:
In this article, we provide a valuable roadmap toward developing a comprehensive, EHR-based ePRO information infrastructure simultaneously optimized for clinical utility, informatics operations, and implementation evaluation founded on RE-AIM principles.
Keywords: implementation research, information infrastructure, head and neck cancer, patient-reported outcomes, RE-AIM
Introduction
The value of patient-reported outcomes (PROs) in oncology has been well established in health care literature. PROs are formally defined as the status of a patient’s health condition that comes directly from the patient.1 PROs can characterize a spectrum of multidimensional domains, including overall quality of life, functional status (eg, swallowing function post-radiotherapy for head and neck cancer [HNC]), symptom burden, and patient experience.2 When collected prospectively and longitudinally throughout cancer survivors’ lifetimes, dynamic patterns of early and late PRO burden emerge and can be investigated to identify high-risk populations and develop more effective, data-driven toxicity mitigation strategies.3–7 Clinically, PROs have also been associated with enhanced multidisciplinary care, patient quality of life, symptom management, and patient-provider communication.8–10
Despite the abundance of support for adopting PROs, high PRO compliance can be challenging to achieve even for heavily resourced, multicenter cancer clinical trials. Independent predictors of PRO completion include institutional size, patient age, and treatment group.11 Additional barriers to PRO implementation in clinical practice include issues with organizational workflows, limited stakeholder engagement (ie, clinical staff and patients), time management, and information technology infrastructures that are underdeveloped for PRO collection, visualization, and efficient clinical reporting.12–14
Electronic health record (EHR) systems present a unique opportunity to construct and implement integrated, electronic PROs (ePROs) with structured metadata (ie, who, when, and how were ePROs captured). Moreover, relevant clinical data can be embedded within the same ePRO information system (ie, imaging, laboratory, and treatment data) to provide additional context to reported symptoms. Given rising interests in EHR-based ePRO integration, the Patient Centered Outcomes Research Institute (PCORI) developed guidelines for health care providers interested in ePRO adoption.15 However, these recommendations are generalized to enable versatile applications in different settings, subsequently leaving knowledge gaps on how to monitor and ensure the maintenance, or institutionalization, of ePROs.
Several implementation evaluation frameworks, such as RE-AIM, have been well established in public health assessments of community-based interventions and are being increasingly applied in cancer settings.16–21 The RE-AIM framework, for example, conceptualizes the impact of implementation efforts as a product of 5 domains. These include reach (proportion of target population participation and risk characteristics), effectiveness (positive and negative outcomes of an intervention and variability across subgroups), adoption (representativeness settings and personnel that will adopt the interventions), implementation (fidelity to and/or adaptations to the intended interventions), and maintenance (extent to which interventions are sustained over time or institutionalized).21
Reports on RE-AIM–based ePRO implementation efforts or the technical build of HER-based ePRO systems exist.22–25 However, these tend to be smaller scale studies or discrete publications without discussions of information tools that can be generated to support the various stages of an implementation program. To address this knowledge gap, we systematically outline our technical work in designing an EHR (Epic)-based ePRO information infrastructure. We also describe our novel 2-cycle ePRO information architecture model that supports an ongoing ePRO implementation program driven by the RE-AIM framework in a large cancer center.
Materials and Methods
The ePRO Implementation Program
An Epic-based ePRO implementation program using the RE-AIM evaluation framework was initiated on January 28, 2021, at MD Anderson Cancer Center’s Department of Radiation Oncology, Head and Neck Section. The target population to receive ePROs were patients with a diagnosis of HNC. The target setting was outpatient visits in the Head and Neck Radiation Oncology (HNRO) clinic. The selected ePRO was the MD Anderson Symptom Index Head and Neck Module (MDASI-HN), a 28-item, validated questionnaire designed to measure general and HNC-specific symptom burden.26 The MDASI-HN is rated on a numeric scale ranging from 0 (not present) to 10 (as bad as you can imagine) and has a 24-hour recall period. Additional stakeholders for this implementation study included all clinical staff involved in HNRO patient care, including schedulers, medical assistants, nurses, advanced practice providers, and faculty. ACM served as the site facilitator and worked closely with the Epic analytics team (AP, TS, KS, SA) to develop the ePRO-driven information tools.
From January 28, 2021, to July 31, 2024, 13,156 patient-submitted ePROs were collected from 3,497 unique HNC patients. A critical all-staff meeting occurred in February 2023 that formally launched the RE-AIM information infrastructure. As a result, ePRO compliance accelerated from 35% to 87.4% with a sustained monthly ePRO completion average of 82% over 12 months. The RE-AIM framework and proposed metrics for this study are described below. This study had Institutional Review Board approval under protocol 2024–0002.
The RE-AIM Framework
An overview of the framework and interactive RE-AIM planning tool can be found at re-aim.org. The target population for this initiative (ie, reach) has been described above. The ePRO program was approved by the institution’s Patient Survey Informatics Committee, who provide oversight of ePRO builds and proposed workflows. The program was then promoted internally through the dissemination of clinical staff educational materials (Supplement S1), quarterly ePRO meetings, and patient-facing educational materials (Supplement S2). Effectiveness was defined as total ePRO compliance, which can be calculated as a percentage of the completed ePROs compared to the total of assigned ePROs. Patient adherence to completing longitudinal ePROs throughout their cancer journey from consultation to on-treatment weekly see visits (WSVs) and/or follow-ups was also examined. Key characteristics to review the degree of ePRO adoption included the clinical setting (ie, the main center or the proton therapy center) and individual provider clinics (ie, the assigned physician and their staff). The feasibility of ePRO implementation was expected to be high as EHR-integrated ePROs can be automatically assigned to eligible visit orders (ie, a future follow-up) and be automatically delivered to patients via Epic MyChart in preparation for their future visit(s). Several staff-facing information tools were also designed to facilitate staff-specific responsibilities in the designed ePRO clinical workflow. Proposed adjustments or modifications to the ePRO program were recorded during monthly-to-quarterly meetings with our analytics team and ePRO Committee, the latter inclusive of all HNRO clinical staff members. Changes to the EHR-based criteria could be audited by examining changes to the rule-based logic. Maintenance of EHR-based ePRO integration was routinely monitored through automated reports developed using dashboarding tools (SAP BusinessObjects Web Intelligence or WebI) available at our institution and connected to our EHR’s data warehouse.29,30
Results
ePRO Information Architecture Design
Figure 1 illustrates our novel information infrastructure designed to specifically support ePRO implementation programs in clinical practice. This 2-cycle implementation model, while founded in RE-AIM principles, highlights critical components of an ePRO clinical program such as stakeholder engagement for developing user-specific clinical tools and guiding technical adaptations to the ePRO distribution and workflow. When ePROs are directly linked to an EHR system as in our study, there are additional opportunities for automating reports of ePRO compliance by connecting analytical software to the EHR’s ePRO database.
Figure 1.

Information architecture for electronic health record (EHR)-based electronic patient-reported outcome (ePRO) integration. Our novel ePRO information architecture is represented as a 2-cycle implementation model. In the clinical phase, stakeholder-specific (patient vs staff) information tools have been developed and deployed to enhance ePRO adoption and compliance. EHR-based ePROs are stored as structured flowsheets in the EHR database, which is linked to analytical software for near real-time, automated reporting of ePRO program-specific RE-AIM metrics. These reports provide quantitative data to inform formative evaluations with stakeholders and iterative refinement of ePRO information tools used in the clinic.
ePRO Build and Release
Upon defining the target population, setting, and ePRO tool (ie, MDASI-HN), we collaborated with EHR analysts to build structured ePRO flowsheets. Each row in the ePRO flowsheet was assigned to a unique survey item, and all questions were reviewed by the facilitator for potential designation as a high alert value (HAV) question. HAVs are defined as patient-entered scores above a certain threshold that should trigger a rule-based action linked to a best practice advisory (BPA). For example, if a patient rated the question “your shortness of breath at its worst” as 8 or greater, the proposed BPA was to route these HAVs to a nursing staff pool in-basket for prompt triaging. All nurses were educated on how to manage incoming HAVs before implementation. A total of 5 items in the MDASI-HN were assigned as HAV-containing questions (Q1, Q5, Q6, Q16, Q19). In addition, high distress scores were routed to a designated social worker.
After the ePRO flowsheet was approved, logic or rules within the EHR system were developed to define when ePROs should be released to patients. For simplicity, ePROs were linked to a series of approved visit types, including new patient, consult, WSV, and follow-up. Telehealth visit codes were also included, as this implementation study was initiated during the COVID-19 pandemic. The ePRO release timeframe was based on the validated questionnaire’s recall period (ie, 24 hours).
Patient-Facing Tools for ePRO Completion
Activated ePROs can be completed by patients through MyChart, Epic’s proprietary patient web portal and mobile application.27 MyChart enables automated notifications to patients of upcoming visits with their clinical providers. When logged into the patient portal, patients can access visit-linked ePROs by 1) clicking on the Menu button and then Questionnaires or 2) clicking on their scheduled visit and finding the section titled “Prepare for Your Visit” (Figure 2). Another route for completing ePROs is through use of a tablet at the time of check-in. Epic’s Welcome application was installed on iPads that were handed out by schedulers to patients checking in for appointments who had not completed assigned ePROs beforehand. This tablet application is also advantageous in allowing schedulers to add new ePROs in real time for various reasons (ie, when providers request ePROs for a visit type not part of the ePRO logic).
Figure 2.

Electronic health record–based electronic patient-reported outcome (ePRO) patient access points. Two patient-facing options for ePRO completion are presented in this figure and include access to questionnaires via MyChart’s primary menu (option 1) or visit-based “Get Ready” dashboard (option 2).
Clinical Staff Tools for Monitoring and Reinforcing ePRO Completion
Continuous, real-time monitoring of ePRO completion status by clinical stakeholders is critical for improving and maintaining an ePRO implementation program. Clinic schedule properties are customizable in Epic, as shown in Figure 3. Users can right-click on their schedule to review its properties, search for the ePRO status column (labeled per facilitator/Epic team preference) in the Available Columns section, add it to their schedule (green box), and move its position using the up and down arrows (blue box) in the Selected Columns section. Figure 3B shows the MDASI-HN RO ePRO status column located on the staff’s schedule with various values available including Assigned (not started), Started at Home (partially completed), and Completed. This schedule column was integrated into all staff schedules to facilitate daily screening of ePRO compliance. Schedulers checking patients into clinic were then given the role to remind patients with an Assigned status to complete ePROs on their mobile phone or the Welcome tablet prior to being roomed. Of note, patients with a status of “Completed, Completed” had 2 ePROs assigned and completed for that visit (ie, the MDASI-HN and OHIP questionnaires, the latter of which evaluates oral health function).28
Figure 3.

Schedule properties (A) and electronic patient-reported outcome (ePRO) status column (B). As part of clinical staff–facing tools for promoting ePRO compliance, Epic clinic schedule properties can be customized to include an ePRO column that specifies the ePRO status (assigned, partially completed, or completed) per patient.
For patients having issues with technology or lacking a smart mobile device, and clinics without the Welcome application, designated clinical staff can assist with entering ePROs manually through the flowsheet tab in the EHR hyperspace (Figure 4). The flowsheet tab within an open patient encounter can display a variety of flowsheets, including the activated ePRO labeled here as Symptom Inventory MDASI-HN. Patient-entered data (ie, via MyChart or the Welcome application) will showcase a human icon to the left of the value. Questions missed or incomplete PROs can be quickly identified and reviewed with the patient for manual completion by clinical staff, as highlighted in Figure 4 where there is a dash to the left of the newly entered value.
Figure 4.

Electronic health record–integrated electronic patient-reported outcome (ePRO) flowsheets. Structured flowsheets of ePRO data are shown in this figure with a human icon to denote patient-entered data versus a blue rectangle for staff-entered data.
Clinical Staff Tools for ePRO Integration in Clinical Decisions and Documentation
EHR-based ePROs provide versatile options for analyzing and using ePRO data during clinical assessments, symptom management, and documentation (Figure 5). Automated ePRO in-basket messages are valuable for timely and prioritized patient triaging as they are immediately sent to designated staff (ie, nursing pool) when HAVs are entered by a patient. Figure 5A shows an in-basket message displaying the time of ePRO completion. The highlighted values in black are linked to HAV-enabled questions, and the highlighted red values denote when an HAV threshold has been met. Self-service reporting tools in Epic that allow for customizable clinical data exploration (SlicerDicer, Figure 5B)29 can be leveraged by clinical staff to develop dashboards with aggregate statistics on ePROs using selected parameters (eg, the number of HAV-containing responses per visit type). The customizable, interactive Synopsis display (Figure 5C) is perhaps one of the most powerful tools for providers as it can integrate longitudinal ePRO data with relevant clinical data. Per feedback from our HNRO clinical team, the HNRO Synopsis was designed to visualize vital signs first followed by ePRO data, laboratory data, systemic therapy records (ie, agent, dose, and delivery dates), and imaging data. It also allows for trending of items such as weight and pain scores over time. This enriched ePRO context allows for quicker pattern recognition and more data-driven, proactive symptom management during and after cancer therapy.
Figure 5.

Clinical tools for analyzing electronic patient-reported outcomes (ePROs). High alert value (HAV) in-basket messaging systems (A) for ePROs are critical for alerting clinical staff of abnormal symptoms, while SlicerDicer dashboards (B) and the Synopsis View (C) are customizable tools for visualizing aggregated clinic-level or patient-level data, respectively.
The ability of building ePRO SmartText is another Epic feature available to clinical staff when ePROs are designed as unique flowsheets. SmartText phrases are customizable, templated text that can extract structured data from Epic to then add this information quickly to clinical documentation. For example, Supplemental Figure 1 highlights the SmartText phrase, “MDASIHNRO,” which was added to a follow-up clinical note to automatically document patient-specific ePRO data from the most recent 3 appointments.
Designing Customized Reports for Implementation Evaluation
While most implementation studies focus on end-of-program summative evaluations, formative evaluations enable mid-course modifications or adaptations to a program to improve the intervention.30 As part of the ePRO information infrastructure, automated ePRO compliance reports were developed to provide quantitative data for formative evaluations between implementation facilitators and stakeholders. Specifically, we developed multiple reports in WebI detailing the reach of the program by linking flowsheet data to patient demographics, the effectiveness by tracking ePRO compliance, the adoption by filtering reports by location and providers, and the implementation-maintenance phase by tracking the sustainability of ePRO compliance over longer periods of time.
Figure 6 shows our ePRO completion rate report with various filters on the left (ie, month-year, visit type, provider). This report was shared monthly with our clinical stakeholders and reviewed quarterly with the ePRO committee to identify and address barriers to ePRO compliance. For example, ePRO completion rates within the first 6 months increased after we identified a problem with the ePRO logic whereby certain visit types (ie, follow-ups) had multiple identifying codes in Epic, but not all had been included in the logic for automated ePRO linkage. Differences in ePRO completion rates across clinic sites, attributed to variations in available staffing, were iteratively reviewed with stakeholders and addressed through adaptive modifications to staff roles. When the Welcome tablets were launched later in the program, in March 2024 (main center) and June 2024 (proton therapy center), there was a need to evaluate their contributions to ePRO compliance. An EHR-based workbench report (Supplemental Figure 2) was generated for customizable timeline reporting of such data and demonstrated that nearly a quarter of ePROs were being completed using the tablets in the clinic.
Figure 6.

WebI report on electronic patient-reported outcome (ePRO) compliance. This is an example of an automated ePRO compliance report generated from ePRO flowsheet data that have been stored in the electronic health record (her) database and retrieved using WebI, a dashboarding software tool. On the left are customizable filters based on clinical variables such as date of ePRO completion and visit type. On the right, the graph shows a trend of monthly ePRO completion rates. The all-staff meeting which occurred in February 2023 to launch the RE-AIM and EHR-based ePRO program is highlighted in red font. A pre- and post-implementation ePRO trend can be visualized as an absolute count of completed ePROs (blue bars) or as a percentage (line graph).
Discussion
There remains a critical need to disseminate scalable methodologies for building, maintaining, and sustaining health care information architectures that support all phases of ePRO implementation. To our knowledge, this is the first report describing a comprehensive, RE-AIM–based ePRO infrastructure fully integrated into one of the world’s most widely adopted EHR systems. With over 325 million patient records in Epic Systems (Epic Systems Corporation),31 an Epic-based ePRO platform offers substantial advantages for scalability, interoperability across disciplines, and integration of ePROs into real-time clinical workflows within a unified data environment.
When designing an ePRO implementation program, the delivery mechanism should align with available resources and scalability goals. Previous feasibility studies32 highly reliant on tablet-based delivery of ePROs (ie, Tonic) have shown that these systems can efficiently transfer ePRO data to the EHR but are highly dependent on equipment availability and personnel oversight—factors that increase operational burden and limit scalability. In contrast, our program leveraged Epic’s native functionality to automate visit-based ePRO deployment through MyChart, enabling patients to complete surveys remotely at any time during the recall period, including prior to check-in. This model minimizes dependence on staff and physical resources while maintaining high data quality. To enhance accessibility, our Welcome tablet workflow added a complementary in-clinic option with minimal staff assistance, improving ePRO compliance by approximately 22%. Based on these findings, we recommend using tablet-based delivery as a supplemental—not primary—method for ePRO collection, while emphasizing patient education within patient portals and pre-treatment information sessions with staff to promote engagement and compliance.
Effective visualization of ePRO data remains essential for clinical utility. Poorly designed interfaces lacking patient summaries, temporal graphics, or clear abnormality indicators have been identified as major barriers to ePRO adoption.14 Our integrated Epic Synopsis view consolidates longitudinal ePROs with relevant clinical and treatment data, allowing for interactive filtering, highlighting, and trend visualization. Automated provider alerts for HAVs and embedded SmartText templates further streamline triage and documentation processes, thereby reinforcing ePRO use in clinical decision-making processes. As part of an ePRO program, who receives triage alert–level ePROs and how they are to be managed should be carefully considered and predefined as clinical operating procedures disseminated to clinical staff. These protocols will vary depending on available staffing and established clinical workflows, and continuous formative evaluations with clinical stakeholders is recommended to mitigate role confusion and ensure patient safety.
Our proposed ePRO information architecture, visualized as a 2-cycle implementation model in Figure 1 and anchored in RE-AIM principles, integrates dynamic feedback loops between clinical operations and informatics optimization. Structured ePRO data collected during the clinical implementation cycle and deposited daily into the EHR database enables near real-time generation of ePRO program–tailored RE-AIM metrics via linked analytical software (ie, WebI). These automated dashboards support the implementation evaluation cycle by identifying new or persistent barriers and guiding responsive workflow refinements.
Although this work was conducted in an Epic environment, the underlying ePRO information architecture can be EHR-agnostic and adaptable. Other vendors, such as Oracle33, now offer integrated ePRO modules with patient-level clinical data or avenues to integrate applications of computerized adaptive testing, which have been shown to reduce patient-related survey fatigue.34 For low-resource or non-Epic settings, similar infrastructures can be established using open-source or modular tools. Secure web applications such as REDCap35 with automated email or SMS survey links can be used to design and automatically release ePROs to patients. REDCap’s Clinical Data Interoperability Services module can connect to an EHR’s Fast Health Interoperability Resources–compliant application programming interface, which serves as a standardized gateway for clinical data exchange to pull clinical data from an EHR for pairing with ePRO data.36 While we designed our implementation reports using WebI, Microsoft’s Power BI or R Shiny dashboards37,38 can be leveraged to design interactive information tools that support various phases of an ePRO implementation program. These adaptations preserve the same RE-AIM framework while enabling sustainable deployment across diverse environments. Inclusion of and early engagement with IT analysts is highly recommended to enhance the design and rollout of these tools tailored to stakeholder needs and available software resources.
This study has several limitations. First, apart from demonstrating the feasibility of Epic-based ePROs, Epic’s offering of extensive configurability and local variations in ePRO flowsheet build could affect interoperability and reproducibility across sites. To address this issue, future work should focus on disseminating consensus-based ePRO flowsheet build standards, ideally as common data elements developed and shared via the National Institute of Health’s Common Data Elements repository.39 Epic’s patient-facing ePRO visualization features, in contrast, tend to be less customizable than other digital patient journey platforms such as Visiontree,40 which may impact patient satisfaction and compliance. Second, our ePROs were developed only in English, but additional language translations should be made available to patients to improve patient participation (ie, reach). Third, while the primary scope of this paper is to provide a technical outline toward building ePRO information architectures, relevant clinical implementation metrics (ie, adoption surveys) have not been presented but will be discussed in greater detail in a subsequent manuscript on the clinical impact of RE-AIM.
Conclusion
Our work provides a practical roadmap for developing and maintaining an EHR-based ePRO information infrastructure that balances clinical utility and implementation evaluation founded on RE-AIM principles. Future directions include reporting of the clinical impact of our ePRO implementation program and multidisciplinary expansion of this framework across oncology subspecialties with differing surveillance schedules. Moreover, our secondary aim will be to leverage the resultant high-quality, structured data for predictive toxicity modeling and algorithmic symptom management pathways to optimize survivorship care.
Supplementary Material
Funding:
This work was supported directly or in part by funding/resources from the National Institutes of Health (NIH) National Institute for Dental and Craniofacial Research (K01DE030524, P01CA285249–01A1); NIH National Cancer Institute (K12CA088084); and The University of Texas MD Anderson Cancer Center Charles and Daneen Stiefel Center for Head and Neck Cancer Oropharyngeal Cancer Research Program. The content is solely the responsibility of the authors and does not necessarily represent the official view of the National Institutes of Health.
Footnotes
Conflict of Interest Statement: ACM serves as a member of the AMIA Clinical Research Informatics Working Group. CDF has received travel, speaker honoraria, and/or registration fee waivers unrelated to this project from Siemens Healthineers/Varian, Elekta AB, Philips Medical Systems, The American Association for Physicists in Medicine, The American Society for Clinical Oncology, The Royal Australian and New Zealand College of Radiologists, Australian & New Zealand Head and Neck Society, The American Society for Radiation Oncology, The Radiological Society of North America, and The European Society for Radiation Oncology.
Ethics Approval Statement: The IRB of The University of Texas MD Anderson Cancer Center approved this research under protocol 2024–0002.
CRediT Statement: In accordance with the Contributor Role Taxonomy (CRediT, https://credit.niso.org), the contributing authors have designated responsibilities and individual author attribution: The corresponding author (ACM) assumes responsibility for role assignment, and all contributors have been given the opportunity to review and confirm assigned roles: Conceptualization: ACM, AL, CDF, DIR, JP, KH, and GBG; Methodology: ACM, CDF, DIR, KH, GBG; Validation: all co-authors; Resources: all co-authors; Data curation: ACM; Funding acquisition: ACM and CDF; Investigation: ACM; Project administration: ACM; Supervision: ACM, CDF, DIR, KH, and GBG; Writing – original draft: ACM; Writing - review & editing: all co-authors.
ICJME Author Statement: In accordance with International Committee of Medical Journal Editors (ICJME, https://www.icmje.org) recommendations, all authors affirm qualification for authorship via the following criteria: “Substantial contributions to the conception or design of the work; or the acquisition, analysis, or interpretation of data for the work; AND Drafting the work or reviewing it critically for important intellectual content; AND Final approval of the version to be published; AND Agreement to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.”
Contributor Information
Amy C. Moreno, Department of Radiation Oncology, Division of Radiation Oncology, The University of Texas MD Anderson Cancer Center:.
Angela Peek, Department of Clinical Access Revenue System, The University of Texas MD Anderson Cancer Center.
Toshiko S. Stein, Department of Clinical Access Revenue System, The University of Texas MD Anderson Cancer Center.
Kevin R. Shook, Department of Enterprise Data Engineering and Analytics, The University of Texas MD Anderson Cancer Center.
Sara M. Ali, Department of Enterprise Data Engineering and Analytics, The University of Texas MD Anderson Cancer Center.
Laia Humbert-Vidan, Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center.
Aileen Chen, Department of Thoracic Radiation Oncology, Division of Radiation Oncology, The University of Texas MD Anderson Cancer Center.
Miriam Lango, Department of Head & Neck Surgery, The University of Texas MD Anderson Cancer Center.
Anna Lee, Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center.
Michael Spiotto, Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center.
William H. Morrison, Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center.
Adam S. Garden, Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center.
Jack Phan, Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center:.
Steven J. Frank, Department of GU Radiation Oncology, The University of Texas MD Anderson Cancer Center.
Katherine Hutcheson, Department of Head & Neck Surgery, The University of Texas MD Anderson Cancer Center.
David I. Rosenthal, Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center.
Clifton D. Fuller, Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center.
G. Brandon Gunn, Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center.
Preprint Availability Statement:
In accordance with NIH Policy NOT-OD-17–050, Reporting Preprints and Other Interim Research Products, we have deposited a pre–peer review version of this submitted manuscript at medRxiv (https://doi.org/10.1101/2025.03.31.25324980).
Data Availability Statement:
All data produced in the present work regarding information infrastructure design are contained in the manuscript.
References
- 1.Patient-reported outcome measures overview. The Measures Management System. https://mmshub.cms.gov/about-quality/types/proms/overview [Google Scholar]
- 2.Cella D, Hahn EA, Jensen SE, et al. Types of patient-reported outcomes. In: Cella D, Hahn EA, Jensen SE, et al. , eds. Patient-Reported Outcomes in Performance Measurement. RTI Press; 2015. Available from: https://www.ncbi.nlm.nih.gov/books/NBK424381 [Google Scholar]
- 3.MD Anderson Head and Neck Cancer Symptom Working Group, Eraj SA, Jomaa MK, et al. Long-term patient reported outcomes following radiation therapy for oropharyngeal cancer: cross-sectional assessment of a prospective symptom survey in patients ≥65 years old. Radiat Oncol. 2017;12(1):150. doi: 10.1186/s13014-017-0878-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Hosseinian S, Hemmati M, Dede C, et al. Cluster-based toxicity estimation of osteoradionecrosis via unsupervised machine learning: moving beyond single dose-parameter normal tissue complication probability by using whole dose-volume histograms for cohort risk stratification. Int J Radiat Oncol Biol Phys. 2024;119(5):1569–1578. doi: 10.1016/j.ijrobp.2024.02.021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Hemmati M, Barbon C, Mohamed ASR, et al. Optimized decision support for selection of transoral robotic surgery or (chemo)radiation therapy based on posttreatment swallowing toxicity. Cancer Med. 2023;12(4):5088–5098. doi: 10.1002/cam4.5253 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Salama V, Youssef S, Xu T, et al. Temporal characterization of acute pain and toxicity kinetics during radiation therapy for head and neck cancer. A retrospective study. Oral Oncol Rep. 2023;7:100092. doi: 10.1016/j.oor.2023.100092 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Rosenthal DI, Mendoza TR, Fuller CD, et al. Patterns of symptom burden during radiotherapy or concurrent chemoradiotherapy for head and neck cancer: a prospective analysis using the University of Texas MD Anderson Cancer Center Symptom Inventory-Head and Neck Module. Cancer. 2014;120(13):1975–1984. doi: 10.1002/cncr.28672 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Basch E, Schrag D, Henson S, et al. Effect of electronic symptom monitoring on patient-reported outcomes among patients with metastatic cancer: a randomized clinical trial. JAMA. 2022;327(24):2413–2422. doi: 10.1001/jama.2022.9265 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Rotenstein LS, Huckman RS, Wagle NW. Making patients and doctors happier - the potential of patient-reported outcomes. N Engl J Med. 2017;377(14):1309–1312. doi: 10.1056/NEJMp1707537 [DOI] [PubMed] [Google Scholar]
- 10.Takeuchi EE, Keding A, Awad N, et al. Impact of patient-reported outcomes in oncology: a longitudinal analysis of patient-physician communication. J Clin Oncol. 2011;29(21):2910–2917. doi: 10.1200/JCO.2010.32.2453 [DOI] [PubMed] [Google Scholar]
- 11.Land SR, Ritter MW, Costantino JP, et al. Compliance with patient-reported outcomes in multicenter clinical trials: methodologic and practical approaches. J Clin Oncol. 2007;25(32):5113–5120. doi: 10.1200/JCO.2007.12.1749 [DOI] [PubMed] [Google Scholar]
- 12.Eng L, Chan RJ, Chan A, et al. perceived barriers toward patient-reported outcome implementation in cancer care: an international scoping survey. JCO Oncol Pract. 2024;20(6):816–826. doi: 10.1200/OP.23.00715 [DOI] [PubMed] [Google Scholar]
- 13.Nguyen H, Butow P, Dhillon H, Sundaresan P. A review of the barriers to using patient-reported outcomes (PROs) and patient-reported outcome measures (PROMs) in routine cancer care. J Med Radiat Sci. 2021;68(2):186–195. doi: 10.1002/jmrs.421 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Glenwright BG, Simmich J, Cottrell M, et al. Facilitators and barriers to implementing electronic patient-reported outcome and experience measures in a health care setting: a systematic review. J Patient Rep Outcomes. 2023;7(1):13. doi: 10.1186/s41687-023-00554-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Users’ guide to integrating patient-reported outcomes in electronic health records. Patient-Centered Outcomes Research Institute. 2017. https://www.pcori.org/document/users-guide-integrating-patient-reported-outcomes-electronic-health-records [Google Scholar]
- 16.Chan A, Ng DQ, Arcos D, et al. Electronic patient-reported outcome-driven symptom management by oncology pharmacists in a majority-minority population: an implementation study. JCO Oncol Pract. 2024;20(12):1744–1754. doi: 10.1200/OP.24.00050 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Mott NM, Huynh V, Vemuru S, et al. Barriers and facilitators to measuring patient reported outcomes in an academic breast cancer clinic: an application of the RE-AIM framework. Am J Surg. 2024;228:180–184. doi: 10.1016/j.amjsurg.2023.09.022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Hodgson W, Kirk A, Lennon M, et al. RE-AIM (reach, effectiveness, adoption, implementation, and maintenance) evaluation of the use of activity trackers in the clinical care of adults diagnosed with a chronic disease: integrative systematic review. J Med Internet Res. 2023;25:e44919. doi: 10.2196/44919 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Yang W, Liang X, Sit CH. Physical activity and mental health in children and adolescents with intellectual disabilities: a meta-analysis using the RE-AIM framework. Int J Behav Nutr Phys Act. 2022;19(1):80. doi: 10.1186/s12966-022-01312-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–1327. doi: 10.2105/ajph.89.9.1322 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.RE- AIM. https://re-aim.org [Google Scholar]
- 22.Zylla DM, Gilmore GE, Steele GL, et al. Collection of electronic patient-reported symptoms in patients with advanced cancer using Epic MyChart surveys. Support Care Cancer. 2020;28(7):3153–3163. doi: 10.1007/s00520-019-05109-0 [DOI] [PubMed] [Google Scholar]
- 23.Gold HT, Karia RJ, Link A, et al. Implementation and early adaptation of patient-reported outcome measures into an electronic health record: a technical report. Health Informatics J. 2020;26(1):129–140. doi: 10.1177/1460458218813710 [DOI] [PubMed] [Google Scholar]
- 24.Nolla K, Rasmussen LV, Rothrock NE, et al. Seamless integration of computer-adaptive patient reported outcomes into an electronic health record. Appl Clin Inform. 2024;15(1):145–154. doi: 10.1055/a-2235-9557 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Turchioe MR, Mangal S, Goyal P, et al. A RE-AIM evaluation of a visualization-based electronic patient-reported outcome system. Appl Clin Inform. 2023;14(2):227–237. doi: 10.1055/a-2008-4036 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Rosenthal DI, Mendoza TR, Chambers MS, et al. Measuring head and neck cancer symptom burden: the development and validation of the M. D. Anderson symptom inventory, head and neck module. Head Neck. 2007;29(10):923–931. doi: 10.1002/hed.20602 [DOI] [PubMed] [Google Scholar]
- 27.MyChart. MyChart is Epic. https://www.mychart.org [Google Scholar]
- 28.Naik A, John MT, Kohli N, Self K, Flynn P. Validation of the English-language version of 5-item Oral Health Impact Profile. J Prosthodont Res. 2016;60(2):85–91. doi: 10.1016/j.jpor.2015.12.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Epic SlicerDicer. UC Davis Health. https://health.ucdavis.edu/data/epic-slicer-dicer.html [Google Scholar]
- 30.I.MPROVINGPRIMARYC. Formative evaluation: fostering real-time adaptations and refinements to improve the effectiveness of patient-centered medical home interventions. PCMH Research Methods Series. 2013. https://api.semanticscholar.org/CorpusID:13799970 [Google Scholar]
- 31.About us. Epic. https://www.epic.com/about [Google Scholar]
- 32.Rotenstein LS, Agarwal A, O’Neil K, et al. Implementing patient-reported outcome surveys as part of routine care: lessons from an academic radiation oncology department. J Am Med Inform Assoc. 2017;24(5):964–968. doi: 10.1093/jamia/ocx009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Life sciences solutions. Oracle. https://www.oracle.com/latam/life-sciences/real-world-evidence/clinical-outcomes-assessment-development [Google Scholar]
- 34.Bass M, Morris S, Neapolitan R. Utilizing multidimensional computer adaptive testing to mitigate burden with patient reported outcomes. AMIA Annu Symp Proc. 2015;2015:320–328. [PMC free article] [PubMed] [Google Scholar]
- 35.REDCap. https://project-redcap.org [Google Scholar]
- 36.Clinical Data Interoperability Services (CDIS). REDCap. https://projectredcap.org/software/cdis [Google Scholar]
- 37.ShinyDashboard. RStudio. 2014. https://rstudio.github.io/shinydashboard [Google Scholar]
- 38.Power BI. Microsoft Power Platform. https://www.microsoft.com/en-us/power-platform/products/power-bi [Google Scholar]
- 39.NIH Common Data Elements (CDE) Repository. https://cde.nlm.nih.gov/home [Google Scholar]
- 40.Press releases. Visiontree. 2024. https://visiontree.com/press-releases [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
In accordance with NIH Policy NOT-OD-17–050, Reporting Preprints and Other Interim Research Products, we have deposited a pre–peer review version of this submitted manuscript at medRxiv (https://doi.org/10.1101/2025.03.31.25324980).
All data produced in the present work regarding information infrastructure design are contained in the manuscript.
