Skip to main content
. 2020 Jul 10;30(11):3015–3033. doi: 10.1007/s11136-020-02564-9

Table 4.

Implementation science metrics for evaluating PROM implementation initiatives in routine care settings

Implementation science construct Evaluating perception of the innovation (PROMs) Evaluating the implementation strategies
Acceptability

Patients and clinicians

• % willing to recommend PROMs to other patients

• % reporting PROMs helpful in discussing symptoms/symptom management

• % reporting ease of use and comprehensibility for PROMs and technology systems

• Stakeholder perceptions of acceptability of implementation strategies (e.g., PROM training session is appropriate length)

• Barriers and enablers for implementing PROMs

• Related contextual factor: organizational readiness for change

Appropriateness

• PROM fit with patient population (e.g., literacy level, technology comfort, language(s), font size, culturally appropriate, meaningful for clinical condition)

• PROM fit for clinic team (e.g., PROM easy to interpret, meaningful for clinical care, integrated in electronic health record system, linked clinical decision support)

• PROM fit with clinic culture and values

• Perceived relative advantage of PROMs vs. usual care

• Leadership support for PROMs

• Stakeholder perceptions of clinic needs and resources for implementing PROMs

• Fit of potential implementation strategies for specific clinics, their needs and resources, clinic team members, and patient population

• Leadership support for implementation strategies (e.g., providing space and time for clinic team to receive training)

Feasibility

• Extent to which technology or electronic health record can be developed or modified to administer PROMs and visualize results in a meaningful way for clinicians

• If collecting PROMs from home, feasibility testing considers underserved patient groups’ needs and access to internet and habits (or alternative data collection methods like interactive voice response offered)

• Consent rate > 70% (if applicable)

• How many and which items are missed or skipped (and identifiable patterns)

• Length of time for patients to complete the PROM, comprehensibility

• Rates of technical issues

• Dropout rate for patients

• PROM characteristics (e.g., literacy demand, number of items, preliminary psychometric properties if used in new population, validity and reliability evidence for population)

• “Action, actor, context, target, time (AACTT)” framework [62]: describe who needs to do what differently, and select fit-for-purpose strategies

• % clinics completing at least one implementation activity or phase (and/or all activities and implementation phases)

• Rates of technical issues for clinics

• Stakeholder perceptions of which implementation strategies are possible

• Stakeholder perceptions of what to include in PROM training session

• Pilot study or rapid cycle testing to determine if implementation strategy is possible (e.g. whether specific workflow change possible in a clinic)

• Which implementation activities were completed vs. skipped

Adoption

• % of clinics advancing to administering PROMs routinely

• Representativeness of clinics willing to initiate PROMs

• Underserved patient groups (e.g., older patients) complete PROMs at similar rates to clinic average

• Dropout rate for clinics

• Representativeness of clinics completing implementation activities

• Stakeholder perceptions and observations on which implementation support strategies were/were not effective in a clinic, and why

• How and why clinics operationalized implementation strategies

• Minor changes made to implementation strategies to fit local conditions or context (if major changes, see fidelity below)

• StaRI reporting guidelines for implementation strategies [61]

Reach/penetration

• % of patient panel completing ≥ 1 PROM during defined time interval (denominator chosen appropriately: all patients with an in-person visit during time interval, etc.)

• % of missing data during defined time interval (with appropriate denominator)

• Informed missingness (correlated with patient demographics)

• Average # PROMs completed per patient during interval

• % of clinic team participating in implementation strategies

• % of clinic team attending training

• % of clinic team reporting training helped them understand new role and how to implement in their workflow

• Clinicians: % reporting self-efficacy for using PROMs after training

Fidelity

• Consistency of PROMs completed by patients (e.g., 80% PROM completion rate for clinic)

• % of clinicians who review PROMs with patients during visits

• How and why clinics adapted the innovation (e.g., changed PROM timeframe for items)

• FRAME framework for reporting adaptions to interventions [49]

• FIDELITY framework [50]: report on five implementation fidelity domains (study design, training, delivery, receipt, and enactment)

• How and why clinics or support personnel adapted implementation strategies (e.g., changed the PROM training format or content)

• % of clinics completing all implementation activities

Cost

• Financial, personnel, and time costs to administer and review PROMs on routine basis

• Technology costs

• Financial, personnel, technology, and time costs to implement PROMs

• Cost of Implementing New Strategies (COINS) [64]

Sustainability

• Extent to which PROMs become normalized and routinized in a clinic’s workflow

• Stakeholder perceptions

• Periodically assess whether updates to PROMs are needed

• Routine data-informed feedback to clinic on PROM completion rates, missing data, and informed missingness

• Provide additional implementation support to identify and overcome new or ongoing barriers (if needed)

• Retraining or “booster” training or train new staff (if needed)

Bold and italic font show the important distinction between evaluating perceptions of the innovation (PROMs/PREMs) vs. evaluating implementation strategies

ePROM electronic patient-reported outcome measure, AACTT action, actor, context, target, time framework, StaRi standards for reporting implementation studies guidelines, FRAME framework for reporting adaptations and modifications-enhanced, COINS Cost of Implementing New Strategies (COINS) scale