Skip to main content
Lippincott Open Access logoLink to Lippincott Open Access
. 2023 Sep 7;61(10):689–698. doi: 10.1097/MLR.0000000000001897

The Costs of Implementing a Conversation Aid for Uterine Fibroids in Multiple Health Care Settings

Stephanie C Acquilano 1,, Rachel C Forcino 1, Danielle Schubbe 1, Jaclyn Engel 1, Marisa Tomaino 1, Lisa C Johnson 1, Marie-Anne Durand 1, Glyn Elwyn 1
PMCID: PMC10478675  PMID: 37943524

Abstract

Background:

Health care organizations considering adopting a conversation aid (CA), a type of patient decision aid innovation, need information about the costs of implementation.

Objectives:

The aims of this study were to: (1) calculate the costs of introducing a CA in a study of supported implementation in 5 gynecologic settings that manage individuals diagnosed with uterine fibroids and (2) estimate the potential costs of future clinical implementation efforts in hypothetical settings.

Research Design:

We used time-driven activity-based costing to estimate the costs of CA implementation at multiple steps: integration with an electronic health record, preimplementation, implementation, and sustainability. We then estimated costs for 2 disparate hypothetical implementation scenarios.

Subjects and Data Collection:

We conducted semistructured interviews with participants and examined internal documentation.

Results:

We interviewed 41 individuals, analyzed 51 documents and 100 emails. Overall total implementation costs over ∼36 months of activities varied significantly across the 5 settings, ranging from $14,157 to $69,134. Factors influencing costs included size/complexity of the setting, urban/rural location, practice culture, and capacity to automate patient identification. Initial investments were substantial, comprising mostly personnel time. Settings that embedded CA use into standard workflows and automated identification of appropriate patients had the lowest initial investment and sustainability costs. Our estimates of the costs of sustaining implementation were much lower than initial investments and mostly attributable to CA subscription fees.

Conclusion:

Initiation and implementation of the interventions require significant personnel effort. Ongoing costs to maintain use are much lower and are a small fraction of overall organizational operating costs.

Key Words: cost analysis, gynecology, implementation research, patient-centered care, shared decision-making, women’s health, patient education


Introducing patient-centered innovations into health care services is a policy priority yet progress towards this goal is slow, likely because health systems face multiple challenges and increasing costs. Implementation research provides guidance about strategies for successfully introducing clinical innovations,1,2 but organizations have very limited information about the costs of introducing such innovations. Cost studies of implementation research are a promising option for producing such information, but these studies often have methodologic problems that limit their usefulness. Prominent among these problems is the lack of mixed methods approaches3,4 and attention to costs associated solely with the intervention, completely missing preimplementation activities (eg, building awareness and buy-in, planning workflow, training).46 Within the context of a larger study, we had the opportunity to use mixed methods to examine costs associated with implementing a conversation aid (CA) for uterine fibroids in health care settings, including all aspects of the implementation process.

Patient-facing decision-support tools are designed to facilitate shared decision-making (SDM) between a clinician and patient710 during clinical visits,10 and can also be used before and after visits.8 There is strong and consistent evidence that patient decision aids are effective, especially in improving patient knowledge, understanding of risks, and participation in decision-making.11 People often use “patient decision aids” and “conversation aids” interchangeably, but we conceptualize CAs as a special form of patient decision aids that are typically briefer, tailored to address low health literacy, and designed for use within the clinical visit.7,10,12 CAs have also been shown to be effective in the same ways as patient decision aids, improving patient knowledge, understanding of risks, and participation in decision-making.7,10,12 Patient decision aids and CAs face multiple implementation challenges,10,11,13 one of which is the lack of information about the cost of adoption.1416

There are short-term, immediate costs to plan and initiate the use of CAs in health care systems, which are then followed by recurring costs to sustain their use. Information about the nature and extent of these costs is limited and inconclusive. Some research suggests that implementing CAs significantly increases costs, while other studies find that costs are only slightly increased.15,17 Other research indicates possible substantial cost-savings, while some show little cost-savings.11,15,17,18 Researchers face a multitude of challenges in conducting cost studies—the methodological quality is highly variable and findings are inconclusive.11,15,17,18 More information about initial investment, recurring costs, impact on practice, and long-term cost-savings is imperative.

To address this need, we conducted a study of the costs incurred in an implementation project. We aimed to: (1) evaluate the costs of implementing CAs in the supported implementation project; and (2) estimate costs for future independently initiated implementation of similar tools in similar settings.

METHODS

This study was an added component of a project that aimed to implement a CA at 5 gynecologic settings to help women choose treatments for symptomatic uterine fibroids.19 The settings are described in Table 1 and were selected through targeted networking with colleagues. We wanted to recruit OB/GYN settings with a keen interest in SDM that included a mix of urban and rural locales in a few different geographic regions. We contacted several colleagues to ask about their interest and to elicit potential names of other interested physicians.

TABLE 1.

Implementation Setting Characteristics

Characteristic Setting 1 Setting 2 Setting 3 Setting 4 Setting 5
Location and type of services Northeast General Care Northeast Specialty Care Midwest Specialty Care Northeast General Care Midwest General Care
Clinical champion has experience using conversation aids Yes Yes No No No
Epic EHR system has a place for patient materials* Yes No Yes No No
Clinic sends previsit materials to patients Rarely Routinely Routinely Never Never
Patient eligibility (ie, visit is for fibroids) is automated No No Yes No No
No. staffed beds 419 871 1314 1280 1406
Annual outpatient visits 1,134,232 1,576,228 450,535 3,373,318 727,904
No. departments involved in the project 1 1 1 3 2
No. clinicians involved in the project 20 8 10 22 12
Estimated monthly volume of patients with fibroids 5 4 23 20 6
*

All settings used the Epic EHR system.

EHR indicates electronic health record.

The study was conceptualized in phases: (1) electronic health record (EHR) integration; (2) preimplementation; (3) implementation; and (4) sustainability. Figure 1 details the activities required in each phase. All the settings began to work on EHR integration in January 2019. Preimplementation was staggered: the first setting began preimplementation in August 2019 and the last setting in November 2020. All settings experienced extended delays in various phases of the project from March 2020 onwards, due to the COVID-19 pandemic.

FIGURE 1.

FIGURE 1

Model of the implementation process. Depicts the major phases of implementation and activities required in each phase. Time estimates are projections of how long each step may take in an independent implementation. Duration of project phases were artificially inflated due to the COVID-19 pandemic. EHR indicates electronic health record.

We used principles of the Cost of Implementing New Strategies (COINS) framework to organize our thinking about the types of data to gather.20 COINS requires collecting information about both the cost of the intervention itself and the cost of implementing the intervention, which are often missed by other methods. COINS then requires gathering details about all resources necessary to successfully organize and implement the intervention (eg, which personnel, for how much time, training costs, fees, supplies, and so on). These principles fit well with recommendations for improving the cost of implementation research noted above and assisted us in designing our qualitative interview topics. We used time-driven activity-based costing (TDABC) methods to further identify specific data points needed and to conduct the analyses. The structure of TDABC methods aligns well with the COINS principles and seemed like a robust combination.

TDABC is a micro-costing method that provides detailed, transparent cost information that can help guide future investments.21 An adaptation of work by Kaplan et al,22 TDABC relies on the following steps: (1) creation of a process map that describes specific activities required; (2) details about the frequency of activities; (3) estimates of time to complete activities; (4) calculation of total time (frequency multiplied by specific activity); (5) estimates of cost per unit of time; and (6) calculations of the total cost (cost per time unit multiplied by total activity time). In short, TDABC enabled us to gather detailed data about the implementation activities performed by each individual, during each phase, at each setting. Because we implemented a CA within an existing clinical interaction, there were no added indirect or overhead costs; hence, we excluded these from our analysis.

Participants

Participants in the cost study were employees at the settings who were involved in the implementation project (key informants). We identified key informants through lists provided by the site principal investigators (PIs) and research coordinators, as well as examination of process maps created in the early phases of the larger study. These process maps were designed to outline potential methods for previsit delivery of the CAs at each site. While these reflected the “ideal” process sites hoped to create (not the actual final process implemented at each site), the personnel involved did not change, making the process maps helpful in identifying staff to interview.

Data Collection

Data collection involved conducting semistructured interviews with key informants, reviewing internal documents, meeting as a team to discuss and confirm data, and obtaining salary estimates for employees involved in the project.

Interview Guide

Our initial interview guide covered the data elements required for TDABC. Feedback from our team resulted in revisions and a final set of questions and probes. In addition to eliciting details about activities specifically related to implementation efforts, we asked about other or new staff involved in the efforts and about costs associated with continuing use of the CAs. Interviews were conducted and recorded between October 2021 and June 2022, and professionally transcribed.

Salary Estimates

We provided each site PI with a list of key informants and their roles and obtained average salary estimates for each role (rather than actual salary details). If we could not obtain this information, we derived estimates of average salaries by reviewing posted job positions at that setting and performing online searches (eg, via Glassdoor, ZipRecruiter, Salary.com).

Analytic Plan

TDABC calculations across implementation phases were undertaken using spreadsheet formulas designed to assess both one-time (eg, initial programming to build CAs into the EHR) and recurring (eg, sending out previsit CAs) activities. For each individual involved, we multiplied the number of hours spent in a phase by the average pay rate plus fringe to obtain a total cost per person. We then summed the total hours and costs for that phase. Data entry and calculations were undertaken by S.C.A. and checked for accuracy (by J.E. or M.T.).

Estimating Costs at Hypothetical Settings

To provide an approximation of the range of costs that different settings might incur during implementation, we created 2 disparate hypothetical settings and implementation scenarios. We designed the details of the scenarios based on the most influential factors so that the 2 scenarios varied greatly in complexity and cost. For expenses that were also borne by settings in our project, we used our data to inform and estimate costs. We estimated other costs (those covered by an external entity in our project) based on information from our research partners, colleagues, experts, and resources available online. For the simpler hypothetical setting, we extrapolated from our 2 lowest cost settings. To estimate costs for the larger, more complex hypothetical setting, we examined our 2 highest cost settings and then increased this by ∼50% to account for the additional size and complexity of the health system in our scenario.

RESULTS

We conducted 41 semistructured interviews and reviewed 51 internal documents and over 100 emails. Key informants interviewed comprised 5 site PIs/clinicians, 11 additional clinicians, 10 EHR specialists, 9 coordinators, 2 clinical fellows, and 4 others. Internal documents included 27 EHR integration meeting minutes, 10 training visit itineraries, 5 initial site visit itineraries, 5 documents containing both emails and summaries of planning meetings, and 4 others.

Implementation Costs

Results reflect costs incurred by participating settings and do not include costs for services provided by the study funder. Dartmouth (coordinating site) provided settings with training and printed copies of the CAs. EBSCO (CA vendor) provided access to the CAs at no cost during the study. Of note, because the clinical activities required to integrate and implement the CAs were absorbed by existing frontline staff, no incremental personnel costs were incurred.

Costs of project phases varied extensively across settings. EHR integration. Total hours required for EHR integration ranged from 23.75 hours at Setting 3 to 120 hours at Setting 2. Total personnel costs for EHR integration varied from $1916 at Setting 3 to $7060 at Setting 4. Preimplementation. The costs associated with preimplementation ranged from 60.5 hours at Setting 3 ($8165) to 281.25 hours at Setting 4 ($19,464). Implementation. For implementation, Setting 3 required the fewest hours and had the lowest cost at 29.25 hours ($3938). Setting 5 spent the most time and had the highest cost at 659.4 hours ($38,165). Sustainability. Staff at Setting 3 spent <1 hour per month during sustainability at a cost of $46 per month, while staff at Setting 4 spent 43 hours per month ($1660/mo). Table 2 lists details about the length of implementation phases, personnel involved, amount of effort, and costs by setting and phase. Figure 2 displays duration, effort, and costs graphically to enable comparisons between sites across phases.

TABLE 2.

Duration, Detailed Effort, and Costs by Phase and Setting

Phase Setting 1 Setting 2 Setting 3 Setting 4 Setting 5
EHR integration phase*
 Length of phase (mo) 7 18 13 13 14
 Detailed effort Hours People Hours People Hours People Hours People Hours People
  Site PI 0.75 1 1 1 2.25 1 0.75 1 1 1
  RC 26 2 1.25 2 0.75 1 1 1
  Providers 0.75 1 0.25 1 1 2
  Epic staff 38.5 4 93 12 20 5 83 7 61.75 21
  Other staff 5 1 0.25 1 1 2 0.5 1
 Total effort 45 7 120 15 23.75 9 85.75 12 65.25 26
Total cost $3530 $6451 $1916 $7060 $4016
Preimplementation phase
 Length of phase (mo) 7 4 5 12 10
 Detailed effort Hours People Hours People Hours People Hours People Hours People
  Site PI 16.25 1 69 1 9.75 1 43 1 43 1
  RC 137.5 2 30.25 2 1.5 1 224.5 2
  Providers 27.5 9 7 4 14.75 7 9.75 9 21.5 14
  Other staff 32.25 16 5.75 4 12.75 8 2.75 2
 Total effort 76 26 213.5 7 60.5 14 67 19 291.75 19
Total cost $9918 $13,235 $8165 $10,631 $19,464
Implementation phase
 Length of phase (mo) 15 14 13 12 12
 Detailed effort Hours People Hours People Hours People Hours People Hours People
  Site PI 52 1 31.5 1 13 1 77.4 1 51.6 1
  RC/other 16.25 1 472.5 3 16.25 2 582 1 424.8 2
 Total effort 68.25 2 504 4 29.25 3 659.4 2 476.4 3
Total cost $9817 $16,903 $3938 $38,165 $24,481
Sustainability phase
 Length of phase (mo) 13 4 3 8 12
 Detailed effort Hours People Hours People Hours People Hours People Hours People
  Site PI 6.5 1 18 1
  RC/other 16.25 1 78 2 2.25 1 344 1 54.6 1
 Total effort 22.75 2 78 2 2.25 1 344 1 72.6 2
Total cost $1723 $2033 $138 $13,278 $5286
Total: all phases $24,988 $38,622 $14,157 $69,134 $53,248
*

All settings used the Epic EHR system.

This setting did not have a coordinator. The Dartmouth team provided minimal support, while the site PI performed most of these functions.

All site PIs were clinicians participating in the project. “Other” staff members could include medical assistants, nurses, licensed nursing assistants, scheduler/receptionists, administrators, quality improvement staff, leadership, patient advocates, etc. Epic staff included programmers, analysts, managers, etc. involved in the process of obtaining approvals and building code for integrating the conversation aid into the EHR.

Activities during all phases were absorbed by existing staff and did not require hiring new personnel.

EHR indicates electronic health record; PI, principal investigator; RC, research coordinator.

FIGURE 2.

FIGURE 2

Length of phases, personnel hours, and costs by phase and site. The length of time each setting spent in each phase (top), number of hours spent by personnel in each phase (middle), and the total cost of each phase (bottom). All are highly variable across settings. *General care; **Specialty care. EHR indicates electronic health record; MW, midwest; NE, northeast.

Impact of Salary Differences

As an indicator of how much differences in salaries across sites drove implementation costs, we conducted a sensitivity analysis using salaries for equivalent positions from the 2022 Bureau of Labor Statistics and 2021 AAMC Faculty Salary reports (see Document, Supplemental Digital Content 1, http://links.lww.com/MLR/C688, which shows standard salaries used as well as salaries for each setting). We re-calculated implementation costs by phase and setting (Document, Supplemental Digital Content 2, http://links.lww.com/MLR/C689) and identified which settings had the lowest and highest cost, per phase. The findings were essentially unchanged from our original analyses. For preimplementation, implementation, sustainability, and total costs the lowest and highest cost settings remained the same. For EHR integration, the lowest cost setting was the same but the highest cost setting changed.

Factors Associated With Implementation Costs

We identified several factors that influenced implementation costs, primarily related to setting characteristics, practice culture, EHR capacity, integration with patient health information (PHI), and depth of adoption. Setting characteristics such as institution size, implementation complexity (eg, number of departments involved), and location (eg, rural or urban) all had an impact on costs. As size and complexity increase, duration and effort needed to accomplish preimplementation and implementation increase. Salaries and other resources tend to be more expensive in urban areas than in rural ones. Our highest cost settings tended to be larger, were located in urban areas, and involved multiple departments.

Practice culture is highly influential but often harder to assess. The degree to which clinicians: (1) deliver patient-centered care; (2) engage in SDM; (3) use patient educational materials; and (4) provide materials before clinical visits had a significant impact on the length of time and amount of effort needed for preimplementation and implementation, as well as the type and level of training required. In general, the more a practice culture embraced patient-centered care, SDM, and educating patients, the lower the cost of preimplementation and implementation. Clinicians in practices not strongly aligned with patient-centered care required more preimplementation efforts and needed more intensive training addressing both SDM and use of the CA. Our lowest cost setting had a rich culture of SDM and routinely provided patients with educational materials before visits. The impact of prior experience with patient educational materials and CAs was less clear. We observed that, while some clinicians familiar with preexisting patient-facing materials easily accepted the use of the CA, others were much less willing to adopt new materials, extending the time and effort needed for preimplementation.

The EHR integration phase of an implementation can be a high-cost endeavor, depending on system customizability and other capabilities, as well as support from the EHR vendor and team. All of our settings used the Epic medical record system, so there were no differences in EHR characteristics across our settings. Generally, an EHR system that is easily customized, already includes patient materials, can preidentify specific patients, and has ongoing support from the vendor will require little time and effort to incorporate a CA. Our settings that were able to create an automated EHR process for identifying patients with uterine fibroids needed much less personnel time in implementation and sustainability than those who had to manually identify patients.

Costs of EHR integration are also driven extensively by the degree of CA integration with secure PHI. Implementation projects that include integrating the CA with PHI require much more time and effort than those simply embedding a link in the EHR. PHI integration requires more time and skill from EHR analysts, lengthy and complicated security reviews at multiple institutional levels, extensive testing and troubleshooting, and close collaboration between CA and EHR vendors.

Depth of intervention adoption was a key factor, especially in sustainability costs. The extent to which CA use was embedded into daily workflow and incorporated into routine personnel responsibilities had a substantial impact on costs. Settings, where clinical staff (vs. researchers) led delivery and use of the CA and embedded these activities into the standard workflow, had the lowest implementation and sustainability costs. These activities became part of the clinical routine and were not seen as added effort. Further, settings that automatically delivered previsit CAs as part of daily workflow had lower costs than those manually identifying patients and mailing out hard copies.

Model of Costs for Future Clinical Implementation

The steps of implementing a CA in other settings will mirror those in our project, plus an initial decision to implement a CA. We did not collect data on this step, which was determined as part of site recruitment for the study, and we do not include it in our hypothetical scenarios below. However, we can estimate that the decision phase will entail mostly personnel costs resulting from meetings held at leadership and senior management levels. Costs associated with the decision phase are likely to be highly variable, depending on setting characteristics and the origin of the idea to adopt a CA.

Presumed Training and Printing Costs

The cost of training in SDM and use of CAs varies by intensity and delivery organization (eg, train-the-trainer workshop over several days vs. brief training sessions). We estimated costs to be between ∼$3000 for a short half-day event to $10,000 a day for more extensive facilitator-led workshops. Printing costs were estimated using an online printing service and based on documents that were 5–8 pages.

Estimated Conversation Aid Subscription Costs

We contacted a number of CA vendors, including the vendor involved in our study, to estimate the cost of commercial subscriptions. It was difficult to obtain information, and we were informed that subscriptions vary by size of the institution, number of providers or patients served, and complexity of software integration. Our best estimates of subscriptions for a typical health system are ∼$50,000 a year without PHI integration and $75,000 or more with PHI integration, depending on the size and complexity of the setting. In selecting a CA vendor, decision-makers will need to consider its compatibility with their EHR. The customizability, flexibility, and features of the EHR may impact which CA will be the best fit for the institution.

Hypothetical Setting Scenarios

Characteristics and estimated costs for the 2 disparate scenario settings we created are shown in Table 3. Estimates of the implementation costs for these hypothetical settings vary extensively. The total initial investment at the single-setting scenario is estimated at $26,788, while the multiple-setting scenario is $126,388. Once implemented, recurring annual sustainability costs are estimated at $52,500 for the single-setting scenario and $143,088 for the multiple-setting scenario.

TABLE 3.

Estimated Costs for Hypothetical Scenario Settings

graphic file with name mlr-61-689-g003.jpg

*

EHR integration and preimplementation occur roughly simultaneously. This is estimated at 6 months for the single-setting and 18 months for the multiple-setting system.

Implementation is estimated at 3 months for the single-setting and 6 months for the multiple-setting system.

There are 2 versions of the conversation aid, text-only and picture-enhanced.

CA indicates conversation aid; EHR, electronic health record; PHI, patient health information; SDM, shared decision-making.

DISCUSSION

Principal Findings

Observed implementation costs varied significantly across settings based on factors associated with characteristics of the setting, practice culture, EHR capacity, PHI integration, and depth of adoption. Variability in costs was greatest during the implementation phase, where total costs ranged from $3938 to $38,165. There appeared to be 3 key factors that caused this variation: size and complexity of the setting, automation of patient identification, and delivery of previsit materials. The setting with the lowest implementation costs involved a single clinic, was able to automate patient identification via EHR fields, and had established procedures for previsit delivery of patient materials. This resulted in almost no additional personnel time for use of the CA. The highest cost setting was large, involved 3 separate departments, had personnel manually identifying patients, and had no process for delivering previsit patient materials. As a result, personnel effort to coordinate across departments, identify patients, and distribute CAs was high. EHR integration had the lowest variability in costs, as all settings used the same EHR system, none were integrating with PHI, and all had dedicated time available from EHR analysts. The cost variability that did occur was largely driven by the extent to which the EHR had to be adapted to accommodate access to the CA. The 2 lowest cost settings already had this infrastructure, while the 3 higher scost settings did not.

The majority of total implementation costs was attributable to the initiation of the innovation and arose mainly from the use of staff and clinical time to introduce the innovation, train colleagues, modify workflows, and initiate the work. It is worth recognizing that there were no additional capital costs to implement CAs; the required infrastructure and clinical and administrative personnel already existed. Our results indicate that after preparation and initiation, the costs of sustaining the use of CAs are much lower. This is especially true if the task of identifying patients eligible to receive the CAs can be automated, at least to a degree, to minimize the use of personnel. Similarly, if patient identification results in automated previsit delivery of the CA, nearly all additional personnel time is eliminated. The remaining ongoing costs would be a subscription to the CA and the need to maintain staff awareness and training. An important distinction to note: Because our intervention involved enhancing existing clinical care, the associated tasks fit well with existing staff responsibilities and could be absorbed seamlessly. More complicated innovations would likely result in a bigger impact and potentially the need to hire additional staff.

Strengths and Weakness of Study Methods

The original implementation study did not include a focus on costs. The data for the cost analysis was therefore collected in the final year of the work and required staff to recall and estimate time spent on tasks earlier in the project. Retrospective data estimations are likely to be less accurate than contemporaneous assessments. We were able to mitigate this issue by analyzing documented meetings and discussions to supplement interview data. We chose to obtain salary data from specific settings when available to make analyses as accurate per setting as possible. Because we were not able to obtain pay rates from all settings, we estimated some pay rates using online resources, which likely reduces generalizability. Finally, the onset of COVID-19 in March 2020 caused substantial disruption to our study and all settings experienced delays in progressing through some phases of the project. These delays artificially inflated the duration of different phases at different settings, rendering comparison of duration across settings less meaningful.

Our study had a number of strengths worth noting. First and foremost, we analyzed costs associated with preimplementation activities in addition to the costs of delivering the intervention. Very few studies have examined preimplementation costs in this manner. Our use of the TDABC micro-costing methodology produced a level of detail that elucidates not only the total hours and costs of the implementation process but also effort and cost for each phase, along with specific personnel roles involved and amount of time spent. This provides decision-makers with critically-important information as they consider CA innovations. Finally, the number of interviews conducted and the variety of stakeholders involved enabled a comprehensive assessment and corroboration of estimated costs incurred.

Results in Context of Other Studies

While the last decade has seen an increase in cost studies relating to implementation research, findings are inconsistent and cover a wide range of health issues, interventions, and implementation strategies. Most studies focus on the cost of delivering an intervention to evaluate whether it increases or decreases overall costs. Some also include evaluation of immediate and short-term consequences of the intervention, such as whether there appeared to be any cost-savings generated by the intervention. While helpful, such findings are insufficient; decision-makers need more detailed information about the entire implementation process, that is, the costs associated with all activities leading up to the launch of the intervention.5,6,23,24 For example, only 6 of the 30 studies in a recent systematic review of economic evaluation studies in implementation science included assessments of some costs associated with preparatory, planning, or administrative activities conducted before launching the intervention.24 Decision-makers need access to cost data regarding the implementation in addition to data about the intervention, but the former is rarely available or complete.5

Unfortunately, research specific to the implementation of CAs also demonstrates a lack of consistency in findings, wide range of patient settings and conditions, and dearth of information regarding the full spectrum of implementation costs. Research in this area has inconsistent quality,11,17,18 reports mixed findings regarding the impact of CAs on visit length11,17 as well as overall cost,11,15,18 and estimates of costs vary widely, depending on many factors.15 Evidence shows that there is an initial cost increase to implement use of CAs, but results on immediate and short-term cost impact are mixed, and there is not enough data on long-term cost impact.15,17 Finally, there is almost no information for decision-makers about the cost of implementation activities in the early phases of the process.15

The studies in this topical collection represent some of the only comprehensive assessments of clinical innovation implementation costs across the full implementation process. The study by Saigal and colleagues25 of implementing a patient decision aid for prostate cancer provides the closest comparison to our project. They also employed TDABC, and while the patients and medical issues were different, they also found that personnel effort was the largest contributor to costs and that EHR integration was a critical means of reducing personnel effort, improving efficiency, and lowering costs. Despite numerous differences among the remaining studies in this collection, several common themes emerged. Generally, costs are highly variable26,27 and likely not prohibitive within an overall operational budget.26,28 Initial costs are increased as preparatory and implementation activities occur, but once launched, ongoing costs may be quite low.26,28 EHR integration and functionality is critical to the efficiency, feasibility, and success of a clinical innovation implementation.26,28 All of the studies used micro-costing methods, demonstrating that these methods are feasible and produce the type of detailed information crucial for decision-makers.

Implications of the Results

For Decision-makers

Our results provide decision-makers with important information for considering the adoption of a health care innovation such as CAs. The initial investment is the largest and most variable, depending on the size and complexity of the setting and efficiency of planning efforts and EHR integration. Recurring costs to sustain the intervention are lower and represent a small fraction of a health care setting’s overall budget. Sustainability costs may be negligible, especially if patient identification is automated and delivery of CAs is embedded into the standard workflow.

There are also ways to maximize the impact of the initial investment to implement an intervention. Any EHR programming or infrastructure created during preimplementation is an investment for the future. Innovations going forward will have that infrastructure upon which to build. Most CA subscriptions include a bundle of products, so expanding use of CAs in other departments will be easier and will increase the impact of the initial investment. Integrating training in CA use and SDM into onboarding procedures for new staff helps maintain the intervention and reduces the need for scheduling extra training over time. Further, using a video recording of the initial training for onboarding eliminates costs of future training and provides an efficient mode of conducting refresher training for existing staff.

For Research

This study is based on an implementation project where settings identified eligible patients using staff partially supported by external funds. In addition, the external project team had regular contact with the implementation settings. Our results, therefore, may not be an accurate representation of the costs that would be borne by an independent implementation effort. Although we estimated such costs, future research should consider more “naturally” occurring efforts. Any research must involve micro-costing methods and evaluate activities across the entire implementation spectrum (decision through sustainability), including incorporating the innovation into standard workflow and onboarding procedures. These strategies will produce the kind of detail-rich results that are so integral to decision-makers. One of the primary challenges future research must tackle is how to accomplish prospective data collection without creating undue burden on clinical staff.

Supplementary Material

SUPPLEMENTARY MATERIAL
mlr-61-689-s001.pdf (105.7KB, pdf)
mlr-61-689-s002.pdf (109.9KB, pdf)

ACKNOWLEDGMENTS

The authors extend their deep appreciation to all of the patients who participated in the project. Without them, none of this would have been possible. They are grateful to all of their collaborators at the implementation sites for their dedication, commitment, and hard work. Special thanks to Christopher Jacobs for his expert editing and feedback.

Footnotes

The supported implementation project and cost analysis study were approved by the Dartmouth College Committee for the Protection of Human Subjects and the Washington University Institutional Review Board. Participants in the cost analysis study received a Study Information Sheet and provided verbal consent to be interviewed.

These data were presented at the 15th Annual Conference on the Science of Dissemination and Implementation in Health, December 13, 2022, in Washington DC.

This project was funded through a Patient-Centered Outcomes Research Institute (PCORI) Dissemination and Implementation Award (SDM-2017C2-8507). The statements in this article are solely the responsibility of the authors and do not necessarily represent the views of the Patient-Centered Outcomes Research Institute (PCORI), its Board of Governors, or Methodology Committee.

M.-A.D. has contributed to the development of the Option Grid patient decision aids, which are licensed to EBSCO Health. She receives consulting income from EBSCO Health and royalties. G.E. is the Director of &think LLC, which owns the registered trademark for the Option Grids. He has contributed to the development of the Option Grid conversation aids, which are licensed to EBSCO Health. He receives consulting income from EBSCO Health and royalties. The remaining authors declare no conflict of interest.

Supplemental Digital Content is available for this article. Direct URL citations are provided in the HTML and PDF versions of this article on the journal's website, www.lww-medicalcare.com.

Contributor Information

Stephanie C. Acquilano, Email: stephanie.acquilano@dartmouth.edu.

Rachel C. Forcino, Email: Rachel.Forcino@dartmouth.edu.

Danielle Schubbe, Email: Danielle.C.Schubbe@dartmouth.edu.

Jaclyn Engel, Email: Jaclyn.A.Engel.MED@dartmouth.edu.

Marisa Tomaino, Email: marisatomaino@gmail.com.

Lisa C. Johnson, Email: Lisa.C.Johnson@dartmouth.edu.

Marie-Anne Durand, Email: Marie-Anne.Durand@dartmouth.edu.

Glyn Elwyn, Email: glynelwyn@gmail.com.

REFERENCES

  • 1.Bauer MS, Damschroder L, Hagedorn H, et al. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3:32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1:1–3. [Google Scholar]
  • 3.Dopp AR, Mundey P, Beasley LO, et al. Mixed-method approaches to strengthen economic evaluations in implementation research. Implement Sci. 2019;14:2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.O’Leary MC, Hassmiller Lich K, Frerichs L, et al. Extending analytic methods for economic evaluation in implementation science. Implement Sci. 2022;17:27. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Saldana L, Ritzwoller DP, Campbell M, et al. Using economic evaluations in implementation science to increase transparency in costs and outcomes for organizational decision-makers. Implement Sci Commun. 2022;3:40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Bowser DM, Henry BF, McCollister KE. Cost analysis in implementation studies of evidence-based practices for mental health and substance use disorders: a systematic review. Implement Sci. 2021;16:26. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Durand M-A, Yen RW, O’Malley AJ, et al. What matters most: randomized controlled trial of breast cancer surgery conversation aids across socioeconomic strata. Cancer. 2021;127:422–436. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Elwyn G, Lloyd A, Joseph-Williams N, et al. Option Grids: shared decision making made easier. Patient Educ Couns. 2013;90:207–212. [DOI] [PubMed] [Google Scholar]
  • 9.Dobler CC, Sanchez M, Gionfriddo MR, et al. Impact of decision aids used during clinical encounters on clinician outcomes and consultation length: a systematic review. BMJ Qual Saf. 2019;28:499–510. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Scalia P, Durand M-A, Berkowitz JL, et al. The impact and utility of encounter patient decision aids: Systematic review, meta-analysis and narrative synthesis. Patient Educ Couns. 2019;102:817–841. [DOI] [PubMed] [Google Scholar]
  • 11.Stacey D, Légaré F, Lewis K, et al. Decision aids for people facing health treatment or screening decisions. Cochrane Database Syst Rev. 2017;4:CD001431. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Wyatt KD, Branda ME. Peering into the black box: a meta-analysis of how clinicians use decision aids during clinical encounters; 2014. Accessed May 18, 2023. https://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-9-26.
  • 13.Légaré F, Adekpedjou R, Stacey D, et al. Interventions for increasing the use of shared decision making by healthcare professionals. Cochrane Database Syst Rev. 2018;7:CD006732. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Cidav Z, Mandell D, Pyne J, et al. A pragmatic method for costing implementation strategies using time-driven activity-based costing. Implement Sci. 2020;15:28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Trenaman L, Bryan S, Bansback N. The cost-effectiveness of patient decision aids: a systematic review. Healthc (Amst). 2014;2:251–257. [DOI] [PubMed] [Google Scholar]
  • 16.Lehman V, Siegel J, Chiang E. The price of practice change: assessing the cost of integrating research findings into clinical practice. Med Care. 2023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Ankolekar A, Dekker A, Fijten R, et al. The benefits and challenges of using patient decision aids to support shared decision making in health care. JCO Clin Cancer Inform. 2018;2:1–10. [DOI] [PubMed] [Google Scholar]
  • 18.Scalia P, Barr PJ, O’Neill C, et al. Does the use of patient decision aids lead to cost savings? a systematic review. BMJ Open. 2020;10:e036834. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Scalia P, Durand M-A, Forcino RC, et al. Implementation of the uterine fibroids Option Grid patient decision aids across five organizational settings: a randomized stepped-wedge study protocol. Implement Sci. 2019;14:88. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Saldana L, Chamberlain P, Bradford WD, et al. The Cost of Implementing New Strategies (COINS): a method for mapping implementation resources using the stages of implementation completion. Child Youth Serv Rev. 2014;39:177–182. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Keel G, Savage C, Rafiq M, et al. Time-driven activity-based costing in health care: a systematic review of the literature. Health Policy. 2017;121:755–763. [DOI] [PubMed] [Google Scholar]
  • 22.Kaplan RS, Witkowski M, Abbott M, et al. Using time-driven activity-based costing to identify value improvement opportunities in healthcare. J Healthc Manag. 2014;59:399–412. [PubMed] [Google Scholar]
  • 23.Powell BJ, Fernandez ME, Williams NJ, et al. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health. 2019;7:3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Roberts SLE, Healey A, Sevdalis N. Use of health economic evaluation in the implementation and improvement science fields—a systematic literature review. Implement Sci. 2019;14:1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Ho D, Kaplan R, Bergman J, et al. Health system perspective on cost for delivering a decision aid for prostate cancer using time-driven activity-based costing. Med Care 2023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Spees LP, Young LA, Rees J, et al. A cost analysis of re-think the strip: de-implementing a low value practice in primary care. Med Care. 2023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Smith NR, Simione M, Farrar-Muir H, et al. Costs to implement a pediatric weight management program across three distinct contexts. Med Care. 2023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Elwy AR, Wasan A, Taubenberger S, et al. Costs of implementing context factor assessments in pain clinic settings. Med Care. 2023. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

SUPPLEMENTARY MATERIAL
mlr-61-689-s001.pdf (105.7KB, pdf)
mlr-61-689-s002.pdf (109.9KB, pdf)

Articles from Medical Care are provided here courtesy of Wolters Kluwer Health

RESOURCES