Skip to main content
Lippincott Open Access logoLink to Lippincott Open Access
. 2023 Sep 7;61(10):708–714. doi: 10.1097/MLR.0000000000001899

A Cost Analysis of Rethink the Strip

De-implementing a Low-value Practice in Primary Care

Lisa P Spees *,†,, Laura A Young , Jennifer Rees §, Kathleen Mottus , Jennifer Leeman †,§,, Marcella H Boynton §,#, Erica Richman , Maihan B Vu **,††, Katrina E Donahue ∥,‡‡
PMCID: PMC10478673  PMID: 37943526

Abstract

Background:

Routine self-monitoring of blood glucose is a low-value practice that provides limited benefit for patients with non–insulin-treated type 2 diabetes mellitus.

Objectives:

We estimated the costs of Rethink the Strip (RTS), a multistrategy approach to the de-implementation of self-monitoring of blood glucose in primary care.

Research Design:

RTS was conducted among 20 primary care clinics in North Carolina. We estimated the non–site-based and site-based costs of the 5 RTS strategies (practice facilitation, audit and feedback, provider champions, educational meetings, and educational materials) from the analytic perspective of an integrated health care system for 12 and 27-month time horizons. Material costs were tracked through project records, and personnel costs were assessed using activity-based costing. We used nationally based wage estimates.

Results:

Total RTS costs equaled $68,941 for 12 months. Specifically, non–site-based costs comprised $16,560. Most non–site-based costs ($11,822) were from the foundational programming and coding updates to the electronic health record data to develop the audit and feedback reports. The non–site-based costs of educational meetings, practice facilitation, and educational materials were substantially lower, ranging between ~$400 and $1000. Total 12-month site-based costs equaled $2569 for a single clinic (or $52,381 for 20 clinics). Educational meetings were the most expensive strategy, averaging $1401 per clinic. The site-based costs for the 4 other implementation strategies were markedly lower, ranging between $51 for educational materials and $555 for practice facilitation per clinic.

Conclusions:

This study provides detailed cost information for implementation strategies used to support evidence-based programs in primary care clinics.

Key Words: cost and cost analysis, de-implementation, implementation strategies, diabetes


Health care spending accounted for ~20% of the US gross domestic product in 2020.1 Contributing to these rapidly rising health care costs is overtreatment, or the provision of health care that has little or no benefit to patients. Estimates suggest that overtreatment costs the US health care system $226 billion annually.2 Consequently, movements, such as the Choosing Wisely campaign,3 have identified practices considered to be overtreatment.

One practice identified as contributing to overtreatment is the self-monitoring of blood glucose (SMBG) among patients with non–insulin-treated type 2 diabetes mellitus.4 Not only is this a low-value practice but it is costly to practices in that it takes away from provider time with patients, as well as diverts resources from the practices that are known to improve diabetes management, such as diet, physical activity, and medication adherence. Several randomized trials have demonstrated no differences in patient outcomes between those who did and did not conduct daily SMBG.58 In addition, the largest US randomized trial found no differences in patients’ blood glucose control, health-related quality of life, or adverse events for those who engaged in daily SMBG as compared with those who did not, further calling into question the clinical utility of routine SMBG.7

Implementation science has been critical to enhancing the uptake of evidence-based practices, such as the reduction of SMBG, and has systematically identified strategies to promote the implementation of evidence-based practices.9 A growing literature describes the use of these strategies to de-implement low-value practices.10 In particular, multistrategy interventions have been found to be more effective in de-implementing low-value practices than single-strategy interventions.11 However, to evaluate these multistrategy interventions, decision-makers need to estimate not only the effectiveness of the implementation strategies but also the cost to determine their financial feasibility.11 Not only do cost analyses provide the cost of employing implementation strategies in a specific setting but they also inform the replication costs of these implementation strategies in other settings.12 Currently, there is a dearth of literature on implementation strategy costs, particularly from the health care system perspective; a recent scoping review found that among the 52 studies included, only 8 described implementation strategy costs from the perspective of the health care system.13 It is essential to estimate the costs of these strategies for health care systems to bridge the research-practice gap in using implementation strategies.

We conducted a cost analysis of Rethink the Strip (RTS), a multistrategy approach to the de-implementation of SMBG in primary care; specifically, we estimated the costs of 5 implementation strategies: practice facilitation, audit and feedback, provider champions, educational meetings, and educational materials. The overall goal of RTS was to reduce prescribing rates of diabetes testing supplies (ie, test strips and/or lancets). Results show the group of newly diagnosed patients had approximately 10% fewer strip/lancet prescriptions than established patients among intervention practices across the 12 month baseline and 24 month post-test period.14 This cost analysis serves to inform the resource allocation decisions of health care systems and clinics as they seek to improve primary care quality.

METHODS

Rethink the Strip

RTS was conducted from February 2019 to April 2022 among 20 primary care clinics located in North Carolina. While clinics were intended to participate in RTS for 12 months, clinics ultimately participated for a minimum of 15 months to a maximum of 27 months due to the coronavirus disease 2019 (COVID-19) pandemic. Implementation strategies used in RTS were chosen based on discussions with an advisory board that included key stakeholders from organizations, such as the National Diabetes Education Program, the American Association of Diabetes Educators, and the American Diabetes Association. Strategies included in RTS had to meet 3 criteria; they had to: (1) address barriers to SMBG de-implementation; (2) have strong evidence of being effective; and (3) were considered low-touch in that limited time and resources were needed from the primary care clinics. The strategies were further refined based on findings from a 1-year pilot study in 3 clinics.

A brief description of each of the implementation strategies is provided in Table 1. In-depth descriptions of the strategies and supportive materials are available on the RTS website (https://rethinkthestrip.sites.unc.edu). While each strategy may be used separately or in various combinations, all implementation strategies were used throughout RTS.

TABLE 1.

RTS De-implementation Strategies

Educational meetings The RTS study team (including 2 physicians with training in family medicine and endocrinology, the practice facilitator, and the research assistant) led a lunch-time educational meeting for providers and staff at each clinic and also for UNC health care system diabetes educators
Practice facilitation Practice facilitation is often delivered by a health care professional with training in facilitation or quality improvement. In RTS, in-person or virtual facilitation was provided on at least a monthly basis throughout the intervention. The practice facilitator was a registered nurse experienced in practice facilitation and who had existing, established relationships with the majority of clinics participating in RTS
Audit and feedback EHR data were analyzed to assess changes in prescription rates for blood glucose monitoring strips at the clinic and provider levels. Performance reports were provided on a quarterly basis to each clinic participating in RTS, with T2DM prescribing rates stratified by provider
Educational materials Educational materials were developed by the RTS study team and distributed to clinics participating in RTS. A waiting room flyer, clinic room posters, and a handout were developed for patients and providers describing the evidence supporting the discontinuation of SMBG (See example educational material in Appendices A, B, and C)
Provider champions Practice managers and primary care providers were identified within each clinic to serve as, respectively, the primary contact person for the RTS study team and the champion of SMBG de-implementation. In particular, the “champion” was tasked with promoting SMBG de-implementation using their informed knowledge of their clinic

EHR indicates electronic health record; RTS, Rethink the Strip; SMBG, self-monitoring of blood glucose; T2DM, type 2 diabetes mellitus.

Analytic Perspective

Costs were collected from the perspective of an integrated health care system. Integrated health care systems have system-wide linked electronic medical health records, which facilitate sharing of patient information across clinics and allows for the extraction of data across the health care system. Currently, over half of US clinicians and almost three-fourths (72%) of hospitals are part of integrated health care systems.15

Costs were analyzed for a time horizon of 12 months and 27 months. Our time horizon of 27 months reflected the longest time period that clinics participated in RTS. However, we also provide 12-month cost estimates, as this would most likely be the length of RTS if adopted by future health care systems.

Data Sources

Material costs were tracked through project records and receipts. Personnel costs were estimated using activity-based costing16 and hourly wage estimates from the US Bureau of Labor Statistics17 (Supplemental Table 1, Supplemental Digital Content 1, http://links.lww.com/MLR/C684). Throughout RTS, the practice facilitator tracked their time spent on all non–research-related activities (ie, traveling to clinics and contacting clinic personnel). RTS research assistants tracked their time performing tasks, including updating educational materials and assembling audit and feedback reports for the last 7 months of RTS. Provider champions and practice managers at 3 clinics recorded their time spent on RTS activities on a daily basis for 3 consecutive months; they were compensated $50 for the first 2 months and $100 for the third month for their time.

Cost Inputs and Analyses

For each implementation strategy, costs were separated into non–site-based and site-based costs. Non–site-based costs refer to those accumulated by the health care system and are not associated with a specific clinic. In contrast, site-based costs were linked to a specific clinic.

Educational Meetings

Non–site-based educational meeting costs included those for (1) a 1-hour kick-off continuing medical education seminar with registered dieticians and (2) updates and revisions to presentation materials. Part of the RTS study team introduced RTS to UNC Health care system’s registered dieticians. This meeting ensured that patients did not receive conflicting recommendations regarding SMBG. We accounted for the dieticians’ meeting attendance and the RTS study team’s meeting preparation and attendance. Time spent making updates and edits to the educational meeting presentation was made by the RTS research assistant.

For each clinic, site-based educational meetings costs included time spent by the RTS study team in preparing, traveling, and presenting for the meeting. From each clinic, we recorded the number of meeting attendees and their respective positions within their clinic. Finally, receipts were collected to identify material costs (eg, lunch).

Practice Facilitation

The practice facilitator included time spent on non–site-based activities, such as clinic recruitment, and on-site–based costs, such as traveling to a clinic, preparing for and conducting meetings with clinic personnel, and informally communicating with clinic personnel (ie, email or phone calls with practice managers or provider champions). Site-based material costs included car rentals, gas, food, and promotional RTS products.

Provider Champions

Provider champions and practice managers recorded time spent on activities, such as distributing RTS education materials in their clinic and educating new clinic personnel on RTS. Because champions could be physician assistants, nurse practitioners, and physicians, we used wage estimates that corresponded with the champion’s clinic position.

Audit and Feedback Reports

Information for the audit and feedback reports was pulled from the Carolina Data Warehouse for Health (CDW-H), a repository containing electronic health record (EHR) data from the UNC Health Care System.18 Non–site-based costs included person-hours that CDW-H staff spent on programming the EHR data; this included the initial coding to automatize creating the audit and feedback reports and then additional recoding to fix data errors or inconsistencies. Site-based costs included time spent by the RTS research assistant on creating each clinic’s personalized audit and feedback report.

Educational Materials

Non–site-based educational material costs included the time spent by the RTS team in developing and revising the content of the educational materials. Site-based material costs, including printed products (eg, waiting room flyers, clinic room posters, and patient and provider handouts) distributed to each clinic.

RESULTS

For the 12-month and 27-month time horizons, total RTS costs for all 20 clinics equaled $68,941 and $96,790, respectively. For a single clinic, total RTS costs were $3447 for 12 months or $4839 for 27 months.

Non–Site-based Costs

Table 2 includes the total non–site-based costs for each RTS implementation strategy. Among 4 implementation strategies, total non–site-based costs equaled $16,560 for a 12-month time horizon and $20,564 for a 27-month time horizon. No non–site-based costs were associated with provider champions as they only performed RTS activities for their particular clinic. The largest non–site-based cost ($11,822) was due to the 140 hours of foundational EHR programming to the CDW-H data for the audit and feedback reports. On average, 2.6 hours per month (~$213 per month) were spent on data coding updates, accounting for the additional audit and feedback report costs throughout RTS. Overall, the audit and feedback reports accounted for over 85% of the total non–site-based costs for both time horizons.

TABLE 2.

Non–site-based Costs

Non–site-based activity 12 mo total costs ($) 27 mo total costs ($)
Educational meetings
 Kick-off CME with registered dieticians 825 825
 Meeting updates* 260 584
Practice facilitation
 Clinic recruitment 709 709
Audit and feedback reports
 Initial data programming 11,822 11,822
 Data coding updates 2,552 5,743
Educational materials
 Material updates* 391 881
*

Personnel tracked time spent on updating educational meeting updates and educational materials for 7 consecutive months; average monthly cost estimates were $22 for educational meeting updates and $33 for educational materials.

Includes costs of time spent by practice facilitators recruiting clinics to participate in Rethink the Strip.

Refers to meetings with data programmers to discuss coding corrections and data updates for audit and feedback reports. Average monthly costs are estimated to be $213.

CME indicates continuing medical education.

The non–site-based costs of educational meetings, practice facilitation, and educational materials were substantially lower, ranging between ~$400 and $1000 for the 12-month time horizon and $700 and $1400 for the 24-month time horizon. Time spent by the practice facilitator on all recruitment activities was ~17.8 hours. The 1-hour kick-off continuing medical education included the time of 16 registered dieticians ($505), as well as the preparation time ($51) and meeting attendance ($270) of the RTS team members (ie, a nurse, an endocrinologist, and a family medicine physician). The time spent on updating and revising educational meeting materials averaged 48 minutes per month (range: 0–270 min) and cost ~$22 per month. Similarly, the time spent on editing and updating education materials averaged 72 minutes per month (range: 0–450 min) accumulating to $33 per month.

Site-based Costs

Table 3 includes site-based costs for the 5 RTS implementation strategies. For a single clinic, the average site-based costs equaled $2569 for a 12-month time horizon and $3811 for a 27-month time horizon. For all 20 clinics, the average site-based costs totaled $52,381 for a 12-month time horizon and $76,226 for a 27-month time horizon.

TABLE 3.

Site-based Costs, Total and Per Clinic

Site-based activity Total 12 mo cost per clinic ($) 12 mo total cost* ($) Total 27 mo cost per clinic ($) 27 mo total cost* ($)
Educational meetings
 Average cost 1,401 28,012 1,401 28,012
 Minimum cost 642 12,841 642 12,841
 Maximum cost 2,508 50,151 2,508 50,151
Practice facilitation
 Average cost 555 11,092 1,043 20,866
 Minimum cost 217 4,338 435 8,705
 Maximum cost 859 17,172 1,710 34,208
Audit and feedback reports
 Average cost 120 2,401 270 5,402
 Minimum cost 68 1,357 153 3,052
 Maximum cost 176 3,527 397 7,936
Educational materials
 Average cost 51 1,019 101 2,020
 Minimum cost 41 815 81 1,616
 Maximum cost 61 1,223 121 2,424
Provider champion§
 Average cost 443 8,856 996 19,927
 Minimum cost 355 7,096 798 15,966
 Maximum cost 616 12,312 1,385 27,702
*

Costs for all 20 clinics participating in Rethink the Strip.

Only one educational meeting per clinic was conducted, so 12-month and 27-month costs are the same.

Average monthly audit and feedback costs per clinic were $10.00 (range: $5.65–$14.70).

§

Average monthly provider champion costs per clinic were $36.90 (range: $29.57–$51.30).

Across both time horizons, the educational meeting was the most expensive implementation strategy, costing on average $1401 per clinic. However, these costs varied widely (range: $642–$2508) across clinics. For each educational meeting, average preparation time and presentation times were, respectively, 33 minutes (range: 0–120 min) and 65 minutes (range: 60–90 min). Material costs for the educational meetings were ~$230 per clinic (range: $131–366). Members from the RTS study team that participated in the educational meetings included a nurse, endocrinologist, family medicine physician, and research assistant and traveled in a single vehicle. Finally, the number of clinic staff and providers in attendance ranged from 4 to 16, and the total cost of clinic personnel time ranged from $191 to $780; this was one of the largest contributors to the variation in the site-based educational meeting costs. Second, educational meeting costs varied depending on whether the meeting was conducted in-person or virtually. For educational meetings conducted in-person, the average round-trip travel time was 73 minutes (range: <5–240 min) and 1-way mileage was 33 miles (range: 0.7–83 miles). Because the COVID-19 pandemic started while RTS was ongoing, in-person educational meetings transitioned to virtual meetings and eliminated the time spent traveling for in-person presentations. Consequently, site-based educational meeting costs decreased by ~$528 per clinic.

The site-based costs for the 4 other implementation strategies were markedly lower than the educational meetings. For the 12-month time horizon, practice facilitation costs on average $555 per clinic, with $164 from material costs. The average personnel times of the practice facilitator equaled $33 per month. Provider champions cost $443 per clinic, based on an average personnel time cost of $37 per month. While the largest non–site-based cost was due to analyzing and preparing the EHR data for the audit and feedback reports, the site-based costs were only $120 per clinic for the 12-month time horizon. On average, the RTS research assistant spent 22 minutes (range: 13–33 min) per month on creating a clinic’s personalized report. Finally, site-based costs for educational materials were the lowest site-based costs, at an average of, $101 per clinic.

Figure 1 depicts the total site-based costs for the implementation strategies stratified by the clinic. While actual clinic participation in RTS ranged from 15 months to 27 months, we assumed a 12-month time horizon across clinics to make clinic costs comparable. Consequently, the total costs per clinic ranged from $1601 for clinic #18 to $4005 for clinic #6. Across all 20 clinics, the implementation strategy with the largest site-based cost was educational meetings. The most inexpensive site-based cost was either educational materials for 16 clinics or audit and feedback reports for 4 clinics (specifically, clinics #12, #14, #17, and #19).

FIGURE 1.

FIGURE 1

Twelve-month site-based costs for each implementation strategy, by RTS clinic. RTS indicates Rethink the Strip.

DISCUSSION

From the health care system perspective, the total cost of implementing and conducting RTS among 20 primary care clinics for a 12-month time horizon was $68,941. Among non–site-based costs, the highest cost of $14,374 was from the audit and feedback implementation strategy, which accounted for over 86% of non–site-based costs. Among the site-based costs, the highest cost was from the educational meetings implementation strategy and cost $28,012 for 20 clinics. Among the 20 clinics participating in RTS, site-based costs per clinic ranged from a minimum of $1601 to a maximum of $4005.

A recent scoping review of implementation strategy costs found that multicomponent interventions ranged from $7288 to $3 million.13 This wide range of costs can be partly attributed to the varying scales, contexts, and settings of the interventions, as well as the complexity of the intervention strategies. Costs ranged from $7570 to $65,335 per clinic for multicomponent interventions focused on primary care clinics.1921 Compared with other interventions in primary care clinics, the results from our cost analysis demonstrate that RTS, with a total cost of $3447 per clinic, is slightly lower than the typical cost range for interventions implemented in primary care settings. These lower costs may be partly explained by economies of scale, since previous studies using multicomponent strategies, were implemented in substantially fewer clinics than RTS’s 20 clinics.

In our study, the audit and feedback implementation strategy was the most expensive non–site-based cost. Other studies have also found audit and feedback to be one of the most expensive implementation strategies;19,20,22 in particular, studies in primary care settings have previously estimated audit and feedback to cost on average $26,000.19 In one multicomponent study, audit and feedback were three times as costly as other implementation strategies.22 In RTS, while the non–site-based audit and feedback costs were substantial, site-based costs for audit and feedback only averaged $10 a month for a single clinic. Although it may not be feasible to lower non–site-based costs, employing audit and feedback in a large, integrated health care system would make these costs more justifiable as adding more clinics would, in fact, potentially lower the total audit and feedback costs (ie, both site-based and non–site-based costs) per clinic. In other words, if future health care systems had 40 clinics instead of 20 clinics participating in RTS for a 12-month time horizon, total audit and feedback costs per clinic would decrease.

With the COVID-19 pandemic, the RTS team transitioned from conducting in-person to virtual educational meetings. Changing the delivery of this implementation strategy also resulted in decreased costs. However, virtual meetings certainly have limitations; studies among medical students found virtual training to be less effective than in-person training in terms of developing practical, hands-on skills.23,24 However, in terms of medical knowledge and information gain, virtual training has been shown to be as effective as in-person training for medical education.2527 Because RTS educational meetings focused on disseminating information and the most recent evidence on the utility of SMBG, it is likely that virtual and in-person meetings were equally effective, although this is an area in which further research is still needed.

A few limitations should be noted. First, our findings may have limited generalizability as RTS was conducted in only North Carolina. However, we used nationally based salary estimates (instead of the reported salaries of the RTS study team and clinic providers and staff) when accounting for personnel time. Second, only provider champions and practice managers at three clinics tracked their time on RTS activities for a limited, 3-month time period. Thus, the costs of the provider champion implementation strategy may be more variable than expected. At the same time, however, because these data were collected alongside RTS, we were able to mitigate recall bias, which has the potential to distort estimates of labor-related costs.14 Finally, RTS was in progress at the beginning of the COVID-19 pandemic. Consequently, as aforementioned, educational meetings transitioned from in-person to virtual, subsequently influencing these costs. However, even with the ongoing pandemic, the same implementation strategies were used and delivered in the same manner as they had been before the pandemic.

Our study builds upon key gaps in the implementation strategy cost literature. First, the individual costs of each implementation strategy are not always reported.20,2830 In the current study, we go beyond the current literature by not only illustrating the discrete costs of each implementation strategy but also showing that these costs can vary within a single implementation strategy. For example, we show that, across 20 primary care clinics, site-based costs of conducting educational meetings ranged by almost $1900. Second, across interventions using the same implementation strategies, costs can vary widely due to differences in the delivery and contents of implementation strategies. For example, practice facilitation is broadly defined as activities and tasks that support the implementation of evidence-based care delivered by a single individual or a team.31 For RTS, we describe using a single practice facilitator who prospectively tracked their tasks and time on non–site-based costs (ie, recruiting clinics to participate in RTS) and site-based activities (ie, clinic visits and communications through email, phone, or in-person with clinic personnel). Detailing these activities helps explain how the cost of an implementation strategy used in RTS may differ when used in another intervention.

This cost analysis estimates the financial feasibility of a multicomponent approach using implementation strategies within primary care settings. Estimating the cost of RTS from the perspective of the health care system will aid in efficient resource allocation. Providing estimates from multiple time periods, particularly the 12-month time horizon, will be important for decision-makers as they decide their priorities within the confines of a health care system’s annual budget. Finally, these real-world results demonstrate the feasibility and affordability of using implementation strategies to de-implement SMBG among patients with type 2 diabetes mellitus.

Supplementary Material

SUPPLEMENTARY MATERIAL
mlr-61-708-s001.docx (15.7KB, docx)

Footnotes

Research reported in this publication was funded through a Patient-Centered Outcomes Research Institute Award DI-2018C1-10853. Further support was provided by the National Center for Advancing Translational Sciences, National Institutes of Health, through Grant Award Number UL1TR002489.

L.P.S. has received salary support from AstraZeneca paid to her institution for unrelated work. The remaining authors declare no conflict of interest.

Supplemental Digital Content is available for this article. Direct URL citations are provided in the HTML and PDF versions of this article on the journal's website, www.lww-medicalcare.com.

Contributor Information

Lisa P. Spees, Email: lspees21@live.unc.edu.

Laura A. Young, Email: laura_young@med.unc.edu.

Jennifer Rees, Email: jennifer_rees@med.unc.edu.

Kathleen Mottus, Email: kmottus@email.unc.edu.

Jennifer Leeman, Email: jleeman@email.unc.edu.

Marcella H. Boynton, Email: marcella_boynton@med.unc.edu.

Erica Richman, Email: elr@email.unc.edu.

Maihan B. Vu, Email: maihan.vu@unc.edu.

Katrina E. Donahue, Email: katrina_donahue@med.unc.edu.

REFERENCES

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

SUPPLEMENTARY MATERIAL
mlr-61-708-s001.docx (15.7KB, docx)

Articles from Medical Care are provided here courtesy of Wolters Kluwer Health

RESOURCES