Abstract
Background
Decision aids (DAs) are evidence-based tools to improve patient-centered care, but their use in routine care is limited. The purpose of this project was to work with orthopedic practices to deliver DAs.
Methods
Eligible sites needed to identify an administrative and clinical champion and have access to DAs for treatment of hip, knee, and/or spine conditions. The implementation strategies included an Orthopaedic Learning Collaborative (OLC), external facilitation, and audit and feedback. The project was conducted over 15 mo with 5 OLC sessions, individual monthly meetings, and monthly data reports. Clinicians and staff completed a baseline survey prior to the start of the project. Sites provided details on their DA workflow and number of DAs delivered. We calculated adoption (the number of specialists who used DAs) and estimated reach (percentage of eligible patients who received DAs). We calculated descriptive statistics and explored predictors of reach.
Results
Twelve participating sites had an average annual orthopedic surgical volume of 550, half were academic medical centers, and some (4/13, 30.7%) had prior experience with orthopedic DAs. Adoption was 76% (60/79 physicians). Sites distributed 9,626 DAs and reached 44% of eligible patients (range 7%–100%). Sites that indicated at baseline that DA delivery was a high priority for staff had higher reach (60% reach for high v. 47% for moderate v. 9% for low priority, P = 0.21). Sites with no prior experience with DAs had higher reach than those with prior experience did (60% v. 38%, P = 0.26, d = 0.71).
Conclusions
Participating sites were able to implement workflows that reached about half of eligible patients. Establishing DA delivery as a priority for staff at the outset appears important for reach, while prior experience does not.
Highlights
The 12 sites were able to reach, on average, 44% of eligible patients with decision aids in routine care demonstrating feasibility of distribution.
The study and associated implementation toolkit provide concrete examples of workflows for orthopedic practices interested in incorporating decision aids into routine care.
A bundle of implementation strategies, including a learning collaborative, external facilitation, and audit and feedback, helped most sites meet targets for decision aid implementation.
Keywords: decision making, shared, decision support techniques, quality improvement, learning collaborative, total hip arthroplasty, total knee arthroplasty, spine, lumbar
Clinical guidelines, including those for the treatment of hip and knee osteoarthritis and lumbar spine issues, emphasize the importance of engaging patients in shared decision making (SDM) to determine the appropriate treatment.1–3 Decision aids (DAs) are evidence-based educational tools that promote SDM. Previous studies have found that orthopedic DAs improved decision quality, reduced decisional conflict, and increased surgeon satisfaction with the consultation compared with usual care.4,5 Further, studies suggest that orthopedic DAs may also increase trust and improve quality of life and patient experience.6–10 DAs are evidence-based interventions that improve patient-centered care.
The significant benefits of DAs will be realized only if they are integrated in routine care. However, studies have documented barriers to SDM and the use of patient DAs. For instance, a Cochrane systematic review of studies across different clinical settings identified common barriers to SDM implementation including time constraints, perceived lack of applicability, and lack of clinician support.11,12 Organizational and system-level barriers include lack of team-based culture, limited leadership support, and misaligned financial incentives.13–15 Implementation studies need to be aware of and proactively address these potential barriers.
Implementation science is interested in the development and evaluation of strategies to promote the uptake of evidence-based practices. Most studies employ multiple implementation strategies to achieve desired implementation outcomes. 16 Learning collaboratives are one implementation strategy to support the dissemination of evidence-based interventions.17–20 A key feature of learning collaboratives is the opportunity to teach and learn from others, as participating sites share barriers and facilitators and discuss solutions to leverage the collective wisdom to improve outcomes.21,22 A recent systematic review of learning collaboratives found that 83% of the studies included showed improvement on at least 1 of the collaborative’s primary outcome measures. 20 Facilitation is another implementation strategy that uses experts (internal or external to the site) to provide flexible, problem-solving support to promote the adoption of evidence-based practices.23,24 These implementation strategies are effective and have the potential to address some of the known barriers to the implementation of DAs.
The main purpose of this project was to engage orthopedic sites to deliver DAs. The primary outcome was reach (defined as the average percentage of eligible patients given a DA). Secondary outcomes included adoption (defined as the percentage of specialists using DAs) and implementation (sites that reached at least 50% of patients with DAs for 6 or more months). Exploratory analyses examined factors associated with reach.
Methods
Although the learning collaborative was just one of the strategies, we refer to the overall project as the Orthopedic Learning Collaborative (OLC).
Conceptual Frameworks
We used the Consolidated Framework for Implementation Research (CFIR) to guide the implementation activities and the RE-AIM framework (Reach, Effectiveness, Adoption, Implementation and Maintenance) to guide the evaluation activities.25,26 CFIR is an explanatory model derived from several theories in implementation science, designed to help understand the implementation of complex interventions.26,27 The framework has 5 domains that we used to guide the design of the learning collaborative sessions and to inform some of the data collection activities. For example, one CFIR domain is “intervention characteristics” and in the baseline needs assessment survey, we measured how clinicians perceive the advantages, complexity, and barriers to use of DAs in their practice. Then, we used the results to identify topics to cover in the learning collaborative sessions (e.g., addressing issues/barriers that were common across sites) as well as topics to focus on in the individual facilitation sessions. The RE-AIM framework was used to define our reach, adoption, and implementation metrics that were selected to evaluate the impact of the project. In addition, we conducted exploratory analyses to determine whether elements of the different CFIR domains predicted reach outcomes.
Setting and Participants
Core project team
The core team leading the implementation strategies included 2 decision scientists (K.S., K.V.), a physician advisor (R.W.), a project manager (F.M.), and a senior research coordinator (H.V.), each with extensive experience with SDM, DA implementation, and continuous quality improvement methods. Two team members (K.S., R.W.) had prior experience leading and participating in learning collaboratives.
Internal sites
Three of 4 sites affiliated with the core project team’s health system agreed to have their hip/knee centers and spine centers participate in the OLC.
External sites
We identified a convenience sample of 15 sites through study advisors, colleagues, and co-investigators. We e-mailed the chief of orthopedic surgery at the 15 sites with an invitation to join the OLC. For the 10 sites who indicated interest, we followed up with a Zoom meeting to discuss roles and responsibilities. We required sites to provide a Letter of Commitment from their leadership identifying both a clinical (e.g., surgeon or physician) and administrative (e.g., practice manager or nurse) champion, confirming institutional support for the champions to participate in the OLC activities, and identifying the DA they planned to use. Nine of the 10 interested sites completed the required materials to join the OLC.
Site team members
Internal and external sites had an administrative champion and a clinical champion who attended the OLC and monthly meetings. The site champions then identified members of their local implementation team (such as nurses, front desk staff, or others) to carry out the DA delivery workflow.
Project advisors
We had 3 orthopedic patient advisors, an implementation science expert, a health insurance quality director, and a representative from the American Academy of Orthopaedic Surgeons who worked with the core project team on overall study design and participated in the OLC meetings (e.g., leading panel discussions, sharing insights for sites).
Interventions
The interventions were DAs for patients considering treatment of hip or knee osteoarthritis, lumbar spinal stenosis, or lumbar herniated disc. Nine sites had access to DAs through their institution from 4 different vendors (Healthwise, EBSCO, OM1Joint, and Wisercare). All of the DAs were able to be accessed online and could be integrated into the electronic medical record, and only Healthwise was also able to be delivered via .pdf/paper. Three sites obtained access to Healthwise DAs (.pdfs and Web links) through their participation in this project. All of the DAs met the International Patient Decision Aids Standards criteria, and 3 vendor’s (Healthwise, EBSCO, and OM1Joint) orthopedic DAs had evidence of effectiveness from clinical trials.6,7,28
Implementation Strategies
The descriptions of the 3 main implementation strategies used in this project follow the reporting guidelines from Proctor et al. 16
Learning collaborative: We conducted 4 virtual sessions and 1 in-person session over a 15-mo period. The virtual sessions were 90 min in length, and the in-person session was a dinner followed by a full day (∼7 h of content). The early sessions focused on team building and quality improvement techniques, while latter sessions focused on DA delivery workflow adaptations, electronic health record (EHR) integration, and strategies to overcome barriers (see Table 1). The clinical and administrative champions attended, along with the core team and patient advisors. The learning collaborative strategy was selected due to the evidence of positive impact. 20
External facilitation: A core team member was assigned to each site and hosted a 30-min monthly check-in call with the clinical and administrative champions. The facilitator reviewed baseline needs assessment data to identify strengths and potential barriers, supported the site champions in designing the initial workflow, and then, once the site launched, reviewed rates of DA delivery, discussed any challenges and successes, and addressed any questions. The main goal was to problem-solve and tailor support to individual sites’ needs. This strategy was selected based on evidence of effectiveness and because the participating sites had varying levels of background experiences.23,24
Audit and feedback: An online dashboard was available to OLC participants that summarized monthly DA delivery across all sites and individually for each site by clinician. The data reports were shared and reviewed at the monthly facilitation meetings. The focused data helped to identify issues (e.g., lower than expected numbers, variability across surgeons) and successes (e.g., high-performing clinicians). This strategy was selected based on prior experience of the core team with positive impact in decision aid use 29 and due to general evidence of effectiveness. 30
Table 1.
Learning Collaborative Session Topics
| Activity | Main Topics |
|---|---|
| Virtual session 1 (90 min) | • Introductions and ice breaker • Baseline needs assessment results and insights • Workflow design templates and elements • Start-up strategy example plans |
| Virtual session 2 (90 min) | • DA distribution review and site updates • Breakout groups: building motivation and effective pitches for leadership, surgeons, and staff • Panel discussion: EHR integration (sites 1–3 present on their experiences) |
| Virtual session 3 (90 min) | • DA distribution review and site updates • Panel discussion: DA integration into EHR (sites 5, 6, 9, and 12 share experiences) • Expanding from DAs to SDM |
| In-person session (7 h) | • DA distribution review and site updates • Small-group discussions: Making the Case for SDM with Data (tracking DA distribution, collecting decision quality, automating measurement, and using data for improvement) • Panel discussion: Creating Quality and Financial Incentives for DAs and SDM • Panel discussion: Strategies for Sustainability and Spread • SDM skills training • Cost analysis for different workflows |
| Virtual session 4 (90 min) | • Starting strong (reflections on launching the project) • Transitions discussion on how to sustain change • Implementation toolkit review • Cost analysis project update • Awards and accomplishments |
DA, decision aid; EHR, electronic health record; SDM, shared decision making.
There were no fees for sites to participate in the OLC. External sites received a small stipend (about $1,500) to support data collection activities.
Data Collection
Baseline needs assessment survey
A survey assessed different core CFIR elements, including attitudes toward SDM and DAs, perceived internal and external support, as well as potential barriers and resources for implementation of DAs. At the internal sites (sites 1–3), all surgeons completed a 15-item survey in January to March 2021. At the external sites (sites 4–12), staff (e.g., schedulers, medical assistants) and clinicians (nurse practitioners, physician assistants, and surgeons) completed slightly adapted versions of the survey from August 2021 to December 2021.The items are included in Supplemental Table 1.
DA delivery data
Sites tracked the number of DAs delivered from November 2021 through January 2023 and sent monthly reports to the OLC core team. For each DA delivered, sites tracked the surgeon, delivery timing (previsit, day-of-visit, or after visit), DA language, and, when possible, patient demographics (age, sex, primary language, and race/ethnicity).
Statistical Analysis
We summarized the baseline survey data and DA data using descriptive statistics. From the RE-AIM framework, we focused on Reach, Adoption and Implementation metrics. We defined reach as the percentage of eligible patients who received a DA over the 15-mo learning collaborative. To calculate the percentage of eligible patients who received the DA, we used the total number of DAs delivered as the numerator. For the denominator, we generated an estimate for potentially eligible patients by multiplying the site’s annual surgical volume by 2 (as prior work at one site found that half of new hip/knee patients ended up having surgery) and then multiplied by 1.25 (to account for the 15 mo of the learning collaborative). The estimate is in the mid-range of those from a meta-analysis of studies in orthopedics (including hips and knees) that found a wide range 23% to 70% depending on the referring clinician. 31 The estimate of 50% is in the middle of the range and quite plausible for centers that have a mix of referrals. For spine, studies have published surgical yields of 42% to 52%, suggesting the estimate may be also appropriate for spine.32,33 Each site was able to adjust the surgical yield estimate, as needed, to reflect their practice. Only 1 site requested adjustments. Site 12 provided an estimate of monthly eligible patient volume (not surgical volume) that was multiplied by 15 for their denominator. For the reach metric, we divided the total DAs delivered by the estimate of eligible patients, and the resulting percentage was truncated at 100% as needed.
We defined adoption as the percentage of physicians at each site who had a patient receive at least 1 DA.
For the implementation metric, we were interested in measuring the consistency of delivery of the DAs. We defined implementation as being met if sites reached at least 50% of eligible patients for 6 or more months.
We explored predictors of reach (defined as the percentage of eligible patients reached) for each practice including site characteristics (e.g., prior DA experience, patient volume), clinician and staff characteristics (e.g., attitudes toward DAs, priority level for staff), and process characteristics (e.g., workflow model, EHR integration). We used correlations to explore the relationship that surgical volume had with number of DAs delivered and reach. One-Way analyses of variance (ANOVAs) with follow-up t tests were used to explore the relationships reach had with DA delivery priority, prior experience with DAs, and EHR integration. Corresponding effect sizes (e.g., Cohen’s d for t tests with 0.2, 0.5, and 0.8 indicating small, moderate, and large effects, respectively, and Omega squared for ANOVA indicating the amount of variance accounted for) were reported for each analysis.34,35
Human subjects
The protocol was reviewed by the MassGeneral Brigham Institutional Review Board and was deemed exempt (protocol 2021P001463). Data use agreements were set up with participating sites, as needed, to cover data-sharing activities.
Results
Of the 4 internal sites invited, 3 agreed to participate and had both hip/knee and spine practices involved. Of the 15 external sites invited, 10 indicated interest and 9 agreed to join the OLC. The 9 external sites each had 1 practice participate (hip/knee center [n = 8] and spine center [n = 1]). About half of the 12 participating sites were academic medical centers (58%, 7/12), including 2 safety-net hospitals, with average annual hip or knee arthroplasty or spine surgical volume of about 550 cases (range 104–1,300). The 3 internal sites and 1 external site (33%, 4/12) had prior experience implementing orthopedic DAs, a few sites (2 of 12; 17%) had experience implementing DAs in other clinical areas, and the rest (50%, 6 of 12) had no prior experience with DAs. Table 2 describes the participating sites.
Table 2.
Participating Sites and Implementation Outcomes
| Site | Topics | Annual Surgical Volume for Target Conditions | Timing of DA Delivery (Primary Model) | Prior Experience with DAs | EHR Integration | Number of Physicians | Reach: Decision Aids Sent | Implementation: 6 or More Months of ≥50% Reach |
|---|---|---|---|---|---|---|---|---|
| 1 (internal) | Hip, knee, and spine | >1,000 | Previsit | Yes | Full | 15 | 1,392 | Yes |
| 2 (internal) | Hip, knee, and spine | >1,000 | Hip and knee: previsit Spine: day of visit |
Yes | Full | 14 | 2,825 | Yes |
| 3 (internal) | Hip, knee, and spine | >1,000 | Previsit | Yes | Full | 10 | 493 | Yes |
| 4 | Hip and knee | <500 | Day of visit | No | Some | 8 | 220 | Yes |
| 5 | Hip and knee | 500–999 | Day of visit | Yes | Full | 8 | 558 | No |
| 6 | Hip and knee | 500–999 | Previsit | Yes, not in orthopedics | Full | 4 | 1,250 | Yes |
| 7 | Hip and knee | <500 | Day of visit | No | None | 3 | 951 | Yes |
| 8 | Spine only | <500 | Day of visit | No | None | 1 | 235 | Yes |
| 9 | Hip and knee | <500 | Day of visit | No | None | 2 | 73 | No |
| 10 | Hip and knee | 500–999 | Day of visit | No | None | 2 | 1,080 | Yes |
| 11 | Knee only | <500 | Day of visit | No | Some | 2 | 107 | Yes |
| 12 | Knee only | >1,000 a | Previsit | Yes, not in orthopedics | Full | 10 | 442 | No |
DA, decision aid; EHR, electronic health record.
Estimate of new patient volume (not surgical volume) for this site.
From the internal sites, we received 49 of 63 (78%) responses from surgeons to the baseline needs assessment survey. From the external sites, we received 72 of 107 (67%) responses to the baseline survey from 34 clinicians (e.g., surgeons, nurse practitioners, and physician assistants) and 38 clinic staff (e.g., medical assistants, schedulers, office managers). On average, 90% reported that they always or usually engage patients in SDM (range 33%–100% by site), and 54% were very positive about patients getting DAs (range 0%–100% by site). Sites reported strong support for SDM from practice leadership 88% (range 78%–100% by site) and from surgeons 88% (range 50%–100% by site), with less perceived support from nonclinical staff 53% (range 0%–100%). Overall, only 45% (range 0%–100% by site) reported being extremely or very familiar with DAs, and few 19% (range 0%–60% by site) reported that getting DAs to patients was part of their job. Some surgeons (36%) reported being at least moderately concerned about the DAs lengthening the visit. Only a few (13%) reported it was very easy to get DAs to patients before a visit. Full baseline data are in Supplemental Table 1.
Sites adopted 2 main workflows, either a previsit DA delivery (5/12) or day-of-visit DA delivery (7/12) model. Figure 1 illustrates an example of each workflow with the main steps: identify eligible patients, deliver the DA, and document delivery and follow-up. In the previsit workflow at the hip and knee center, a nurse reviewed the clinic schedules and determines eligibility based on the program criteria. The nurse sent the DA to eligible patients electronically via the patient portal about a week before the visit. The center also set up a kiosk in the waiting room that had printed copies of the DAs available to patients upon check-in. The surgeon reviewed the options and answered any questions about the DA or options in the visit. The center created a smart phrase to support documentation of SDM and DA delivery. In the spine day-of-visit workflow, the clinic modified their check-out form to include the spine DAs. Clinicians (either physicians or nurses) who consulted with patients assessed their eligibility for the DA and indicated if it should be given by marking it on the check-out form. The clinic staff checking out the patient then provided the DA for the patient to take home and review. There was no formal follow-up, but patients were encouraged to call the office and set up an appointment (if they did not already have a follow-up scheduled) if they had additional questions after reviewing the DA.
Figure 1.
Two workflows illustrating implementation with and without electronic health record integration.
EHR, electronic health record; DA, decision aid; PA, physician assistant.
Sites had varying levels of EHR integration. Half (50% (6/12) had full integration where the DA was delivered, accessed, and tracked via the EHR and/or patient portal. A few 17% (2/12) had some integration; for example, the DA was able to be accessed in the patient portal, but access was not tracked in the EHR. The remaining sites (33%, 4/12) did not have any EHR integration and used paper copies or .pdf copies of the DA with their patients. Figure 1 illustrates 2 different workflows, 1 with and 1 without EHR integration.
During the 15 mo of the learning collaborative, the DA distribution totaled 9,626 DAs or about 642 DAs per month. Patients who received DAs were on average 65 y old, female (51%), and White (56%), and 6% were Hispanic. A small percentage (4%, n = 338) of non-English DAs were disseminated. Overall, reach at sites was 44% (range 7%–100%). Most sites (75%, 9/12) met the implementation target as they reached at least 50% of patients for 6 or more months. Overall adoption by specialists was 76% (60/79) across the sites. Table 2 shows reach and implementation outcomes, and Figure 2 shows reach and adoption for each participating site.
Figure 2.
Reach and adoption across the 12 sites.
Exploratory analyses examining predictors of reach found that sites that indicated at baseline that DA delivery was a high priority for clinical staff had higher reach (60% reach for high priority v. 47% reach for moderate priority v. 9% reach for low priority, P = 0.21, Omega = 0.22). Surgical volume was correlated with the number of DAs delivered (r = 0.58, P = 0.02) but not with reach (r = −0.17, P = 0.54). Reach varied by prior experience with DAs (P = 0.15, Omega = 0.01). Sites with no prior experience with DAs had higher reach (60%) than those with some (nonorthopedic) experience (38%; P = 0.58, d = 0.55) or prior orthopedic DA experience (38%; P = 0.26, d = 0.71). We found a similar pattern of results when assessing the relationships of reach with EHR integration and workflow use. As sites with prior experience tended to have full EHR integration and used a previsit model, the findings were similar across these factors (32% reach for full v. 62% for some v. 60% for no EHR integration, P = 0.25, Omega = 0.06; 35% reach for previsit v. 51% reach for day-of-visit workflows, P = 0.36, d = 0.52).
Discussion
The study provides important new evidence of effective models to guide implementation activities for hospitals and orthopedic practices interested in incorporating DAs into routine care. Most of the participating sites were able to integrate DAs for patients with hip and knee osteoarthritis, lumbar herniated disc, or lumbar spinal stenosis. Most surgeons (76%) at the participating practices adopted the intervention. The sites reached an average of 44% of eligible patients with DAs over the 15-mo OLC. Most sites, 75%, met the implementation target of reaching at least 50% of eligible patients for 6 or more months. Importantly, the participating sites were geographically diverse, included 2 safety-net hospitals, and were of varying size.
In the baseline survey, we found strong acceptability of SDM and DAs for surgeons across the participating sites. We did not identify a strong concern about increasing visit time, which has been often cited as a common barrier to SDM and DA implementation. However, there was low familiarity with DA content, and in prior studies, we have found clinicians are much more likely to use and promote DAs when they are familiar and confident in the content. 29 Further, very few felt it would be easy to integrate DAs into practice. Generally, most sites did not need much work convincing their surgeons and staff that DAs were a good thing; rather, the baseline survey identified needs for logistical support and concrete workflows. The early learning collaborative and external facilitation sessions focused on developing strategies for sites to build on the strengths identified in their baseline surveys and to target ways to mitigate potential barriers (e.g., by presenting streamlined workflows for sites to use or adapt).
Reach is a key implementation metric, reflecting the ability to deliver the intervention to eligible patients. The participating sites had a wide range of reach, from 7% to 100%. Two prior publications in orthopedics, both single-center studies, provide some benchmarks for comparison. One study, at Group Health (now Kaiser Permanente Washington) published in 2012, reported reaching 41% of hip and 28% of knee patients with DAs. 36 A quality improvement study at Massachusetts General Hospital from 2018 reported reach of 56% for lumbar spine patients and 69% for hip and knee patients. 37 Another single-center study in a primary care setting reported reaching 10% of eligible patients with DAs for low-back pain. 38 With a mean of 44%, the average reach of the OLC sites compares favorably to published rates from primary care integration and is similar to what was achieved in published single-center orthopedic studies.
The OLC included 4 sites that had prior experience with orthopedic DAs. At the start of the OLC, these sites did not have reliable DA use and were restarting implementation efforts. These sites had faced major disruptions to existing workflows from changing DA vendors, changing EHR platforms (which affected the DA-ordering process), and COVID-related staffing shortages. The experienced sites were eager to rebuild their own DA usage and were able to share advice and strategies with the newer sites. Typically, learning collaboratives recruit a cohort of naïve practices, but this project suggests that the learning collaborative may be a valuable strategy to promote maintenance and sustainability for sites with prior experience. Somewhat surprisingly, the new sites achieved higher reach than did sites with prior experience with DA delivery (60% v. 38%). This may have to do in part with the unique challenges of maintaining or sustaining a practice compared with those of starting a new practice.39,40 This may also be due in part because the topics covered in the learning collaborative session were derived from participating sites, for example, electronic delivery (that all of the sites with prior experience were already employing), and may not have addressed issues more pressing to the few experienced sites. While data are limited, one systematic review found only 60% of sites who participated in a learning collaborative continued to sustain the implementation of the evidence-based practice after 1 y. 41 One study that examined the sustained use of decisions aids specifically found 49% (46/93) of sites that signed up to use DAs reported sustained use and that lack of clinician support was a barrier to sustained use. 42 It will be important to follow sites over time to determine whether the DA delivery is maintained after the formal learning collaborative ends and whether this differs for experienced versus naïve sites.
The sites had a wide range of reach, and the exploratory analyses attempted to shed some light on the barriers and facilitators to implementing DAs. Common barriers to SDM implementation in the literature include lack of clinician support,11,12 lack of team-based culture, limited leadership support, and misaligned financial incentives.13–15 The baseline survey attempted to measure some of these factors and did find that some attitudes were associated with reach. Most sites had strong leadership support; this feature was by design as it was a requirement for joining the OLC. Sites whose staff perceived DA delivery as a high priority were more successful in reaching patients than those who perceived it as a low priority (e.g., 60% reach for high priority v. 9% reach for low priority). This finding underscores the importance of taking time at the beginning of the implementation project to ensure that clinic staff, who are often a critical part of the DA delivery workflow, understand the importance of SDM and DAs. Surgeons and leadership can help by clearly communicating that DA delivery is a priority for the clinic. The previsit workflows, prior experience, and full EHR integration were all associated with lower reach. As these factors were highly correlated, it is unclear which of these might be contributing to the lower reach. A prior study in orthopedics found that patients preferred to receive the information before the visit (compared with after the visit) and that previsit delivery resulted in higher DA usage 37 ; as a result, we strongly encouraged sites to adopt a previsit model (or to move toward one if currently using day of visit). EHR integration is very helpful in implementing a previsit model, as most sites no longer mail out materials in advance of visits and many DAs are optimized for electronic viewing. Further research into the effectiveness of various workflows will be important moving forward.
As orthopedic sites transition to new payment models, such as the government’s Merit-based Incentive Payment System (MIPS), having reliable documentation of the delivery of DAs will be important. Under MIPS, the “use of evidence-based DAs to support shared decision making” is identified as one of the improvement activities for the MIPS Value Pathways. 43 To support widespread use, particularly for high-volume sites where manual delivery and tracking may be burdensome, comprehensive EHR integration to support DA delivery and documentation will be important. Six sites had full EHR integration, meaning the DA was able to be ordered, sent, and tracked electronically. One site that had the ability to send DAs to their patients via patient portal also developed a paper workflow to ensure they were capturing all patients and not increasing care disparities for those who may be less comfortable with technology. EHR integration can facilitate implementation and documentation of DAs and help practices meet requirements for new payment models.
There has been considerable debate recently on the relationship of SDM, DAs, and implementation science. Clayman et al. (2024) 44 argued that SDM should be considered the evidence-based practice and that decision coaching and DAs should be considered implementation strategies to achieve SDM. In contrast, Matlock and Sherer (2024) 45 argued that DAs and decision coaching were more appropriate to be considered the evidence-based practice and thus the target of implementation activities. Our study follows this latter approach, with the DA as the evidence-based practice and the implementation strategies selected to address known barriers to use (such as limited time, lack of familiarity, and lack of acceptability). Pieterse and Van Bodegom-Vos (2024) 46 argued that SDM did not have sufficient evidence of positive impact on patient outcomes to warrant its consideration as an evidence-based practice; instead, they encouraged more hybrid implementation-effectiveness studies before moving directly to implementation. Given that there is considerable evidence of a positive impact of DAs generally and in orthopedics specifically, we did feel that the intervention met the criteria for an evidence-based practice.4,5 This study adds important evidence to the discussion on this emerging relationship between these fields.
The main strengths of the study include the diverse group of participating sites, the 4 different DAs that were implemented, and the clear documentation of DA delivery. Of note, both of the 2 safety net hospitals (sites 4 and 7) had high success reaching patients, providing important evidence of generalizability to sites serving vulnerable patient populations. The investigators distilled insights from the project into an implementation toolkit for clinical and administrative champions who are interested in integrating DAs into routine care. The toolkit includes tips, templates, and resources to help sites generate leadership support, map out DA delivery workflows, scale-up efforts and monitor results. The toolkit also includes short case write-ups of workflows from participating OLC sites that describe how they engaged clinic staff and used the EHR to facilitate delivery and documentation. The toolkit materials are available on the MGH Health Decision Sciences Center’s Web site (www.mghdecisionsciences.org).
There are a few limitations that are important to consider. First, due to very limited funds to support data collection, we did not collect data from patients on DA usage or patient-reported outcomes. We felt that focusing on reach, adoption, and implementation would provide a more meaningful contribution, as there is extensive existing literature on the effectiveness of DAs on patient-reported outcomes for orthopedic patients.6,7,9 Second, we used an estimate for the denominator that was based on surgical volume to calculate reach. This estimate was reviewed and approved by sites but may over- or underestimate reach depending on the sites’ actual surgical yield (percentage of new patient visits that end up having surgery). Further, we calculated reach at the site level and were not able to calculate it at the individual clinician level, nor were we able to examine the relationship between prior experience with DAs at the clinician level and reach. Third, the sites had to identify clinical and administrative champions, have access to a DA, and document leadership support prior to being accepted into the OLC, and as a result, these findings may not be generalizable to all orthopedic practices.
While there have been a few single-center reports of integration of DAs in routine orthopedic care, this report provides new evidence of effective implementation across a dozen sites. The current study counters the pervasive narrative that it is difficult to implement DAs in routine care. This diverse set of sites distributed close to 10,000 DAs over the 15-mo project without formal funding, demonstrating the feasibility of delivery. Orthopedic practices and care teams should be encouraged to implement DAs—as it is achievable with modest effort. Clinical teams should take advantage of the resources in the implementation toolkit including contact information for the participating site champions who were willing to talk with others. There are many different implementation strategies, and multiple strategies are often needed to successfully change practice patterns. The strong relationships among sites, built through the learning collaborative model, helped sustain interest in the project as well as motivate action to be accountable to their peers. It was also fun as the lead center to have the group and one-on-one interactions with the sites and help them achieve their goals. Researchers interested in the implementation of patient DAs should consider using learning collaboratives with one-on-one facilitation to promote use in routine care. Finally, it is critical that DA developers design tools that are able to be integrated easily into routine care—tools have flexibility in the timing, mode of administration, and straightforward eligibility criteria (ideally, such that nonclinical staff and/or patients can determine eligibility).
We have hundreds of trials demonstrating the positive impact of DAs, and to make meaningful improvements in patient-centered care, we need to build up the evidence base on implementation to ensure that DAs get into the hands of patients who need them, when they need them. With modest effort, most orthopedic practices were able to consistently reach a meaningful portion of patients with DAs. DAs are an evidence-based strategy to achieve value-based health care, and sites that are able to integrate them into routine care will be able to benefit from the positive impact of DAs, including higher decision quality, better patient experience, and increased trust.
Supplemental Material
Supplemental material, sj-docx-1-mdm-10.1177_0272989X251405892 for Patient Decision Aids into Routine Orthopedic Care: Results from an Implementation Study at 12 Sites by Karen Sepucha, Ha Vo, Felisha Marques, Kathrene D. Valentine, Ayesha Abdeen, Hany Bedair, Antonia F. Chen, Jesse Eisler, David Freccero, Prakash Jayakumar, Emily Kropfl, Kathleen Paul, Benjamin Ricciardi, Daniel Vigil, Richard Wexler, Theresa Williamson, Adolph Yates and Thomas Cha in Medical Decision Making
Acknowledgments
This work was presented at the 2023 Society for Medical Decision Making’s annual meeting in Philadelphia, Pennsylvania, and at the 2024 International Shared Decision Making Conference in Lausanne, Switzerland.
Footnotes
The authors declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: KS reports grant funding from the Patient-Centered Outcomes Research Institute (PCORI) and a leadership role on the International Patient Decision Aids Steering Committee. HV, FM, KV, and SB report grant funding from PCORI. TC reports grant funding from PCORI and consulting fees from Globus, Kuros Biosciences, and Stryker. HB reports grants from PCORI and Zimmer-Biomet; has royalties and received consulting fees from Exactech and Smith & Nephew; and has stock in Exactech. AA reports consulting fees from Depuy, Teladoc, and Smith & Nephew; has a leadership role on the AAOS OITE exam writing committee and AAHKS Quality Committee; and has stocks in Brixton Biosciences. AC reports grants from the Clinical Process Improvement Leadership Program, Foundation for Arthroplasty Research and Education, National Institutes of Health/NIAMS, The Knee Society, Vela Foundation, RJOS/Zimmer Biomet Clinical/Basic Science Research Grant, Orthopaedic Trauma Association, Agency for Healthcare Research and Quality and PCORI Large Conference Grant Program, AAOS BOS Quality and Patient Safety Action Fund, and CMS; received royalties from Stryker, SLACK Incorporated, and UpToDate; received consulting fees from Adaptive Phage Therapeutics, Avanos, BICMD, Convatec, Ethicon, GLG, Guidepoint, Heraeus, IrriMax, Peptilogics, Pfizer, Stryker, Smith & Nephew, and TrialSpark; has a leadership role at AAOS, AJRR, and AAHKS; has stock options in Hyalex, IrriMax, Osteal Therapeutics, Sonoran, and IlluminOss; and is on the editorial boards for the Journal of Bone and Joint Surgery, Journal of Arthroplasty, Journal of Bone and Joint Infection, and Arthroplasty Today. DF reports leadership roles in AAHKS Committee Member and AAOS Committee Member, research support from Conformis and DePuy, and equity in ROMTech. PJ reports grants from the Bass Family Foundation, Kozmetsky Family Foundation, The Commonwealth Fund, PCORI, and West Health Foundation; received consulting fees from Proximie Ltd and Pacific Business Group on Health; received payments from the AO Foundation, University of Minnesota Grand Rounds, Optum Health Grand Rounds, and Cedar Sinai Health System; has patents under Nudge Innovations; and has stocks in Ora Medical. BR is on the editorial board for Clinical Orthopaedics and Related Research. AY reports meeting support from the American Academy of Orthopaedic Surgeons and is on the board of directors of the American Academy of Orthopaedic Surgeons. All other authors (JE, EK, KP, DV, RW and TW) did not have interests to report. The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Financial support for this study was provided entirely by a Patient-Centered Outcomes Research Institute (PCORI) Award (DI-2019C2-17151). The funding agreement ensured the authors’ independence in designing the study, interpreting the data, writing, and publishing the report. The views in this article are solely the responsibility of the authors and do not necessarily represent the views of PCORI, its Board of Governors, or its Methodology Committee.
Ethical Considerations: The protocol was reviewed by the Massachusetts General Brigham Institutional Review Board and was deemed exempt, protocol 2021P001463 Learning Collaborative to Implement Patient Decision Aids for Elective Orthopedic Surgery. Data use agreements were set up with participating sites, as needed, to cover data-sharing activities.
Consent to Participate: Written or verbal consent was not required by the Massachusetts General Brigham Institutional Review Board because this was a minimal-risk study. The clinicians and staff invited to complete the baseline survey received an information sheet, and consent was implied by completion of the survey.
Consent for Publication: Not applicable.
ORCID iDs: Karen Sepucha
https://orcid.org/0000-0002-3762-3880
Ha Vo
https://orcid.org/0000-0002-9450-3412
Kathrene D. Valentine
https://orcid.org/0000-0001-6349-5395
Ayesha Abdeen
https://orcid.org/0000-0001-7180-8259
Antonia F. Chen
https://orcid.org/0000-0003-2040-8188
David Freccero
https://orcid.org/0000-0002-2444-7099
Richard Wexler
https://orcid.org/0009-0007-5671-2620
Adolph Yates
https://orcid.org/0000-0002-2091-7743
Data Availability: Data, analytic methods, and/or study materials will be made available to other researchers upon request to the corresponding author. Researchers interested in the data will need to obtain appropriate human subject approvals and execute a data-sharing agreement.
Contributor Information
Karen Sepucha, Harvard Medical School, Boston, MA, USA; Health Decision Sciences Center, Massachusetts General Hospital, Boston, MA, USA.
Ha Vo, Health Decision Sciences Center, Massachusetts General Hospital, Boston, MA, USA.
Felisha Marques, Health Decision Sciences Center, Massachusetts General Hospital, Boston, MA, USA.
Kathrene D. Valentine, Harvard Medical School, Boston, MA, USA Health Decision Sciences Center, Massachusetts General Hospital, Boston, MA, USA.
Ayesha Abdeen, Department of Orthopaedic Surgery, Boston Medical Center, Boston, MA, USA.
Hany Bedair, Department of Orthopaedic Surgery, Massachusetts General Hospital, Boston, MA, USA.
Antonia F. Chen, Department of Orthopaedics, Brigham and Women’s Hospital, Boston, MA, USA
Jesse Eisler, Bone and Joint Institute, Hartford Healthcare, Hartford, CT, USA.
David Freccero, Department of Orthopaedic Surgery, Boston Medical Center, Boston, MA, USA.
Prakash Jayakumar, The Musculoskeletal Institute, The University of Texas at Austin, Dell Medical School, Austin, TX, USA.
Emily Kropfl, Department of Orthopaedic Surgery, University of Pittsburgh School of Medicine, Pittsburgh, PA, USA.
Kathleen Paul, Kaiser Permanente Washington.
Benjamin Ricciardi, Department of Orthopaedic Surgery, University of Rochester School of Medicine, Rochester, NY, USA.
Daniel Vigil, David Geffen School of Medicine at UCLA, Department of Family Medicine, Department of Orthopaedic Surgery, Divisions of Primary Care Sports Medicine (Chief), Los Angeles, CA, USA.
Richard Wexler, Consultant, Portland ME.
Theresa Williamson, Department of Neurosurgery, Massachusetts General Hospital, Boston, MA, USA.
Adolph Yates, Department of Orthopaedic Surgery, University of Pittsburgh School of Medicine, Pittsburgh, PA, USA.
Thomas Cha, Department of Orthopaedic Surgery, Massachusetts General Hospital, Boston, MA, USA.
References
- 1. Chou R, Loeser JD, Owens DK, et al. Interventional therapies, surgery, and interdisciplinary rehabilitation for low back pain: an evidence-based clinical practice guideline from the American Pain Society. Spine. 2009;34(10):1066–77. DOI: 10.1097/BRS.0b013e3181a1390d [DOI] [PubMed] [Google Scholar]
- 2. Jevsevar DS. Treatment of osteoarthritis of the knee: evidence-based guideline, 2nd edition. J Am Acad Orthop Surg. 2013;21(9):571–6. DOI: 10.5435/JAAOS-21-09-571 [DOI] [PubMed] [Google Scholar]
- 3. Bannuru RR, Osani MC, Vaysbrot EE, et al. OARSI guidelines for the non-surgical management of knee, hip, and polyarticular osteoarthritis. Osteoarthritis Cartilage. 2019;27(11):1578–89. DOI: 10.1016/j.joca.2019.06.011 [DOI] [PubMed] [Google Scholar]
- 4. Pacheco-Brousseau L, Charette M, Poitras S, Stacey D. Effectiveness of patient decision aids for total hip and knee arthroplasty decision-making: a systematic review. Osteoarthritis Cartilage. 2021;29(10):1399–411. DOI: 10.1016/j.joca.2021.07.006 [DOI] [PubMed] [Google Scholar]
- 5. Stacey D, Lewis KB, Smith M, et al. Decision aids for people facing health treatment or screening decisions. Cochrane Database Syst Rev. 2024;1(1):CD001431. DOI: 10.1002/14651858.CD001431.pub6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Sepucha K, Bedair H, Yu L, et al. Decision support strategies for hip and knee osteoarthritis: less is more: a randomized comparative effectiveness trial (DECIDE-OA study). J Bone Joint Surg Am. 2019;101(18):1645–53. DOI: 10.2106/JBJS.19.00004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Jayakumar P, Moore MG, Furlough KA, et al. Comparison of an artificial intelligence–enabled patient decision aid vs educational material on decision quality, shared decision-making, patient experience, and functional outcomes in adults with knee osteoarthritis: a randomized clinical trial. JAMA Netw Open. 2021;4(2):e2037107. DOI: 10.1001/jamanetworkopen.2020.37107 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Brodney S, Sepucha K, Chang Y, Moulton B, Barry MJ. Patients who reviewed a decision aid prior to major orthopaedic surgery reported higher trust in their surgeon. JBJS Open Access. 2022;7(1):e21.00149. DOI: 10.2106/JBJS.OA.21.00149 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Sepucha K, Atlas SJ, Chang Y, et al. Patient decision aids improve decision quality and patient experience and reduce surgical rates in routine orthopaedic care: a prospective cohort study. J Bone Joint Surg Am. 2017;99(15):1253–60. DOI: 10.2106/JBJS.16.01045 [DOI] [PubMed] [Google Scholar]
- 10. Sepucha KR, Atlas SJ, Chang Y, et al. Informed, patient-centered decisions associated with better health outcomes in orthopedics: prospective cohort study. Med Decis Making. 2018;38(8):1018–26. DOI: 10.1177/0272989X18801308 [DOI] [PubMed] [Google Scholar]
- 11. Légaré F, Ratté S, Gravel K, Graham ID. Barriers and facilitators to implementing shared decision-making in clinical practice: update of a systematic review of health professionals’ perceptions. Patient Educ Couns. 2008;73(3):526–35. DOI: 10.1016/j.pec.2008.07.018 [DOI] [PubMed] [Google Scholar]
- 12. Elwyn G, Scholl I, Tietbohl C, et al. “Many miles to go…”: a systematic review of the implementation of patient decision support interventions into routine clinical practice. BMC Med Inform Decis Mak. 2013;13(S2):S14. DOI: 10.1186/1472-6947-13-S2-S14 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Scholl I, LaRussa A, Hahlweg P, Kobrin S, Elwyn G. Organizational- and system-level characteristics that influence implementation of shared decision-making and strategies to address them—a scoping review. Implement Sci. 2018;13(1):40. DOI: 10.1186/s13012-018-0731-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Munro S, Manski R, Donnelly KZ, et al. Investigation of factors influencing the implementation of two shared decision-making interventions in contraceptive care: a qualitative interview study among clinical and administrative staff. Implement Sci. 2019;14(1):95. DOI: 10.1186/s13012-019-0941-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Lloyd A, Joseph-Williams N, Edwards A, Rix A, Elwyn G. Patchy “coherence”: using normalization process theory to evaluate a multi-faceted shared decision making implementation program (MAGIC). Implement Sci. 2013;8(1):102. DOI: 10.1186/1748-5908-8-102 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013. Dec 1;8:139. doi: 10.1186/1748-5908-8-139. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. de Silva D. Improvement Collaboratives in Health Care: Evidence Scan. Health Foundation; 2014. Available from: https://www.health.org.uk/publications/improvement-collaboratives-in-health-care. [Accessed 20 August, 2023]. [Google Scholar]
- 18. Lindenauer PK. Effects of quality improvement collaboratives. BMJ. 2008;336(7659):1448–9. DOI: 10.1136/bmj.a216 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Shaw EK, Chase SM, Howard J, Nutting PA, Crabtree BF. More black box to explore: how quality improvement collaboratives shape practice change. J Am Board Fam Med. 2012;25(2):149–57. DOI: 10.3122/jabfm.2012.02.110090 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Wells S, Tamir O, Gray J, Naidoo D, Bekhit M, Goldmann D. Are quality improvement collaboratives effective? A systematic review. BMJ Qual Saf. 2018;27(3):226–40. DOI: 10.1136/bmjqs-2017-006926 [DOI] [PubMed] [Google Scholar]
- 21. Institute for Healthcare Improvement. The breakthrough series: IHI’s collaborative model for achieving breakthrough improvement. Diabetes Spectr. 2004;17(2):97–101. DOI: 10.2337/diaspect.17.2.97 [DOI] [Google Scholar]
- 22. Kilo CM. A framework for collaborative improvement: lessons from the institute for healthcare improvement's breakthrough series. Qual Manag Health Care. 1998;6(4):1–14. DOI: 10.1097/00019514-199806040-00001 [DOI] [PubMed] [Google Scholar]
- 23. Stetler CB, Legro MW, Rycroft-Malone J, et al. Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006;1(1):23. DOI: 10.1186/1748-5908-1-23 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24. Harvey G, Loftus-Hills A, Rycroft-Malone J, et al. Getting evidence into practice: the role and function of facilitation. J Adv Nurs. 2002;37(6):577–88. DOI: 10.1046/j.1365-2648.2002.02126.x [DOI] [PubMed] [Google Scholar]
- 25. Glasgow RE, Harden SM, Gaglio B, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019;7:64. DOI: 10.3389/fpubh.2019.00064 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50. DOI: 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27. Butler M, Epstein RA, Totten A, et al. AHRQ series on complex intervention systematic reviews—paper 3: adapting frameworks to develop protocols. J Clin Epidemiol. 2017;90:19–27. DOI: 10.1016/j.jclinepi.2017.06.013 [DOI] [PubMed] [Google Scholar]
- 28. Elwyn G, Pickles T, Edwards A, et al. Supporting shared decision making using an option grid for osteoarthritis of the knee in an interface musculoskeletal clinic: a stepped wedge trial. Patient Educ Couns. 2016;99(4):571–7. DOI: 10.1016/j.pec.2015.10.011 [DOI] [PubMed] [Google Scholar]
- 29. Sepucha KR, Simmons LH, Barry MJ, Edgman-Levitan S, Licurse AM, Chaguturu SK. Ten years, forty decision aids, and thousands of patient uses: shared decision making at Massachusetts General Hospital. Health Aff (Millwood). 2016;35(4):630–6. DOI: 10.1377/hlthaff.2015.1376 [DOI] [PubMed] [Google Scholar]
- 30. Jamtvedt G, Young J, Kristoffersen D, Thomson O’Brien M, Oxman A. Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2003;CD000259. [DOI] [PubMed] [Google Scholar]
- 31. Marks D, Pearce-Higgins J, Frost T, Fittock J, Rathbone E, Hing W. The referrer matters. Musculoskeletal surgical conversion rates: a systematic review with meta-analysis. Health Serv Insights. 2024;17:11786329241304615. DOI: 10.1177/11786329241304615 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Pennington Z, Michalopoulos GD, Biedermann AJ, et al. Positive impact of the pandemic: the effect of post–COVID-19 virtual visit implementation on departmental efficiency and patient satisfaction in a quaternary care center. Neurosurg Focus. 2022;52(6):E10. DOI: 10.3171/2022.3.FOCUS2243 [DOI] [PubMed] [Google Scholar]
- 33. French ZP, Hundal RS, McGee AC, Winzenried AE, Williams SK. Spine surgeon triage of new patient referrals: quantification of surgical conversion rate by clinic referral source. Spine J. 2024;24(8):1478–84. DOI: 10.1016/j.spinee.2024.03.009 [DOI] [PubMed] [Google Scholar]
- 34. Lakens D. Calculating and reporting effect sizes to facilitate cumulative science: a practical primer for t-tests and ANOVAs. Front Psychol. 2013;4:863. DOI: 10.3389/fpsyg.2013.00863 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35. Olejnik S, Algina J. Generalized eta and omega squared statistics: measures of effect size for some common research designs. Psychol Methods. 2003;8(4):434–47. DOI: 10.1037/1082-989X.8.4.434 [DOI] [PubMed] [Google Scholar]
- 36. Arterburn D, Wellman R, Westbrook E, et al. Introducing decision aids at group health was linked to sharply lower hip and knee surgery rates and costs. Health Aff. 2012;31(9):2094–104. DOI: 10.1377/hlthaff.2011.0686 [DOI] [PubMed] [Google Scholar]
- 37. Mangla M, Cha TD, Dorrwachter JM, et al. Increasing the use of patient decision aids in orthopaedic care: results of a quality improvement project. BMJ Qual Saf. 2018;27(5):347–54. DOI: 10.1136/bmjqs-2017-007019 [DOI] [PubMed] [Google Scholar]
- 38. Lin GA, Halley M, Rendle KAS, et al. An effort to spread decision aids in five California primary care practices yielded low distribution, highlighting hurdles. Health Aff. 2013;32(2):311–20. DOI: 10.1377/hlthaff.2012.1070 [DOI] [PubMed] [Google Scholar]
- 39. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):117. DOI: 10.1186/1748-5908-8-117 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40. Shelton RC, Cooper BR, Stirman SW. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018;39(1):55–76. DOI: 10.1146/annurev-publhealth-040617-014731 [DOI] [PubMed] [Google Scholar]
- 41. Scheirer MA. Is sustainability possible? A review and commentary on empirical studies of program sustainability. Am J Eval. 2005;26(3):320–47. DOI: 10.1177/1098214005278752 [DOI] [Google Scholar]
- 42. Feibelmann S, Yang TS, Uzogara EE, Sepucha K. What does it take to have sustained use of decision aids? A programme evaluation for the Breast Cancer Initiative: a programme evaluation for the Breast Cancer Initiative. Health Expect. 2011;14:85–95. DOI: 10.1111/j.1369-7625.2010.00640.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43. Quality Payment Program. Improving care for lower extremity joint repair. 2024. Available from: https://qpp.cms.gov/mips/explore-mips-value-pathways/2024/G0058. [Accessed 6 June, 2024].
- 44. Clayman ML, Elwy AR, Vassy JL. Reframing SDM using implementation science: SDM is the intervention. Med Decis Making. 2024;44(8):859–61. DOI: 10.1177/0272989X241285418 [DOI] [PubMed] [Google Scholar]
- 45. Matlock DD, Scherer L. Shared decision making “ought” to be done, but definitions need simplicity: response to “reframing SDM using implementation science: SDM is the intervention.” Med Decis Making. 2024;44(8):865–6. DOI: 10.1177/0272989X241286880 [DOI] [PubMed] [Google Scholar]
- 46. Pieterse AH, Van Bodegom-Vos L. Shared decision making is in need of effectiveness-implementation hybrid studies. Med Decis Making. 2024;44(8):862–4. DOI: 10.1177/0272989X241286516 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental material, sj-docx-1-mdm-10.1177_0272989X251405892 for Patient Decision Aids into Routine Orthopedic Care: Results from an Implementation Study at 12 Sites by Karen Sepucha, Ha Vo, Felisha Marques, Kathrene D. Valentine, Ayesha Abdeen, Hany Bedair, Antonia F. Chen, Jesse Eisler, David Freccero, Prakash Jayakumar, Emily Kropfl, Kathleen Paul, Benjamin Ricciardi, Daniel Vigil, Richard Wexler, Theresa Williamson, Adolph Yates and Thomas Cha in Medical Decision Making


