Abstract
Evidence is needed about how to effectively support health care providers in implementing screening for social risks (adverse social determinants of health) and providing related referrals meant to address identified social risks. This need is greatest in underresourced care settings. The authors tested whether an implementation support intervention (6 months of technical assistance and coaching study clinics through a five-step implementation process) improved adoption of social risk activities in community health centers (CHCs). Thirty-one CHC clinics were block-randomized to six wedges that occurred sequentially. Over the 45-month study period from March 2018 to December 2021, data were collected for 6 or more months preintervention, the 6-month intervention period, and 6 or more months postintervention. The authors calculated clinic-level monthly rates of social risk screening results that were entered at in-person encounters and rates of social risk-related referrals. Secondary analyses measured impacts on diabetes-related outcomes. Intervention impact was assessed by comparing clinic performance based on whether they had versus had not yet received the intervention in the preintervention period compared with the intervention and postintervention periods. In assessing the results, the authors note that five clinics withdrew from the study for various bandwidth-related reasons. Of the remaining 26, a total of 19 fully or partially completed all 5 implementation steps, and 7 fully or partially completed at least the first 3 steps. Social risk screening was 2.45 times (95% confidence interval [CI], 1.32–4.39) higher during the intervention period compared with the preintervention period; this impact was not sustained postintervention (rate ratio, 2.16; 95% CI, 0.64–7.27). No significant difference was seen in social risk referral rates during the intervention or postintervention periods. The intervention was associated with greater blood pressure control among patients with diabetes and lower rates of diabetes biomarker screening postintervention. All results must be interpreted considering that the Covid-19 pandemic began midway through the trial, which affected care delivery generally and patients at CHCs particularly. Finally, the study results show that adaptive implementation support was effective at temporarily increasing social risk screening. It is possible that the intervention did not adequately address barriers to sustained implementation or that 6 months was not long enough to cement this change. Underresourced clinics may struggle to participate in support activities over longer periods without adequate resources, even if lengthier support is needed. As policies start requiring documentation of social risk activities, safety-net clinics may be unable to meet these requirements without adequate financial and coaching/technical support.
Social risks — the downstream material manifestations of adverse social determinants of health — are associated with a higher risk of chronic diseases such as type 2 diabetes, and hamper individuals’ ability to effectively self-manage these diseases.1–9 Clinical providers seeking to help mitigate the impacts of social risks (e.g., by referring patients to social services such as food banks) must know about patients’ risks. One strategy to increase provider awareness of social risks involves conducting social risk screening and documenting screening results in the electronic health record (EHR). Such screening is foundational to subsequent activities for mitigating the health impacts of social risks, such as adjusting care plans and connecting patients with community services, and conducting related advocacy efforts.
Numerous national initiatives now recommend incorporating social risk screening10–13 into health care settings, and rates of such screening are increasing. This is particularly true in community health centers (CHCs),14,15 which primarily serve low-income populations that disproportionately experience poverty-associated barriers to health promotion. Many CHCs were early adopters of social risk screening, but some have found it difficult to conduct systemic screening with EHR documentation of those screening results or to sustain or expand screening efforts.16,17 From 2016 to 2018, while 67% of 107 CHCs conducted any social risk screening, only 2% of all patients at these clinics had documented results of such screening.18 This pattern is not unique to CHCs — in the few prior articles on social risk screening in other care settings, screening rates have similarly varied widely.19
Increasing systematic social risk screening and referrals — and then ensuring that screening results and referrals made are documented in the EHR — will require addressing the multiple implementation barriers described in a 2019 National Academies of Sciences, Engineering, and Medicine report20 and other publications.21–23 Implementation barriers include obtaining leadership and staff buy-in,20 especially if follow-up service referrals are not feasible; determining which patients to screen for which social risks and how often; configuring the EHR to easily ingest screening data and present them in a useful manner; developing and adopting effective workflows (e.g., determining which staff will conduct screening at which step during or before a clinical encounter, and whether staff expected to enter such data can access the appropriate EHR interfaces); deciding when and how documented social risks will be reviewed and whether and how the clinic will act on reported risks; maintaining up-to-date lists of social service agencies, if referrals to such agencies are planned; staffing and resource limitations; and others.20
Given these challenges, multifaceted implementation support may be necessary to enhance adoption of social risk screening, referral-making, and documentation of screening results and referrals in the EHR. Empirical evidence is needed about which practices for supporting this implementation will be most useful for CHCs and which are chronically underresourced. This trial (R18DK114701) examined whether an implementation support intervention — 6 months of tailored technical assistance and coaching — improved EHR documentation of social risks and associated referral-making in CHCs. It also assessed the intervention’s impact on diabetes outcomes. To our knowledge, no previous studies have rigorously tested a multicomponent implementation support intervention designed to facilitate the adoption of social risk screening and related referral-making in any health care setting. Trial results should inform the work of health care organizations aiming to implement social care initiatives in underresourced care settings.
Methods
Data Source and Trial Design
OCHIN, Inc. (a nonprofit health care innovation center based in Portland, Oregon) serves a national network of locally controlled organizations that provide independent, community-based care to patients across a variety of health care settings; as of 2021, OCHIN supported a network of more than 21,000 providers who reach more than 6 million patients while supporting nearly 1,000 community health care sites in 45 states.
At the beginning of our study, in 2018, there were 593 OCHIN member clinic sites in 16 states that shared a single instance of the Epic EHR. Our stepped-wedge trial included 31 CHC clinics recruited from OCHIN’s member CHCs; these 31 clinics are located in California, Georgia, Massachusetts, Montana, Ohio, Oregon, Washington, and Wisconsin. We included no more than two clinics from a given CHC organization. Recruitment occurred in two waves (14 in spring 2018 and 17 in fall 2019) to ensure that no recruited clinics waited more than 1 year for the intervention. Each set of clinics was block-randomized to wedges 1–3 and 4–6, respectively. A stepped-wedge trial was chosen as an effective design for when an intervention under study cannot be rolled out simultaneously for everyone yet allows all to eventually receive the intervention. The comparison is of metrics before the implementation and those during and after the intervention. Clinics were eligible to participate if they were interested in implementing or expanding social risk screening and/or referral activities. They had to commit to identifying staff members to serve as a Clinician Champion and/or Operational Champion for the project and allowing those staff to participate in intervention activities (≥2 hours per month interacting with the implementation support team).
Intervention
The intervention details and conceptual frameworks underlying this study have been described previously.24 In brief, study clinics received 6 months of technical assistance in the use of relevant EHR tools (within the shared Epic EHR) and practice coaching in how to use these tools in clinic workflows, both tailored to the individual clinic’s needs. (The relevant tools supported: identifying patients due for social risk screening; customizing which patients were considered due; documenting and reviewing screening results; and ordering social service referrals.25,26) At the outset, one person served as both EHR trainer and practice coach. By wedge 3, we modified the intervention staff structure by bringing on a separate practice coach. Thereafter, the EHR trainer and practice coach worked as a team to support clinics (e.g., preparing for, leading, and debriefing after clinic meetings and further supporting clinics as needed by email or calls).
The trainer/coach team guided each wedge of study clinics though a five-step implementation process:27 (1) secure leadership buy-in; (2) set goals; (3) develop workflows; (4) orient staff; and (5) implement and iterate. Throughout the 6-month intervention, the dedicated trainer/coach team met with clinic representatives two to three times monthly and tracked clinic progress in completing these steps. All meetings were conducted via video conferencing with one clinic at a time, a feature intended to enhance the intervention’s potential scalability. The coach and trainer spent about 3–6 hours each per month per clinic on meeting with and preparing to meet with the study clinics. The intervention was designed to address barriers to social risk screening/referral implementation identified in a prior pilot study (R18DK105463) and as reported in the literature. Prior evidence suggested that each intervention component — practice coaching/facilitation, technical assistance, interdisciplinary support teams, tailored support, staff training, feedback data, goal identification, leadership engagement, peer-to-peer learning, orientation materials, and how-to guides — had the potential to effectively support practice changes in primary care settings.28–35
Study Period
The study period was March 2018 through December 2021 and included six 6-month intervention wedges. The first 6-month intervention (wedge 1) began in September 2018, and the last (wedge 6) began in January 2021. This allowed for at least 6 months of data collection before each wedge started and after the last wedge ended. Thus, all months before the 6-month intervention periods were considered preintervention, the 6 months of the intervention were the intervention phase, and all months from the intervention period’s end through December 2021 were postintervention. Of note, wedge 4 began in February 2020, as the Covid-19 pandemic began affecting care delivery in the study clinics.
Outcome Measures
Patient- and encounter-level data were aggregated to the clinic level, limited to people 18 years of age or older. Patients seen only for Covid-19 vaccination/testing were excluded (n = 3,720; 0.7% of the total study sample). Primary analyses centered on social risk screening and related referrals and included two outcome measures.
The first outcome measure was the monthly clinic rate of social risk screening, measured as the number of patients with documented social risk screening results entered at a face-to-face clinical encounter in the measurement period (excluding those only for Covid-19 testing/vaccination, as many people received these services at the study sites who were not otherwise patients at these clinics). Domains of social risk screening included child/family care insecurity, education, employment, financial strain, food insecurity, health insurance, health literacy, housing instability, inadequate physical activity, relationship safety, social isolation, stress, transportation needs, and utilities insecurity. Because the shared EHR enabled clinics to select from several commonly used social risk screening tools, the use of any of the questions from any of these tools was counted.
The second outcome was the monthly clinic rate of provision of social risk-related referrals, measured as the number of patients with a documented referral among all patients seen in the measurement period (regardless of whether social risk screening was documented). This outcome included referrals internal (e.g., to a social worker) or external (e.g., to housing services) to the clinic. Procedure and diagnosis codes were used to find indication of related referrals. Some are specific enough that they were considered to indicate a social risk-related referral on their own. Other codes, such as referrals to a social worker, were more ambiguous and thus were only considered a social risk-related referral in the presence of a related positive social risk screening result before or on the same date as the referral code.
The EHR enabled documenting when patients declined to answer social risk screening questions or declined offered referrals. Documented declinations were considered to indicate that screening or referral actions were taken and included in the numerators described above.
Given the known association between social risks and diabetes outcomes, secondary analyses assessed intervention impacts on diabetes control and receipt of relevant diabetes care. Patients with an encounter during the study period and established diabetes before the second month of their clinic’s baseline period comprised this subpopulation cohort (excluding pregnant women). Guideline-concordant diabetes-related care was considered monthly for this cohort and included whether patients were up to date on receipt of their: (1) annual lipid panel; and (2) biannual hemoglobin A1c (HbA1c) screening. Three diabetes-control measures were assessed monthly among patients screened that month: (1) blood pressure (BP; <130/80 mm Hg); (2) HbA1c (<7.0%); and (3) low-density lipoprotein (LDL; <100 mg/dL).
Baseline Covariates
The baseline period was defined as the 6 months before each clinic’s wedge began. Analyses accounted for clinic-level baseline measures: number of years since the clinic began using their current EHR; whether the clinic conducted screening at or above the 50th percentile for all study clinics (to capture prior experience with such screening); and patient characteristics aggregated to the clinic level (Table 1).
Table 1.
Wedge Characteristics in the 6-Month Baseline Period
Wedge 1 | Wedge 2 | Wedge 3 | Wedge 4 | Wedge 5 | Wedge 6 | |
---|---|---|---|---|---|---|
Clinics | N = 4 | N = 5 | N = 5 | N = 5 | N = 6 | N = 6 |
intervention started | 9/2018 | 3/2019 | 9/2019 | 2/2020 | 8/2020 | 1/2021 |
Urban location, N | 4 | 5 | 4 | 5 | 4 | 5 |
Accountable Health Communities participant,* N | 2 | 2 | 1 | 1 | 0 | 1 |
Clinic characteristics | ||||||
Years using OCHIN Epic EHR, EpicCare | ||||||
Median (range)** | 1 (0–6) | 3 (1–10) | 4 (−2 to 6) | 2 (−2 to 11) | 2 (−1 to 3) | 1 (−1 to 5) |
Encounters | ||||||
Median | 3,532 | 14,586 | 4,449 | 16,486 | 23,682 | 41,591 |
Range | 579–23,827 | 1,755–108,301 | 520–14,574 | 7, 539–63,521 | 13,151–32,088 | 4, 502–147,310 |
Telehealth encounters, % (SD) | 0.0 (0.0) | 0.0 (0.0) | 0.3 (0.4) | 0.1 (0.2) | 11.5 (8.0) | 13.8 (9.3) |
Patients (≥18 years of age) | ||||||
Median | 601 | 3,285 | 1,505 | 2,345 | 4,473 | 9,088 |
Range | 161–5,628 | 695–20,977 | 264–3,428 | 1,653–10,765 | 2,576–5,856 | 1,724–11,901 |
Age, median (range), y | 36 (24—47) | 44 (43–50) | 45 (29–46) | 45 (39–55) | 44 (35–53) | 45 (41–57) |
Female, % (SD) | 58.2 (20.6) | 60.6 (2.9) | 61.1 (21.2) | 55.0 (12.1) | 52.3 (11.2) | 57.5 (6.3) |
Race/ethnicity, % (SD) | ||||||
Hispanic | 9.9 (9.7) | 24.2 (21.1) | 30.4 (21.5) | 15.2 (18.7) | 19.1 (21.0) | 40.9 (34.8) |
Non-Hispanic Black | 21.7 (38.1) | 4.5 (5.2) | 34.1 (40.4) | 10.0 (12.1) | 15.0 (21.5) | 5.7 (6.8) |
Non-Hispanic, non-Black, and nonwhite | 24.6 (42.8) | 21.5 (34.7) | 3.3 (1.6) | 3.3 (1.9) | 3.6 (4.0) | 3.9 (3.2) |
Non-Hispanic white | 41.2 (38.5) | 46.5 (31.6) | 27.6 (14.2) | 61.0 (20.4) | 54.4 (23.7) | 42.1 (35.7) |
Not documented in EHR | 2.5 (2.5) | 3.4 (1.2) | 4.7 (4.2) | 10.6 (2.5) | 7.9 (4.8) | 7.3 (3.2) |
Primary language, % (SD) | ||||||
English | 71.7 (42.9) | 64.1 (33.4) | 77.9 (17.0) | 87.2 (18.4) | 85.8 (16.9) | 73.2 (23.4) |
Not documented in EHR | 1.9 (3.3) | 0.1 (0.1) | 0.1 (0.2) | 1.1 (1.0) | 0.4 (0.3) | 1.1 (0.7) |
Federal poverty level, % (SD) | ||||||
≤200% | 91.4 (4.9) | 91.9 (2.8) | 92.6 (3.2) | 76.9 (21.4) | 71.0 (24.4) | 78.2 (13.9) |
Not documented in EHR | 1.3 (1.1) | 0.6 (0.5) | 1.9 (1.7) | 10.7 (14.2) | 16.1 (21.5) | 10.1 (6.1) |
Insurance status, % (SD) | ||||||
Public | 44.0 (26.4) | 65.5 (8.2) | 37.1 (10.8) | 62.9 (12.3) | 51.5 (14.8) | 61.7 (8.9) |
Uninsured | 38.5 (28.1) | 20.1 (3.7) | 53.5 (14.9) | 19.2 (9.0) | 26.6 (15.1) | 20.5 (7.7) |
Patients screened for social risk, median (range) | 2 (0–751) | 57 (0–882) | 360 (1–563) | 23 (0–332) | 67 (10–311) | 430 (2–2,771) |
Patients with social risk referrals, median (range) | 1 (0–74) | 28 (0–135) | 19 (0–113) | 3 (0–22) | 13 (1–108) | 89 (7–1,072) |
Patients with diabetes, range | 8–944 | 187–3,984 | 22–829 | 343–3,618 | 352–923 | 495–2,981 |
EHR = electronic health record, SD = standard deviation.
Participated in Accountable Health Communities program of the U.S. Centers for Medicare & Medicaid Services.
The negative numbers in the range indicate that the user did not have experience with EpicCare until 1 or 2 years after the baseline. Source: The authors
We also accounted for whether the clinic was concurrently involved in the U.S. Centers for Medicare & Medicaid Services (CMS) Innovation Center’s Accountable Health Communities (AHC) Model,36 a large national demonstration project targeting implementation of social risk screening and navigation services. Participants involved in this demonstration received modest financial incentives but only minimal implementation support from CMS.
Statistical Analysis
Clinic-level outcomes were monthly from March 2018 through December 2021 (totaling 1,384 monthly time points across 31 clinics). Generalized linear mixed models (GLMMs) were used to assess intervention effect by comparing outcomes during time periods in clinics that had versus had not yet participated in the intervention. GLMMs were used to account for a general time trend and to flexibly model the intervention effect over time postintervention. Negative binomial mixed-effects modeling was used to evaluate the primary outcomes; mixed-effects linear regression was used to evaluate secondary outcomes. Each GLMM fit flexible time effects by treating time as a categorical variable and included random effects for clinics, adjusted for baseline covariates, and used robust standard errors. Average differences are reported comparing the preintervention period versus: (1) the 6-month intervention period; and (2) the postintervention period. Rate ratios for the primary outcomes, rate differences for the secondary outcomes, and corresponding 95% confidence intervals (CIs) are reported. A more detailed description of the GLMM is provided in Exhibit 1 of the Appendix. Analyses were performed by using Stata 15 (StataCorp); hypothesis tests were two-sided with a type I error of 0.05.
Note on Protocol Deviation
The GLMM specifications written at the time of trial registration were based on the then-flagship article for analysis of stepped-wedge clinic-randomized trials.37 This approach used continuous variables for modeling the general time trend of the outcomes without intervention and estimating the added intervention effect that accrues over time after intervention initiation. This approach originally provided an estimate for the intervention effect that appears in the first time period (month) of intervention. Since trial registration, advances have been made in the analysis of such study designs that improve ability to estimate intervention effects.38–40 We updated the GLMM specifications in these analyses to reflect these advances by using the more flexible, categorical time variables. In addition, we dropped the stand-alone estimator for intervention effect in the first month of intervention, as it is unlikely that the effect was immediate and constant; instead, this estimator was absorbed into the categorized estimator for added intervention effects. Ultimately, these modifications resulted in changes to our reported outcomes. Instead of reporting intervention effect in the first month of intervention and the added intervention effect beyond the first month until study end, we report the average intervention effect for the 6 months of hands-on intervention and for the postintervention period. We only report estimates from the updated models in the main results. A comparison of estimates using both the original and updated models is provided in Exhibit 2 of the Appendix.
Results
Of 31 clinics enrolled in the study, five withdrew before intervention initiation due to various bandwidth-related issues (three after March 2020). Of the remaining 26 clinics, 7 fully completed all 5 implementation process steps, 12 fully or partially completed all steps (i.e., all steps were started, but not all were completed), and 7 fully or partially completed at least the first 3 steps, including 3 that completed the last step but skipped an intermediate step. All 31 recruited clinics were included in analyses for intention-to-treat assessments. Five clinics (including 2 of the 5 that withdrew from the study) did not have preintervention data from 6 months before the start of wedge 1 (due to activating their EHR after that date), but all 31 clinics had ≥6 months of data before the start of their wedge. Exhibit 3 in the Appendix presents months of observation and denominators for all outcomes.
Table 1 presents characteristics of study clinics and their patients according to wedge in the 6 months before a given wedge’s intervention began. Variation was seen across wedges in clinic patients’ median age, distribution according to race/ethnicity, primary language, poverty level, and insurance status; regression models adjusted for these variables. Notably, there was variability in the extent to which study clinics conducted social risk screening and referrals in the 6-month baseline period. Seven study clinics were involved in the CMS Innovation Center’s AHC initiative.
Figure 1 shows clinic screening rate patterns over the study period according to wedge. In March 2020, at month 24 of the study, the Covid-19 pandemic began to severely affect clinic operations; this time period coincides with when the intervention began for wedge 4. As shown in Figure 1, the study sites had very different preintervention rates of social risk screening. These screening rates rarely increased in a linear or sustained manner but instead varied substantially over time and by clinic site. Exhibit 4 of the Appendix presents the variation in social risk screening in all study clinics over the study period; of note, what may be a secular trend of increased social risk screening in months 1–23 appears to flatten when the pandemic began. Exhibit 5 of the Appendix presents patterns of social risk referral rates in each wedge’s clinics over time.
FIGURE 1. Patterns of Screening Rates over Time, According to Study Clinics Within Study Wedges.
Screening rates rarely increased in a linear manner and varied widely across study sites. Notes: The y-axis denotes the percentage of patients screened for social risk. The x-axis denotes the study period, from month 1 (March 2018) to month 45 (December 2021). The colored lines represent each of the 31 individual clinics. Wedge 1 has four clinics; wedges 2, 3, and 4 have five clinics; and wedges 5 and 6 have six clinics each. The solid vertical line denotes the start of the implementation for that wedge. The dashed vertical line at month 24 is March 2020, when the Covid-19 pandemic began affecting U.S. clinics.
Source: The authors
NEJM Catalyst (catalyst.nejm.org) © Massachusetts Medical Society
Social Risk Screening and Referral Outcomes
Results of adjusted regression analyses assessing intervention effects on social risk screening and referral rates are presented in Table 2.
Table 2.
Effects of Implementation Support on Social Risk Screening Rates and Referrals
Outcome | Added Effects of Intervention Compared with Preintervention, RR (95% CI) | |
---|---|---|
Updated Model Adjusted | ||
During 6-Month Intervention | Postintervention | |
All patients | ||
Social risk screening | 2.45 (1.32–4.39) | 2.16 (0.64–7.27) |
Social risk referral | 1.33 (0.73–2.43) | 0.89 (0.18–1.93) |
Documented need | 0.79 (0.46–1.36) | 0.56 (0.21–1.48) |
No documented need | 1.11 (0.60–2.04) | 0.40 (0.12–1.34) |
Social risk screening rates were significantly higher during the 6-month intervention period compared with the preintervention period. Estimates were derived by using mixed-effects negative binomial regression with random effects for clinic adjusted for baseline characteristics. Note: Boldface data are considered statistically significant at P < .05. RR = rate ratio, CI = confidence interval. Source: The authors
The rate of social risk screening was 2.45 times (95% CI, 1.32–4.39) higher during the 6 intervention months compared with preintervention rates. This impact was not sustained in the postintervention period; although the effect size was similar in magnitude (rate ratio, 2.16; 95% CI, 0.64–7.27), it lacked statistical significance. No significant difference was seen in rates of social risk referrals during or postintervention, regardless of patients having documented social risks.
Diabetes Outcomes
In brief, analyses showed little intervention impact on diabetes outcomes (Table 3) in the intervention and postintervention periods compared with the preintervention period, with a few exceptions.
Table 3.
Association Between Implementation Support Intervention and Diabetes Biomarkers and Receipt of Recommended Biomarker Screening
Outcome | Added Effects Associated with Intervention Compared with Preintervention, Absolute Percent Change (95% CI) | |
---|---|---|
During 6-Month Intervention | Postintervention | |
All patients with diabetes | ||
HbA1c screen, % up to date | −9.92 (−15.91 to −3.94) | −30.51 (−50.20 to −10.83) |
LDL screen, % up to date | −10.02 (−15.75 to −4.30) | −33.20 (−54.01 to −12.39) |
BP, % controlled | −0.90 (−3.37 to 1.56) | 1.57 (−6.21 to 9.36) |
HbA1c, % controlled | −2.05 (−5.26 to 1.17) | −6.46 (−13.52 to 0.61) |
LDL, % controlled | 4.63 (−0.05 to 9.30) | 5.61 (−2.99 to 14.20) |
Patients with diabetes screened for social risks (subset) | ||
HbA1c screen, % up to date | −7.50 (−14.51 to −0.49) | −23.26 (−48.13 to 1.62) |
LDL screen, % up to date | −8.53 (−15.03 to −2.02) | −29.95 (−54.58 to −5.32) |
BP, % controlled | 1.35 (−1.96 to 4.65) | 11.26 (1.51 to 21.00) |
HbA1c % controlled | −2.01 (−7.50 to 3.47) | −6.74 (−15.98 to 2.49) |
LDL, % controlled | 4.19 (−1.86 to 10.25) | 7.33 (−2.61 to 17.27) |
Significant decreases in guideline-concordant diabetes care may reflect decreased in-person encounters resulting from the Covid-19 pandemic. Notes: Results are given for all patients with diagnosed diabetes on or before the first baseline month and for the subset of patients screened for social risk. Estimates were derived by using mixed-effects linear regression with random effects for clinic, adjusted for baseline characteristics. Boldface data indicate statistical significance at P < .05. CI = confidence interval, HbA1c = hemoglobin A1c, LDL = low-density lipoprotein, BP = blood pressure. Source: The authors
In the intervention and postintervention periods, the monthly average of patients with up-to-date diabetes-related screenings declined significantly from the preintervention period among all patients with diabetes and the subset who were screened for social risks. (The percentage of those screened for social risks who had up-to-date HbA1c screening was not significantly lower postintervention but trended in the same direction.) Among patients with biomarker screenings in a given month, no patterns were seen in the percentage with controlled HbA1c or LDL. There was a significant increase in the percentage of patients with diabetes who had controlled BP in the postintervention period among those screened for social risks. To assess the influence of the Covid-19 pandemic on these outcomes, we ran these analyses stratified according to wedges occurring before and after March 2020. (Both sets of clinics had postintervention data from after March 2020, but all postintervention data from wedges 4–6 occurred in this period; results not shown.) Outcomes related to being up to date on diabetes-related screenings showed a larger effect size in the later three wedges, suggesting that practice changes made in response to the pandemic had influenced these outcomes; hypotheses about this pattern are given in the Discussion.
Discussion
During a 6-month tailored implementation support intervention, CHC clinics’ social risk screening rates were more than twice as high than screening rates during the preintervention period (P < .01). In the postintervention period, screening rates trended in the same direction as in the intervention period, but differences were not significantly different from the preintervention period. Rates of social service referrals did not increase regardless of patients’ social risk status. The intervention was associated with greater BP control among patients with diabetes, and lower receipt of recommended diabetes biomarker screening, in the postintervention period.
These results suggest that intensive, adaptive implementation support was effective at temporarily increasing social risk screening rates. The increase in screening seen during the intervention period may be lower than what would have occurred without the impact of the Covid-19 pandemic on clinics’ ability to adopt new practices, as discussed below. Furthermore, most study sites chose to test and iterate screening workflows in a subset of their patients, as suggested in implementation step 3 of the five-step process: (1) secure leadership buy-in; (2) set goals; (3) develop workflows; (4) orient staff; and (5) implement and iterate. The changes seen in this study may, therefore, indicate success in meeting those goals, but as clinic-defined targets changed frequently, it was not possible to limit analyses to each clinic’s target population; this study limitation is discussed below. In addition, although this study measured rates of EHR-documented social risk screening and referrals, CHCs have been providing contextualized, person-centered care since their inception. It is possible that the implementation support furthered this work but did not always result in measurable screening/referral outcomes. Conversely, the increases in social risk screening may reflect a secular trend, as a growing national emphasis on such screening occurred during the study period. Others’ research concurs; a 2022 report,41 for example, suggests that screening rates are rising nationally. Policy changes may be relevant: in data from 2019,14 higher social risk screening rates were reported by clinics in states with Medicaid accountable care organizations. These factors may have influenced the results seen here.
Multiple components of the intervention — including staff training, using small tests of change, having a champion, leadership support, and flexibility in screening implementation — have been proven effective in other circumstances.41–45 This may explain the intervention’s initial impact. The fact that this result was not consistent across study sites aligns with prior research showing variable impacts of multifaceted interventions targeting practice change.46–49 Furthermore, although the intervention was designed based on results of a pilot study, limited prior evidence on social risk screening adoption, and evidence on methods for supporting practice change in general,41–45 it may not have adequately addressed key barriers to sustained screening/referral implementation in underresourced clinics. The timing of the reported effects was surprising, as we expected that screening rates would increase after rather than during the 6-month intervention period. This finding suggests that improvements may have occurred as a result of clinic staff engaging in implementation efforts with outside support, rather than because of specific intervention elements. Qualitative analyses now underway should help explain these findings.
Results also suggest the possibility that the 6-month intervention did not last long enough; the implementation science literature on effective maintenance of change adoption postimplementation support is nascent.50 However, it may be challenging for underresourced clinics to participate in such support activities over a longer period without resources to cover staff time spent on such efforts. Research is needed to explore whether incentive structures enhance the implementation of sustained social risk screening. (As noted above, 7 of the 31 study clinics concurrently participated in a national initiative through which they received modest reimbursement for conducting screening and referrals; our team is now analyzing the relative impacts of reimbursement versus hands-on support in these clinics.) It is critical to understand the resources that CHCs need to effectively implement and sustain social risk screening as state Medicaid policies begin to require documentation of such screening and related interventions. These requirements might motivate targeted clinics to focus on social risk screening, but the results presented in this article and by others14,22,51 suggest that without adequate support — both financial and coaching/technical assistance — underresourced clinics may struggle to meet these requirements.
Another potential driver of these results is the limited evidence on best practices for social risk efforts in clinical environments (e.g., who should be screened, how often, and with which screening instruments) or on the effectiveness of interventions meant to address identified social risks.16,17,20,52–54 It may be easier to implement and sustain new processes that have clear protocols and/or solid evidence on the expected health impacts of such processes. Research is needed to provide this evidence.
Although social risk screening increased significantly during the 6-month implementation support period, no such increase occurred in related referrals. Previous research suggests possible explanations. Standards for documenting formal referrals are evolving,55,56 and our methods may not have captured some referrals. For example, as documented referrals to a clinic’s social worker did not always specify whether that referral was for behavioral health or social service navigation support, they were not considered social risk referrals here unless a concurrent positive social risk screening was documented, as described in the Methods. In addition, some patients with identified social needs likely declined such referrals.57,58 Care team members might have provided patients with relevant information via written materials, without documenting this action in the EHR. It is also possible that clinics experienced challenges in connecting patients with social services, such as a lack of referral resources.52,59 Prior research found that care team members may be reluctant to screen for social risk factors if they feel unable to address identified needs;60,61 if increased social risk screening was not coupled with an increased ability to make referrals, it may have diminished enthusiasm for continuing screening postintervention. Finally, although the intervention was designed to help clinics start making such referrals, doing so may not have been a priority for every clinic, and/or the support provided may not have been enough to overcome the challenges to making these referrals described above.
We posit that the association seen between the intervention and decreased provision of guideline-concordant diabetes care is because of the Covid-19 pandemic. In the pandemic’s first year (our study months 24–35), in-person clinical encounters decreased dramatically, which certainly affected clinics’ ability to conduct HbA1c or LDL tests. There are several possible explanations for the significant and substantial postintervention improvement in BP control among patients with diabetes who were screened for social risks. It is possible that having social risks documented drove care teams to provide social service referrals (documented or not) to patients with reported risks or to make care plan adjustments for these patients to enhance their ability to follow recommended care. Alternatively, social risk documentation may have been used by clinics to prioritize which patients were contacted via outreach during the pandemic. Another possibility is that patients whose care involved titrating BP medications had additional encounters and thus more opportunities to receive social risk screening. However, it is also possible that CHCs focused outreach efforts on patients with the highest documented BP, who may also be those with social risks, in which case social risk documentation did not play a role.
The effect size for these diabetes-related outcomes was greater among clinics for which all data collection occurred in the post–Covid-19 pandemic period. This supports the proposed explanation that changes seen in diabetes-related screening up-to-date status and in rates of BP control reflect changes in clinic processes made in response to the pandemic. This influence on overall study outcomes is likely because given the stepped-wedge design, all study clinics had some follow-up data during the pandemic.
Limitations
All study results must be interpreted with caution given that the Covid-19 pandemic began midway through the trial and almost certainly affected outcomes. Although the pandemic dramatically influenced financial insecurity among CHC patients, it also heightened interest by CHC in social risk screening; thus, it may have both increased clinics’ motivation to conduct social risk screening and referrals and affected their ability to do so. It is clear that the pandemic profoundly disrupted primary care clinics’ capacity, workflows, staffing, and ability to implement non–pandemic-related practice changes. It also affected the capacity of social service organizations.
Several additional limitations must be considered in interpreting these findings, some of which were mentioned earlier. First, in this pragmatic trial, each study clinic targeted different groups of patients for their initial screening efforts, and it was not feasible to limit analyses to their target populations. Results, therefore, reflect clinic-wide rather than population-specific changes, even though only certain patient populations were targeted for screening. Second, some screenings and referrals may have been documented in EHR text notes and not captured in analyses. This would incur an error toward the null, and the intervention’s goal was to improve documentation in discrete data fields; this limitation is therefore noted but is not concerning. Third, some of the study clinics’ concurrent participation in the AHC initiative may have affected study results. As noted, we adjusted for this in analytic models, and analyses now underway are assessing the potential interplay between these projects. Last, recruitment bias may affect the generalizability of these findings. Clinics that agreed to take part in this study were motivated to implement social risk screening; many study sites had clearly attempted to implement such screening in the past and may have struggled to do so effectively. Study results should be interpreted as generalizable to clinics that are eager to implement or expand their social risk screening efforts.
Looking Ahead
Although social risk screening is increasingly emphasized by national health care leaders and payers, many primary care clinics face complex barriers to implementing social risk screening and related referral-making. Substantial and ongoing investment and support are needed to enable implementing this practice change. This is especially important for safety-net CHCs given that social risk screening documentation is becoming a requirement in many state Medicaid agencies. A publicly available implementation guide based on study findings may be a useful resource for clinics seeking to adopt or expand social risk screening and referral-making efforts.27
Supplementary Material
Acknowledgments
We express our deep gratitude to the OCHIN member clinics that participated in this study. We also thank the study’s expert advisory committee, whose guidance was invaluable: Laura Mae Baldwin, Deborah Cohen, Yuriko de la Cruz, Jennifer DeVoe, Arvin Garg, Nancy Gordon, Amber Haley, Danielle Marie Hessler-Jones, Christian Hill, Hilary Placzek, Bryon Powell, Michelle Proser, and Thomas J. Schuch.
Footnotes
Disclosures: Rachel Gold, Jorge Kaufmann, Erika K. Cottrell, Arwen Bunce, Christina R. Sheppler, Megan Hoopes, Molly Krancari, Laura M. Gottlieb, Meg Bowen, Julianne Bava, Ned Mossman, Nadia Yosuf, and Miguel Marino have nothing to disclose. The Kaiser Permanente Northwest Institutional Review Board (FWA #00002344, IRB #00000405) approved the study (Project 1394354) and continues to review study activities and monitor progress. All clinics in the study consented to participate. Study data and materials are available by request. The study was funded by the National Institute of Diabetes and Digestive and Kidney Diseases grant 1R18DK114701. Trial registration: clinicaltrials.gov, NCT03607617 (https://clinicaltrials.gov/ct2/show/NCT03607617; registration date: July 31, 2018 — retrospectively registered).
Appendix
Contributor Information
Rachel Gold, Lead Research Scientist, OCHIN, Portland, Oregon, USA; Senior Investigator, Kaiser Permanente Center for Health Research, Portland, Oregon, USA.
Jorge Kaufmann, Biostatistician, Department of Family Medicine, Oregon Health & Science University, Portland, Oregon, USA.
Erika K. Cottrell, Senior Investigator, OCHIN, Portland, Oregon, USA; Research Associate Professor, Department of Medical Informatics and Clinical Epidemiology, School of Medicine, Oregon Health & Science University, Portland, Oregon, USA.
Arwen Bunce, Qualitative Research Scientist, OCHIN, Portland, Oregon, USA.
Christina R. Sheppler, Research Associate III, Kaiser Permanente Center for Health Research, Portland, Oregon, USA.
Megan Hoopes, Manager of Research Analytics, OCHIN, Portland, Oregon, USA.
Molly Krancari, Research Associate, OCHIN, Portland, Oregon, USA.
Laura M. Gottlieb, Professor of Family and Community Medicine, School of Medicine, University of California San Francisco, San Francisco, California, USA.
Meg Bowen, Practice Coach, OCHIN, Portland, Oregon, USA.
Julianne Bava, Trainer, OCHIN, Portland, Oregon, USA.
Ned Mossman, Director of Social and Community Health, OCHIN, Portland, Oregon, USA.
Nadia Yosuf, Project Manager III, Fred Hutchinson Cancer Research Center, Seattle, Washington, USA.
Miguel Marino, Assistant Professor, Department of Family Medicine, Oregon Health & Science University, Portland, Oregon, USA.
References
- 1.Patel MR. Social determinants of poor management of type 2 diabetes among the insured. Curr Diab Rep 2020;20:67 10.1007/s11892-020-01354-4 10.1007/s11892-020-01354-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Lee W, Lloyd JT, Giuriceo K, Day T, Shrank W, Rajkumar R. Systematic review and meta-analysis of patient race/ethnicity, socioeconomics, and quality for adult type 2 diabetes. Health Serv Res 2020;55:741–72 10.1111/1475-6773.13326 10.1111/1475-6773.13326. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Hill-Briggs F, Adler NE, Berkowitz SA, et al. Social determinants of health and diabetes: a scientific review. Diabetes Care 2020;44:258–79 https://diabetesjournals.org/care/article/44/1/258/33180/Social-Determinants-of-Health-and-Diabetes-A 10.2337/dci20-0053. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Gibson DM. Food insecurity, eye care receipt, and diabetic retinopathy among US adults with diabetes: implications for primary care. J Gen Intern Med 2019;34:1700–2 10.1007/s11606-019-04992-x 10.1007/s11606-019-04992-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Solomon EM, Wing H, Steiner JF, Gottlieb LM. Impact of transportation interventions on health care outcomes: a systematic review. Med Care 2020;58:384–91 https://journals.lww.com/lww-medicalcare/Abstract/2020/04000/Impact_of_Transportation_Interventions_on_Health.13.aspx 10.1097/MLR.0000000000001292. [DOI] [PubMed] [Google Scholar]
- 6.Walker RJ, Williams JS, Egede LE. Pathways between food insecurity and glycaemic control in individuals with type 2 diabetes. Public Health Nutr 2018;21:3237–44 https://www.cambridge.org/core/journals/public-health-nutrition/article/pathways-between-food-insecurity-and-glycaemic-control-in-individuals-with-type-2-diabetes/3DD67E36DD0A56B6155C8992B0423213 10.1017/S1368980018001908. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Berkowitz SA, Baggett TP, Wexler DJ, Huskey KW, Wee CC. Food insecurity and metabolic control among U.S. adults with diabetes. Diabetes Care 2013;36:3093–9 https://diabetesjournals.org/care/article/36/10/3093/30779/Food-Insecurity-and-Metabolic-Control-Among-U-S 10.2337/dc13-0570. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Berkowitz SA, Karter AJ, Corbie-Smith G, et al. Food insecurity, food “deserts,” and glycemic control in patients with diabetes: a longitudinal analysis. Diabetes Care 2018;41:1188–95 https://diabetesjournals.org/care/article/41/6/1188/36449/Food-Insecurity-Food-Deserts-and-Glycemic-Control 10.2337/dc17-1981. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Cottrell EK, O’Malley JP, Dambrun K, et al. The impact of social and clinical complexity on diabetes control measures. J Am Board Fam Med 2020;33:600–10 https://www.jabfm.org/content/33/4/600 10.3122/jabfm.2020.04.190367. [DOI] [PubMed] [Google Scholar]
- 10.Council on Community Pediatrics. Poverty and child health in the United States. Pediatrics 2016;137:e20160339 https://publications.aap.org/pediatrics/article/137/4/e20160339/81482/Poverty-and-Child-Health-in-the-United-States?autologincheck=redirected 10.1542/peds.2016-0339. [DOI] [PubMed] [Google Scholar]
- 11.American Academy of Family Physicians. Social Determinants of Health: Family Physicians’ Leadership Role. 2019. Accessed September 14, 2021. https://www.aafp.org/pubs/afp/issues/2019/0415/p476.html. [Google Scholar]
- 12.Agency for Healthcare Research and Quality. About the National Quality Strategy. 2019. Accessed July 19, 2020. https://www.ahrq.gov/workingforquality/about/index.html. [Google Scholar]
- 13.Wyatt R, Laderman M, Botwinick L, Mate K, Whittington J. Achieving health equity: a guide for health care organizations. Cambridge, MA: Institute for Healthcare Improvement, 2016. https://www.ihi.org/resources/Pages/IHIWhitePapers/Achieving-Health-Equity.aspx. [Google Scholar]
- 14.Cole MB, Nguyen KH, Byhoff E, Murray GF. Screening for social risk at federally qualified health centers: a national study. Am J Prev Med 2022;62:670–8 https://www.ajpmonline.org/article/S0749-3797(21)00603-6/fulltext 10.1016/j.amepre.2021.11.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Fraze TK, Brewster AL, Lewis VA, Beidler LB, Murray GF, Colla CH. Prevalence of screening for food insecurity, housing instability, utility needs, transportation needs, and interpersonal violence by US physician practices and hospitals. JAMA Netw Open 2019;2:e1911514 https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2751390 10.1001/jamanetworkopen.2019.11514. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Gottlieb LM, Wing H, Adler NE. A systematic review of interventions on patients’ social and economic needs. Am J Prev Med 2017;53:719–29 https://www.ajpmonline.org/article/S0749-3797(17)30268-4/fulltext 10.1016/j.amepre.2017.05.011. [DOI] [PubMed] [Google Scholar]
- 17.De Marchis EH, Torres JM, Benesch T, et al. Interventions addressing food insecurity in health care settings: a systematic review. Ann Fam Med 2019;17:436–47 https://www.annfammed.org/content/17/5/436 10.1370/afm.2412. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Cottrell EK, Dambrun K, Cowburn S, et al. Variation in electronic health record documentation of social determinants of health across a national network of community health centers. Am J Prev Med 2019;57(Suppl 1):S65–73 https://www.ajpmonline.org/article/S0749-3797(19)30322-8/fulltext 10.1016/j.amepre.2019.07.014. [DOI] [PubMed] [Google Scholar]
- 19.Cartier Y, Gottlieb L. The prevalence of social care in US health care settings depends on how and whom you ask. BMC Health Serv Res 2020;20:481 10.1186/s12913-020-05338-8 10.1186/s12913-020-05338-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.National Academies of Sciences, Engineering, and Medicine. Integrating social care into the delivery of health care: moving upstream to improve the nation’s health. Washington, DC: National Academies Press, 2019. Accessed February 13, 2023. https://www.nationalacademies.org/our-work/integrating-social-needs-care-into-the-delivery-of-health-care-to-improve-the-nations-health. [PubMed] [Google Scholar]
- 21.Davidson KW, McGinn T. Screening for social determinants of health: the known and unknown. JAMA 2019;322:1037–8 https://jamanetwork.com/journals/jama/article-abstract/2749417 10.1001/jama.2019.10915. [DOI] [PubMed] [Google Scholar]
- 22.Palakshappa D, Scheerer M, Semelka CT, Foley KL. Screening for social determinants of health in free and charitable clinics in North Carolina. J Health Care Poor Underserved 2020;31:382–97 https://muse.jhu.edu/article/747795 10.1353/hpu.2020.0029. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Murray GF, Rodriguez HP, Lewis VA. Upstream with a small paddle: how ACOs are working against the current to meet patients’ social needs. Health Aff (Millwood) 2020;39:199–206 10.1377/hlthaff.2019.01266 10.1377/hlthaff.2019.01266. [DOI] [PubMed] [Google Scholar]
- 24.Gold R, Bunce A, Cottrell E, et al. Study protocol: a pragmatic, stepped-wedge trial of tailored support for implementing social determinants of health documentation/action in community health centers, with realist evaluation. Implement Sci 2019;14:9 10.1186/s13012-019-0855-9 10.1186/s13012-019-0855-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Gold R, Bunce A, Cowburn S, et al. Adoption of social determinants of health EHR tools by community health centers. Ann Fam Med 2018;16:399–407 https://www.annfammed.org/content/16/5/399 10.1370/afm.2275. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Gold R, Cottrell E, Bunce A, et al. Developing electronic health record (EHR) strategies related to health center patients’ social determinants of health. J Am Board Fam Med 2017;30:428–47 https://www.jabfm.org/content/30/4/428 10.3122/jabfm.2017.04.170046. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Social Interventions Research & Evaluation Network (SIREN). Guide to Implementing Social Risk Screening and Referral-Making. 2022. Accessed October 30, 2022. https://sirenetwork.ucsf.edu/guidei-mplementing-social-risk-screening-and-referral-making.
- 28.Powell BJ, Beidas RS, Lewis CC, et al. Methods to improve the selection and tailoring of implementation trategies. J Behav Health Serv Res 2017;44:177–94 10.1007/s11414-015-9475-6 10.1007/s11414-015-9475-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci 2015;10:21 10.1186/s13012-015-0209-1 10.1186/s13012-015-0209-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q 2004;82:581–629 10.1111/j.0887-378X.2004.00325.x 10.1111/j.0887-378X.2004.00325.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Taylor EF, Machta RM, Meyers DS, Genevro J, Peikes DN. Enhancing the primary care team to provide redesigned care: the roles of practice facilitators and care managers. Ann Fam Med 2013;11:80–3 https://www.annfammed.org/content/11/1/80 10.1370/afm.1462. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Scott VC, Jillani Z, Malpert A, Kolodny-Goetz J, Wandersman A. A scoping review of the evaluation and effectiveness of technical assistance. Implement Sci Commun 2022;3:70 10.1186/s43058-022-00314-1 10.1186/s43058-022-00314-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Baker R, Camosso-Stefinovic J, Gillies C, et al. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev 2015;2015:CD005470 10.1002/14651858.CD005470.pub3/full . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Wensing M. The Tailored Implementation in Chronic Diseases (TICD) project: introduction and main findings. Implement Sci 2017;12:5 10.1186/s13012-016-0536-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Paina L, Peters DH. Understanding pathways for scaling up health services through the lens of complex adaptive systems. Health Policy Plan 2012;27:365–73 https://academic.oup.com/heapol/article/27/5/365/751682 10.1093/heapol/czr054. [DOI] [PubMed] [Google Scholar]
- 36.U.S. Centers for Medicare & Medicaid Services. Accountable Health Communities Model. 2022. Updated January 26, 2023. Accessed October 30, 2022. https://innovation.cms.gov/innovation-models/ahcm.
- 37.Hussey MA, Hughes JP. Design and analysis of stepped wedge cluster randomized trials. Contemp Clin Trials 2007;28:182–91 https://www.sciencedirect.com/science/article/pii/S1551714406000632?via%3Dihub 10.1016/j.cct.2006.05.007. [DOI] [PubMed] [Google Scholar]
- 38.Hemming K, Taljaard M, Forbes A. Analysis of cluster randomised stepped wedge trials with repeated cross-sectional samples. Trials 2017;18:101 10.1186/s13063-017-1833-7 10.1186/s13063-017-1833-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Nickless A, Voysey M, Geddes J, Yu LM, Fanshawe TR. Mixed effects approach to the analysis of the stepped wedge cluster randomised trial — investigating the confounding effect of time through simulation. PLoS One 2018;13:e0208876 10.1371/journal.pone.0208876 10.1371/journal.pone.0208876. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Voldal EC, Xia F, Kenny A, Heagerty PJ, Hughes JP. Random effect misspecification in stepped wedge designs. Clin Trials 2022;19:380–3 10.1177/17407745221084702 10.1177/17407745221084702. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.De Marchis E, Brown E, Aceves B, et al. State of the Science of Screening in Healthcare Settings San Francisco: Social Interventions Research & Evaluation Network. Summer 2022. Accessed October 30, 2022. https://sirenetwork.ucsf.edu/sites/default/files/2022-06/final%20SCREEN%20State-of-Science-Report%5B55%5D.pdf.
- 42.Greenwood-Ericksen M, DeJonckheere M, Syed F, Choudhury N, Cohen AJ, Tipirneni R. Implementation of health-related social needs screening at Michigan Health Centers: a qualitative study. Ann Fam Med 2021;19:310–7 https://www.annfammed.org/content/19/4/310 10.1370/afm.2690. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Morgenlander MA, Tyrrell H, Garfunkel LC, Serwint JR, Steiner MJ, Schilling S. Screening for social determinants of health in pediatric resident continuity clinic. Acad Pediatr 2019;19:868–74 https://www.academicpedsjnl.net/article/S1876-2859(19)30057-9/fulltext 10.1016/j.acap.2019.02.008. [DOI] [PubMed] [Google Scholar]
- 44.Theiss J, Regenstein M. Facing the need: screening practices for the social determinants of health. J Law Med Ethics 2017;45:431–41 10.1177/1073110517737543. [DOI] [Google Scholar]
- 45.Gruß I, Bunce A, Davis J, Dambrun K, Cottrell E, Gold R. Initiating and implementing social determinants of health data collection in community health centers. Popul Health Manag 2021;24:52–8 10.1089/pop.2019.0205 10.1089/pop.2019.0205. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Willis TA, Collinson M, Glidewell L, et al. An adaptable implementation package targeting evidence-based indicators in primary care: a pragmatic cluster-randomised evaluation. PLoS Med 2020;17:e1003045 10.1371/journal.pmed.1003045 10.1371/journal.pmed.1003045. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Glidewell L, Hunter C, Ward V, et al. ; ASPIRE programme team. Explaining variable effects of an adaptable implementation package to promote evidence-based practice in primary care: a longitudinal process evaluation. Implement Sci 2022;17:9 10.1186/s13012-021-01166-4 10.1186/s13012-021-01166-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Foy R, Willis T, Glidewell L, et al. Developing and evaluating packages to support implementation of quality indicators in general practice: the ASPIRE research programme, including two cluster RCTs. Programme Grants Appl Res 2020;8(4) https://www.journalslibrary.nihr.ac.uk/pgfar/pgfar08040#/abstract 10.3310/pgfar08040. [DOI] [PubMed] [Google Scholar]
- 49.Presseau J, Mackintosh J, Hawthorne G, et al. Cluster randomised controlled trial of a theory-based multiple behaviour change intervention aimed at healthcare professionals to improve their management of type 2 diabetes in primary care. Implement Sci 2018;13:65 10.1186/s13012-018-0754-5 10.1186/s13012-018-0754-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Hailemariam M, Bustos T, Montgomery B, Barajas R, Evans LB, Drahota A. Evidence-based intervention sustainability strategies: a systematic review. Implement Sci 2019;14:57 10.1186/s13012-019-0910-6 10.1186/s13012-019-0910-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Kreuter MW, Thompson T, McQueen A, Garg R. Addressing social needs in health care settings: evidence, challenges, and opportunities for public health. Annu Rev Public Health 2021;42:329–44 10.1146/annurev-publhealth-090419-102204 10.1146/annurev-publhealth-090419-102204. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Eder M, Henninger M, Durbin S, et al. Screening and interventions for social risk factors: technical brief to support the US Preventive Services Task Force. JAMA 2021;326:1416–28 https://jamanetwork.com/journals/jama/fullarticle/2783975 10.1001/jama.2021.12825. [DOI] [PubMed] [Google Scholar]
- 53.Hessler D, Bowyer V, Gold R, Shields-Zeeman L, Cottrell E, Gottlieb LM. Bringing social context into diabetes care: intervening on social risks versus providing contextualized care. Curr Diab Rep 2019;19:30 10.1007/s11892-019-1149-y 10.1007/s11892-019-1149-y. [DOI] [PubMed] [Google Scholar]
- 54.Bibbins-Domingo K Integrating social care into the delivery of health care. JAMA 2019;322:1763–4 https://jamanetwork.com/journals/jama/article-abstract/2752359 10.1001/jama.2019.15603. [DOI] [PubMed] [Google Scholar]
- 55.Open Referral. 2022. Accessed November 15, 2022. https://openreferral.org/.
- 56.HL7 International. SDOH Clinical Care. HL7 Implementation Guide. 2022. Accessed October 20, 2022. https://build.fhir.org/ig/HL7/fhir-sdoh-clinicalcare/index.html. [Google Scholar]
- 57.Fichtenberg CM, De Marchis EH, Gottlieb LM. Understanding patients’ interest in healthcare-based social assistance programs. Am J Prev Med 2022;63(Suppl 2):S109–15 https://www.ajpmonline.org/article/S0749-3797(22)00288-4/fulltext 10.1016/j.amepre.2022.04.026. [DOI] [PubMed] [Google Scholar]
- 58.De Marchis EH, Alderwick H, Gottlieb LM. Do patients want help addressing social risks? J Am Board Fam Med 2020;33:170–5 https://www.jabfm.org/content/33/2/170 10.3122/jabfm.2020.02.190309. [DOI] [PubMed] [Google Scholar]
- 59.Steeves-Reece AL, Totten AM, Broadwell KD, Richardson DM, Nicolaidis C, Davis MM. Social needs resource connections: a systematic review of barriers, facilitators, and evaluation. Am J Prev Med 2022;62:e303–15 https://www.ajpmonline.org/article/S0749-3797(22)00010-1/fulltext 10.1016/j.amepre.2021.12.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Garg A, Boynton-Jarrett R, Dworkin PH. Avoiding the unintended consequences of screening for social determinants of health. JAMA 2016;316:813–4 https://jamanetwork.com/journals/jama/article-abstract/2531579 10.1001/jama.2016.9282. [DOI] [PubMed] [Google Scholar]
- 61.Garg A, Homer CJ, Dworkin PH. Addressing social determinants of health: challenges and opportunities in a value-based model. Pediatrics 2019;143:e20182355 https://publications.aap.org/pediatrics/article/143/4/e20182355/37178/Addressing-Social-Determinants-of-Health?autologincheck=redirected 10.1542/peds.2018-2355. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.