Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2024 Jan 22.
Published in final edited form as: J Med Syst. 2023 Jul 22;47(1):78. doi: 10.1007/s10916-023-01975-8

Non-response bias in social risk factor screening among adult emergency department patients

Joshua R Vest 1, Olena Mazurenko 2,*
PMCID: PMC10439727  NIHMSID: NIHMS1922222  PMID: 37480515

Abstract

Healthcare organizations increasingly use screening questionnaires to assess patients’ social factors, but non-response may contribute to selection bias. This study assessed differences between respondents and those refusing participation in a social factor screening. We used a cross-sectional approach with logistic regression models to measure the association between subject characteristics and social factor screening questionnaire participation. The study subjects were patients from a mid-western state safety-net hospital’s emergency department. Subjects’ inclusion criteria were: 1) ≥ 18 years old, 2) spoke English or Spanish, and 3) able to complete a self-administered questionnaire. We classified subjects that consented and answered the screening questionnaire in full as respondents. All others were non-respondents. Using natural language processing, we linked all subjects’ participation status to demographic characteristics, clinical data, an area-level deprivation measure, and social risk factors extracted from clinical notes. We found that nearly 6 out of every 10 subjects approached (59.9%), consented, and completed the questionnaire. Subjects with prior documentation of financial insecurity were 22% less likely to respond to the screening questionnaire (marginal effect = −22.40; 95% confidence interval (CI) = −41.16, −3.63; p = 0.019). No other factors were significantly associated with response. This study uniquely contributes to the growing social determinants of health literature by confirming that selection bias may exist within social factor screening practices and research studies.

Keywords: Emergency department, social determinants of health, bias, surveys

Introduction

Social factors encompass numerous patients’ nonclinical, economic, contextual, and psychosocial characteristics13. Thus, social factors are important drivers of morbidity, mortality, disparities, healthcare utilization, and costs4,5. Subsequently, healthcare organizations are increasingly screening patients to ascertain the presence of multiple social factors, mostly with questionnaires6,7. Available screening questionnaires include those developed by healthcare organizations8, collaborative organizations9,10, payers11, or electronic health record (EHR) vendors12. Questionnaire-based social factor screening has the advantage of flexible deployment: clinical staff can ask for patient responses, patients can self-complete questionnaires on paper, or patients can complete them electronically through EHR patient portals13. Moreover, once collected, social factor screening results can be guide referrals to the social services14 or for population health management activities15.

While healthcare organizations, payers, and researchers may want patient-level social factor data, such screening is not risk-free16. Patients have multiple reasons for not completing social factor screening questionnaires. For example, patients may fear that disclosure of social risks can change the nature of the treatments, lead to stigmatization, or be inappropriately disclosed to other parties17,18. Additionally, patients may not view social factors as relevant to healthcare and therefore opt not to answer screening questions19. Importantly, patient views of the appropriateness of social factor screening may vary by race and ethnicity 20. Patient concerns about social risk screening may be particularly salient among underrepresented groups with a long history of mistreatment by healthcare organizations21. Thus, some patients may be less likely to complete screening questionnaires and become non-respondents, contributing to selection bias in the data collection process22.

The longstanding epidemiological assumption is that respondents to questionnaires are likely different than non-respondents23. Differences due to non-response bias may affect care delivery and population health management initiatives. First, non-response may prevent social factor screening interventions from effectively identifying and referring all patients with social needs. Patients may refuse screening out of a desire not to disclose a sensitive topic 24. Thus, they will not be offered services or referrals designed to address their social needs. Second, incomplete identification of social needs can undermine the population-based risk stratification modeling25. If such selection biases are left unaccounted for, risk stratification approaches exacerbate or create further inequities in the healthcare delivery 26. Finally, non-response can produce inaccurate estimates of the prevalence of social needs among the patients, complicating needs assessment planning for hospitals and health systems.

In light of the potential challenges posed by non-responses and the lack of quantifiable information on those refusing social factor screening, this study sought to assess the difference in patient characteristics between respondents and those refusing participation in a social factor screening study conducted in the emergency department (ED) of one safety-net hospital located in a mid-western state. Identification of screening response rates and potential differences between respondents and nonrespondents will help set organizational expectations for the effectiveness of social screening programs and identify risks of bias in using social factor screening data for analyses.

METHODS

Within a larger survey study on social risk factors, we nested a cross-sectional study to compare adult ED patients that responded or refused to complete a social factor screening questionnaire. This analysis is part of the larger project aimed at developing risk stratification models in the ED.

Setting & subjects

We recruited adults who sought care at Eskenazi Health’s ED between July and November 2021. Eskenazi Health is a 315-bed safety-net hospital in a mid-western state. Potential subjects were approached for social factor screening if they met the following criteria: 1) ≥ 18 years old, 2) spoke English or Spanish, and 3) were able to complete a self-administered social factor screening questionnaire. Patients with altered mental states, trauma, and acute behavioral health issues were excluded. We did not recruit confirmed or suspected COVID-19 patients out of safety concerns. We approached 429 eligible ED patients for participation between June and November 2021. The sample size was not pre-determined, but based on the pragmatic choice to evaluate potential response bias within the first 6 months of data collection.

Recruitment

During various shifts during days, nights, and weekends, trained data collection staff reviewed the ED’s EHR for eligible study subjects seeking care in the ED. Next, a research staff invited all eligible subjects to complete a survey about ED patients’ social risks. Research staff explained to eligible subjects the purpose of the study to better understand the occurrence of social risks to help healthcare organizations implement more effective referral services. Recruitment occurred within patient rooms or pre-discharge waiting areas. All subjects were offered a $5 gift card as an incentive for completing the survey. The social factor screening questionnaire was self-administered on a tablet 27,28. The Indiana University IRB approved this study (2011558232).

Participation measures

Subjects that consented to study inclusion and answered the social factor screening questionnaire in full were classified as respondents. Non-respondents included subjects who declined participation and those who verbally agreed to participate in the study but subsequently declined to sign consent forms or failed to complete the questionnaire after consent.

Data & covariates

We linked participation status to demographic and clinical structured data from the safety-net hospital’s EHR for all subjects. These included subjects: age, race/ethnicity, gender, preferred language, date & time of visit, insurance status, and discharge diagnosis. We grouped primary discharge diagnoses using AHRQ’s Clinical Classification Software29. The NYU-ED algorithm identified non-emergency visits30.

We combined two additional data sources to assess differences in social factors between respondents and non-respondents comprehensively. First, we characterized the subjects’ zip code of residence using a modified Townsend Index31, an overall summary measure of socioeconomic disadvantage. We categorized this continuous measure into quartiles. Second, we identified subjects with any mention of housing instability, income insecurity, and unemployment in clinical textual documents (e.g., clinical notes and reports) in the prior 12 months using natural language processing (NLP). The development and validation of the NLP algorithm have been reported previously and included notes from ED visits, inpatient admissions, and outpatient visits32.

Analyses

We described the sample using percentages. We compared subjects’ characteristics by participation status using Chi-square tests. We measured the association between subject characteristics and social factor screening questionnaire participation using logistic regression models controlling for all covariates. To facilitate interpretation, we expressed regression coefficients as marginal effects, i.e., the changes in the predicted probability of being a respondent for each characteristic.

RESULTS

Reflective of an ED patient population at a safety-net hospital, the overall sample was diverse (Table 1). Briefly, more than half (59.2%) of approached subjects were females and were on Medicaid (56.4%). Subjects aged 45–66 (38.7%) and Black non-Hispanic (45.2%) were the largest groups compared to other age and race/ethnicity groups. Most subjects were English speaking (80.4%). The most common diagnoses were related to pain and headache. Nearly 6 out of every 10 subjects approached (59.9%) consented and completed the social factor screening questionnaire. Subjects spent, on average, eight minutes (mean = 8.1, sd=5.2) to complete the questionnaire.

Table 1.

Comparison of subjects approached for a social risk factor screening questionnaire by response status.

Demographics Respondents (N=257; 59.9 %) % Non-respondents (N=172; 40.1%) % Total (N= 429) N (%) P-value

Gender 0.867
 Male 40.5 41.3 175 (40.8)
 Female 59.5 58.7 254 (59.2)
Age 0.080
 18–34 31.5 32.0 136 (31.7)
 35–44 21.8 19.8 90 (21.0)
 45–64 40.9 35.5 166 (38.7)
 >65 5.8 12.8 37 (8.6)
Race and ethnicity 0.503
 White non-Hispanic 28.4 26.2 118 (27.5)
 Black non-Hispanic 42.4 49.4 194 (45.2)
 Hispanic 26.5 21.5 105 (24.5)
 Other / unknown 2.7 2.9 12 (2.8)
Preferred language
 English 79.4 82.0 345 (80.4) 0.506
 Other than English 20.6 18.0 84 (19.6)
Insurance status 0.320
 Commercial 12.8 12.8 55 (12.8)
 Medicare 15.2 29.3 75 (17.5)
 Medicaid 56.4 56.4 242 (56.4)
 Self-pay 14.4 8.7 52 (12.1)
 Other/Unknown 1.2 1.2 5 (1.2)
Discharge diagnosis 0.707
 Pain & headache1 35.4 39.0 158 (36.8)
 Injury 12.1 13.4 54 (12.6)
 Urinary tract infection 5.1 3.5 19 (4.4)
 Nausea & vomiting 3.5 1.7 12 (2.8)
 Respiratory illness & conditions 7.4 5.2 28 (6.5)
 Other 36.6 37.2 158 (36.8)
Non-emergency condition 19.8 21.5 88 (20.5) 0.675
Time of visit 0.229
 AM 20.6 25.6 97 (22.6)
 PM 79.4 74.4 332 (77.4)
Day of visit 0.447
 Weekend 16.3 19.2 75 (17.5)
 Weekday 83.7 80.8 354 (82.5)
Area level deprivation
 1 (least disadvantaged) 28.2 27.8 119 (28.1) 0.979
 2 23.1 21.9 96 (22.6)
 3 26.7 28.4 116 (27.4)
 4 (most disadvantaged) 22.0 21.9 93 (21.9)
Documented in clinical notes 2
Housing instability 6.2 4.7 24 (5.6) 0.487
Financial insecurity 4.3 9.9 28 (6.5) 0.021
Unemployed 16.0 22.7 80 (18.7) 0.080
1

Includes: abdominal, musculoskeletal, spondylopathies & spondyloarthropathy, chest pain, headache

2

In the past 12 months

Subject participation in the social factor screening questionnaire did not vary significantly by structured data elements in the EHR, such as age, gender, race/ethnicity, payer status, preferred language, discharge diagnosis, the timing of the visit, and whether the visit was related to the non-emergency condition (see Table 1). Likewise, the area-level social determinants of health measures did not vary significantly by participation status. However, financial insecurity documentation in clinical notes varied by participation: 9.9% of non-respondents had such documentation in the past 12 months, whereas it was present in 4.3% of those that completed the screening questionnaire.

In the fully adjusted model (Table 2), subjects with prior documentation of financial insecurity were 23% less likely to respond to the social factor screening questionnaire than those without (marginal effect = −22.90; 95% confidence interval = −41.76, −4.05; p = 0.017). No other factors were significantly associated with response.

Table 2.

Association between patient characteristics and nonresponse to social factor screening (N= 424 ED patients)1

Demographics Non-response2 (Marginal effect; 95% Confidence Interval) P-value

Gender
 Male Reference
 Female 3.64 (-6.15, 13.42) 0.467
Age
 18–34 Reference
 35–44 7.67 (−5.48, 20.82) 0.253
 45–64 9.50 (−2.2, 21.2) 0.112
 >65 −13.77 (−35.36, 7.83) 0.211
Race and ethnicity
 White non-Hispanic Reference
 Black non-Hispanic −3.43 (−14.89, 8.04) 0.558
 Hispanic 6.51 (−14.6, 27.62) 0.546
 Other / unknown 1.99 (−26.91, 30.9) 0.893
Preferred language
 English Reference
 Other than English −5.83 (−27.73, 16.06) 0.602
Insurance status
 Commercial Reference
 Medicare 3.68 (−15.88, 23.24) 0.712
 Medicaid 3.26 (−11.78, 18.29) 0.671
 Self-pay 14.89 (−4.12, 33.91) 0.125
 Other/Unknown 7.48 (−35.25, 50.21) 0.732
Primary diagnosis
 Pain & headache3 −2.38 (−13.12, 8.36) 0.664
 Injury −2.52 (−17.94, 12.89) 0.748
 Urinary tract infection 7.54 (−14.79, 29.88) 0.508
 Nausea & vomiting 16.74 (−8.44, 41.92) 0.193
 Respiratory illness & conditions 5.20 (−14.41, 24.81) 0.603
 All others Reference
Non-emergency condition −5.06 (−17.66, 7.53) 0.431
Time of visit
 AM Reference
 PM 9.58 (−1.91, 21.07) 0.102
Day of visit
 Weekday Reference
 Weekend −6.37 (−18.87, 6.13) 0.318
Area level deprivation 4
 1 (least disadvantaged) Reference
 2 3.82 (−9.26, 16.9) 0.567
 3 −2.70 (−15.66, 10.26) 0.683
 4 (most disadvantaged) 1.88 (−11.38, 15.14) 0.781
Documented in clinical notes 5
Housing instability 14.47 (−3.72, 32.66) 0.119
Financial insecurity −22.90 (−41.76, −4.05) 0.017
Unemployed −8.90 (−21.77, 3.97) 0.175
1

Reference category is responded to survey;

2

Adjusted for gender, age, race & ethnicity, preferred language, and insurance type

3

Includes: abdominal, musculoskeletal, spondylopathies & spondyloarthropathy, chest pain, headache;

4

Subject zip code

5

In the past 12 months

DISCUSSION

In an adult ED patient population, nearly 6 out of 10 approached subjects responded to a social factor screening questionnaire. Our study had a typical response rate for ED social risk screening efforts17,21. While respondents and non-respondents were similar in demographic characteristics, they did differ in the history of documented financial insecurity. Our finding that respondents and non-respondents differ by risk factor is expected, as it is well-established in the literature that reporting social needs is not risk-free for patients16. This study uniquely contributes to the growing social determinants of health literature by confirming that selection bias may exist within social factor screening practices and research studies.

A critical concern from both the practice and research perspectives is that patients who refuse screening participation are at higher risk. We observed that those with prior documented financial insecurity were less likely to respond to the social factor screening questionnaire. As financial insecurity is a critical determinant of health and an underlying driver of other social risks3335, this finding indicates a potentially significant difference between respondents and nonrespondents. The history of documented financial insecurity may reflect more acute social needs because a provider thought it warranted documentation. Consistent with our findings, research among pediatric caregivers reported that those who declined social screening were potentially at higher social risk36.

We can speculate about potential reasons for the observed differences in documented financial insecurity. Screening questionnaires and clinical documentation represent two very different methods of data collection and contexts, which may drive the observed differences. For example, sharing personal social information may be easier with a healthcare provider with an assumed or historical level of trust19,37. A such trust might explain why a patient that divulged financial difficulties in the past would not respond to a research survey. Additionally, our questionnaire administration was not part of the clinical care process. Patients may have refused to fill out our survey because they needed to see how social information disclosure was relevant or potentially beneficial. Organizations may face similar nonresponse challenges if employing modalities that are outside the patient encounter, such as sending social factors screening questionnaires through patient portals or personal health records8,38.

Critically, the differences in prior financial insecurity documentation stand in stark contrast to the absence of any significant difference in measured and observable characteristics, including area-level social determinants of the health measure. Checking for differences in response rates in patient demographics or area-level social determinants of health measures may not be sufficient to assess the presence of bias for any social factor screening. Thus, referral programs to social service interventions and population health analytics efforts should consider alternative approaches for dealing with unscreened patients. Additionally, the absence of differences in common structured data elements may hinder analytic approaches to address missing information. For example, a recent review on EHR-based social factor data quality recommended leveraging imputation methods to deal with missingness39. While imputation can be achieved using numerous techniques, these findings indicate that such efforts may require additional data sources or advanced information extraction methods like NLP.

Social factor screening is most effective when consistently applied to the entire target population40. Whether that target population should be all patients or the subset of patients with known risk factors is an open question41,42. Still, multiple healthcare organizations have attempted to screen all their patients14,43, CMS’ Accountable Health Communities model requires universal screening44, and screening all patients avoids perpetuating biases and stigmatization16. Currently, screening is often occurring selectively and inconsistently. The screening is often based on the patient’s appearance18,45 or demographic characteristics 46; different screening questionnaires are used in the organization concurrently6; or patients may be excluded because data are collected in a single language46. Closing the gap between current response rates and nearer-universal screening will require integration within EHR portals for out-of-office data collection, mining clinical notes using NLP47, or risk stratification modeling to identify those most likely in need of additional screening48.

Limitations

The generalizability of our findings to clinical practice warrants further exploration. The social factor screening questionnaire was administered as part of a research study, which included interaction with nonclinical staff, written informed consent, and incentives. Interest in responding to social factor screening may be different if administered by the patient’s healthcare provider in the context of care. These findings may not generalize to other care settings, non-adult patients, or provider types. The current study focused on the response to a social factor screening questionnaire. Still, prior work on social screening among pediatric caregivers has also identified substantial differences among non-response to individual questionnaire items as well36. Finally, these data were collected during the COVID-19 pandemic (after vaccinations became widely available). Overall, the stresses felt by patients and the healthcare environment may have influenced recruitment.

CONCLUSIONS

Respondents and non-respondents to a social factor screening questionnaire differed in a history of prior financial insecurity. Selection bias may exist within social factor screening.

ACKNOWLEDGEMENTS

The authors thank the Regenstrief Institute Data Core, Ms. Amber Blackmon, and Mr. Harold Kooreman for their assistance.

FUNDING

This work was supported by the Agency for Healthcare Research & Quality (1R01HS028008–01 PI: Vest).

Footnotes

COMPETING INTERESTS

Joshua Vest is a founder and equity holder in Uppstroms, Inc, a health information technology company. Olena Mazurenko has no conflicts to declare.

ETHICAL APPROVAL

The Indiana University IRB approved this study (2011558232).

Contributor Information

Joshua R Vest, Department of Health Policy & Management, Indiana University Richard M. Fairbanks School of Public Health – Indianapolis, Center for Biomedical Informatics, Regenstrief Institute.

Olena Mazurenko, Department of Health Policy & Management, Indiana University Richard M. Fairbanks School of Public Health – Indianapolis.

AVAILABLITY OF DATA AND MATERIALS

The data for this study are not available.

REFERENCES

  • 1.Green K, Zook M. When Talking About Social Determinants, Precision Matters | Health Affairs. Health Affairs Blog. Published; October 29, 2019. Accessed December 3, 2019. 10.1377/hblog20191025.776011/full/ [DOI] [Google Scholar]
  • 2.Alderwick H, Gottlieb LM. Meanings and Misunderstandings: A Social Determinants of Health Lexicon for Health Care Systems. Milbank Q. 2019;97(2):1–13. doi: 10.1111/1468-0009.12390 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Woolf S, Aron, eds. U.S. Health in International Perspective: Shorter Lives, Poorer Health. National Academies Press; 2013. doi: 10.17226/13497 [DOI] [PubMed] [Google Scholar]
  • 4.Commission on Social Determinants of Health. Closing the Gap in a Generation: Heatlh Equity through Action on the Social Determinants of Health. World Health Organization; 2008. [DOI] [PubMed] [Google Scholar]
  • 5.Pruitt Z, Emechebe N, Quast T, Taylor P, Bryant K. Expenditure Reductions Associated with a Social Service Referral Program. Popul Health Manag. 2018;21(6):469–476. doi: 10.1089/pop.2017.0199 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Lee J, Korba C. Social Determinants of Health : How Are Hospitals and Health Systems Investing in and Addressing Social Needs?; 2017. https://www2.deloitte.com/us/en/pages/life-sciences-and-health-care/articles/addressing-social-determinants-of-health-hospitals-survey.html
  • 7.Cartier Y, Gottlieb L. The prevalence of social care in US health care settings depends on how and whom you ask. BMC Health Serv Res. 2020;20(1):481. doi: 10.1186/s12913-020-05338-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.LaForge K, Gold R, Cottrell E, et al. How 6 Organizations Developed Tools and Processes for Social Determinants of Health Screening in Primary Care: An Overview. J Ambulatory Care Manage. 2018;41(1):2–14. doi: 10.1097/jac.0000000000000221 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Leads Health. The Health Leads Screening Toolkit. Health Leads. Published; 2018. Accessed February 5, 2020. https://healthleadsusa.org/resources/the-health-leads-screening-toolkit/ [Google Scholar]
  • 10.National Association of Community Health Centers. PRAPARE. Published online; 2019. http://www.nachc.org/research-and-data/prapare/ [Google Scholar]
  • 11.Centers for Medicare & Medicaid Services. The Accountable Health Communities Health-Related Social Needs Screening Tool. Published; 2018. Accessed February 5, 2020. https://innovation.cms.gov/Files/worksheets/ahcm-screeningtool.pdf [Google Scholar]
  • 12.Kim Cohen J Stuck in first gear: Hospitals struggle to get past the initial phase of tending to social determinants of health. Mod Healthc. 2020;50(7):14. [Google Scholar]
  • 13.Cottrell EK, Dambrun K, Cowburn S, et al. Variation in Electronic Health Record Documentation of Social Determinants of Health Across a National Network of Community Health Centers. Am J Prev Med. 2019;57(6, Supplement 1):S65–S73. doi: 10.1016/j.amepre.2019.07.014 [DOI] [PubMed] [Google Scholar]
  • 14.Meyer D, Lerner E, Phillips A, Zumwalt K. Universal Screening of Social Determinants of Health at a Large US Academic Medical Center, 2018. Am J Public Health. 2020;110(S2):S219–S221. doi: 10.2105/AJPH.2020.305747 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Gottlieb LM, Sandel M, Adler NE. Collecting and Applying Data on Social Determinants of Health in Health Care Settings. JAMA Intern Med. 2013;173(11):1017–1017. doi: 10.1001/jamainternmed.2013.560 [DOI] [PubMed] [Google Scholar]
  • 16.Garg A, Boynton-Jarrett R, Dworkin PH. Avoiding the Unintended Consequences of Screening for Social Determinants of Health. JAMA. 2016;316(8):813–814. doi: 10.1001/jama.2016.9282 [DOI] [PubMed] [Google Scholar]
  • 17.Byhoff E, De Marchis EH, Hessler D, et al. Part II: A Qualitative Study of Social Risk Screening Acceptability in Patients and Caregivers. Am J Prev Med. 2019;57(6, Supplement 1):S38–S46. doi: 10.1016/j.amepre.2019.07.016 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Wallace AS, Luther BL, Sisler SM, Wong B, Guo JW. Integrating social determinants of health screening and referral during routine emergency department care: evaluation of reach and implementation challenges. Implement Sci Commun. 2021;2(1):114. doi: 10.1186/s43058-021-00212-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.De Marchis EH, Hessler D, Fichtenberg C, et al. Part I: A Quantitative Study of Social Risk Screening Acceptability in Patients and Caregivers. Am J Prev Med. 2019;57(6, Supplement 1):S25–S37. doi: 10.1016/j.amepre.2019.07.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Rogers AJ, Hamity C, Sharp AL, Jackson AH, Schickedanz AB. Patients’ Attitudes and Perceptions Regarding Social Needs Screening and Navigation: Multi-site Survey in a Large Integrated Health System. J Gen Intern Med. Published online; January 2, 2020. doi: 10.1007/s11606-019-05588-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Institute of Medicine. Fostering Rapid Advances in Health Care: Learning from System Demonstrations. (Corrigan JM, Greiner A, Erikson SM, eds.). National Academy Press; 2003. [PubMed] [Google Scholar]
  • 22.Delgado-Rodriguez M, Llorca J. Bias. J Epidemiol Community Health. 2004;58(8):635–641. doi: 10.1136/jech.2003.008466 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Galea S, Tracy M. Participation Rates in Epidemiologic Studies. Ann Epidemiol. 2007;17(9):643–653. doi: 10.1016/j.annepidem.2007.03.013 [DOI] [PubMed] [Google Scholar]
  • 24.Nielsen KDB, Dyhr L, Lauritzen T, Malterud K. “You can’t prevent everything anyway”: A qualitative study of beliefs and attitudes about refusing health screening in general practice. Fam Pract. 2004;21(1):28–32. doi: 10.1093/fampra/cmh107 [DOI] [PubMed] [Google Scholar]
  • 25.Nong P, Adler-Milstein J. Socially situated risk: challenges and strategies for implementing algorithmic risk scoring for care management. JAMIA Open. 2021;4(3):ooab076. doi: 10.1093/jamiaopen/ooab076 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Tan M, Hatef E, Taghipour D, et al. Including Social and Behavioral Determinants in Predictive Models: Trends, Challenges, and Opportunities. JMIR Med Inform. 2020;8(9). doi: 10.2196/18084 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2):377–381. doi: 10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Harris PA, Taylor R, Minor BL, et al. The REDCap consortium: Building an international community of software platform partners. J Biomed Inform. 2019;95:103208. doi: 10.1016/j.jbi.2019.103208 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Agency for Healthcare Research & Quality,. Clinical Classifications Software Refined (CCSR). HCUP: Healthcare Cost and Utilization Project. Published; 2021. Accessed February 23, 2022. https://www.hcup-us.ahrq.gov/toolssoftware/ccsr/ccs_refined.jsp [Google Scholar]
  • 30.NYU Center for Health and Public Service Research. NYU ED Algorithm. Published online; 2016. Accessed June 6, 2022. http://wagner.nyu.edu/faculty/billings/nyued-background [Google Scholar]
  • 31.Schwartz BS, Stewart WF, Godby S, et al. Body Mass Index and the Built and Social Environments in Children and Adolescents Using Electronic Health Records. Am J Prev Med. 2011;41(4):e17–e28. doi: 10.1016/j.amepre.2011.06.038 [DOI] [PubMed] [Google Scholar]
  • 32.Allen K, Hood D, Cummings J, Kasthurirathne S, Embi P, Vest J. Extracting Social Variables from Clinical Documentation to Better Facilitate Response to Patient Need. Presented at: AMIA Annual Fall Symposium; 2021; San Diego, CA. [Google Scholar]
  • 33.Weida EB, Phojanakong P, Patel F, Chilton M. Financial health as a measurable social determinant of health. PLOS ONE. 2020;15(5):e0233359. doi: 10.1371/journal.pone.0233359 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Sinclair RR, Cheung JH. Money Matters: Recommendations for Financial Stress Research in Occupational Health Psychology. Stress Health. 2016;32(3):181–193. doi:10/f8xzsh [DOI] [PubMed] [Google Scholar]
  • 35.Institute of Medicine. Capturing Social and Behavioral Domains in Electronic Health Records: Phase 1. The National Academies Press; 2014. [PubMed] [Google Scholar]
  • 36.Ray KN, Gitz KM, Hu A, Davis AA, Miller E. Nonresponse to Health-Related Social Needs Screening Questions. Pediatrics. 2020;146(3). doi: 10.1542/peds.2020-0174 [DOI] [PubMed] [Google Scholar]
  • 37.Brochier A, Messmer E, Garg A. Physicians and Social Determinants of Health. JAMA. 2020;324(12):1215. doi: 10.1001/jama.2020.12106 [DOI] [PubMed] [Google Scholar]
  • 38.Gold R, Cottrell E, Bunce A, et al. Developing Electronic Health Record (EHR) Strategies Related to Health Center Patients’ Social Determinants of Health. J Am Board Fam Med JABFM. 2017;30(4):428–447. doi: 10.3122/jabfm.2017.04.170046 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Cook LA, Sachs J, Weiskopf NG. The quality of social determinants data in the electronic health record: a systematic review. J Am Med Inform Assoc JAMIA. 2021;29(1):187–196. doi: 10.1093/jamia/ocab199 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Dobrow MJ, Hagens V, Chafe R, Sullivan T, Rabeneck L. Consolidated principles for screening based on a systematic review and consensus process. CMAJ. 2018;190(14):E422–E429. doi: 10.1503/cmaj.171154 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Davidson KW, Krist AH, Tseng CW, et al. Incorporation of Social Risk in US Preventive Services Task Force Recommendations and Identification of Key Challenges for Primary Care. JAMA. 2021;326(14):1410–1415. doi: 10.1001/jama.2021.12833 [DOI] [PubMed] [Google Scholar]
  • 42.Eder M, Henninger M, Durbin S, et al. Screening and Interventions for Social Risk Factors: Technical Brief to Support the US Preventive Services Task Force. JAMA. 2021;326(14):1416–1428. doi: 10.1001/jama.2021.12825 [DOI] [PubMed] [Google Scholar]
  • 43.Sundar KR. Universal Screening for Social Needs in a Primary Care Clinic: A Quality Improvement Approach Using the Your Current Life Situation Survey. Perm J. 2018;22:18–089. doi: 10.7812/tpp/18-089 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Alley DE, Asomugha CN, Conway PH, Sanghavi DM. Accountable Health Communities — Addressing Social Needs through Medicare and Medicaid. N Engl J Med. 2016;374(1):8–11. doi: 10.1056/nejmp1512532 [DOI] [PubMed] [Google Scholar]
  • 45.Chhabra M, Sorrentino AE, Cusack M, Dichter ME, Montgomery AE, True G. Screening for Housing Instability: Providers’ Reflections on Addressing a Social Determinant of Health. J Gen Intern Med. 2019;34(7):1213–1219. doi: 10.1007/s11606-019-04895-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Wallace AS. Implementing a Social Determinants Screening and Referral Infrastructure During Routine Emergency Department Visits, Utah, 2017–2018. Prev Chronic Dis. 2020;17. doi: 10.5888/pcd17.190339 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Feller DJ, Iv OJBDW, Zucker J, Yin MT, Gordon P, Elhadad N. Detecting Social and Behavioral Determinants of Health with Structured and Free-Text Clinical Data. Appl Clin Inform. 2020;11(01):172–181. doi: 10.1055/s-0040-1702214 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Vest JR, Menachemi N, Grannis SJ, et al. Impact of Risk Stratification on Referrals and Uptake of Wraparound Services That Address Social Determinants: A Stepped Wedged Trial. Am J Prev Med. 2019;56(4):e125–e133. doi: 10.1016/j.amepre.2018.11.009 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The data for this study are not available.

RESOURCES