Skip to main content
Journal of the American Medical Informatics Association : JAMIA logoLink to Journal of the American Medical Informatics Association : JAMIA
. 2023 Mar 21;30(8):1456–1462. doi: 10.1093/jamia/ocad043

Integrating patient voices into the extraction of social determinants of health from clinical notes: ethical considerations and recommendations

Andrea L Hartzler 1,, Serena Jinchen Xie 2, Patrick Wedgeworth 3, Carolin Spice 4, Kevin Lybarger 5, Brian R Wood 6, Herbert C Duber 7,8, Gary Hsieh 9, Angad P Singh 10,11; SDoH Community Champion Advisory Board 2
PMCID: PMC10354781  PMID: 36944091

Abstract

Identifying patients’ social needs is a first critical step to address social determinants of health (SDoH)—the conditions in which people live, learn, work, and play that affect health. Addressing SDoH can improve health outcomes, population health, and health equity. Emerging SDoH reporting requirements call for health systems to implement efficient ways to identify and act on patients’ social needs. Automatic extraction of SDoH from clinical notes within the electronic health record through natural language processing offers a promising approach. However, such automated SDoH systems could have unintended consequences for patients, related to stigma, privacy, confidentiality, and mistrust. Using Floridi et al’s “AI4People” framework, we describe ethical considerations for system design and implementation that call attention to patient autonomy, beneficence, nonmaleficence, justice, and explicability. Based on our engagement of clinical and community champions in health equity work at University of Washington Medicine, we offer recommendations for integrating patient voices and needs into automated SDoH systems.

Keywords: social determinants of health, electronic health records, patient acceptance of health care, bioethical issues, natural language processing

INTRODUCTION

 

Addressing social determinants of health (SDoH)—nonmedical factors that affect health and health outcomes—is critical for improving patient care and achieving health equity.1 Unmet social needs, such as financial hardship, food insecurity, and housing instability, may account for 40%–90% of health outcomes.2 Because screening and referral are associated with improved outcomes,3 the National Academies of Sciences recommends integrating SDoH into healthcare with electronic health records (EHRs) playing a central role.4 The Center for Medicare & Medicaid Services set forth guidelines that recommend voluntary reporting of positive screening rates for SDoH (including food insecurity, housing instability, transportation needs, utility difficulties, and interpersonal safety) in 2023, mandatory reporting in 2024, and use of these metrics to drive reimbursement by 2026.5 To meet these guidelines, health systems need efficient ways to capture and use SDoH data in the EHR without adding administrative burden.1,6

Although much effort has aimed to develop EHR-based screening and referral systems for SDoH,7–9 implementation barriers persist.10 Some patients express discomfort and confidentiality concerns about SDoH screening.10 Compared with traditional medical needs, some patients perceive social needs as less relevant to healthcare11 and of lower priority to bring up with busy clinicians.12 Patients with more social needs express more discomfort documenting this data in the EHR.13 Further, patients vary in their perceptions about whether a health system should intervene on identified needs.14 The high burden of screening perceived by clinical staff also limits adoption.15–17 Although self-administered screening through patient portals can help,18,19 racial, ethnic, literacy, and other barriers to use continue to persist20–25 despite gains in technology adoption over the past decade.26 Because patients with social needs may face barriers due to limited literacy and digital access,27 additional SDoH strategies are needed. Pressure for health systems to comply with the emerging reporting guidelines5 increases the appeal of automated strategies that extract SDoH from clinical notes over traditional SDoH screening strategies, particularly given increased financial strain of COVID-19 pandemic.28

The growing literature on extracting social needs from unstructured clinical notes using natural language processing (NLP) demonstrates potential to improve the utility of SDoH data in the EHR while reducing costs of traditional SDoH screening and documentation by clinical users.29 For example, Lybarger et al30 demonstrate how NLP can augment structured SDoH data in the EHR to provide a more comprehensive picture of patients’ social needs. Accurately and reliably extracted social needs could automatically populate structured data elements in the EHR that trigger computerized decision support and closed-loop referrals with community-based organizations.18,31 The biomedical informatics field is making extraordinary strides toward these new “automated SDoH systems” through initiatives like the National NLP Clinical Challenge on extracting SDoH.32 Given this progress, the desire to implement these advances in clinical operations will only grow.33,34 As social needs screening and referral becomes standard of care, the potential unintended consequences of automated SDoH systems for patients must be taken into serious consideration. Ethical considerations must take precedence.

ETHICAL CONSIDERATIONS

Automated SDoH systems have significant potential to enhance healthcare, but efforts to extract social needs from clinical notes could also exacerbate patient concerns regarding privacy, stigma, and mistrust. Although many patients find SDoH screening and navigation acceptable,12–14 unintended consequences are well documented.35,36 Patients express concerns about the use of artificial intelligence (AI) in healthcare, such as cost, safety, security, data quality, and lack of personal choice, all of which impact patient acceptability.37–40 We wish to bring focus to the importance of incorporating perspectives of patients into the design and implementation of automated SDoH systems as an emerging type of AI. “AI4People” is an ethical framework comprised of 5 principles to guide the development and adoption of AI technologies: Patient autonomy, beneficence, nonmaleficence, justice, and explicability.41 The framework lays a foundation for “a Good AI society,” which researchers have used to characterize bias in clinical NLP42 Using the AI4People framework, we examine ethical considerations of automated SDoH systems from the perspectives of patients.

Beneficence

Automated SDoH systems should benefit and empower people by promoting well-being and preserving dignity.41 SDoH interventions are only clinically useful if they are used by care teams to benefit patients.43 Patients may feel reluctant to share social needs they perceive not relevant or beneficial to their care.11,12,40 Automatic extraction has the potential to alleviate patient discomfort associated with traditional social needs screening methods.10 By mining clinical notes, automated SDoH systems could increase the availability of contextual patient information to care teams, leading to fewer assumptions and more informed communication with patients.44,45 Although automated SDoH systems could mitigate some clinical burdens of traditional social needs screening, clinicians may still lack awareness of EHR-based SDoH documentation46 or the time to follow up on identified social risks.10 Time burdens of automated SDoH systems could be even greater given the potential for false positives that may benefit from a “human in the loop” to review the accuracy of extracted social needs before patient follow-up. The benefits of automated SDoH systems may be limited if the identified social needs are not actionable, calling into question the ethics of screening without the capacity for community referral.35 Unlike mining EHRs for health conditions that are addressed through medical interventions,47 the clinical utility of SDoH interventions relies on a fragile network of community organizations.43 Compared with traditional medical needs, health systems may be less equipped to respond to some social needs (eg, utility difficulties) than others (eg, interpersonal safety). While it may be technically feasible to efficiently extract a given social need through NLP,29,30 pathways are needed that will equip clinical staff and community-based organizations to effectively respond.

Nonmaleficence

Automated SDoH systems should avoid harm by preserving privacy, confidentiality, and security, and preventing data misuse,41 including potential harms from inaccurate data generated by AI.48 Repeating answers to questions considered by some patients as sensitive (eg, interpersonal violence) can trigger unpleasant or potentially traumatic memories and engender feelings of being judged.49 Clinicians may feel uncomfortable routinely inquiring about adverse social circumstances, particularly if they lack experience or training on eliciting and responding to social needs, which may leave patients frustrated when those needs are left unmet.35 Automated SDoH systems could reduce the potential for subsequent discomfort by surfacing social needs already identified in clinical notes, and help the workforce respectfully listen and collaborate with patients to address SDoH.50 However, surfacing and sharing social needs mined from clinical notes that patients believe were discussed privately with their clinician could inhibit trust, communication, and therapeutic relationships. Automated screening could feel intrusive, especially if findings are incorrect. False positives can misguide conversations and lead to unnecessary efforts to link patients with social services, while false negatives fail to identify and meet patients’ needs.43 Compared with traditional clinical data, inaccurate SDoH data could lead to even greater potential for stigmatization, such as labeling a patient “homeless” versus “diabetic.” Inaccuracies could increase patient distrust, particularly in populations with historic marginalization51 and at greater risk for social needs. Patients also express security concerns about SDoH data misuse.52 Although automated SDoH systems can surface social needs for clinicians that might otherwise be missed, we must weigh potential benefit and harms.

Autonomy

Automated SDoH systems should provide individuals the freedom to make decisions for themselves, including how much agency to delegate to AI.41 Although many patients find social needs screening and navigation in healthcare acceptable,12–14 not all patients desire assistance.53–55 Few screening tools ask patients about their interest in assistance for identified social needs.43 As regulatory requirements place pressure to screen every patient for social needs,5 healthcare systems have the opportunity to implement SDoH screening in ways that are sensitive to patient preference and preserve autonomy. Automated SDoH systems should not assume that patients wish to have their social needs mined and acted upon without providing patients the autonomy to decide. These systems should ask patients for permission50 and obtain informed consent to use social needs mined from their clinical notes. AI tools that “operate surreptitiously in the background”37 can break public trust, particularly if patients who do not want the technology used in their care discover its use only after deployment. We need best practices for implementing SDoH screening and referral interventions,10,56–58 including automated SDoH systems that respect patient autonomy. Informatics techniques can provide patients with greater visibility of how social needs extracted by automated SDoH screening are shared.59,60 Patients could be provided the freedom to opt-out of automated SDoH screening, similar to national recommendations that help normalize other potentially stigmatizing services, like HIV screening.61,62 Another possible direction is for automated SDoH systems to offer features for patients to opt-in to the use of extracted social needs in their care, similar to EHR-based research permissions.63

Justice

Although automated SDoH systems should be fair and equitable by promoting prosperity and preserving solidarity,41 these systems have the potential to disempower patients when imperfect mining leads to the unintentional creation of biased SDoH data that can reinforce inequities.37,64 Use of stigmatizing language that is disapproving, discrediting, and stereotyping to describe patients65 can negatively impact clinicians’ attitudes toward patients and their clinical decision-making.66 Although researchers, who have described the prevalence of racially stigmatizing language in the EHR, call for changes in documentation practices,67,68 historical language in clinical notes could be extracted by automated SDoH systems and label patients in ways that stigmatize care and transmit bias. For example, systems could surface clinical text describing a patient as a “drug seeker”69 when extracting social needs related to the SDoH domain of substance use. Mislabeling patients with inaccurate social needs is a further concern. How patients are asked about social needs can impact the quality of responses received. Similarly, NLP performance relies on reducing the dimensionality of input data, affecting the quality of social needs extracted. For example, some social needs like polysubstance use may be expressed in complex and nuanced ways in clinical notes (eg, current use, past use, and multiple types of drugs), which impacts the fidelity of NLP.30 Although no automated SDoH system is infallible, inherent biases in data, algorithms, and system use can unintentionally exacerbate inequities.53,64 Unequal treatment based on extracted social needs may be particularly consequential for patients from groups that have experienced historical marginalization evidenced in the EHR, including Black individuals67,68 and transgender people.70 In addition to improving NLP performance,30 informatics efforts can guide the just implementation of automated SDoH systems. For example, rather than “auto-populating” SDoH data in the EHR, systems could “auto-suggest” social needs identified by NLP for human review first. Such implementation choices could help guard against false positive or otherwise inaccurate SDoH data.

Explicability

Automated SDoH should exhibit transparency and accountability while supporting the 4 other traditional bioethics principles.41 To be beneficent and nonmaleficent, automated SDoH systems must enable patients to understand the potential for good as well as potential for harm. For example, systems should inform patients about how these tools intend to help and potential risks to privacy, data security, data quality, or system misuse. To support autonomy, automated SDoH systems must enable patients to decide whether and how their SDoH data are used. For example, systems should allow patients the choice to participate, whether assistance is desired, and from whom. To be just, the implications of using automated SDoH systems should be readily understandable to all patients regardless of their experience with AI. Limited digital and health literacy in some populations with high social needs makes it critical to find clear and effective means to explain the potential implications of using automated SDoH systems for informed consent. Just systems should not create or perpetuate bias—we should strive for antibias systems. Organizations deploying this technology should take accountability for unintended consequences to patients. All of these assurances require that patients have a voice in the design and implementation of automated SDoH systems.

RECOMMENDATIONS

Patient acceptability of automated SDoH systems is contingent on mitigating patient concerns,37–40 particularity for individuals with social needs for who automated SDoH systems might impact most. Ethical principles can help drive mitigation efforts, but patients must have a voice in these systems. At the University of Washington Medicine, we are leading an effort to accentuate patient voices concerning the clinical extraction of SDoH. Based on our partnership with clinical champions and an advisory board of community advocates (Table 1), we offer recommendations for improving and diversifying patient engagement to ensure that efforts to implement automated SDoH systems are informed by patient input and reflect ethical principles.

Table 1.

SDoH community champion advisory board

graphic file with name ocad043ilf1.jpg Kase Cragg is a transmasculine-nonbinary person with lived experience as a recipient and provider of mental healthcare, both of which informs their work. He earned dual master’s degrees from the University of Washington in public health and social work, where they focused on the SDoH that impact transgender and nonbinary peoples’ engagement with the healthcare system. They are a member of the “Birth Includes Us” research team, a study that examines queer and transgender experiences in pregnancy and birthing. He currently works in Seattle as an intensive outpatient therapist for youth and families.
graphic file with name ocad043ilf2.jpg Shoma Goomansingh advocates for all spectrums of under spoken groups. She is a first-generation Caribbean American from Trinidad and Tobago with a background in computer science. She is a chef, small business consultant, and a certified peer counselor in Washington State. She has experience assisting people who live with serious mental health issues with social resources ranging from housing and employment to drug rehab, sex protection, and free small business consulting. She initiated Tech Chef Productions, a start-up company for entrepreneurs and artists. She is passionate about combining her love of research and technology to help make the system a better place for people experiencing mental health complications.
graphic file with name ocad043ilf3.jpg Searetha Simons is passionate about helping people who don’t have a voice to speak up, including people experiencing homelessness, mental health issues, and victims of sex trafficking. She has lived and professional experience providing outreach and group facilitation at a Seattle-based organization for prostitution survivors. She served as a resident member of the board for a Seattle-based Housing Group that provides permanent homes and comprehensive services for people experiencing chronic homelessness. A Seattle native, She loves music and was named a “Community Star” by the Starbucks Community Star program, a partnership between Starbucks and Kraken Ice Hockey Team that recognizes anchors in the Seattle Metropolitan community who drive positive change.
graphic file with name ocad043ilf4.jpg J. J. Wong lived with a birth disease, in 5 countries, through various varietal intersectional socioeconomic strata in the healthcare systems to successfully advocate for self and for those who are disenfranchised, marginalized, stigmatized, discriminated, in particular, directly related to basic living: healthcare, housing, nutrition across culturo-lingo-technological-socioeconomic strata. He is intersectionally represented and advocated for the underprivileged/underserved, to bring peace, justice, and wellness (ie, transcendence) to LGBTQQIP2SAA* locally, nationally, and internationally, in body, mind, and spirit.
*LGBTQQIP2SAA: Lesbian, gay, bisexual, transgender, queer, questioning, intersex, pansexual, 2-spirit, asexual, and ally. “P” stands for pansexual: A term that describes a person who may have a physical, emotional, or romantic attraction to people of any gender.
graphic file with name ocad043ilf5.jpg Angeilea’ Yancey-Watson has a background in community advocacy and SDoH in her role as a lead program coordinator at a nonprofit organization that addresses health disparities and inequities in healthcare services for people of African descent. She is also a mental health first aid instructor. She earned dual degrees in community psychology and health administration, where she studied SDoH best practices. In her free time, she loves to travel and hike.

Engage patients inclusively

Efforts driving the use of SDoH data extracted from EHRs should include patients from traditionally marginalized or underrepresented communities. Patients from these groups express concerns and opinions that are critical to the implementation of automated SDoH systems. Patients from marginalized groups may be more likely to experience social needs, putting these groups at greater risk should automated SDoH systems make mistakes. Engagement should include individuals from diverse racial, ethnic and cultural groups, and genders, including individuals with limited English proficiency and those with limited literacy. As a critical part of our work, we have engaged a community advisory board of individuals with lived experience who advocate for communities which are most affected by SDoH (Table 1). This advisory board reflects inclusive advocacy for diverse patient communities to inform automated SDoH systems.

Engage patients with transparency

Automated SDoH systems can be complex and difficult to describe to patients. Although many people experience AI in their everyday lives, individuals with social needs often face literacy and access barriers to technology, leading to disparities in the use of digital health innovations.27 To facilitate informed use of automated SDoH systems that promote digital health equity, patient engagement should focus on ways to describe these systems as transparently and clearly as possible. In our work, we have partnered with clinical and community champions to explore techniques for helping patients with limited digital literacy to understand the use and implications of automated SDoH systems work in everyday terms. Examples include the use of storyboards, scenarios, vignettes, and analogies. We have worked diligently with this community champion advisory board to generate ideas for disseminating research findings about automated SDoH systems to communities of interest. Examples of community-generated ideas include dissemination through posters, video commercials, community pop-ups, webinars, and listening sessions.

Engage patients cooperatively

Patients should be engaged as partners with clinical stakeholders and system designers in guiding the design and implementation of automated SDoH systems. It is important for health systems to understand what patients want and do not want, and to offer them ways to express their needs, preferences, and autonomy. Through coproduction of healthcare services,71 patients lend tremendous insight into system features and workflows that promote patient experience, such as vulnerable and underrepresented groups who have long shaped care innovations at Kaiser Permanente’s Care Management Institute.72 Similar codesign methods could integrate patient perspectives into features and workflows for implementing automated SDoH systems, such as considering whether to “auto-suggest” rather than “auto-populate” social needs in the EHR. Including patient representatives as advisors in research and practice can elevate the perspectives of marginalized individuals and leverage their expert advice for health equity initiatives,73 requires consideration of compensation, inclusion, and democratic participation.

CONCLUSIONS

Automated SDoH systems offer health systems an efficient method to identify and act on social needs. However, to improve health outcomes, population health, and health equity, it is critical to consider unintended consequences for patients and to give patients a voice. The lens of AI4People provides ethical guidance for system design and implementation that is patient centric.

ACKNOWLEDGMENTS

We wish to acknowledge the significant contributions of Community Champion Advisory Board Kase Cragg, Shoma Goomansingh, Searetha Simons, J.J. Wong, and Angeilea’ Yancey-Watson. We also wish to thank UW Medicine Clinical Champions Martine Pierre-Louis and Drs Leo Morales, Bessie Young, and Doreen Kiss for their support and contributions.

Contributor Information

Andrea L Hartzler, Biomedical Informatics and Medical Education, University of Washington, Seattle, Washington, USA.

Serena Jinchen Xie, Biomedical Informatics and Medical Education, University of Washington, Seattle, Washington, USA.

Patrick Wedgeworth, Biomedical Informatics and Medical Education, University of Washington, Seattle, Washington, USA.

Carolin Spice, Biomedical Informatics and Medical Education, University of Washington, Seattle, Washington, USA.

Kevin Lybarger, Department of Information Sciences and Technology, George Mason University, Fairfax, Virginia, USA.

Brian R Wood, Department of Medicine, University of Washington, Seattle, Washington, USA.

Herbert C Duber, Department of Health, Washington State, Olympia, Washington, USA; Department of Emergency Medicine, University of Washington, Seattle, Washington, USA.

Gary Hsieh, Human Centered Design and Engineering, University of Washington, Seattle, Washington, USA.

Angad P Singh, Biomedical Informatics and Medical Education, University of Washington, Seattle, Washington, USA; Department of Family Medicine, University of Washington, Seattle, Washington, USA.

SDoH Community Champion Advisory Board:

Kase Cragg, Shoma Goomansingh, Searetha Simons, J J Wong, and Angeilea’ Yancey-Watson

FUNDING

This work was supported by the University of Washington Population Health Initiative Tier 1 Pilot Grant and National Library of Medicine Training Grant T15LM007442 from the National Institutes of Health.

AUTHOR CONTRIBUTIONS

Each author made substantial intellectual contributions to elements of the conception or recommendations in this work.

CONFLICT OF INTEREST STATEMENT

None declared.

DATA AVAILABILITY

No new data were generated or analyzed in support of this research.

REFERENCES

  • 1. Daniel H, Bornstein SS, Kane GC, et al. ; Health and Public Policy Committee of the American College of Physicians. Addressing social determinants to improve patient care and promote health equity: An American college of physicians position paper. Ann Intern Med 2018; 168 (8): 577–8. [DOI] [PubMed] [Google Scholar]
  • 2. Friedman NL, Banegas MP.. Toward addressing social determinants of health: A health care system strategy. Perm J 2018; 22 (4S): 18-095. [Google Scholar]
  • 3. Gottlieb LM, Wing H, Adler NE.. A systematic review of interventions on patients’ social and economic needs. Am J Prev Med 2017; 53 (5): 719–29. [DOI] [PubMed] [Google Scholar]
  • 4. National Academies of Sciences, Engineering, and Medicine; Health and Medicine Division; Board on Health Care Services; Committee on Integrating Social Needs Care into the Delivery of Health Care to Improve the Nation’s Health.Integrating Social Care into the Delivery of Health Care: Moving Upstream to Improve the Nation’s Health [Internet]. Washington, DC: National Academies Press (US; ); 2019. http://www.ncbi.nlm.nih.gov/books/NBK552597/ [cited November 30, 2022] [Google Scholar]
  • 5. The Center for Medicare & Medicaid Services. FY 2023 Hospital Inpatient Prospective Payment System (IPPS) and Long-Term Care Hospital Prospective Payment System (LTCH PPS) Final Rule—CMS-1771-F, August 1, 2022. https://www.cms.gov/newsroom/fact-sheets/fy-2023-hospital-inpatient-prospective-payment-system-ipps-and-long-term-care-hospital-prospective
  • 6. Cantor MN, Thorpe L.. Integrating data on social determinants of health into electronic health records. Health Aff Proj Hope 2018; 37 (4): 585–90. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. LaForge K, Gold R, Cottrell E, et al. How 6 organizations developed tools and processes for social determinants of health screening in primary care: An overview. J Ambul Care Manage 2018; 41 (1): 2–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Buitron de la Vega P, Losi S, Sprague Martinez L, et al. Implementing an EHR-based screening and referral system to address social determinants of health in primary care. Med Care 2019; 57 (Suppl 6 Suppl 2): S133–9. [DOI] [PubMed] [Google Scholar]
  • 9. Gold R, Bunce A, Cowburn S, et al. Adoption of social determinants of health EHR tools by community health centers. Ann Fam Med 2018; 16 (5): 399–407. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Eder M, Henninger M, Durbin S, et al. Screening and interventions for social risk factors: Technical brief to support the US preventive services task force. JAMA 2021; 326 (14): 1416–28. [DOI] [PubMed] [Google Scholar]
  • 11. Kiles TM, Cernasev A, Leibold C, Hohmeier K.. Patient perspectives of discussing social determinants of health with community pharmacists. J Am Pharm Assoc 2022; 62 (3): 826–33. [DOI] [PubMed] [Google Scholar]
  • 12. Langevin R, Berry A, Zhang J, et al. Implementation fidelity of chatbot screening for social needs: Acceptability, feasibility, appropriateness in its current from for publication in the Applied Clinical Informatics [published online ahead of print February 14, 2023]. Appl Clinic Inform. doi: 10.1055/a-2035-5342. [DOI] [PMC free article] [PubMed]
  • 13. Albert SM, McCracken P, Bui T, et al. Do patients want clinicians to ask about social needs and include this information in their medical record? BMC Health Serv Res 2022; 22 (1): 1275. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Rogers AJ, Hamity C, Sharp AL, Jackson AH, Schickedanz AB.. Patients’ attitudes and perceptions regarding social needs screening and navigation: Multi-site survey in a large integrated health system. J Gen Intern Med 2020; 35 (5): 1389–95. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Herrera CN, Brochier A, Pellicer M, Garg A, Drainoni M L.. Implementing social determinants of health screening at community health centers: Clinician and staff perspectives. J Prim Care Community Health 2019; 10: 2150132719887260. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Davidson KW, McGinn T.. Screening for social determinants of health: The known and unknown. JAMA 2019; 322 (11): 1037–8. [DOI] [PubMed] [Google Scholar]
  • 17. Schickedanz A, Hamity C, Rogers A, Sharp AL, Jackson A.. Clinician experiences and attitudes regarding screening for social determinants of health in a large integrated health system. Med Care 2019; 57 (Suppl 6 Suppl 2): S197–201. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Rogers CK, Parulekar M, Malik F, Torres CA.. A local perspective into electronic health record design, integration, and implementation of screening and referral for social determinants of health. Perspect Health Inf Manage 2022; 19 (Spring): 1g. [PMC free article] [PubMed] [Google Scholar]
  • 19. Tai-Seale M, Downing NL, Jones VG, et al. Technology-enabled consumer engagement: Promising practices at four health care delivery organizations. Health Aff 2019; 38 (3): 383–90. [DOI] [PubMed] [Google Scholar]
  • 20. Ancker JS, Barrón Y, Rockoff ML, et al. Use of an electronic patient portal among disadvantaged populations. J Gen Intern Med 2011; 26 (10): 1117–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Goldzweig CL, Orshansky G, Paige NM, et al. Electronic patient portals: Evidence on health outcomes, satisfaction, efficiency, and attitudes: A systematic review. Ann Intern Med 2013; 159 (10): 677–87. [DOI] [PubMed] [Google Scholar]
  • 22. Graetz I, Gordon N, Fung V, Hamity C, Reed ME.. The digital divide and patient portals: Internet access explained differences in patient portal use for secure messaging by age, race, and income. Med Care 2016; 54 (8): 772–9. [DOI] [PubMed] [Google Scholar]
  • 23. Wallace LS, Angier H, Huguet N, et al. Patterns of electronic portal use among vulnerable patients in a nationwide practice-based research network: From the OCHIN practice-based research network (PBRN). J Am Board Fam Med 2016; 29 (5): 592–603. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Anthony DL, Campos-Castillo C, Lim PS.. Who isn’t using patient portals and why? Evidence and implications from a national sample of US adults. Health Aff 2018; 37 (12): 1948–54. [DOI] [PubMed] [Google Scholar]
  • 25. Lyles CR, Nelson EC, Frampton S, Dykes PC, Cemballi AG, Sarkar U.. Using electronic health record portals to improve patient engagement: Research priorities and best practices. Ann Intern Med 2020; 172 (11_Supplement): S123–129. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Vogels EA. Digital Divide Persists even as Americans with Lower Incomes Make Gains in Tech Adoption. Washington, DC: Pew Research Center; 2021.https://www.pewresearch.org/fact-tank/2021/06/22/digital-divide-persists-even-as-americans-with-lower-incomes-make-gains-in-tech-adoption/. Accessed February 13, 2023. [Google Scholar]
  • 27. Crawford A, Serhal E.. Digital health equity and COVID-19: The innovation curve cannot reinforce the social gradient of health. J Med Internet Res 2020; 22 (6): e19361. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Khullar D, Bond AM, Schpero WL.. COVID-19 and the financial health of US hospitals. JAMA 2020; 323 (21): 2127–8. [DOI] [PubMed] [Google Scholar]
  • 29. Patra BG, Sharma MM, Vekaria V, et al. Extracting social determinants of health from electronic health records using natural language processing: A systematic review. J Am Med Inform Assoc 2021; 28 (12): 2716–27. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. Lybarger K, Yetisgen M, Uzuner O.. The 2022 n2c2/UW shared task on extracting social determinants of health [published online ahead of print February 16, 2023]. J Am Med Inform Assoc 2023. doi: 10.1093/jamia/ocad012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Berry C, Paul M, Massar R, Marcello RK, Krauskopf M.. Social needs screening and referral program at a large US public hospital system. Am J Public Health 2017; 110 (S2): S211–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. National NLP Clinical Challenges (n2c2). Continuing the Legacy of the I2b2 NLP Shared Tasks [Internet]. https://n2c2.dbmi.hms.harvard.edu/2022-track-2
  • 33. Afshar M, Phillips A, Karnik N, et al. Natural language processing and machine learning to identify alcohol misuse from the electronic health record in trauma patients: Development and internal validation. J Am Med Inform Assoc 2019; 26 (3): 254–61. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34. Price R, Boyda J, Bobay K. Workshop—innovative natural language processing engines for unlocking the potential of free-text electronic health record narrative notes. In: proceedings from the AMIA Clinical Informatics Conference; 2022. Houston, TX.
  • 35. Garg A, Boynton-Jarrett R, Dworkin PH.. Avoiding the unintended consequences of screening for social determinants of health. JAMA 2016; 316 (8): 813–4. [DOI] [PubMed] [Google Scholar]
  • 36. Butler E, Morgan A, Kangovi S.. Screening for unmet social needs: Patient engagement or alienation? N Eng J Med Catalyst Innov Care Deliv 2020; 1 (4). https://catalyst.nejm.org/doi/full/10.1056/CAT.19.1037. Accessed February 13, 2023. [Google Scholar]
  • 37. Richardson JP, Smith C, Curtis S, et al. Patient apprehensions about the use of artificial intelligence in healthcare. NPJ Digit Med 2021; 4 (1): 140. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Esmaeilzadeh P. Use of AI-based tools for healthcare purposes: A survey study from consumers’ perspectives. BMC Med Inform Decis Making 2020; 20 (1): 1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Aggarwal R, Farag S, Martin G, Ashrafian H, Darzi A.. Patient perceptions on data sharing and applying artificial intelligence to health care data: Cross-sectional survey. J Med Internet Res 2021; 23 (8): e26162. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40. Young AT, Amara D, Bhattacharya A, Wei ML.. Patient and general public attitudes towards clinical artificial intelligence: A mixed methods systematic review. Lancet Digit Health 2021; 3 (9): e599–611. [DOI] [PubMed] [Google Scholar]
  • 41. Floridi L, Cowls J, Beltrametti M, et al. AI4People—An ethical framework for a good AI society: Opportunities, risks, principles, and recommendations. Minds Mach 2018; 28 (4): 689–707. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42. Bear Don’t Walk OJ, Reyes Nieva H, Lee SSJ, Elhadad N.. A scoping review of ethics considerations in clinical natural language processing. J Am Med Inform Assoc Open 2022; 5 (2): ooac039. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43. Garg A, Sheldrick RC, Dworkin PH.. The inherent fallibility of validated screening tools for social determinants of health. Acad Pediatr 2018; 18 (2): 123–4. [DOI] [PubMed] [Google Scholar]
  • 44. Aboumatar HJ, Cooper LA.. Contextualizing patient-centered care to fulfill its promise of better health outcomes: Beyond who, what, and why. Ann Intern Med 2013; 158 (8): 628–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Weiner SJ. Advancing health equity by avoiding judgmentalism and contextualizing care. AMA J Ethics 2021; 23 (2): E91–96. [DOI] [PubMed] [Google Scholar]
  • 46. Iott BE, Pantell MS, Adler-Milstein J, Gottlieb LM.. Physician awareness of social determinants of health documentation capability in the electronic health record. J Am Med Inform Assoc 2022; 29 (12): 2110–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47. Wang Y, Wang L, Rastegar-Mojarad M, et al. Clinical information extraction applications: A literature review. J Biomed Inform 2018; 77: 34–49. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48. Ibrahim SA, Pronovost PJ.. Diagnostic errors, health disparities, and artificial intelligence: A combination for health or harm? JAMA Health Forum 2021; 2 (9): e212430 (American Medical Association). [DOI] [PubMed] [Google Scholar]
  • 49. Theis RP, Blackburn K, Lipori G, et al. Implementation context for addressing social needs in a learning health system: A qualitative study. J Clin Transl Sci 2021; 5 (1): e201. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50. Schoenthaler A, Hassan I, Fiscella K.. The time is now: Fostering relationship-centered discussions about patients’ social determinants of health. Patient Educ Couns 2019; 102 (4): 810–4. [DOI] [PubMed] [Google Scholar]
  • 51. Bajaj SS, Stanford FC.. Beyond Tuskegee—vaccine distrust and everyday racism. N Engl J Med 2021; 384 (5): e12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52. Gottlieb LM, Francis DE, Beck AF.. Uses and misuses of patient- and neighborhood-level social determinants of health data. Perm J 2018; 22: 18-078. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53. De Marchis EH, Alderwick H, Gottlieb LM.. Do patients want help addressing social risks? J Am Board Fam Med 2020; 33 (2): 170–5. [DOI] [PubMed] [Google Scholar]
  • 54. Gruß I, Varga A, Brooks N, Gold R, Banegas MP.. Patient interest in receiving assistance with self-reported social risks. J Am Board Fam Med 2021; 34 (5): 914–24. [DOI] [PubMed] [Google Scholar]
  • 55. Tong ST, Liaw WR, Kashiri PL, et al. Clinician experiences with screening for social needs in primary care. J Am Board Fam Med 2018; 31 (3): 351–63. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56. Magnan S. Social determinants of health 201 for health care: Plan, do, study, act. NAM Perspect 2021; 2021. doi: 10.31478/202106c. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57. Garg A, Byhoff E, Wexler MG.. Implementation considerations for social determinants of health screening and referral interventions. JAMA Netw Open 2020; 3 (3): e200693. [DOI] [PubMed] [Google Scholar]
  • 58. Gold R, Gottlieb L.. National data on social risk screening underscore the need for implementation research. JAMA Netw Open 2019; 2 (9): e1911513. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59. Caine K, Hanania R.. Patients want granular privacy control over health information in electronic medical records. J Am Med Inform Assoc 2013; 20 (1): 7–15. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60. Meslin EM, Alpert SA, Carroll AE, et al. Giving patients granular control of personal health information: Using an ethics points to consider to inform informatics system designers. Int J Med Inform 2013; 82: 1136–43. [DOI] [PubMed] [Google Scholar]
  • 61. Branson B, Handsfield HH, Lampe MA, et al. Revised recommendations for HIV testing of adults, adolescents, and pregnant women in health care settings. MMWR Morb Mortal Wkly Rep 2006; 55 (RR-14): 1–17. [PubMed] [Google Scholar]
  • 62. Gebrezgi MT, Mauck DE, Sheehan DM, et al. Acceptance of opt-out HIV screening in outpatient settings in the United States: A systematic review and meta-analysis. Public Health Rep 2019; 134 (5): 484–92. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63. Marshall EA, Oates JC, Shoaibi A, et al. A population-based approach for implementing change from opt-out to opt-in research permissions. PLoS ONE 2017; 12 (4): e0168223. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64. Obermeyer Z, Powers B, Vogeli C, Mullainathan S.. Dissecting racial bias in an algorithm used to manage the health of populations. Science 2019; 366 (6464): 447–53. [DOI] [PubMed] [Google Scholar]
  • 65. Park J, Saha S, Chee B, Taylor J, Beach MC.. Physician use of stigmatizing language in patient medical records. JAMA Netw Open 2021; 4 (7): e2117052. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66. Goddu AP, O’Conor KJ, Lanzkron S, et al. Do words matter? Stigmatizing language and the transmission of bias in the medical record. J Gen Intern Med 2018; 33 (5): 685–91. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67. Himmelstein G, Bates D, Zhou L.. Examination of stigmatizing language in the electronic health record. JAMA Netw Open 2022; 5 (1): e2144967. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68. Sun M, Oliwa T, Peek ME, Tung EL.. Negative patient descriptors: Documenting racial bias in the electronic health record: Study examines racial bias in the patient descriptors used in the electronic health record. Health Aff 2022; 41 (2): 203–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69. Valdez A. Words matter: Labelling, bias and stigma in nursing. J Adv Nurs 2021; 77 (11): e33–5. [DOI] [PubMed] [Google Scholar]
  • 70. Alpert AB, Mehringer JE, Orta SJ, et al. Experiences of transgender people reviewing their electronic health records, a qualitative study. J Gen Intern Med 2022; May: 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71. Batalden M, Batalden P, Margolis P, et al. Coproduction of healthcare service. BMJ Qual Saf 2016; 25 (7): 509–17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72. Neuwirth EB, Bellows J, Jackson AH, Price PM.. How Kaiser Permanente uses video ethnography of patients for quality improvement, such as in shaping better care transitions. Health Aff 2012; 31 (6): 1244–50. [DOI] [PubMed] [Google Scholar]
  • 73. Veinot TC, Clarke PJ, Romero DM, et al. Equitable research PRAXIS: A framework for health informatics methods. Yearb Med Inform 2022; 31 (1): 307–16. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

No new data were generated or analyzed in support of this research.


Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES