Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2017 Jul 1.
Published in final edited form as: J Empir Res Hum Res Ethics. 2016 Jul;11(3):214–219. doi: 10.1177/1556264616661611

Did I Tell You That? Ethical Issues Related to Using Computational Methods to Discover Non-Disclosed Patient Characteristics

Kenrick D Cato 1, Walter Bockting 1,2, Elaine Larson 1,3
PMCID: PMC4991620  NIHMSID: NIHMS801831  PMID: 27534587

Abstract

Background

Widespread availability of large data sets through warehousing of electronic health records coupled with increasingly sophisticated information technology and related statistical methods offer great potential for a variety of applications for health and disease surveillance, developing predictive models and advancing decision support for clinicians. However, use of such ‘big data’ mining and discovery techniques has also raised ethical issues such as how to balance privacy and autonomy with the wider public benefits of data sharing. More specifically, electronic data are being increasingly used to identify individual characteristics which can be useful for clinical prediction and management, but that were not previously disclosed to a clinician. This process in computer parlance is called electronic phenotyping, and has a number of ethical implications.

Approach

Using the Belmont Report’s principles of respect for persons, beneficence, and justice as a framework, we examined the ethical issues posed by electronic phenotyping.

Findings

Ethical issues identified include the ability of the patient to consent for the use of their information, the ability to suppress pediatric information, ensuring that the potential benefits justify the risks of harm to patients, and acknowledging that the clinician’s biases or stereotypes, conscious or unintended, may also become a factor in the therapeutic interaction. We illustrate these issues with two vignettes, using the person characteristic of gender minority status (i.e., transgender identity) and the health history characteristic of substance abuse.

Conclusion

Big data mining has the potential to uncover patient characteristics previously obscured which can provide clinicians with beneficial clinical information. Hence, ethical guidelines must be updated to ensure that electronic phenotyping supports the principles of respect for persons, beneficence, and justice.

Keywords: clinical settings, electronic health record, data mining, LGBT

Vignettes 1

Patient X is a 50 year-old transgender woman who typically receives her primary care at a community health center for lesbian, gay, bisexual, and transgender (LGBT) individuals. She had an emergency appendectomy at the University Hospital five years previously, during which she disclosed that she was taking hormones often used by transgender women. This year, Patient X is again hospitalized in the same University Hospital. An electronic decision support tool was used to identify characteristics common to transgender individuals (a transgender phenotyping algorithm), which causes the clinician to ask her questions related to transgender health that he would not have otherwise asked.

Vignettes 2

Patient Y is a 21-year-old man seeing his primary care clinician because of ongoing depression and suicidal ideation. When Patient Y was 16 years of age he was treated at an inpatient drug rehabilitation clinic that at the time was not part of the University Hospital Health System, but is now. When the clinician enters the information into the problem list of the electronic health record, an electronic algorithm designed as a decision support tool alerts her to possible drug use by this patient, causing the clinician to ask Patient Y questions that she would not have otherwise asked.

Introduction

Widespread availability of large data sets through warehousing of electronic health records (EHR) coupled with increasingly sophisticated information technology and statistical methods offer great potential for a variety of applications for health and disease surveillance, developing predictive models and advancing decision support for clinicians. Use of such ‘big data’ mining techniques has also raised ethical issues as to how to balance privacy and autonomy issues with the wider public benefits and data sharing (White House, 2014). While much of clinical data mining to date has used anonymous or de-identified data, there are ongoing active discussions about the application of electronic phenotyping, a process used to identify individual characteristics that might be useful for clinical prediction, management, and decision making (Herland, Khoshgoftaar, & Wald, 2014). In other words, using advanced statically based methods to predict characteristics of a patient based on another patient’s de-identified data. Moreover, as Klitzman and colleagues (Klitzman, Appelbaum, & Chung, 2014) have pointed out, ethical issues arise when EHR data such as genetic testing results are available to determine differential access to life insurance.

In computer parlance, a phenotype refers to identifiable patient characteristics such as diagnoses, prescribed medications, and laboratory results that can be used to identify or characterize individuals because they are found more often in individuals with a particular disease or condition than in the general population. Hence, phenotyping offers promise as a surveillance method to identify persons with or at risk for certain conditions. A number of phenotyping applications have already been developed to discover characteristics of patients from their EHRs. For example, Holt and colleagues described methods to search EHRs for laboratory results and clinical diagnoses to classify diabetic patients (Holt, Gunnarsson, Cload, & Ross, 2014). Similarly, other researchers have developed phenotypes to identify patients at risk for suicide (Tran et al., 2014).

The aim of this paper is to discuss the ethical issues raised by the promise of discovering characteristics of a patient from their EHR that the patient did not explicitly disclose to a particular clinician. To frame our discussion, we will utilize the ethical principles of respect for persons, beneficence, and justice outlined in the 1979 Belmont Report (National Commission for the Proptection of Human Subjects of Biomedical and Behavioral Research, 1978). While these principles were originally explicated for the purpose of protecting individuals involved in human subjects research, they are equally applicable to clinical practice.

Respect for Persons

The principle of Respect for Persons raises two related ethical concerns regarding the ability to discover certain characteristics of a patient from their EHR that they chose not to disclose. The first moral obligation associated with this principle requires that a person’s autonomy is acknowledged. This assumes that patients are capable of deliberating about the intersection of their personal goals and clinical choices (Cassell, 2000). Persons in that regard are social, moral, legal, and political entities with rights, to whom obligations are due. In the case of electronic phenotyping, the patient is obligated to know that even though they might have chosen not to disclose a characteristic (e.g., transgender status, history of drug use or mental illness, or other potentially sensitive issues), software may still uncover it and present that information to their clinician. Therefore, there may be an ethical requirement to refrain from phenotyping if the patient is unaware and does not at least implicitly consent to it.

The second moral obligation regarding the principle of Respect for Persons is the requirement to protect patients who are incapable of exercising their own autonomy, either because of age or changes in mental status. It is imperative that the use of phenotyping take into account a person’s ability to consent to having the technology applied to their EHR. To accomplish this goal, there is an implied additional level of complexity related to changes in a person’s ability to consent. For example, clinical information entered into a minor’s record may have to be excluded from phenotyping software unless the adult patient consents to the use of such information when majority status is reached (Currie, 2013).

Unfortunately, the consent process for clinical use of patient records has not kept pace with the potential uses of electronic patient data, and the requisite processes and structures to assure adequate disclosure and consent are not yet in place. Furthermore, the consent process as it stands has been shown to be problematic at best. Clinicians and patients alike have voiced concerns with the current state of affairs. Clinicians have raised the alarm that the consent process is “broken” (Hayden, 2012). For example, research has shown that patients have great difficulty understanding and recounting the research to which they have consented (Brehaut et al., 2012; Ghandour, Yasmine, & El-Kak, 2013; Lee, Lampert, Wilder, & Sowell, 2011; Tamariz, Palacio, Robert, & Marcus, 2013). In other words, clinicians are concerned that the informed consent process leaves research participants and patients not informed about how their data will be used. Electronic phenotyping can be performed with respect for persons. The discussion

Beneficence

While the patient may greatly benefit from the provider’s attention to their identity characteristics, vulnerability for certain health conditions, and related needs, this also raises the ethical concern of potential harm. For example, a patient may not have disclosed their transgender identity or their vulnerability to substance abuse explicitly in order to maintain privacy and/or avoid attention to sensitive issues that the patient perceives as not particularly relevant to the presenting concern. Indeed, previous research among transgender patients has documented experiences of undue attention to their gender identity and genital status to the detriment of getting their immediate health concerns addressed (Sperber, Landers, & Lawrence, 2005).

The principle of Beneficence requires researchers to minimize the risk of harm and to maximize the potential benefits of their research and mandates that researchers and institutional review boards conduct a careful assessment of the risks of harm and the potential benefits of the research to ensure that the potential benefits justify the risks. Applying Beneficence to the opportunities provided by patient profiling through the analysis of EHR data means that the potential benefits of identifying and alerting health providers to the vulnerabilities, risks and needs of individual patients must be weighed against the potential harm. For example, since establishing and maintaining a trusting relationship with the patient is of paramount importance, researchers, providers, and policy makers should consider the potential impact of profiling on the provider-patient relationship. This may vary depending on the practice setting and the available services and treatments. For example, Patient X may perceive questions about her transgender identity as intrusive and not relevant to her presenting concern. Simply asking patients routinely about their gender identity (i.e., not only in response to an alert triggered by phenotyping) could achieve the same benefits of disclosure without running the risk of being perceived as intrusive and, as a result, jeopardize the provider-patient relationship. It is possible that the patient is unaware of the risks involved in failing to share relevant information with a provider. Depending on the health care context, benefits of phenotyping could outweigh the risks of harm. For example, Patient Y’s ongoing depression and suicidal ideation may be related to his ongoing struggle with substance abuse and present a barrier to successful treatment. In that case an alert based on phenotyping of EHR data to initiate substance abuse screening could be lifesaving. Beneficence therefore means that researchers, clinicians, and policy makers carefully weigh the potential benefits and harm of the additional information that electronic phenotyping affords them.

Justice

With regard to the issue of Justice, “injustice occurs when some benefit to which a person is entitled is denied without good reason or when some burden is imposed unduly” (http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html#xjust). Although it is sometimes appropriate to take into account variations in individuals’ age, competence, or other characteristics when making clinical decisions, it is also possible that stereotypes, preconceived notions and biases of the care provider can result in decisions that are unfair, place certain groups at a therapeutic disadvantage, and/or interfere with the clinician-patient interaction. As discussed in the Institute of Medicine Report, “Unequal Treatment” (Smedley, Stith, & Nelson, 2002), health care disparities are caused by a number of factors, one of which is the clinician-patient encounter. The Report summarizes research demonstrating that clinicians’ decisions about treatment modalities and even what options to offer patients are influenced by a variety of sociologic and demographic characteristics, and points out that bias against certain groups or beliefs about the health behaviors or conditions in these groups can lead to discriminatory patterns of care and mistrust about the clinician’s advice (Grant et al., 2010; Kosenko, Rintamaki, Raney, & Maness, 2013; Schwartz, Woloshin, & Welch, 1999; van Ryn, 2002; van Ryn & Burke, 2000).

While it may be rare indeed for a clinician to state or perhaps even recognize his/her own prejudices, it is precisely because they are subtle, often unconscious, and difficult to measure that biases can enter the clinical encounter in the form of nonverbal behavior, avoidance, or aversion. It is possible that patients choose not to disclose certain information because they sense such subtle bias or stereotyping from the clinician. In fact, patient perceptions of communication with their provider have been shown to be an important predictor of receipt of optimal care among patients with lung cancer (Cykert et al., 2010; Dalton et al., 2014). The Belmont principle of Justice means that such unfair treatment based on pre-conceptions of the clinician should be prevented as much as possible. Disclosure of information without the patient’s explicit agreement or knowledge could result in more comprehensive treatment, but there exists the concomitant risk that the clinician’s biases or stereotypes, conscious or unintended, will also become a factor in the therapeutic interaction. Hence, Justice means that clinicians must carefully and explicitly examine their own biases and prejudices to assure that data mining is used solely for the benefit of making useful and informed decisions for the patient’s care.

Discussion

To improve outcomes and increase safety, there has been rapid development in creating electronic phenotypes of patient characteristics. With the increasing availability of electronic patient data, the pace and breadth of phenotyping will increase (Hripcsak & Albers, 2012). Furthermore, the ethical considerations of research and clinical practice are closely related. Undoubtedly, the IRB approved research to create the electronic phenotypes for the decision support in the two vignettes would have adhered to privacy and confidentiality standards that are required by the Health Insurance Portability and Accountability Act (OP Brief, 2005) and the Common Rule (Code of Federal Regulations; http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html), utilizing de-identified electronic data. The ethical concerns that we are addressing are the ability of predictive data analysis to infer characteristics of a patient in a clinical setting, that they did not disclose to their clinician. While some have argued that a new “big data ethical code” is required (Richards & King, 2014), there are a number of existing options that increase transparency to consider.

First, even though data mining is a rapidly evolving field in which it is hard to forecast novel uses of data, transparency of data usage is a necessity. Providing transparency of clinical data use in big data analysis is required by the ethical principal of respect for persons. Patients and providers should be informed of the current ways in which electronic data are likely to be used, so that informed decisions can be made. Furthermore, it is often assumed that the algorithms that power big data analyses are valid and remain stable over time, but that is a false assumption. For example, Google Flu Trends(GFT), is big data flu predictor application, that has been a remarkable predictor of global flu patterns since 2008. However, in 2009 GFT under predicted flu occurrences (Cook, Conrad, Fowlkes, & Mohebbi, 2013) and then in 2013 over predicted flu rates Butler, 2013). As the highly publicized errors in GFT has demonstrated, transparency to big data algorithms can allow external data scientists to point out possible errors (Lazer, Kennedy, King, & Vespignani, 2014).

Second, phenotyping should be built on a technical foundation that protects patient preferences and ensures consent for the use of their data for clearly defined purposes and delineated ways. For example, the phenotype in Vignette 2 used data from Patient Y’s chart when he was a minor. It is important to have the programming in place to prevent the data from those who cannot consent (e.g., minors, individuals with altered mental status) from being used in phenotyping software.

Third, methods like dynamic flexible consent and portable legal consent (PLC; http://weconsent.us/informed-consent/) may provide a solution to some of the aforementioned ethical challenges. Both methods include data for which a workflow and information technology platform has been developed to allow patients to: 1) consent to new uses of their data with ease or alter their consent preferences in real time as their circumstances change; 2) change contact information and personal preferences and find out how their data have been used; 3) configure preferences regarding information they receive, how often they receive it and in what format (e.g., text messages, email, letters) (Kaye et al., 2014). The revolutions in genetic testing and genome sequencing, and the burgeoning of biobanks and data marts allow researchers access to large volumes of data to ask new questions that might be beyond the aims of the original project for which a research participant may have given consent. The current consent processes and regulations, however, have not yet kept pace with availability of data, potentially leading to distrust on the part of the research participant and uncertainty about how to proceed on the part of the researcher (Asghar & Russello, 2012; Erdmann, 2013; Rahm, Wrenn, Carroll, & Feigelson, 2013). The dynamic flexible consent process makes is possible for participants to electronically control their consent over time and also obtain information about how their data are used. This more flexible and ongoing consent process has been recommended to empower patients and enhance the participant-researcher trust and communication (Erlich et al., 2014; Kondylakis, Flouris, Fundulaki, & Tsiknakis; Teare, Morrison, Whitley, & Kaye, 2015; Williams et al., 2015).

Similarly, the Portable Legal Consent for Common Genomics Research allows users to give their data to databases which remove specific patient details, to transport their consent between institutions, and to attach their consent preferences to discreet pieces of data(Erika Check Hayden, 2012). For example, in Vignette 1, Patient X’s willingness to consent to disclosure of her transgender status may change over time and depend on the practice setting. Dynamic consent would also allow Patient X to change her preferred name and pronoun, key in the provision of culturally competent LGBT health care (Deutsch et al., 2013). Researchers are beginning to test digital and dynamic consent processes (Kuehn, 2013; Thiel et al., 2014), and toolkit to develop such patient-centered consent forms is available at http://sagebase.org/e-consent/.

Such technologies may conflict with a clinician’s desire to have a complete medical chart and full access to patient’s health information. Further, the patient may not appreciate the potential consequences of choosing not to disclose certain characteristics relevant to their health. This conundrum raises the issue of whether patients have the right to exclude clinically relevant data from their medical record when doing so could compromise their health care and survival. Research has shown that individuals are more likely to withhold information when they have little say in how their medical records will be used (Agaku, Adisa, Ayo-Yusuf, & Connolly, 2014). This issue requires further discussion.

Finally, another approach to respecting individual’s ethical rights and increasing lack of transparency is to increase community consultation. This method has been effectively used effectively by the Secretary’s Advisory Committee on Testing in the 1990’s when deciding how to approach ethical issues of mapping of the genome (Koenig, 2014).

Regardless, data mining techniques like phenotyping of electronic patient data to discover patient characteristics are here to stay. It is incumbent upon us to address the resulting ethical challenges. We strongly recommend that researchers and ethical review boards ‘update’ their modes of obtaining informed consent and consider processes such as those described above which include more active participation and decision-making on the part of research participants, patients, and community members. This will facilitate the transition from passive and ineffective informed consent to dynamic communication, enhance trust, and fulfill the spirit of the principles of respect, beneficence, and justice.

Acknowledgments

Financial support. This study was funded in part by a grant from The Agency for Healthcare Research and Quality, 1R01HS022961.

Footnotes

Potential conflicts of interest. All authors report no conflicts of interest relevant to this article.

Contributor Information

Walter Bockting, Email: wb2273@cumc.columbia.edu.

Elaine Larson, Email: ell23@cumc.columbia.edu.

References

  1. Agaku IT, Adisa AO, Ayo-Yusuf OA, Connolly GN. Concern about security and privacy, and perceived control over collection and use of health information are related to withholding of health information from healthcare providers. Journal of the American Medical Informatics Association. 2014;21(2):374–378. doi: 10.1136/amiajnl-2013-002079. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Asghar MR, Russello G. Flexible and Dynamic Consent-Capturing. In: Camenisch J, Kesdogan D, editors. Open Problems in Network Security: IFIP WG 11.4 International Workshop, iNetSec 2011, Lucerne, Switzerland, June 9, 2011, Revised Selected Papers. Berlin, Heidelberg: Springer Berlin Heidelberg; 2012. pp. 119–131. [Google Scholar]
  3. Beskow LM, Friedman JY, Hardy NC, Lin L, Weinfurt KP. Developing a simplified consent form for biobanking. PloS One. 2010;5(10):e13302. doi: 10.1371/journal.pone.0013302. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Brehaut JC, Carroll K, Elwyn G, Saginur R, Kimmelman J, Shojania K, Fergusson D. Informed consent documents do not encourage good-quality decision making. Journal of Clinical Epidemiology. 2012;65(7):708–724. doi: 10.1016/j.jclinepi.2012.01.004. [DOI] [PubMed] [Google Scholar]
  5. Brief OP. Summary of the HIPAA Privacy Rule. Washington, DC: United States Department of Health and Human Services; 2005. [Google Scholar]
  6. Butler D. When Google got flu wrong. Nature. 2013;494(7436):155. doi: 10.1038/494155a. [DOI] [PubMed] [Google Scholar]
  7. Cassell EJ. The principles of the Belmont report revisited: How have respect for persons, beneficence, and justice been applied to clinical medicine? Hastings Center Report. 2000;30(4):12–21. [PubMed] [Google Scholar]
  8. Currie J. “Big Data” Versus “Big Brother”: On the Appropriate Use of Large-scale Data Collections in Pediatrics. Pediatrics. 2013;131(Supplement 2):S127–S132. doi: 10.1542/peds.2013-0252c. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Code of Federal Regulations. [accessed 15 June 2016]; HHS.gov. http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html.
  10. Cook S, Conrad C, Fowlkes AL, Mohebbi MH. Assessing Google flu trends performance in the United States during the 2009 influenza virus A (H1N1) pandemic. PloS one. 2011;6(8):e23610. doi: 10.1371/journal.pone.0023610. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Cykert S, Dilworth-Anderson P, Monroe MH, Walker P, McGuire FR, Corbie-Smith G, Bunton AJ. Factors associated with decisions to undergo surgery among patients with newly diagnosed early-stage lung cancer. Journal of the American Medical Informatics Association. 2010;303(23):2368–2376. doi: 10.1001/jama.2010.793. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Dalton AF, Bunton AJ, Cykert S, Corbie-Smith G, Dilworth-Anderson P, McGuire FR, Edwards LJ. Patient characteristics associated with favorable perceptions of patient-provider communication in early-stage lung cancer treatment. Journal of Health Communication. 2014;19(5):532–544. doi: 10.1080/10810730.2013.821550. [DOI] [PubMed] [Google Scholar]
  13. Deutsch MB, Green J, Keatley J, Mayer G, Hastings J, Hall AM, Cody MK. Electronic medical records and the transgender patient: recommendations from the World Professional Association for Transgender Health EMR Working Group. Journal of the American Medical Informatics Association. 2013 doi: 10.1136/amiajnl-2012-001472. amiajnl-2012-001472. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Erdmann J. As personal genomes join big data will privacy and access shrink? Chem Biol. 2013;20(1):1–2. doi: 10.1016/j.chembiol.2013.01.008. [DOI] [PubMed] [Google Scholar]
  15. Erlich Y, Williams JB, Glazer D, Yocum K, Farahany N, Olson M, Kain RC. Redefining genomic privacy: trust and empowerment. PLoS Biol. 2014;12(11):e1001983. doi: 10.1371/journal.pbio.1001983. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Ghandour L, Yasmine R, El-Kak F. Giving Consent without Getting Informed: A Cross-Cultural Issue in Research Ethics. Journal of Empirical Research on Human Research Ethics. 2013;8(3):12–21. doi: 10.1525/jer.2013.8.3.12. [DOI] [PubMed] [Google Scholar]
  17. Grant JM, Mottet LA, Tanis J, Herman J, Harrison J, Keisling M. National Center for Transgender Equality and National Gay and Lesbian Task Force. Washington, DC: 2010. National Transgender Discrimination Survey Report on health and health care; pp. 1–23. [Google Scholar]
  18. Hayden EC. A broken Contract. Nature (London) 2012;486(7403):312. doi: 10.1038/486312a. [DOI] [PubMed] [Google Scholar]
  19. Hayden EC. Open-data project aims to ease the way for genomic research. Nature. 2012 [Google Scholar]
  20. Herland M, Khoshgoftaar TM, Wald R. A review of data mining using big data in health informatics. Journal of Big Data. 2014;1(1):2. [Google Scholar]
  21. Holt TA, Gunnarsson CL, Cload PA, Ross SD. Identification of undiagnosed diabetes and quality of diabetes care in the United States: cross-sectional study of 11.5 million primary care electronic records. Canadian Medical Association Open Access Journal. 2014;2(4):E248–E255. doi: 10.9778/cmajo.20130095. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. House W. Big Data: Seizing opportunities, preserving values. Washington, DC: Exceutive Office of the President; 2014. [Google Scholar]
  23. Hripcsak G, Albers DJ. Next-generation phenotyping of electronic health records. Journal of the American Medical Informatics Association. 2012 doi: 10.1136/amiajnl-2012-001145. amiajnl-2012-001145. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Kaye J, Whitley EA, Lund D, Morrison M, Teare H, Melham K. Dynamic consent: a patient interface for twenty-first century research networks. European Journal Human Genetics. 2014 doi: 10.1038/ejhg.2014.71. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Klitzman R, Appelbaum PS, Chung WK. Should Life Insurers Have Access to Genetic Test Results? The Journal of the American Medical Association. 2014;312(18):1855–1856. doi: 10.1001/jama.2014.13301. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Koenig BA. Have we asked too much of consent? Hastings Center Report. 2014;44(4):33–34. doi: 10.1002/hast.329. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Kondylakis H, Flouris G, Fundulaki I, Tsiknakis M. Flexible Access to Patient Data through e-Consent [Google Scholar]
  28. Kosenko K, Rintamaki L, Raney S, Maness K. Transgender patient perceptions of stigma in health care contexts. Medical Care. 2013;51(9):819–822. doi: 10.1097/MLR.0b013e31829fa90d. [DOI] [PubMed] [Google Scholar]
  29. Kuehn BM. Groups experiment with digital tools for patient consent. JAMA. 2013;310(7):678–680. doi: 10.1001/jama.2013.194643. [DOI] [PubMed] [Google Scholar]
  30. Lazer D, Kennedy R, King G, Vespignani A. The parable of Google Flu: traps in big data analysis. Science. 2014 Mar 14;343 doi: 10.1126/science.1248506. [DOI] [PubMed] [Google Scholar]
  31. Lee R, Lampert S, Wilder L, Sowell AL. Subjects agree to participate in environmental health studies without fully comprehending the associated risk. International Journal of Environmental Research and Public Health. 2011;8(3):830–841. doi: 10.3390/ijerph8030830. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Marsolo K, Corsmo J, Barnes MG, Pollick C, Chalfin J, Nix J, Ganta R. Challenges in creating an opt-in biobank with a registrar-based consent process and a commercial EHR. Journal of the American Medical Informatics Association. 2012;19(6):1115–1118. doi: 10.1136/amiajnl-2012-000960. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Rahm AK, Wrenn M, Carroll NM, Feigelson HS. Biobanking for research: a survey of patient population attitudes and understanding. Journal of community genetics. 2013;4(4):445–450. doi: 10.1007/s12687-013-0146-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Richards NM, King JH. Big data ethics. Wake Forest L. Rev. 2014;49:393. [Google Scholar]
  35. National Commission for the Proptection of Human Subjects of Biomedical and Behavioral Research, Bethesda, MD. The Belmont report: Ethical principles and guidelines for the protection of human subjects of research. ERIC Clearinghouse; 1978. [PubMed] [Google Scholar]
  36. Schwartz LM, Woloshin S, Welch HG. Misunderstandings about the effects of race and sex on physicians' referrals for cardiac catheterization. New England Journal Medicine. 1999;341(4):279–283. doi: 10.1056/NEJM199907223410411. discussion 286-277. [DOI] [PubMed] [Google Scholar]
  37. Smedley B, Stith A, Nelson A. Unequal treatment: Confronting racial and ethnic disparities in health care. Washington DC: National Academy Press; 2002. [PubMed] [Google Scholar]
  38. Sperber J, Landers S, Lawrence S. Access to health care for transgendered persons: Results of a needs assessment in Boston. International Journal of Transgenderism. 2005;8(2–3):75–91. [Google Scholar]
  39. Tamariz L, Palacio A, Robert M, Marcus EN. Improving the informed consent process for research subjects with low literacy: a systematic review. Journal of General Internal Medicine. 2013;28(1):121–126. doi: 10.1007/s11606-012-2133-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Teare HJ, Morrison M, Whitley EA, Kaye J. Towards ‘Engagement 2.0’: Insights from a study of dynamic consent with biobank participants. Digital Health. 2015;1 doi: 10.1177/2055207615605644. 2055207615605644. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Thiel DB, Platt J, Platt T, King SB, Fisher N, Shelton R, Kardia SL. Testing an online, dynamic consent portal for large population biobank research. Public Health Genomics. 2014;18(1):26–39. doi: 10.1159/000366128. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Tran T, Luo W, Phung D, Harvey R, Berk M, Kennedy RL, Venkatesh S. Risk stratification using data from electronic medical records better predicts suicide risks than clinician assessments. BMC Psychiatry. 2014;14(1):76. doi: 10.1186/1471-244X-14-76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. van Ryn M. Research on the provider contribution to race/ethnicity disparities in medical care. Medical Care. 2002;40(1 Suppl):I140–I151. doi: 10.1097/00005650-200201001-00015. [DOI] [PubMed] [Google Scholar]
  44. van Ryn M, Burke J. The effect of patient race and socio-economic status on physicians' perceptions of patients. Social Science and Medicine. 2000;50(6):813–828. doi: 10.1016/s0277-9536(99)00338-x. [DOI] [PubMed] [Google Scholar]
  45. Whitley EA, Kanellopoulou N, Kaye J. Consent and research governance in biobanks: evidence from focus groups with medical researchers. Public Health Genomics. 2012;15(5):232–242. doi: 10.1159/000336544. [DOI] [PubMed] [Google Scholar]
  46. Williams H, Spencer K, Sanders C, Lund D, Whitley EA, Kaye J, Dixon WG. Dynamic consent: a possible solution to improve patient confidence and trust in how electronic patient records are used in medical research. JMIR medical informatics. 2015;3(1) doi: 10.2196/medinform.3525. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES