Abstract
Understanding how to govern emerging distributed research networks is essential to their success. Distributed research networks aggregate patient medical data from many institutions leaving data within the local provider security system. While much is known about patients’ views on secondary medical research, little is known about their views on governance of research networks. We conducted six focus groups with patients from three medical centers across the U.S. to understand their perspectives on privacy, consent, and ethical concerns of sharing their data as part of research networks. Participants positively endorsed sharing their health data with these networks believing that doing so could advance healthcare knowledge. However, patients expressed several concerns regarding security and broader ethical issues such as commercialism, public benefit, and social responsibility. We suggest that network governance guidelines move beyond strict technical requirements and address wider socio-ethical concerns by fully including patients in governance processes.
Introduction
Distributed research networks (DRNs) are currently proliferating across the United States. These innovative structures harness electronic medical records (EHRs) and other technologies for large-scale health research. DRNs allow researchers to aggregate patient data for research yet retain local control as access is coordinated through a central hub that distributes data for research purposes according to user requests1. To achieve acceptance and use, DRNs must adhere to various state and federal laws as well as consider emergent ethical and social issues raised by their innovation. Federal policies such as the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Common Rule govern how research data can be accessed and used by clinicians and researchers. State laws apply but vary in scope; many are relevant to individual institutions that comprise DRNs, yet state laws are not directed explicitly toward research networks themselves. The design and implementation of research networks such as DRNs requires addressing a complex patchwork of federal and state laws, institutional policies, as well as numerous technical, regulatory, and ethical issues that together constitute the landscape of technoscientific governance. Effective and transparent governance policies are needed to ensure public protection, build public trust and encourage patients to allow their health information to be aggregated and shared in these networks.
The term governance signals that the usual distinctions among science, technology and politics as separate, non-overlapping categories are not clear boundaries but instead co-produced through various co-relations. That is, the development and administrative control of technological systems are not narrowly matters of government and laws, but also include the activities of a wide range of actors occupying various social worlds, including industry, scientific organizations, public groups, consumers, and markets2. While many actors are involved with and affected by DRNs, patients and members of the public are not involved in the governance of these networks and only rarely are their perspectives taken into consideration in design, implementation, and governance decisions. Yet, patients are primary stakeholders – the “data” comprising DRNs are built from and designed to analyze patient health information. Yet, DRN governance practices to date have not included patient perspectives.
This paper begins with the design and governance of one DRN: A multi-site research study led by UC San Diego, Scalable National Network for Effectiveness Research (SCANNER) 3. SCANNER’s goal is to develop and evaluate an electronic health information infrastructure that supports comparative effectiveness research (CER) to address highly prevalent health conditions such as cardiovascular disease and diabetes. The larger SCANNER research study includes fine-tuning design and governance of policies for data sharing to allow different levels of sharing depending on factors such as institutional differences in data sharing policies, IRB guidelines and patient consent practices, and other variables. Our goal was to understand the ways designers and institutional actors are themselves part of a wide range of de-centralized networks and shifting assemblages of power that together constitute the field of technoscientific governance of DRNs.
SCANNER included the development of a privacy and security framework informed by applicable federal privacy and security laws and state health information exchange (HIE) guidelines. A systematic policy review as well as the convening of a 7-member panel of policy and technical experts was assembled to review, provide expert input, and design a privacy and security framework analysis4. A second component of SCANNER, and the basis of our analysis, was to conduct patient focus groups within the project sites in an effort to understand patient perspectives about the use of their health information in data sharing in general and DRNs in particular so that these actors and their perspectives could be integrated into design and use practices. The focus group objective was to capture patient attitudes by staging discussion into three expanding case scenarios beginning with EHRs in general and gradually moving to a concluding scenario of DRNs. The following three research questions together, we argue, inform how patient perspectives might be incorporated into governance structures and practices of DRNs:
What are the contours of patient knowledge about the use of health information in research and research networks?
What are patient preferences and concerns regarding the use of their health information in research networks? Specifically, what issues, concerns, and institutional actors and groups are considered in their opinions of their own health information in research networks?
Finally, what are patients’ perspectives on how to design informed consent practices and data access governance that meet their own concerns and perspectives?
Background
Theories of governance in science and technology are largely concerned with the interrelationships among science, technology, and politics. Governance theories place a special emphasis on democratic engagement, the relationship between “scientific” and wider social concerns, and the resolution of political conflict and controversy raising important questions about the process of scientific knowledge production, its interaction with the public realm, and policies regarding its use2. Governance theories broadly understand governance as a set of multi-dimensional power relations, proposing an interplay of expert and public social actors, and a mix of top-down, horizontal, and bottom-up policy initiatives2, 5. Governance policies, then, are applied specifically to clusters of technoscience and include a wide range of activities including but not limited to governance of clinical decision support6, governance of personal health records and clinical data sharing for Health Information Organizations (HIOs)7 and the Nationwide Health Information Network (NwHIN)8, as well as research networks themselves.
Frameworks for Governing Patient Data
Research governance is defined as a “system of administration and supervision through which research is managed, participants and staff are protected, and accountability is assured”9. A technical report on standards for DRNs by the Patient Centered Outcomes Research Institute (PCORI) recommends governance principles that include auditing of the users and uses of the data through a data access committee, network and data use agreements, and setting up governing bodies that include stakeholders1. Committees and workgroups should be created for critical network domains to assure collaboration with network participants and stakeholders, which include patients and members of the public. While the PCORI principles cover governance policy for comparative effectiveness research (CER), further policies are still needed to ensure legal, ethical, and socially responsible conduct of research networks10.
Patient Perspectives on Health Data Sharing
While little is known about patients’ perspectives on governance of research networks, much is known about how patients view the use of the health information, including biomaterials, in research. This extant literature informs analysis of what patient concerns to include in research and practices around the governance of DRNs. First, there is low awareness among the general public regarding the current use and future plans for integrating patient health information into secondary research. As a result, patient education is needed about research and information sharing practices11–14. Second, privacy both for oneself and for one’s immediate and extended (future) family, is the most important concern raised by patients when asked about sharing their health data11, 15–17. Importantly, while the literature on medical research using health records suggests that privacy should be assured by researchers in order to gain patient trust, it remains unclear if patients’ views of privacy are directly linked with their actual decisions to include their health information in medical research16, 18. Theoretical sharing preferences often do not match actual sharing behavior19. For patients, concerns over privacy are tightly linked to concerns over security, and some studies have suggested these concerns may be a barrier to consumer acceptance of both electronic health record research and electronic health record sharing20–22.
Third, when patients express favorable opinions about sharing of health information, they overwhelmingly do so in the context that societal gain might emerge from their participation15. As a result of these three areas of patient concerns, important to the governance of research is the question of how to balance individual privacy with social good12, 23–24. Finally, studies show that a large majority of patients would like control over how their health records are used. While they may be willing to share their data for clinical care purposes, a majority want to be able to at least positively authorize the use of their records for research13, 17, 19, 25–30. This finding is consistent with research on consumer opinions on biobanks and genetic research: the majority of patients want to be able to consent at least once to having their data shared, however, specific, granular consent options often differ16, 31–32. Researchers have thus suggested that flexible consent models be built into electronic medical records13–14, 33. Finally, meaningful consent has emerged as an important consideration when future research purposes as well as risks and benefits are not known.
In all, addressing issues of patient education, privacy and security, concerns regarding social good (or what researchers define as benevolence and justice), and consent practices as well as consent limitations are critical to not only garnering public acceptance and participation in DRNs, but also to establishing governance policies and practices that incorporate patients as stakeholders. Building on this literature, a deeper understanding of patient knowledge and preferences regarding the sharing of data in DRNs is essential to ensuring a patient-centered governance structure.
Methods
Researchers at San Francisco State University (SFSU) designed a qualitative component to elicit in-depth patient views on data networks, sharing of data for research, and perspectives on consent and access options. A semi-structured focus group guide was designed around three case scenarios that gradually moved the group discussion from perspectives on the use of data in clinical research to sharing across research sites to sharing in DRNs.
Recruitment and Sample
Focus groups were conducted at the three SCANNER project sites: Partners Healthcare in Massachusetts (two focus groups, 12 participants total), the San Diego Veteran’s Healthcare System (two focus groups, 14 participants total), and the University of California San Diego Medical Center (two focus groups, 10 participants total). Each of the focus groups lasted between 90 and 120 minutes. Recruitment was coordinated by SCANNER site leaders in collaboration with physicians. Efforts were made to recruit a diverse group of patients using a combination of relationship building and networking strategies. Eligibility criteria included: diagnosis of diabetes or cardiovascular disease; not in inpatient/acute care, rehabilitation or skilled nursing facility at the time of focus group; 18 years of age or older; and able to converse in English.
Interested participants were given an information sheet about the project. Once volunteers were identified and agreed to participate, the SFSU research team sent a confirmation letter and scheduled a time for the focus group. Focus groups were conducted at the SCANNER sites in accessible and private conference rooms separate from any clinical encounter sites or physician office locations. Before the focus groups were conducted, the facilitator explained focus group procedures, including the importance of protecting the confidentiality of the group discussion, and asked the respondents to sign the informed consent forms. In order to ensure confidentiality, participants were assigned a participant ID # and all names were erased from the transcripts. IRB approval was obtained at all sites.
Table 1 summarizes the demographics of the 36 focus group participants across all three sites. The high percentage of men is due to conducting two of the focus groups at the San Diego VA, and in Boston all but one of the participants were men. The majority of the participants (69.4%) identified as white and college educated (88.9%). 22% of participants made less than $40,000 per year, while 50% of participants made over $80,000 per year.
Table 1.
Variable | Focus Group #s (percentages) N=36 | Variable | Focus Group #s (percentages) N=36 |
---|---|---|---|
| |||
Gender: Women | 7 (19.4%) | Age: Average | 63.4 y/o |
Men | 29 (80.6%) | Range | 29–81 y/o |
| |||
Ethnicity: White/Caucasian | 25 (69.4%) | Income: | |
Hispanic/Latino(a) | 4 (11.1%) | < $20,000 | 3 (8.3%) |
Black | 3 (8.3%) | $20,000 – $40,000 | 5 (13.9%) |
Asian | 3 (8.3%) | $40,000 – $80,000 | 10 (27.8%) |
Native American | 1 (2.8%) | > $80,000 | 18 (50.0%) |
| |||
Diagnoses (not mutually exclusive): | Education Level: | ||
Diabetes | 21 (58.3%) | Did not graduate high school | 1 (2.8%) |
Hypertension | 26 (72.2%) | High school graduate | 2 (5.6%) |
Heart Disease | 15 (41.7%) | Some college or graduate | 23 (63.9%) |
Graduate degree | 9 (25%) | ||
Declined to answer | 1 (2.8%) |
Focus Group Design and Analysis
Two researchers conducted the focus groups which involved a structured set of scenarios designed to generate conversations around known themes from the literature, such as patient preferences for data sharing, informed consent, access to data and results, and organizational ethics. Three progressive scenarios were posed. Scenario 1 asked participants how they would feel about sharing their de-identified information for a study involving data from only their provider’s practice. In Scenario 2, de-identified data would be used in a research study involving patient data from several providers/hospitals. For Scenario 3, they were asked their thoughts about sharing their de-identified health information in an electronic research network involving data from many patients across many institutions. Research questions were open-ended and repeated for each scenario. The three questions were: How do you feel about participating in a study where your health information will be used in this way?; Do you have any questions or concerns about your health information being used in this way?; Finally, do you think there might be any advantages to using your health information in this way?
The six focus groups were analyzed inductively following the principles of grounded theory methodology33–34. Two researchers independently coded the focus group transcripts to promote consistency of interpretation. Transcripts were coded for participants’ attitudes, opinions, and preferences within each focus group topic. We then looked for patterns that crosscut main topics such as specific ethical and security concerns of participants. Once agreement regarding the interpretive codes and their operational meaning was reached between both researchers, codes were organized into conceptual memos which served as the thematic basis for the findings in this paper.
Results
Focus group participants positively endorsed sharing their data in research networks. This perspective was driven by a belief that large-scale research could be used for public benefit by improving healthcare practices and the health of communities. Such endorsement, however, was accompanied by participants’ concerns over electronic security and a range of ethical and social concerns that increased from Scenario 1 to 3. In Scenario 1, for example, participants had almost no reservation about sharing their de-identified data as long as they were assured it would remain anonymous (even after explanations by the moderators, participants continued to conflate the terms “de-identified” and “anonymous”). Security concerns were minimal based on the perception that when data stays within a single provider database, security breaches are minimized. In this scenario, participants mentioned only minor ethical concerns such as ensuring their family’s health records would not be involved without their consent.
As we moved to Scenario 2, a research study involving patient data across a few providers/hospitals, participants began voicing concerns around security, privacy, and ethics. The idea of transmitting data electronically across institutions led to the perception of increased potential for breaches and misuse or mishandling of health records as additional staff and organizations became involved. Here, participants began voicing concerns about research administration such as who would be leading the research, and who would have access to and benefit from the results, and what might it mean for relatives and future generations if privacy was not protected. Corporations, including health insurance companies, emerged as examples of entities some participants would want barred from having access to the data.
With the introduction of Scenario 3, participants’ concerns escalated around security and broad ethical issues. For example, participants raised concerns over security breaches, including hacking and stealing of data, across the focus groups. One participant stated: “It’s interesting how you put the computer system [network] in there because now we’re really getting into the ability for information to easily be stolen.” Another stated: “I would want to know what kind of security the central network is using. Are they using any type of encryption at all, who has access to the system? How do they maintain that type of access, you know, just general [questions].” In this scenario, participants also voiced ethical concerns such as social good and social justice. These primarily revolved around two related but also separate issues: corporate involvement in the research network and administration of the network. Both include concerns regarding how data findings would be used, for what groups, and for whose benefit. Participants considered these to be topics that are politically charged and included debates over “profit motives,” “unavoidable commercialism,” and organizational interests versus social good.
Trust as a Key Factor for Sharing Health Data
Most participants emphasized that their willingness to share their health data was, in part, driven by trust in their doctors to do the right thing for patients. For many, if their doctors asked or suggested they participate in a research study, they were inclined to do so. In fact, many of the participants in these focus groups joined as a result of their trust in their doctor. In probing this perception, it also became clear that participants place their trust in actual people, such as their doctors, rather than in institutions or networks.
As the scenarios grew in scale, trust became more difficult to ensure. For example, participants reflected that while they can trust their doctor it is hard to trust a system. Even if commercial entities are not involved in the DRNs, participants spoke with hesitancy of the bureaucratic and often political nature of large institutions and how they are governed. However, participants were also realistic about the need to place trust in institutions, or else no large-scale research would ever get done. Participants related their feelings of trust to their options for authorizing sharing: if they can trust that their health information will be put to good use helping others and not simply used for commercial profits, they are willing to share data in DRNs (See Box 1).
Box 1: Trust in Physicians and Systems.
Trust in physicians:
“I was recommended by my physician and I have such high regard for him that if he had recommended me then you know I think that it would probably by worthwhile. He knows the value of time.”
“I am a catholic and I say that because I went to confession one day and I had major surgery coming up and I was worried about it you know, and the priest said to me, You trust god don’t you and I sad yes I do and he says, well, you have to learn to trust my doctor….and so I trust my doctor.”
“At some point you have to trust somebody in something if you’re going to want to participate. If you don’t want to participate then tell your doctor, ‘No I don’t want my information sent.’ Because then if you do find out it gets sent then again she’s going to fall into that same lawsuit of ‘I told you no, you don’t have my consent. Show me my signature somewhere.’”
Trust in a system:
“Yeah but then we’re just building another bureaucracy… you’re gonna have to have somebody take care of the operation you know I mean it’s not going to be – you’re talking about 300 million people you know. And that’s like putting it on the – in a little organization who has a lot of control.”
“Who is going to make those decisions?...controllability affects everybody’s life you know no matter what we talk about…this is going to be so big that it’s going to be political. We’re not talking about a local community…One way or another you’re going to build that bureaucracy.”
Socio-Ethical Concerns of DRNs
Beyond considerations of ethics in research (such as those enumerated in the Belmont Report35) lie broader social-ethical concerns raised by participants. These included discussions of the conduct of powerful entities such as corporations in the use of and subsequent ownership over research data. Participants held multiple concerns related to ethical principles of justice including the institutional stakeholders involved with DRNs. A shared and unanimous concern surfaced across all the focus groups: who will have access to the data, for what purpose, and to who’s benefit and/or exclusion? Discussion of these concerns often revolved around issues with corporate involvement in research networks. The quotes in Box 2 speak to participants’ concerns that results of large-scale studies might only end up benefitting the corporations involved.
Box 2: Corporate Access.
Concerns over corporate involvement and access to the data
“As we get further and further away from the core research and it then goes to private corporations I get a little more hesitant but I’m generally okay as long as the anonymity is there… I don’t often feel like big corporations have anything other than their best interest at heart, and how they use that research is you know once it’s in their hands is I don’t know, I’m a little bit more resistant as it gets passed further and further beyond the true research participants.”
“If you raise the issue about how the information is to be used, you mentioned government, industry, universities, I have somewhat of a suspicion of large pharmaceutical companies that are pushing one particular med and they want to see efficacy because they want to sell that med whereas a competing company that might have a somewhat different kind of med but still is looking for the same results you’re not going to have the objectivity in that process.”
Additionally, questions of governance emerged in concerns about administrative control, management of bureaucracy, and what participants regarded as an uncertain political field of development and management. As one participant asked: what kind of “half-life” will the data have? How will it be captured and stored? Notably, participants did not directly state that they would refuse to share their clinical data if a corporation was involved in the production or dissemination of the study results, but did express concern over this possibility.
Public Justice: Balancing Profit and Public Good
While there was no overall agreement among participants about research results leading to a product or products with profit potential, many participants expressed concern with potential commercial involvement in DRNs. Some participants supported corporate profits if the company developed new or modified medical technologies that would help people and if such help could be distributed equitably. Conversations took place around how equity could be ensured and who would decide and ensure this goal. Others expressed ambivalence around the negative associations people have with some commercial entities, in this context pharmaceutical and health insurance companies. Some participants felt that companies place profit before benefits to individuals. One participant said they do not trust pharmaceutical companies to offer patients what is in their best interest. However, some participants also felt that the profit motive was a powerful reason for health technologies to be improved which would ultimately benefit patients..
For some, the public availability of knowledge and treatment offered through potential discoveries would outweigh the concerns of financial gains to commercial entities; but if only a select few would benefit from this research, then participants were less enthusiastic about including their healthcare data in the network. Participants in one focus group discussed why benefits from DRN research findings need to benefit both the researchers and the public; it cannot be just one or the other: the researchers need incentives to do the research, but the data is coming from a mass database that involves both public and private health institutions. Box 3 illustrates participants’ concerns about corporate involvement and profit-making.
Box 3: Commercial and Profit-Making Motives.
Concerns over corporate involvement and commercialism:
“To me it really depends. I would be a lot more inclined to help out an educational facility than I would a biomedical company that’s trying to make a profit on a drug, just the motive would depend. They have a higher standard with which to prove to me why I should participate.”
“[Research] pursues a good end to benefit others. My concern is that certainly that’s not the philosophy of the insurance companies and they also have access or may have access to this information… They’re not looking for ways to lower their prices but they’re looking for ways to be more competitive.”
Ambivalence about profit-making:
“You can say that well if it goes to a corporation they’re going to develop a new drug, they’re going to market the heck out of that drug; they’re going to make millions of dollars, but on the other end if the drug is helping people I think that’s okay.”
“I think everybody’s in agreement that we have no problem sharing the information, we’re glad to contribute to the betterment of women and men in kind but the problem is how is this information going to be used? By whom? And will it have some negative results if it’s in the wrong control?”
Profit motives can be positive:
“[Sharing your data] is a good thing although you’d never know it, I mean you’d never really know what you provided, I mean you’re one of hundreds of thousands you know. But this happens all the time and it’s a good thing, it’s a good thing.”
“I personally wouldn’t [have a problem with the company profiting off the results of the research]. They’re going to get the information some way at some point from someone and you know I think it’s good to contribute and be a part of that.”
Assuring Social Responsibility
Balancing individual needs and social responsibility emerged as a theme of discussion for patients as they reflected on data sharing. We found that for the participants who were supportive of commercial entities profiting from the health research spoke directly or indirectly of “social responsibility.” This was frequently referenced when discussing affordable access to the medical technologies developed from DRNs. Participants asked if companies would intend to create some way for those who cannot afford their medical products or drugs to have access to them. In our analysis the long discussions about affordability pointed to the larger theme of social responsibility and justice. That is, the idea that benefit be equitably distributed across publics regardless of ability to pay, insurance status, citizenship, location, or other factors that often stratify access. Box 4 displays participants’ comments about social responsibility extending beyond commercial profits to include equitable and fair distribution of knowledge and products that result from research.
Box 4: Social Responsibility.
Commitments to social responsibility:
“I think here there are two issues and the scenario is very interesting because it puts us in a very contradictory framework. Why contradictory? Because on one hand we have reached what we are pursuing here, the benefit of the patient, but on the other hand we see the concept of commercial, sold commercially. So then here is the thing so is this a contribution to everybody? Yes, contribution to benefit the patient. But also we’ve contributed to the benefit of a company that is selling this product. I don’t have problems with that but certainly these companies most of the times they’ve refused to release the patents after certain time which is the time that they should be releasing these patents.”
“This point…is a very interesting point: there are health issues in Africa that could be solved if the pharmaceutical companies could drop the prices of the medicines, and the real price of the medicine is low but because it’s the only company that sells that particular medicine they put in a very high price. And we don’t need to go as you said we don’t need to go to Africa, here we have troubles. So I would really put that emphasis that you mentioned like kind of a social responsibility into the use of the information. If there is this commitment it would be excellent.”
Authorization for Electronic Data Sharing
Participants were asked to discuss four data sharing authorization options within the research network scenarios using de-identified data: 1) broad opt-in; 2) broad opt-out; 3) project-specific approval; and 4) no authorization required. In explaining each scenario to the participants, the facilitators acknowledged that forming a review board was not current practice and that project specific approval by an “administrator” would have to be worked out by the DRN in accordance with their practice guidelines.
Overwhelmingly, though not unanimously, participants favored option 1, broad opt-in, where they would be asked during their first medical visit if they consented to having their health information included in future studies. One participant stated: “I don’t think consent should be assumed, I think there should have to be affirmative positive consent.” The issue of “trust” was also raised among participants: their feelings on consenting came back to the issue of trust. If they can trust that their health information will be put to good use helping others, they are willing to consent to future studies. However, participants also asked how a one-time consent could capture specific information about future projects and, thus, questioned the very notion of informed consent. One participant who favored project specific approval thought this would more accurately reflect real-time information, including potential risks and benefits.
Participants did not prefer option 2, broad opt-out, since data could be used without initial authorization. A few participants discussed how the importance of what they were signing (consent forms) could slip through the cracks: when they go for a medical visit, they are concerned about their personal health, and the last thing on their minds is trying to understand consent forms. One participant felt conflicted over being able to opt-out since their information is valuable for advancing medical knowledge (see Box 5 below). Most participants also did not like option 3, project-specific consent, since requiring consent authorizations for every future study would be too much of a burden for both them and their doctors. Two participants did like option 3, however. One stated they would like to be able to learn about each study their information is used in, such as who is funding it and what will be done with the results. Another participant felt nervous about broad-opt in, and wanted to be able to re-consent for each project noting that the specifics would vary for each and every research study. Box 5 represents participants’ diverse thoughts on data sharing authorization options.
Box 5: Conflict Between Risks and Benefits of Authorization.
Feelings of confliction over risks versus benefits of consent options:
“I’m defeating the purpose [of doing research with health data] by saying well I want to have the option to say no, I don’t want this particular thing used but because you know again that’s the purpose of this is to have all the data so I don’t know, it’s where I’m kind of torn or conflicted by it to some degree.”
Feelings of nervousness over broad opt-in only:
“I don’t like it. That’s just me because I mean it’s just like you sign the form once and you never see it again and then later on in life it ends up biting you in the ass ‘cause well you signed the form once and you never saw it again. But someone goes out and dusts off your records and says ‘Hey look here.’ I’m like, ‘Well Goddam I guess I did sign it.’ And you can’t do anything about it. There’s no option.”
Thoughts on trust and consent:
“But again it comes back to the issue of trust. Is the information going to be used basically for your benefit, is it going to be used more or less according to standards or appropriate research? And there comes a point where you just have to say I hope it’s going to work out.”
Discussion
Focus groups with patients reveal that governance when including patient perspectives can move beyond the more narrow requirements of federal and state polices and institutional administrative and technical requirements to include broad ethical and social issues36. Based on empirical data of patient perspectives we found that patients are driven by trust and when asked to participate in research networks they hold socio-ethical concerns beyond those regulated by the conduct of human subject research enacted in IRB practices. These included discussions of the conduct of powerful entities such as corporations in the use of and subsequent ownership over research data. These results may not be surprising in light of the recent public attention to issues of research ethics and justice in high profile books and cases, such as The Immoral Life of Henrietta Lacks37, high profile court cases such as the Myriad Genetics suit over the patenting of the BRCA1 and BRCA2 genes that is now before the U.S. supreme court38, and the research case involving Arizona State University and data collected from the Havasupai tribe39. These cases, as well as many other historical cases of research ethics, have brought important public awareness to issues of human subjects protection, ethics in use of personal health information and biomaterials, and considerations of social justice as well as ethical principles of justice in medical research. Participants in these focus groups engaged in discussions over the ethical use of research that similarly moved beyond ethical principles captured by research policies.
Situating these findings within social scientific and medical frameworks of governance allows us to more thoroughly address public participation and trust in what is sure to be an expanding field of distributed research networks. Governance of DRNs must include systems of administration that ensure participants and staff is protected and accountability is assured. However, as this research shows, patient informed governance would expand beyond these technical assurances to include broader assurance of public good, social responsibility, and fairness and equity if public trust is to be built and public participation assured. These are essential to the success of these networks.
Building public trust and engagement in DRNs will require work on many fronts, including patient and stakeholder education campaigns, transparent administrative processes, and open disclosure of engagement in the networks, especially by commercial entities. Public education campaigns could start with doctors’ offices and hospitals that use electronic medical records introducing patients to the concept of DRNs and discussing the specific networks the providers participate in. Patients must be educated about the security measures taken to safeguard medical records and how secondary use of data for research can benefit the public. Transparent administrative processes will also further engage the public in the governance of research networks. Administrators must be able to clearly communicate who runs the network, how it is funded, who controls access to the data, what entities and persons can gain access for research, and how results are disseminated. Patients in our focus groups seemed to favor a compromise between tightly controlled access to the data and open access, suggesting their attentiveness to the risks and benefits of data privacy compared with advancing healthcare knowledge. Finally, open disclosure of commercial involvement, including identifying any corporate social responsibility, can facilitate public trust.
Governance along these lines might enhance public trust in systems and, thus, garner patient participation in research. Interestingly, as we found here and as others have found14, participants mostly discussed trust in reference to their doctors, which suggests that trust of their doctors may then confer benefits of trust to the research networks. We do not assume naively that including patient views erases issues of power and politics. While doing so includes multiple perspectives, attention to the processes of how and along what avenues so-called publics are engaged (i.e., through social movement activism, as corporate designed patient groups, in public research projects, etc.) will inform whether and to what extent broad sets of interests and values are built into the ongoing management of research. Such an approach only holds the potential to check and balance the power of decision-making; it does not determine its fate. Thus, as Irwin stated, it is valuable to retain a critical awareness of “new scientific governance” models and theory: while this literature supports public engagement in all scientific endeavors, it is possible that in the end nothing may change and the decision-making will still come from top-down administration only; network designers must remain aware of a top-down commitment to “bottom-up” processes40.
There are a number of limitations to this study. Our focus group findings are not generalizable to represent the U.S. public. In particular, our participants were highly educated and males were significantly overrepresented (80.6% males vs. 19.4% females). Also, patients are not a homogenous group. Different populations of patients will have different opinions and needs regarding sharing their data and support for DRNs, and more studies are needed to reflect these differences. Focus groups elicit the underlying thoughts and feelings behind participants’ statements and beliefs, but do not allow us to test any hypotheses or to quantify participants’ answers to validate findings. Further, given that some of the issues presented, especially the issue of data authorization policies, were unknown, responses by patients were speculative. Future research should include large-scale surveys and other studies that would yield quantifiable data and allow empirical tests based on the themes elucidated from these focus groups as they relate to DRNs. Finally, studies should ground responses in actual practices and among patients who themselves have been asked to participate in a DRNs and thereby seek to further understand the relationship between patients’ thoughts on specific aspects of DRN governance and their willingness to actually share their data with these networks.
Conclusion
Our findings provide a basis for further thinking through of what patient-centered governance in electronic research networks would look like. A challenge with respect to ethical issues that arise from emerging technologies is that they can give rise to concerns that cannot be dealt with adequately by existing regulations or guidelines. Governance in regards to new technologies is not a one-time event but an ongoing process. Much work needs to done regarding the assessment of current regulations, the development of new insights into how ethical issues pertain to emerging technologies, and how public concerns may be considered in policymaking. Public understanding, dialogue, and engagement are already part of science and technology conversation, but given the fast pace of innovation and change, new efforts must strive to create mutual benefits for all actors involved in innovation.
Acknowledgments
SCANNER and this study were supported by Agency for Healthcare Research and Quality R01 HS19913-01. We thank Zia Agha, Tania Zamora, Susan Robbins, Fred Resnic, Michele Day, Robert El Kareh, and Cindy Wong for assistance with patient recruitment.
References
- 1.Ohno-Machado L, Day ME, El-Kareh R, et al. Technical report: Standards in the use of collaborative or distributed data networks in patient centered outcomes research. Patient-Centered Outcomes Research Institute (PCORI) 2012. Available from: http://pcori.org/assets/pdfs/Standards%20Outcomes%20Research.pdf.
- 2.Irwin A. Hackett EJ, Amsterdamska O, Lynch M, Wacjman J, editors. STS perspectives on scientific governance. The handbook of science and technology studies. third edition. 2008. pp. 583–607.
- 3.http://scanner.ucsd.edu/
- 4.Kim KK, McGraw D, Mamo L, Ohno-Machado L. Development of a privacy and security policy framework for a multistate comparative effectiveness research network. Medical Care. 2013;51(8 Suppl 3):S66–72. doi: 10.1097/MLR.0b013e31829b1d9f. [DOI] [PubMed] [Google Scholar]
- 5.Graham J, Amos B, Plumptre T, for the Institute on Governance Principles of good governance in the 21st century: Policy brief no. 15. 2003.
- 6.Wright A, Sittig DF, Ash JS, Bates DW, Feblowitz J, Fraser G, et al. Governance for clinical decision support: case studies and recommended practices from leading institutions. J Am Med Inform Assoc. 2011;18:187–194. doi: 10.1136/jamia.2009.002030. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Reti SR, MBChB, Feldman HJ, Safran C. Governance for personal health records: Viewpoint paper. J Am Med Inform Assoc. 2009;16:14–17. doi: 10.1197/jamia.M2854. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Dixon BE, Zafar A, Overhage JM. A Framework for evaluating the costs, effort, and value of nationwide health information exchange. J Am Med Inform Assoc. 2010;17:295–301. doi: 10.1136/jamia.2009.000570. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Shaw S, Boynton PM, Greenhalgh T. Research governance: where did it come from, what does it mean? Journal of the Royal Society of Medicine. 2005;98(11):496–502. doi: 10.1258/jrsm.98.11.496. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Diamond CC. In: Learning what Works: Infrastructure required for comparative effectiveness research: Workshop summary. Olsen L, Grossman C, McGinnis JM, editors. Institute of Medicine The National Academies Press; 2011. [PubMed] [Google Scholar]
- 11.Medical Research Council The use of personal health information in medical research: General public consultation. [homepage on the Internet] 2007. [cited 2013 Jan 4]. Available from: http://www.ipsos-mori.com/Assets/Docs/Archive/Polls/mrc.pdf.
- 12.Robling MR, Hood K, Pill R, Fay J, Evans Hm. Public attitudes towards the use of primary care patient record data in medical research without consent: A qualitative study. J Med Ethics. 2004;30:104–9. doi: 10.1136/jme.2003.005157. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Whiddett R, Hunter I, Engelbrecht J, Handy J. Patients’ attitudes towards sharing their health information. Intl J of Med Informatics. 2006;75:530–41. doi: 10.1016/j.ijmedinf.2005.08.009. [DOI] [PubMed] [Google Scholar]
- 14.Willison DJ, Steeves V, Charles C, Schwartz L, Ranford J, Agarwal G, et al. Consent for use of personal information for health research: Do people with potentially stigmatizing health conditions and the general public differ in their opinions? BMC Medical Ethics. 2009;10(10) doi: 10.1186/1472-6939-10-10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Weitzman ER, Kaci L, Mandl KD. Sharing medical data for health research: The early personal health record experience. J Med Internet Res. 2010;12(2):e14. doi: 10.2196/jmir.1356. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Kaufman DJ, Murphy-bollinger J, Scott J, Hudson KL. Public Opinion about the importance of Privacy in biobank research. The Am J of Human Genetics. 2009;85:643–54. doi: 10.1016/j.ajhg.2009.10.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Willison DJ, Schwartz L, Abelson J, Charles C, Swinton M, Northrup D, et al. Alternatives to project-specific consent for access to personal information for health research: What is the opinion of the Canadian public? J Am Med Inform Assoc. 2007;14:706–12. doi: 10.1197/jamia.M2457. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Stone MA, Redsell SA, Ling JT, Hay AD. Sharing patient data: Competing demands of privacy, trust and research in primary care. British Journal of General Practice. 2005 Oct;:783–9. [PMC free article] [PubMed] [Google Scholar]
- 19.Caine K, Hanania R. Patients want granular privacy control over health information in electronic medical records. J Am Med Inform Assoc. 2013;20:7–15. doi: 10.1136/amiajnl-2012-001023. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Chhanabhai P, Holt A. Consumers are ready to accept the transition to online and electronic records if they can be assured of the security measures. MedGenMed. 2007;9(1):8. [PMC free article] [PubMed] [Google Scholar]
- 21.Patel VN, Abramson E, Edwards AM, Cheung MA, Dhopeshwarkar RV, Kaushal R. Consumer attitudes toward personal health records in a beacon community. Am J Manag Care. 2011;17(4):e104–e120. [PubMed] [Google Scholar]
- 22.Patel VN, Dhopeshwarkar RV, Edwards A, Barrón Y, Sparenborg J, Kaushal R. Consumer support for health information exchange and personal health records: A regional health information organization survey. J Med Syst. 2012;36:1043–52. doi: 10.1007/s10916-010-9566-0. [DOI] [PubMed] [Google Scholar]
- 23.Asai A, Ohnishi M, Nishigaki E, Sekimoto M, Fukuhara S, Fukui T. Attitudes of the Japanese public and doctors towards use of archived information and samples without informed consent: Preliminary findings based on focus group interviews. BMC Medical Ethics. 2002;3(1) doi: 10.1186/1472-6939-3-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Lane J, Schur C. Balancing access to health data and privacy: A review of the issues and approaches for the future. Health Research and Educational Trust. 2010;45(5p2):1456–67. doi: 10.1111/j.1475-6773.2010.01141.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Buckley BS, Murphy AW, Macfarlane AE. Public attitudes to the use in research of personal health information from general practitioners' records: A survey of the Irish general public. J Med Ethics. 2011;37:50–5. doi: 10.1136/jme.2010.037903. [DOI] [PubMed] [Google Scholar]
- 26.Damschroder LJ, Pritts JL, Neblo MA, Kalarickal RJ, Creswell JW, Hayward RA. Patients, privacy and trust: Patients’ willingness to allow researchers to access their medical records. Social Science & Medicine. 2007;64:223–35. doi: 10.1016/j.socscimed.2006.08.045. [DOI] [PubMed] [Google Scholar]
- 27.Hunter IM, Whiddett RJ, Norris AC, Mcdonald BW, Waldon JA. New Zealanders’ attitudes towards access to their electronic health records: Preliminary results from a national study using vignettes. Health Informatics Journal. 2009;15(3):212–28. doi: 10.1177/1460458209337435. [DOI] [PubMed] [Google Scholar]
- 28.King T, Brankovic L, Gillard P. Perspectives of Australian adults about protecting the privacy of their health information in statistical databases. Intl J of Med Inform. 2012;81:279–89. doi: 10.1016/j.ijmedinf.2012.01.005. [DOI] [PubMed] [Google Scholar]
- 29.Page SA, Mitchell I. Patients’ opinions on privacy, consent and the disclosure of health information for medical research. Chronic Diseases in Canada. 2006;27(2):60–7. [PubMed] [Google Scholar]
- 30.Willison DJ, Keshavjee K, Nair K, Goldsmith C, Holbrook Am. Patient consent preferences for research uses of information in electronic medical records: Interview and survey data. BMJ. 2003:326. doi: 10.1136/bmj.326.7385.373. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Hoeyer K, Olofsson B, Mjorndal T, Lynoe N. Informed consent and biobanks: a population-based study of attitudes towards tissue donation for genetic research. Scand J Public Health. 2004;32:224–229. doi: 10.1080/14034940310019506. [DOI] [PubMed] [Google Scholar]
- 32.Valle-Mansilla JI, Ruiz-Canela M, Sulmasy DP. Patients’ attitudes to informed consent for genomic research with donated samples. Cancer Investigation. 2010;28:726–34. doi: 10.3109/07357907.2010.494320. [DOI] [PubMed] [Google Scholar]
- 33.Charmaz Cathy. Constructing Grounded Theory: A practical Guide Through Qualitative Analysis. London: Sage; 2006. [Google Scholar]
- 34.Strauss AL, Corbin JM. Basics of qualitative research: Techniques and procedures for developing grounded theory. 2nd ed. Thousand Oaks: Sage Publications; 1998. [Google Scholar]
- 35.Mamo L, Fishman J. “Why Justice? Introduction to the Special Issue on Entanglements of Science, Ethics, and Justice,”. Science, Technology & Human Values. 2013;38(2):159–175. [Google Scholar]
- 36.http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html
- 37.Skloot Rebecca. The Immortal Life of Henrietta Lacks. New York: Macmillan Press; 2010. [Google Scholar]
- 38.Conley J. The ACLU v. Myriad Genetics Suit: Legitimate Challenge or Publicity Stunt? Genomics Law Report [Internet] 2009. [cited 2013 Jan 5]. Available from: http://www.genomicslawreport.com/index.php/2009/06/04/aclu-v-myriad-genetics-suit-legitimate-challenge-or-publicity-stunt/.
- 39.Harmon A. Indian tribe wins fight to limit research of its DNA. The New York Times [Internet] 2010. [cited 2013 Jan 5]. Available from: http://www.nytimes.com/2010/04/22/us/22dna.html?pagewanted=all&_r=0.
- 40.Irwin A. The politics of talk: Coming to terms with the 'new' scientific governance. Social Studies of Science. 2011;36(2):299–320. [Google Scholar]