Abstract
Background: Use of medical data for secondary purposes such as health research, audit, and service planning is well established in the UK. However, the governance environment, as well as public opinion and understanding about this work, have lagged behind. We aimed to systematically review the literature on UK and Irish public opinions of medical data use in research, critically analysing such opinions though an established biomedical ethics framework, to draw out potential strategies for future good practice guidance and inform ethical and privacy debates.
Methods: We searched three databases using terms such as patient, public, opinion, and electronic health records. Empirical studies were eligible for inclusion if they surveyed healthcare users, patients or the wider public in UK and Ireland and examined attitudes, opinions or beliefs about the use of patient data for medical research. Results were synthesised into broad themes using a Framework Analysis.
Results: Out of 13,492 papers and reports screened, 20 papers or reports were eligible. While there was a widespread willingness to share EHRs for research for the common good, this very rarely led to unqualified support. The public expressed two generalised concerns through a variety of hypothetical examples. The first of these concerns related to a party’s competence in keeping data secure, while the second was associated with the motivation a party might have to use the data.
Conclusions: The public evaluates trustworthiness of research organisations by assessing their competence in data-handling and motivation for accessing the data. Public attitudes around data-sharing exemplified several principles which are also widely accepted in biomedical ethics. This provides a framework for understanding public attitudes, which should be considered in the development in any guidance for regulators and data custodians. We propose four salient questions which data guardians should address when evaluating proposals for the secondary use of data.
Keywords: privacy, patient data, Electronic Health Records, governance, public, engagement, ethics
Introduction
The use of medical data for secondary purposes such as health research, audit, and service planning is well established in the UK, and technological innovation in analytical methods for new discoveries using these data resources is developing quickly. Data scientists have developed, and are improving, many ways to extract and process information in medical records. This continues to lead to an exciting range of health related discoveries, improving population health and saving lives. Nevertheless, as the development of analytic technologies accelerates, the decision-making and governance environment as well as public opinion and understanding about this work, has been lagging behind 1.
Public opinion and data use
A range of small studies canvassing patient opinions, mainly in the USA, have found an overall positive orientation to the use of medical data for societal benefit 2– 7. However, recent case studies, like NHS England’s ill-fated Care.data scheme, indicate that certain schemes for secondary data use can prove unpopular in the UK. Launched in 2013, Care.data aimed to extract and upload the whole population’s general practice patient records to a central database for prevalence studies and service planning 8. Despite the stated intention of Care.data to “make major advances in quality and patient safety” 8, this programme was met with widely reported public outcry leading to its suspension and eventual closure in 2016. Several factors may have been involved in this failure, from the poor public communication about the project, lack of social licence 9, or as pressure group MedConfidential suggests, dislike of selling data to profit-making companies 10. However, beyond these specific explanations for the project’s failure, what ignited public controversy was a concern with the impact that its aim to collect and share data on a large scale might have on patient privacy. The case of Care.data indicates a reluctance on behalf of the public to share their medical data, and it is still not wholly clear whether the public are willing to accept future attempts at extracting and linking large datasets of medical information. The picture of mixed opinion makes taking an evidence-based position, drawing on social consensus, difficult for legislators, regulators, and data custodians who often respond to personal or media generated perceptions of public opinion. However, despite differing results of studies, we hypothesise that there may be underlying ethical principles that could be extracted from the literature on public opinion, which may provide guidance to policy-makers for future data-sharing.
Governance and legal framework of data use
The Data Protection Act (1998) is the main legislation governing the use of patients’ medical data, soon to be replaced by General Data Protection Regulation legislation (2018). This law covers personal, or patient identifiable data in the UK. Personally Identifiable Information (PII) is defined as information that can be used on its own, or with other information, to identify, contact or locate a single person, or to identify an individual in context. Where patient data is being used at scale for indirect patient care (that is, service planning or research), without explicit consent and while still patient-identifiable, its use must be approved by the Confidentiality Advisory Group England (CAG) within the Health Research Authority.
Another potential route to the use of data for these purposes without individual patients’ consent, is the de-identification of data so that it is no longer classified as personal data. De-identification involves removal or replacement of personal identifiers so that it would be difficult to re-establish a link between the individual and their data 11: “The challenge is to balance the levels of de-identification that are acceptable to the patients, research participants, clinicians, researchers, institutions, and federal requirements” 11.
It should also be noted that privacy can be achieved through other means than de-identification of patient data such as controlling linkage with other sources of information, robust protection with computing security systems, and giving access only to trusted users 12. “De-identification should be considered a necessary but insufficient means of protecting health privacy. In accordance with this view, health information should be collected, maintained, disclosed, and used in the least identifiable form consistent with the purpose of the information” 13.
Striking a balance
While it is clearly important to make sure patient privacy is protected, it is also argued that the societal benefit of medical research using health big data, which may save lives, should be given ethical weight. This is argued on the basis that harms to patients may occur where these rich data sources are not used to improve our understanding of health conditions and treatments 14. While individual privacy and societal benefit are often portrayed as being in opposition, for the future of health big data research, a way to achieve both to the satisfaction of patients, legislators and researchers must be found. Recent work has sought to identify the key issues of patient responses to data-sharing 15. It still remains to be established whether these issues are connected by any system of ethical values. The Four Principles approach, established by Beauchamp and Childress, has become a canonical text and established approach to evaluating the ethical aspects of medical practice and decision-making 16. The Four Principles proposed by Beauchamp and Childress are respect for autonomy, nonmaleficence, beneficence, and justice. We aim in this study to systematically review and thematically analyse UK and Irish studies exploring patient and public opinions on medical data being used for the secondary purpose of research, using the lens of the Four Principles to understand and map the results we find onto an established ethical framework. We aim to draw out potential strategies for future good practice guidelines around data privacy, to guide data custodians, the health data research community and the public.
Methods
We followed the PRISMA guidelines for the conduct and reporting of this review 17.
Search strategy
We searched PubMed, Web of Science, and Scopus between 03/10/16 and 11/10/16 using the following search string: (Public OR Patient OR People) AND (Attitudes OR Knowledge OR Opinions OR Views OR Perceptions) AND ("Care.data" OR "Electronic Health Record" OR "Electronic Health Data" OR "Electronic Medical Record" OR "Electronic Medical Data" OR "Personal Health Information" OR "Personal Health Record" OR "Electronic Patient Information" OR "Electronic Patient Data" OR "Electronic Patient Record" OR "Data linkage" OR "Data sharing") AND (Research). We restricted our search to publications from 2006–2016 inclusive. We also searched the grey literature using the search string: "public attitudes" AND "sharing" AND "health data" on Google (in June 2017). The first 20 results were selected and screened. The following inclusion criteria were then applied:
-
1.
Empirical studies using any methods reported as a full length peer review manuscript or published report.
-
2.
Healthcare users, patients or the wider public as participants
-
3.
Examining attitudes, opinions or beliefs about the topic of use of patient data for medical research.
-
4.
Studies using a UK or Irish sample, written in English. We chose to keep our review to these two countries because of similarities in their socialised healthcare systems, and because of the well-established use of patient data within these jurisdictions.
Studies were excluded if they were:
-
1.
Studies focused more broadly on digital technologies in health care where the focus was on use of digital methods or records rather than public attitudes
-
2.
Studies focused on patient and practitioner attitudes to analogous areas such as biorepositories, genetic testing and genomic research.
-
3.
Non-empirical reviews of legislation, policy, ethical challenges etc.
Using these criteria, the articles extracted from the literature search were screened based on their title, then abstract (by author JS), then finally the choice of full text papers for the review was undertaken by two authors (JS and EF).
Quality Assessment
Study quality was assessed using the Mixed Methods Appraisal Tool (MMAT) 18. This tool was designed for the appraisal of studies in mixed methods systematic reviews and attempts to appraise the quality of methodology, rather than the quality of reporting. All studies meeting the inclusion criteria above were assessed using 6 criteria. The first two are the same for all studies: is there a clear research question or objective, and does the data collected address the research question or objective? A further 4 questions were specific to the study type. Studies were given a score out of 6 depending on how many of the 6 criteria they met, and were rejected if they did not meet at least the first two criteria. Two papers were excluded on the basis of scoring zero on all criteria.
Data extraction
We extracted author names, dates, location, type of study (qualitative or quantitative), methods used, number of participants, their backgrounds or roles, ages, genders, and the study findings which fitted into the themes relating to research questions reported below.
Synthesis of results
The full text of eligible articles were read iteratively by two authors (JS and EF) with the aim of extracting coherent themes. In the first iteration of reading and coding the results of the papers, authors focussed on nine questions.
-
1.
Are patients/public aware of electronic health records (EHRs) and their secondary uses?
-
2.
Are patients/public concerned about the privacy and security of their medical data?
-
3.
Are patients/public willing to share their medical data for research, policy and planning?
-
4.
What consent model do patients/public prefer?
-
5.
How does data being identifiable or anonymised affect patient/public preferences?
-
6.
Which organisations are most and least trusted with patients’/public data?
-
7.
What are the reported perceptions of risks and benefits of sharing medical data for research?
-
8.
Are there any other ways in which willingness to share could be increased?
-
9.
Is there any one social group who are overly concerned with the sharing of EHRs?
A framework 19 was created with a column for each of the 9 questions and data was extracted from each study where it fitted into these categories. Following this data extraction, the two authors (EF and JS) discussed refining and combining extracted data into as smaller number of themes. In a second iteration of data extraction, authors re-read articles and extracted data into 7 themes. For interpretation and synthesis, a data driven approach was taken, trying to make meaning from first order data reported in the papers (i.e. statistics or participant quotes). Where themes were populated mainly by summary of quantitative data, a straightforward reports of papers’ findings is given. Where contributing papers were mainly qualitative e.g. in the Trust theme, we undertook a deeper analysis of meaning within findings guided by both metasynthesis principles 20 and established principles of bioethics 9.
Results
A total of 13,472 peer-reviewed papers were found through the systematic search, as well as 20 reports found through the grey literature search. Of these, 20 UK and Ireland based papers met the inclusion criteria and were included in the review 4, 21– 39 ( Supplementary File 2). Studies which reported time periods indicated that data was collected from 2004 to 2016, although seven studies published between 2011 and 2016 did not report the data collection period. Research participants included patients, service-users, lay persons, those living with chronic conditions, and the general public ranging from 16 years of age to over 75. Five of the studies included the views of health researchers, health professionals, industry experts, NHS managers and other key stakeholders. Seven of the papers were quantitative, using surveys or structured questionnaires. Ten of the studies were qualitative, using focus groups and one-to-one interviews, and there were three mixed methods studies. Details of studies are reported in Table 1.
Table 1. Included study characteristics.
Study
no. |
Reference | Method | Setting | Data Collection
Period |
Sample | Quality
Score |
---|---|---|---|---|---|---|
21 | Audrey, S.,
et al., BMC Medical Ethics,
2016. 17(1): p. 53. |
Qualitative: focus groups
with participants from ALSPAC. |
Bristol, England | Not stated | Total n=55, 56.4% female 43.6% male Ages: 17–19. | 5 |
22 | Baird, W.,
et al., Journal of Medical
Ethics, 2009. 35(2): p. 92–96. |
Qualitative: focus groups
and interviews. |
England and
Northern Ireland |
February and July
2006 |
Total n=68 Focus groups: Patients with MS and
stakeholders (n=55) Interviews: Health and social care professionals an academics (n=13) |
5 |
23 | Barrett, G.,
et al., British Medical
Journal, 2006. 332(7549): p. 1068–1070. |
Quantitative: survey run
by the Office of national statistics |
England, Wales
and Scotland |
March and April
2005 |
Total n=2872 46% male 54% female Ages: 16–44
46%; 45–64 35%; 65+ 20% |
6 |
24 | Buckley, B.S., A.W. Murphy, and A.E.
MacFarlane, Journal of Medical Ethics, 2011. 37(1): p. 50–55. |
Quantitative: postal
electoral roll-based questionnaire survey. |
Republic of
Ireland |
Not stated | Total n=1575 27.6% male 71.6% female Ages:
18–75+ |
4 |
25 | Campbell, B.,
et al., Quality and Safety
in Health Care, 2007. 16(6): p. 404–408. |
Quantitative: postal
questionnaire |
South-West
England |
October to
December 2004 |
Total n=166 patients recently discharged from
the care of 78 bed-holding consultants across all specialties at the Royal Devon and Exeter Hospital. |
5 |
26 | CM Insight and Wellcome Trust,
Summary report of qualitative research into public attitudes to personal data and linking personal data. 2013. |
Qualitative: focus groups
and one-to-one telephone interviews. |
London,
Midlands and Norfolk, England |
29 April to 12 May
2013 |
Total n=50, Ages: 18-70 Focus group
respondents were recruited as owners of store loyalty cards, smart phones and social media users. Telephone interviewees were recruited as especially pro-privacy or cautious about sharing personal data. |
3 |
4 | Clerkin, P.,
et al., Family Practice, 2013.
30(1): p. 105–112. |
Qualitative: focus groups. | West Republic of
Ireland |
Not stated | Total n=35, female (n=18) male (n=17) Ages:
18-35(n=2); 36-55(n=14); 56-70 (n=19). |
6 |
27 | Grant, A.,
et al., BMC Health Services
Research, 2013. 13: p. 422. |
Qualitative: focus groups
and semi-structured interviews. |
Tayside and
Lothian, Scotland |
Between February
and June 2011 |
Total n=64, Focus Groups: Patients (n=37), Health
services researchers (n=10) Interviews: GPs and Practice managers (n=17) |
5 |
28 | Haddow, G.,
et al., Journal of
Evaluation in Clinical Practice, 2011. 17(6): p. 1140–1146. |
Qualitative: focus groups. | North East
Scotland |
May and June 2009 | Total n=19, female (n=12), male (n=6), Unstated
(n=1) Ages: <60 (n=1); 60–74 (n=15); +75 (n=3). |
5 |
29 | Hays, R. and G. Daker-White, BMC
Public Health, 2015. 15: p. 838. |
Qualitative: using tweets. | Over 18 days
during February and March 2014 |
3537 tweets containing the hashtag #caredata;
904 contributors |
6 | |
30 | Hill, E.M.,
et al., BMC Medical Research
Methodology, 2013. 13: p. 72. |
Qualitative: focus groups. | England, Wales,
Scotland and Northern Ireland |
Not stated | Total n=19, 100% male Ages: 54–69; mean age 61. | 4 |
31 | Ipsos Mori, Medical Research Council,
The use of personal health information in medical research general public consultation. 2007. |
Mixed methods study.
Qualitative: workshops and interviews Quantitative: face-to-face survey |
England, Wales,
Scotland and Northern Ireland |
Quant: 14–18 Sept
2006; Qual 29/7- 5/8 2006 |
Quant: 2106 people aged 15+; Qual: Total n=69
Workshops: General public (n=63) Interviews: disabled people, and people with chronic illnesses/or their carers (n=6) |
3 |
32 | Ipsos Mori, Macmillan Cancer Support,
Cancer Research UK, Perceptions of the Cancer Registry: Attitudes towards and awareness of cancer data collection. 2016. |
Quantitative: online
survey. |
England | 13 June to 4 July
2016 |
Total n=2,033 Adults who have, or have had
cancer (PLWC) (n=1,033) Adults from the general public (n=1,000). All 18+ |
5 |
33 | Ipsos Mori, Wellcome Trust, The
one-way mirror: Public attitudes to commercial access to health data. 2016. |
Mixed methods study.
Quantitative: face-to-face survey Qualitative workshops |
England, Wales,
Scotland and Northern Ireland |
September to
December 2015 |
Quant: n = 2017, age 16+; Qual n= 247;
Members of the general public, patients, and ALSPAC cohort members (n=212) GPs and hospital doctors (n=35) |
3 |
34 | Ipsos Mori, Wellcome Trust, Wellcome
Trust monitor report wave 3: Tracking public views on science and biomedical research. 2016. |
Quantitative: questionnaire
using face-to-face Computer-Assisted Personal Interviewing (CAPI). |
England, Wales,
Scotland and Northern Ireland |
2 June to 1
November 2015 |
Total n=1,524 Ages: 18+ | 5 |
35 | Luchenski, S.A.,
et al., Journal of
Medical Internet Research, 2013. 15(8). |
Quantitative: cross
sectional, self-completed questionnaire survey. |
West London,
England |
Six weeks from 1
August 2011 |
Total n=2857, 59.5% female, 40.5% male Ages:
18–75+ |
5 |
36 | Papoutsi, C.,
et al. BMC Medical
Informatics and Decision Making, 2015. 15(1): p. 124. |
Mixed methods study.
Quantitative: questionnaire survey Qualitative: focus groups and interviews. |
West London,
England |
Between August
2011 and April 2013 |
Quant: n=2761, 59.1% female; Qual: n=160,
Patients (n=114). Health professionals and researcher total not stated Interviews: Patients who did not wish to join group discussions (n=6). |
4 |
37 | Riordan, F.,
et al., International Journal
of Medical Informatics, 2015. 84(4): p. 237–247. |
Quantitative:
Crosssectional, self-completed questionnaire survey. |
West London,
England |
Six weeks from 1
August 2011 |
Total n=3157, 60.4% female, 39.6% male | 5 |
38 | Spencer, K.,
et al., Journal of Medical
Internet Research, 2016. 18(4): p. e66. |
Qualitative: focus groups
and interviews. |
Salford, England | Not stated | Total n=40, 58% female, 43% male, Ages: 23–88 | 4 |
39 | Stevenson, F.,
et al., Family Practice,
2013. 30(2): p. 227–232. |
Qualitative: focus groups
and interviews |
Not stated | Not stated | Total n=57, Patients (n=50), Staff members (n=7). | 4 |
Quality assessment
Studies’ quality scores ranged from 3 to 6 out of a possible 6, scores of individual studies are shown in Table 1. Two studies which otherwise met inclusion criteria were rejected on the basis of quality and do not appear further in the results 40, 41.
Themes elicited from the studies
The seven themes identified in and elicited from the studies were: Knowledge and Awareness of Electronic Records; Willingness to Share; Privacy; Trust; De-identification and Consent Preferences; Increasing Trust; and Demographic Differences. The contribution of each study to each theme is shown in Table 2.
Table 2. Contribution of Studies to Themes.
Study | Knowledge
and Awareness of Electronic Records |
Willingness
to Share Data |
Privacy | Trust | De-identification
and Consent |
Increasing
Trust |
Demographic
Differences |
---|---|---|---|---|---|---|---|
Audrey et al. 2016 | X | X | X | ||||
Baird et al. 2009 | X | X | X | X | |||
Barrett et al. 2006 | X | X | X | ||||
Buckley et al. 2011 | X | X | X | X | |||
Campbell et al. 2007 | X | ||||||
CM Insight and
Wellcome Trust 2013 |
X | X | X | X | X | ||
Clerkin et al. 2013 | X | X | X | ||||
Grant et al. 2013 | X | X | X | X | |||
Haddow et al. 2011 | X | X | X | X | |||
Hays & Daker-White
2015 |
X | X | X | X | X | X | |
Hill et al. 2013 | X | X | X | X | X | X | |
Ipsos Mori, MRC 2007 | X | X | X | X | X | X | X |
Ipsos Mori, MacMillan,
CRUK 2016 |
X | X | X | X | |||
Ipsos Mori, Wellcome
Trust 2016 One Way Mirror |
X | X | X | X | X | ||
Ipsos Mori, Wellcome
Trust 2016 Monitor Report 3 |
X | ||||||
Luchenski et al. 2013 | X | X | |||||
Papoutsi et al. 2015 | X | X | X | X | X | ||
Riordan et al. 2015 | X | X | X | ||||
Spencer et al. 2016 | X | X | X | ||||
Stevenson et al. 2013 | X | X | X | X |
Knowledge and awareness of electronic records
Generally, knowledge of the content and electronic collection of GP records among respondents was high. One quantitative study reported that a moderately high proportion of respondents at 59% had prior awareness of EHRs 37. Another quantitative study reported that levels of understanding of the information recorded by GPs were high without giving exact numbers 24. One qualitative study reported that across groups, participants had a good awareness of the kind of information that usually held in general practice records 4. Nevertheless, participant awareness of specific uses of routinely collected patient data was low. For instance, two quantitative studies reported that 82% 23 and 80% 33 of the general public had not heard of the National Cancer Registry, while another study reported that patients were not only inadequately informed about Care.data, but were also unaware of the project 29. Two studies indicated that understanding of medical research using patient data was low 26, 31, while another suggested that participants were unaware of how their data was currently used 30. Another demonstrated limited public grasp of a range of concepts related to patient information use, such as de-identification, data science, the benefits of aggregate data, and the role of private companies in the healthcare system. People with lower understanding of these issues were more likely to have concerns about commercial access to health data 33.
Willingness to share
In many of the studies, participants expressed willingness to share their EHRs for secondary purposes like research, policy and planning, despite the range of concerns discussed below. Among the quantitative studies, public support for a national EHR system was reported at 62.5% 36, 62.47% 35, and 81% 23, while support for sharing information in general was reported 73% 32. Specifically, support for sharing data with researchers was reported at 68.7% 24, 83% 31, 77% 34, and 81.4% 36. In the qualitative studies, participants identified willingness to share their EHRs for secondary purposes with the “common”, “greater” or “public good” 4, 21, 22, 26, 31; “social responsibility” 38, 39; “altruistic attitudes” 22; and “giving something back” 27 to “other people” or “future generations” 4, 22, 28– 30, 38. For example, in one study it was stated:
I’m saying yes because I think there is a greater good. (Participant 1, Group 2, 35)
Such reasoning was largely predicated on the understanding that medical research using EHRs could lead to benefits such as the improvement of healthcare services, or innovations in the diagnosis and treatment of disease. For example:
I think if you are going to do something, eczema, allergies, something that affects one in five people you need the huge samples in order to do it. (Patient Interview L2, 39)
And:
. . . because you never know where research is going to go. You don’t know where some brilliant young scientist’s mind linking up different things, you know. And you cannot put a halt on that, a break on that. (Di, Focus Group 1, 28)
Moreover, it was also understood that using EHRs might be a better way of doing and facilitating research:
. . . I mean it’s a better system than it is at present, because you are going to get 100% response that way or near enough and the present system is that the GPs put out things on spec to people that may want to join this thing and they may get a very low return. (Male, Patient Focus Group 3, 27)
This suggests that, here the “common good” consists of the collective public health benefits brought about by the improvement of the services, practices and methods of healthcare through secondary uses of data. Willingness to share appears connected to idea of an individual having a personal responsibility, obligation or duty to help bring about this common good:
Once you have been in receipt of the excellent kind of care and treatment that I’ve had, I think you have a social responsibility that if you can help the next generation by having your information provided to the researchers to [do] some good. (Focus Group 3, 38)
Privacy
Despite the general willingness to share EHRs for secondary purposes, many qualifying concerns were raised by participants 21, 22, 26– 33, 36, 38, 39. This suggests that although the sharing of EHRs is largely seen as being for the overall common good, participants believe that it also has the potential to create new risks, and increase existing ones. The various perceived risks involved in sharing EHRs were well described with participants frequently citing such things as hacking 29, 36, unintentional data leakage or loss 29, identity theft 36, unauthorised access 36, errors in medical records 36, unnecessary stigmatising judgements in clinical settings 36, consequences for employment, pension eligibility, or insurance costs 4, social discomfort and community embarrassment 4, re-identification 28, access without explicit consent 21, the use of EHRs for financial gain 30, aggregating data to a group’s disadvantage 28, access, use and governance of data by the government 28. The breadth of this list demonstrates the structural complexities of the particular, concrete situations which study participants imagine may arise from the misuse of their data. Several studies connected these risks and the concept of privacy 4, 21, 22, 26, 33, 36, 38. Privacy was widely conceptualised as a process whereby an individual determines for themselves what happens with the information relating to them:
Seemingly radical idea: let PATIENTS control who can access their personal medical data! #caredata. (Twitter user, 29)
Participants frequently identified two key elements that could be determined in relation to their information. The first was whether information is revealed to, or accessed by another party:
My concern is exactly that: who has access to my files and how can we make sure that only those I want to have access would have access? (Focus Group 12, 36)
A second element concerned how this information should be used, or analysed after it being revealed to another party:
At the end of the day, it’s not who has access to it all, it’s how they use it, I think is the main concern for us, for everybody. . . how they use it. (Person with MS, Focus Group 7, 22)
These two factors were necessary components in identifying what was and was not acceptable when it came to unlocking the potential of health data.
Trust
Views on storing and using EHRs were linked to the kind of trust or distrust the public had in an organisation or individual using or accessing the data.
You have to trust people. (Fiona, Focus Group 2, 28)
Where participants distrusted organisations who would handle their data, this generally occurred along two lines:
-
1.
Distrust of a party’s ability, or competence, to ensure data security.
-
2.
Distrust of a party’s motivations.
In terms of a party’s competence, participants may agree that a particular party can store and use their EHR in principle, but are concerned that they are not able to guarantee the level of security required by such personal data due to institutional incompetence. One such party was “the NHS” 30, 36. For example, in one study a large majority, 71.3% of respondents, voiced doubts about the ability of the NHS to guarantee the security of EHRs, yet 53.5% of those respondents would nevertheless support the development of a national EHR 36. On the incompetence and inefficiency of the NHS, participants stated the following things:
I just have very little faith in the way that the NHS handles databases. I don’t think it’s got a very good record. . . (Focus Group 3, 36)
Always thought that [the NHS] would mess it up (Focus Group 11, 36)
#NHSPatientdata scheme handling a ‘masterclass in incompetence’ #CareData #NHS [link] [link]. (Twitter user, 29)
However, in the qualitative studies, participants expressed a generalised trust towards the NHS, especially when concerning GPs:
I mean I can trust the doctors and all . . . but other people, no. Once it leaves the NHS, I’d be wondering where it’s going and who’s looking at it. (Participant 19 38)
. . . once it goes out of the NHS, the NHS have no control over it whatsoever. (Di, Focus Group 1, 28)
Participants tended to say that the data would be safer in the hands of the NHS or a public sector or independent organisation, and that private companies were less likely to be as diligent in their handling of it 33.
When it came to an organisation’s motivation, there was a strong sense that any access and use of the data must be for the good of the individual patient or the common good of the public. Many studies indicated that any kind of data handing for private interests would be unacceptable 26– 30, 36, 39. In terms of the possible consequences, a recurring theme was that if a party had the wrong competences or motivations, this could lead to substantial harm on both an individual or collective societal level. For instance, as the following quote illustrates, it was identified that the private profit motivations of insurance and marketing companies could lead to harms on an individual level:
One of my fears was if it somehow goes astray from there and somebody, for instance, like insurance companies, get a hold of it they could use it to their advantage and the patient’s disadvantage. (P2, Focus Group, 39)
However, direct harm to individuals is not a necessary factor in determining the wrongness of certain motivations. It was also indicated that even if no particular individual is disadvantaged, allowing those with private interests to access public data can constitute a collective harm. This is because there is a strong sense that data should only be used to benefit either individuals:
Financial gain comes into it then so why should you then let them look at your records? They are going to gain out of it and you’re not. . . (Participant 2, Group 2, 30)
Or, the public at large:
If there was a large commercial company. . . [that] had free and easy access to people’s medical records I don’t think that would be right. It would further their research into the particular drug or treatment, but it’d also further their profits that would be wrong. But if it was for medical research for everybody then that would be different. (Participant 6, Group 3, 30)
Despite this firm belief, several of the studies indicated a tension in the status of pharmaceutical companies whose products are indispensable to medicine and the health of populations, but which ultimately operate in a profit driven capacity 22, 27, 30, 31, 36. As Grant et al. 27 write, this leads some participants to see the involvement of pharmaceutical companies as a “necessary evil”.
This dimension was further discussed in the grey literature which revealed a more nuanced picture regarding public opinion towards the commercial uses of data. Support for commercial access to health data raised from 54% to 61% when taking into account the possibility of new treatments being discovered 33, and participants were indifferent to who conducts research so long as the objective is to increase knowledge around the causes and cures of ill health 26. This suggests that participants recognise that not all commercial uses of data are done from purely privately interested motivations, but that at least in part can involve public motivations too. In explaining the apparent reluctance of the public to accept certain private interests so as to ensure public benefits, one study identified that participants did not currently feel that they could evaluate the motivations of commercial organisations using data, which created an unclear conception of what the public could stand to gain through these uses of data. As a result, participants tended to fall back into wider assumptions, personal beliefs and prejudices regarding private companies 33.
De-identification and consent preferences
In the quantitative studies, 67.5% 24 of respondents in 2011 and 91% 37 of respondents in 2015 were clear that although it was fine for researchers to access their EHRs, they still expected to be asked for consent when their identifiable data was accessed for secondary purposes. However, there was less consensus over deidentified data, with 83.7% 24, 51% 25, and 49.3% 37 of respondents reporting willingness to share or extracted without consent. Reasons for concern around de-identification also emerged in the qualitative studies where participants questioned what would qualify as identifying information 36, whether de-identification could be achieved effectively 31, 36, whether it was sufficient for the elimination of consent 21, 30 and highlighted the risks of re-identifying individuals 26, 29.
Several studies also indicated substantial concerns about the opt-out rather than opt-in consent which was proposed in schemes such as Care.data 29, 39, while others noted that participants generally thought about consent along opt-in lines when asked for their opinions 21. Participants expressed worries about whether people would really understand the concept of opting-out 39. They also criticised opt-out on the basis that it was unethical and illegal 29. However, in one quantitative study 52% of the general public supported the opt-out method of collection for the National Cancer Registry 32, while a minority of participants in another study acknowledged that opt-out might be a better option given the impracticalities of opting-in 31.
The problem of selection bias and its connection with consent arrangements was explored in three studies 21, 30, 36. In two studies, some participants identified the potential for bias if the information which was gained was neither accurate nor balanced 21, 36:
If they’ve got mental health illness then. . . that might affect their willingness, so it might be hard to. . . gather enough information. I think that might be biased. . . (Male ID47, 21)
Participants also recognised that larger, more representative samples could be gained by an opt-out process:
You are going to get 100% response that way, or near enough and the present system is that the GPs put out things on spec to people that may want to join this thing and they may get a very low return. (Male, Patient focus group 3, 27)
This prompted discussion in one study about the importance of mitigating the requirement of consent by de-identifying information:
There is certain situations where you might be able to, it might be acceptable to ask or it might be acceptable just to go ahead and get it—as long as it wasn’t directly linked back to you as a person, it would be alright. . . (Female, ID6, 21)
In another study 30, after receiving presentation about selection bias, participants recognised the difficulties faced by researchers. Interestingly, when asked if this information had changed their opinion about using health data without consent, several participants out of the group who at first indicated reluctance, reported that they had indeed changed their minds. A quantitative study showed that a substantial minority of respondents at 20% 31 believe that consent may not be needed if it is not practical to obtain.
Increasing trust
Across studies, participants identified several different infrastructure arrangements which could increase willingness to share EHRs for secondary purposes. Participants indicated that no single organisation should be responsible for deciding who could access and use their EHRs, rather a committee of stakeholders was called for, including Caldicott Guardians, research consultants, members of the public, GPs, social services staff, charities, funders, patients etc. 22, 28. It was also felt that greater transparency was needed in regards to safeguarding processes and data sharing arrangements 29, 38, including stiff penalties or fines for misuses of data 29, 33; the publication of results 33; clear guidelines and laws to regulate access and use of data 29; and, regulators and parties accessing data to be held to high standards 33. Several studies also indicated that participants wanted a better understanding about the nature of EHR initiatives, medical research 31, the purposes and benefits of using data 27, 31, de-identification and aggregation 33, and also why in some situations consent might not be practical 33. More generally, participants wanted the security of records to be ensured 27, 33; for private profit to be capped 33; and denial of third party access 33. In several studies, participants also indicated their preference to retain granular control over the data in their EHR using an explicit opt-in consent scheme, the right to withdraw at any time and ability to tailor sharing preferences 22, 27, 29, 38.
Despite the breadth and diversity of participant suggestions to increase willingness, it might be that no one, or any specific combination of strategies will amount to a gold standard of acceptability or social licence. One study found that no particular safeguard made sharing data with commercial companies any more acceptable than any other 33. However, in the same study, participants were significantly less likely to endorse sharing data without any safeguards (49% agreed) compared to with safeguards (56–64% agreed, depending on the safeguard). This suggests that the precise nature of the safeguard is less important to improving willingness to share than knowing there are safeguards in place.
Demographic differences
We aimed to ascertain whether the included studies indicated a level of heightened concern, worry or fear among one or more specific social groups and we restricted this analysis to quantitative studies which are designed to enable such contrasts. Although participants were asked a variety of different questions across each survey, we evaluated responses on the basis of whether they indicated an overall negative or positive attitude towards the sharing of EHRs for secondary purposes such as research. For example, in Papoutsi et al. 36, participants were asked if they would be more worried about the security of their information if it were part of a national EHR register, while Buckley et al. 24 asked if they would allow their EHRs to be provided to researchers without their explicit consent. Despite the differing approaches of these questions, we concluded that a response indicating more worry about security, and one indicating less likelihood of granting researchers access without explicit consent, were comparative insofar as they represented a negative attitude towards sharing of EHRs.
Within quantitative studies, findings were reported across a whole range of demographic differences. Between studies, comparison could only be made between age range, levels of education, and ethnicity. We found conflicting findings in all three of these categories. We found evidence that both younger people and older people would favour sharing their data, that people with lower levels of education were both more and less likely to agree to sharing without consent, and that people of non-white ethnicity were both more and less likely to support EHRs and think of them as secure. For a full break down of the demographic results, see Table 3.
Table 3. Study findings on Demographic Differences.
Group | Indicative of Negative Attitude | Indicative of Positive Attitude | |
---|---|---|---|
Age | Compared to those aged 25–34, respondents between
the ages of 35–64 were more likely to report they would be worried about the security of their records as part of a national EHR 36. |
Increase in age by each 10 year increment was
significantly associated with an increased likelihood of reporting that any info can be provided to researchers without asking for consent 24. |
|
Compared to those aged 25–34, respondents over 35 years
old were more likely to report less confidence in the ability of NHS security and were less likely to report that EHRs were equally or more secure than paper records 36. |
Older people (55–64, 65+) were more likely to find
a drug company conducting research into the unwanted side effects of a drug using deidentified data to be more acceptable than younger people (16–24, 25–34, 35–44, 45–54) 33. |
||
Older people were increasingly more likely to report that they
would not be in favour of a national EHR compared with 25–34 year olds 35. |
Those aged 55–64 tended to agree that research
should be conducted by commercial organisations if there is a possibility of new treatments being discovered in comparison to 16–24s and 35–44s 33. |
||
In the general public, support for the opt-out
collection method was higher in over 55s (58%) than 18–34 (49%) and 35–54s (49%) 32. Those over 55 were more likely to say to say that they would allow their data to be used for medical research compared to those aged 16–24 31. |
|||
Education | Respondents with lower educational qualifications were more
likely to expect to be asked for explicit consent before their deidentified records were accessed 37. |
Compared with participants with higher degrees,
individuals with no academic qualifications were less likely to say that they would worry about security if their record was part of a national EHR 36. |
|
Compared with completion of third level education,
completion of only primary level education was associated with increased likelihood of reporting that any info can be provided to researchers without asking for consent 24. |
|||
Socioeconomic
Status |
Those of a lower socioeconomic status were more likely to be
concerned about privacy 23. |
Those in the lower socioeconomic group DE (43%)
were more likely to support companies using health data collected in the NHS to help target health products at different groups of people 33. |
|
Those in socioeconomic groups C2 and DE were less likely
than those in AB and C1 to view the use of health data as having a potential benefit to society 26. |
|||
Those in the lower socioeconomic group DE were less likely
to say they trusted a variety of people with their health data; say that the advantages outweigh the disadvantages of using health data in research; and say that researcher can use data without prior consent than ABs 31. |
|||
Those in socioeconomic groups C1 and C2 were less likely
than ABs to allow their health data to be used 31. |
|||
Those in socioeconomic groups DE (46%) were less likely
to support commercial organisations to undertaking health research with health data than AB (62%) 33. |
Those in socioeconomic groups DE (26%) were less likely
to support commercial organisations to undertaking health research with health data than AB (30%) 33. |
||
Ethnicity | Black British respondents were more likely to say they would
not support the development of a national EHR system compared with White British respondents 35. |
Compared with White British groups White non-British,
Asian, British Asian, Black-African, Caribbean, and British Black groups were more likely to say that EHRs are as secure, or more secure that paper records 36. |
|
Respondents identifying as belonging to an ethnic group
other than White British were more likely to expect to be asked for explicit consent before their deidentified records were accessed 37. |
|||
Those whose ethnicity was not White British were more likely
to be concerned about the invasion of privacy 23. |
Discussion
We found that knowledge of the content and collection of electronic patient data was reasonably high, but knowledge about the secondary uses, such as data sharing for research, was low. Nevertheless, when asked, participants were generally willing to share their data for the “common good”, subject to safeguards. Willingness was qualified with concerns about privacy, and loss of control over personal information. Participants feared adverse outcomes less when they trusted both the motivation of research organisations to conduct research for the common good, and the competence of organisations to handle the data safely and without compromise. When evaluating opinions on consent mechanisms, findings suggested that educational and deliberative research into public opinion may provide different answers from snapshot surveys. Results suggested a range of mechanisms to increase public trust, and the overarching theme here was transparency of motivation, data handling and data flow.
Beyond this, however, further analysis reveals how closely this public rationale maps to, and can be interpreted by, some of the foundational moral principles which Beauchamp and Childress 16 identify as paramount to governing biomedical practice. For instance, the included studies indicate that there is a widespread willingness to share EHRs for secondary purposes, in principle. This belief was held on the basis that, using and accessing data in such a way can bring about benefits which are in the interests of all individuals, or in other words, the “common good”. This strongly suggests that what forms the basis of this belief is a general expectation that if members of the public can contribute to the welfare of each other by sharing data, then they feel a moral obligation to do so. This can be interpreted as a more specific formulation of the principle of beneficence, which urges us to act, where we can, to promote good. Interestingly, this shows that the public feel that they, and not just health professionals and researchers, have a moral duty to uphold within medical practice, and give something back for the good of others where possible.
Nevertheless, willingness rarely led to unqualified support of the schemes designed to enable secondary use. This was withheld because, in practice, it was felt that key values would not, or could not, be ensured, thus bringing with it the risk of individual and collective harm. Two values in particular emerged as pertinent to conferring support: the first related to an organisation having the right expertise, or competencies, to ensure data security, and the second pertained to an organisation having the right motivations, i.e. those which serve the public, rather than private interests. A party’s failure to show that they will meet either or both of these values led to the erosion of trust in that party, and ultimately, in support for sharing health data in general. The public might feel justified in objecting to irresponsible, or insecure use of data because it is likely to cause individual harm; a direct violation of the principle of nonmaleficence. Similarly, the use of data for private gain may be said to be in violation of the principle of justice because it is generally unfair to exploit something for reasons other than what it was intended for. Finally, the use of patient data without consent may be seen to violate the principle of respect for autonomy. This is only a brief sketch of how such acceptable ethical principles are reflected in public opinion about sharing health data, however, what it demonstrates is the cogency of public ethical reasoning and its relationship to the Four Principles framework, which appears to be a constituent part of non-specialist thinking about the ethical practicalities of healthcare and medicine.
In concrete terms, this means that the public will not outright reject any research initiative, or project, on the sole basis that it requires the use of health data for purposes other than direct care. However, what will be demanded is that projects of this nature not only maintain a secure environment for health data, but also set research objectives which are primarily concerned with contributing to the common good. So long as these values are maintained, it is possible that the public may not object to companies outside the NHS accessing and using their data. In light of these findings, any researcher, regulator or company must ensure that the use of health data neither brings about risk of personal harm nor is conducted with the dominant aim of creating private profit. Consequently, any governmental safeguards that are instantiated to protect data must seek to preserve either of these principles as a matter of course.
Strengths and limitations
We conducted a wide search and sifted a huge number of papers, including grey literature reports. The search was challenging due to wide range of terms used within the literature for secondary data usage, and for expressing the concept of public opinion and attitudes. We cast a wide net and spending time excluding papers, and we therefore believe this review encompasses all available research meeting our criteria up until the search was conducted. Our findings were deliberately limited to UK and the Republic of Ireland to create a manageable, relevant and comparable body of literature. This enabled us to look for underlying principles for publics exposed to a particular type of healthcare system, but are obviously only applicable within these contexts.
Synthesis of results was also challenging as there was a wide range of study types, using different methods. The small samples and low response rates in the majority of studies is also likely to have introduced bias into the findings, as it is probable that only members of the public most interested in the issues consented to take part in the research. This means that each study likely represents a narrow range of views. It is not clear how this might have affected results across the whole range of studies, but it is likely that the themes and views represented here are not a complete picture of the public’s opinions. Additionally, certain research questions of particular interest were not asked of participants and therefore our understanding of public opinion is still limited. One example of this is whether the use of medical text (in contrast to structured data in medical records) elicits specific privacy concerns for the public.
Our analysis was informed and influenced by our respective backgrounds in philosophy, psychology and epidemiology. While attempting to be data-led, we must acknowledge that we may not have been wholly neutral in approach. However, our review highlights similar themes to Aitken et al., 15 suggesting a cross-over with other syntheses in this area.
Future directions
Our research clearly demonstrates the relevance of the widely accepted Four Principles approach to public deliberation on EHR research. We recommend that frameworks for regulators and data custodians deciding on access to data for secondary purposes encourage consideration of the following questions:
-
1.
Do the methods of data collection in the proposal respect individual patient autonomy? (Respect for Autonomy)
Given that public concern about data sharing is currently widespread, it is essential that individuals have the option to opt-out of any data collecting schemes like, for instance, Care.data. This is of particular importance while public understanding and transparency of data sharing is low, since these factors often contribute towards the overall feeling that there is not enough information to base decisions on. Giving individuals the right to opt-out, and ensuring such a process is as transparent and streamlined as possible is vital for maximising public trust in an initiative.
-
2.
Could granting access to the data, or granting a particular use of the data, lead to individual harm? (Nonmaleficence)
Regarding the access and use of data for research purposes, data custodians and regulators must develop sophisticated methods to identify any underlying risk to individuals in submitted proposals. Possible risks to consider are articulated earlier in the results section of this paper. Mitigations of risk may include high standards for data storage security, restrictions on data linkage where necessary, and evaluation of analytical methods. Depending on the dataset in question, and those involved, parties may need to submit further applications if they wish to use data they have already been granted access to for new purposes.
-
3.
Are the objectives and the intended outputs primarily concerned with contributing to the public good? Do they have clear scientific value? (Beneficence)
In order to maintain a minimum standard of justice, ethical bodies and regulators must also evaluate proposals on the basis of their intended aims and whether they significantly contribute towards the common good. As part of this process, such bodies must identify what private value may be unlocked for the party involved, and assess whether any reported objectives and intended outputs are reasonably achievable in the proposal.
-
4.
Related to 3; is any agreement between the NHS and organisations providing analytics (private or public) fair and just? Will it lead to an outcome which will provide benefit to the key stakeholders of the data, i.e. the patients? (Justice)
The engagement of industry and private companies to provide data analytics will be crucial to maximise benefits from patient data in the future. However, agreements with industry must be transparent and critically, must be fair, representing benefit or gain for both parties, and for the public. Failure to achieve fairness or transparency in data-sharing agreements may result in a collective harm, that is, a loss of public trust in the endeavours of research.
Conclusions
We have shown that public thinking about the privacy issues around sharing patient data for research maps onto established biomedical ethical principles. These can be used to frame guidance for data custodians, regulators and researchers when planning or approving research using patient data.
Funding Statement
This work was supported by the Wellcome Trust [202133].
The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
[version 1; referees: 2 approved with reservations]
Supplementary material
Supplementary File 1: PRISMA checklist.
Supplementary File 2: PRISMA flowchart, showing the number of records identified, included and excluded.
References
- 1. Academy of Medical Sciences: Personal data for public good: using health information in medical research. London, UK: Academy of Medical Sciences;2006. Reference Source [Google Scholar]
- 2. Kim KK, Joseph JG, Ohno-Machado L: Comparison of consumers’ views on electronic data sharing for healthcare and research. J Am Med Inform Assoc. 2015;22(4):821–30. 10.1093/jamia/ocv014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Botkin JR, Rothwell E, Anderson R, et al. : Public attitudes regarding the use of electronic health information and residual clinical tissues for research. J Community Genet. 2014;5(3):205–13. 10.1007/s12687-013-0175-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Clerkin P, Buckley BS, Murphy AW, et al. : Patients’ views about the use of their personal information from general practice medical records in health research: a qualitative study in Ireland. Fam Pract. 2013;30(1):105–12. 10.1093/fampra/cms036 [DOI] [PubMed] [Google Scholar]
- 5. Grande D, Mitra N, Shah A, et al. : Public preferences about secondary uses of electronic health information. JAMA Intern Med. 2013;173(19):1798–806. 10.1001/jamainternmed.2013.9166 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Teschke K, Marino S, Chu R, et al. : Public opinions about participating in health research. Can J Public Health. 2010;101(2):159–64. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Weitzman ER, Kaci L, Mandl KD: Sharing medical data for health research: the early personal health record experience. J Med Internet Res. 2010;12(2):e14. 10.2196/jmir.1356 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. NHS England: NHS England sets out the next steps of public awareness about Care.Data.2013, [cited 2017 6th September]. Reference Source [Google Scholar]
- 9. Carter P, Laurie GT, Dixon-Woods M: The social licence for research: why Care.Data ran into trouble. J Med Ethics. 2015;41(5):404–9. 10.1136/medethics-2014-102374 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Ramesh R: £140 could buy private firms data on NHS patients. The Guardian;2013, [cited 2017 26 September]. Reference Source [Google Scholar]
- 11. Kushida CA, Nichols DA, Jadrnicek R, et al. : Strategies for de-identification and anonymization of electronic health record data for use in multicenter research studies. Med Care. 2012;50(Suppl):S82–S101. 10.1097/MLR.0b013e3182585355 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Ohm P: Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA Law Review. 2010;57:1703–77. Reference Source [Google Scholar]
- 13. Rothstein MA: Is deidentification sufficient to protect health privacy in research? Am J Bioeth. 2010;10(9):3–11. 10.1080/15265161.2010.494215 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Jones KH, Laurie G, Stevens L, et al. : The other side of the coin: Harm due to the non-use of health-related data. Int J Med Inform. 2017;97:43–51. 10.1016/j.ijmedinf.2016.09.010 [DOI] [PubMed] [Google Scholar]
- 15. Aitken M, de St Jorre J, Pagliari C, et al. : Public responses to the sharing and linkage of health data for research purposes: a systematic review and thematic synthesis of qualitative studies. BMC Med Ethics. 2016;17(1):73. 10.1186/s12910-016-0153-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Beauchamp TL, Childress JF: Principles of Biomedical Ethics. 6th ed. New York: Oxford University Press;2009. Reference Source [Google Scholar]
- 17. Moher D, Liberati A, Tetzlaff J, et al. : Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151(4):264–9, W64. 10.7326/0003-4819-151-4-200908180-00135 [DOI] [PubMed] [Google Scholar]
- 18. Pace R, Pluye P, Bartlett G, et al. : Testing the reliability and efficiency of the pilot Mixed Methods Appraisal Tool (MMAT) for systematic mixed studies review. Int J Nurs Stud. 2012;49(1):47–53. 10.1016/j.ijnurstu.2011.07.002 [DOI] [PubMed] [Google Scholar]
- 19. Smith J, Firth J: Qualitative data analysis: the framework approach. Nurse Res. 2011;18(2):52–62. 10.7748/nr2011.01.18.2.52.c8284 [DOI] [PubMed] [Google Scholar]
- 20. Noblit GW, Hare RD: Meta-ethnography: Synthesizing qualitative studies. Sage;1988. Reference Source [Google Scholar]
- 21. Audrey S, Brown L, Campbell R, et al. : Young people’s views about the purpose and composition of research ethics committees: findings from the PEARL qualitative study. BMC Med Ethics. 2016;17(1):53. 10.1186/s12910-016-0133-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Baird W, Jackson R, Ford H, et al. : Holding personal information in a disease-specific register: the perspectives of people with multiple sclerosis and professionals on consent and access. J Med Ethics. 2009;35(2):92–6. 10.1136/jme.2008.025304 [DOI] [PubMed] [Google Scholar]
- 23. Barrett G, Cassell JA, Peacock JL, et al. : National survey of British public’s views on use of identifiable medical data by the National Cancer Registry. BMJ. 2006;332(7549):1068–72. 10.1136/bmj.38805.473738.7C [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24. Buckley BS, Murphy AW, MacFarlane AE: Public attitudes to the use in research of personal health information from general practitioners’ records: A survey of the Irish general public. J Med Ethics. 2011;37(1):50–5. 10.1136/jme.2010.037903 [DOI] [PubMed] [Google Scholar]
- 25. Campbell B, Thomson H, Slater J, et al. : Extracting information from hospital records: what patients think about consent. Qual Saf Health Care. 2007;16(6):404–8. 10.1136/qshc.2006.020313 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Summary report of qualitative research into public attitudes to personal data and linking personal data. CM Insight, Wellcome Trust,2013. Reference Source [Google Scholar]
- 27. Grant A, Ure J, Nicolson DJ, et al. : Acceptability and perceived barriers and facilitators to creating a national research register to enable 'direct to patient' enrolment into research: the Scottish Health Research Register (SHARE). BMC Health Serv Res. 2013;13:422. 10.1186/1472-6963-13-422 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28. Haddow G, Bruce A, Sathanandam S, et al. : 'Nothing is really safe': a focus group study on the processes of anonymizing and sharing of health data for research purposes. J Eval Clin Pract. 2011;17(6):1140–6. 10.1111/j.1365-2753.2010.01488.x [DOI] [PubMed] [Google Scholar]
- 29. Hays R, Daker-White G: The care.data consensus? A qualitative analysis of opinions expressed on Twitter. BMC Public Health. 2015;15:838. 10.1186/s12889-015-2180-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Hill EM, Turner EL, Martin RM, et al. : "Let’s get the best quality research we can": public awareness and acceptance of consent to use existing data in health research: a systematic review and qualitative study. BMC Med Res Methodol. 2013;13:72. 10.1186/1471-2288-13-72 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. The Use of Personal Health Information in Medical Research General Public Consultation. Ipsos Mori, Medical Research Council,2007. Reference Source [Google Scholar]
- 32. Perceptions of the Cancer Registry: Attitudes towards and awareness of cancer data collection. Ipsos Mori, Macmillan Cancer Support, Cancer Research UK,2016. Reference Source [Google Scholar]
- 33. The one-way mirror: Public attitudes to commercial access to health data. Ipsos Mori, Wellcome Trust,2016. Reference Source [Google Scholar]
- 34. Wellcome Trust monitor report wave 3: Tracking public views on science and biomedical research. Ipsos Mori, Wellcome Trust,2016. Reference Source [Google Scholar]
- 35. Luchenski SA, Reed JE, Marston C, et al. : Patient and public views on electronic health records and their uses in the United Kingdom: Cross-sectional survey. J Med Internet Res. 2013;15(8):e160. 10.2196/jmir.2701 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36. Papoutsi C, Reed JE, Marston C, et al. : Patient and public views about the security and privacy of Electronic Health Records (EHRs) in the UK: results from a mixed methods study. BMC Med Inform Decis Mak. 2015;15(1):86. 10.1186/s12911-015-0202-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37. Riordan F, Papoutsi C, Reed JE, et al. : Patient and public attitudes towards informed consent models and levels of awareness of Electronic Health Records in the UK. Int J Med Inform. 2015;84(4):237–47. 10.1016/j.ijmedinf.2015.01.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38. Spencer K, Sanders C, Whitley EA, et al. : Patient Perspectives on Sharing Anonymized Personal Health Data Using a Digital System for Dynamic Consent and Research Feedback: A Qualitative Study. J Med Internet Res. 2016;18(4):e66. 10.2196/jmir.5011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39. Stevenson F, Lloyd N, Harrington L, et al. : Use of electronic patient records for research: Views of patients and staff in general practice. Fam Pract. 2013;30(2):227–32. 10.1093/fampra/cms069 [DOI] [PubMed] [Google Scholar]
- 40. Robinson G, Dolk H: Public attitudes to data sharing in Northern Ireland. Administrative Data Research Centre, Northern Ireland,2016. Reference Source [Google Scholar]
- 41. Stevenson F: The use of electronic patient records for medical research: conflicts and contradictions. BMC Health Serv Res. 2015;15(1):124. 10.1186/s12913-015-0783-6 [DOI] [PMC free article] [PubMed] [Google Scholar]