Abstract
Introduction:
There is increased interest in transitioning to a “learning health care system” (LHCS). While this transition brings the potential for significant benefits, it also presents several ethical considerations. Identifying the ethical issues faced by institutions in this transition is critical for realizing the goals of learning health care so that these issues can be anticipated and, where possible, resolved.
Methods:
29 semi-structured telephone interviews were conducted with leaders within 25 health care institutions. Respondents were recruiting using purposive sampling, targeting institutions considered as LHCS leaders. All interviews were audiorecorded and transcribed. NVIVO10 software was used to support qualitative analysis.
Results:
Respondents described seven ethical challenges: (1) ethical oversight of learning activities; (2) transparency of learning activities to patients; (3) potential tensions between improving quality and reducing costs; (4) data sharing and data management; (5) lag time between discovery and implementation; (6) transparency to patients about quality; and (7) randomization for quality improvement initiatives.
Discussion:
To move towards LHCS, several ethical considerations require further attention, including: the continued appropriateness of the research-treatment distinction; policy frameworks for privacy and data sharing; informing patients about learning activities; obligations to share data on quality; and the potential for trade-offs between quality improvement and cost control.
Conclusion:
To our knowledge, this is the first project to ask leaders from health care systems committed to ongoing learning about the ethical issues they have faced in this effort. Their experiences can provide guidance on relevant ethical issues, and what might be done to resolve them.
Keywords: Learning Health Care, Ethics, Comparative Effectiveness Research, Organizational Innovation
Introduction
Recently, there has been increased interest in the “learning health care system” (LHCS). Defined by the National Academy of Medicine (NAM; formerly the IOM) as a system in which “science and informatics, patient-clinician partnerships, incentives, and culture are aligned to promote and enable continuous and real-time improvement in both the effectiveness and efficiency of care,”1 the LHCS promises to speed the translation of research to improve patient care.
While this transition brings the potential for substantial benefits, it also presents several ethical considerations. In particular, integrating research with clinical practice challenges a longstanding paradigm in research ethics, codified in the Federal Policy for the Protection of Human Services (generally known as the “Common Rule”), that views research and clinical practice as unique and independent endeavors.2 Efforts to substantially enhance the integration of research with clinical care thus raise concerns regarding ethics and regulatory oversight. For example, debate exists regarding whether and when review by an institutional review board (IRB) and patient informed consent should be required for “learning activities,” including quality improvement (QI) and research on standard medical practices, such as comparative effectiveness research (CER).3,4,5,6 While ethical oversight can help ensure the protection of individuals as patients and research subjects, it has been criticized for creating logistical and regulatory hurdles that may, paradoxically, stymie efforts to improve patient care.7
Other aspects of the LHCS may create additional ethical challenges. For example, achieving continuous and real-time improvement often relies upon collecting and analyzing vast amounts of patient data, raising potential issues for privacy and data security. Further, comparative studies may reveal that one therapy is marginally more effective but considerably more expensive than another, raising concerns that patient trust may be undermined if patients perceive learning activities as primarily aimed at cost control, rather than at improving quality.8,9 Finally, this transformation occurs against the backdrop of a broader shift toward patient engagement, requiring health systems to examine whether and how patients should be involved in decision-making about which learning activities should be introduced and how they should be conducted.
Identifying and understanding the ethical issues faced by institutions in the transition toward LHCS is critical for realizing the goals of learning health care so that these issues can be anticipated and, where possible, ameliorated or resolved. In partnership with the NAM’s Leadership Consortium for Value and Science-Driven Health Care (LCVSHC), we interviewed leaders from United States health care systems thought to be at the forefront of the transition to learning health care. Our goal was to understand their transition toward becoming an LHCS, and to identify the ethics and regulatory challenges they have faced along the way.
In this paper, we describe the ethical issues associated with transitioning to an LHCS model, as understood by health care system leaders. Hearing what these leaders have confronted, as well as how they have responded, may be valuable to other institutions as they consider moving toward becoming an LHCS.
Methods
Sample Selection and Recruitment
We used purposive sampling to recruit respondents, targeting institutions considered to be at the forefront of the transition toward an LHCS. We sought to include the perspectives of individuals with different institutional responsibilities—including clinical and executive leadership, as well as those from operations, research, strategy, and QI.
Colleagues from NAM proposed candidate institutions and respondents, drawing from lists of attendees and speakers from NAM meetings relevant to the topic of learning health care. Additional target respondents were identified by several “thought leaders” (experts) associated with the LCVSHC. Invitation letters were sent from NAM, and described both the project and that it was being conducted by colleagues at the Berman Institute of Bioethics.
Study Procedures
We conducted hour-long, semistructured telephone interviews with institutional leaders at 25 health care institutions. One of the authors (SM) led the interviews, using detailed interview guides (available from the authors on request), which focused on two broad areas: (1) the origin and implementation of the transition toward an LHCS model, and (2) the ethics and regulatory issues encountered in this transition. In this manuscript, we report on the ethical issues identified by health system leaders. All interviews were audio recorded and transcribed. The Johns Hopkins Bloomberg School of Public Health Institutional Review Board classified this project as non–human subjects research.
Analysis
We took an integrated approach to developing the coding structure.10 A priori codes, drawn from our interview guide, served as the organizing framework for our analysis. One investigator (SM) reviewed all transcripts for accuracy, and to identify subthemes. Subthemes were grouped within the main a priori codes to develop our codebook, which one investigator (SM) then applied to all transcripts—using the NVIVO10 software package. Memos were written for each a priori code, describing each subtheme and its frequency, and presenting quotations exemplifying those themes. Another author (NK) then reviewed the memos. Any differences of opinion about the meaning of specific quotations were discussed and resolved through an iterative process of discussion and comparison to the raw data.
Results
Sample Characteristics
We conducted 25 interviews between October 2014 and February 2015, involving a total of 29 institutional leaders. Most interviews were with one institutional leader, but a few respondents requested the involvement of 1–2 additional colleagues from the same institution. The sample included institutional leaders whose key responsibilities included quality and safety, research, clinicians, operations, overall leadership (CEO), and strategy (Table 1). A list of participating institutions is provided in Table 2.
Table 1.
PRIMARY ROLE | # OF RESPONDENTS (N=29) |
---|---|
Quality, Safety | 7 |
Research | 6 |
Cliniciansl | 5 |
Operations | 5 |
CEO, Leadership | 4 |
Strategy | 2 |
Note: Self-reported data from respondents about their primary role within their institution.
Table 2.
Advocate Health Care | Hospital Corporation of America (HCA) |
Baylor Scott & White Health | HealthPartners |
Bellin Health | Health Share |
Bon Secours Health System | Intermountain Healthcare |
Boston Children’s Hospital | Kaiser Permanente Colorado |
Carolinas HealthCare | Marshfield Clinical Health System |
Cincinnati Children’s Hospital Medical Center | Nemours |
Christiana Care Health System | New York–Presbyterian Hospital |
Dartmouth Hitchcock Health | Palo Alto Medical Foundation |
Denver Health | Penn State Health |
Duke University Health System | Sutter Health |
Geisinger Health System | Vanderbilt |
Group Health |
Ethical Challenges
Collectively, respondents described seven ethical challenges associated with the transition toward a model of learning health care (Table 3). Three of these challenges responded directly to issues we raised in our interview questions: (1) ethical oversight of learning activities, including determining which activities required IRB review, and the impact of IRB review requirements on their ability to learn; (2) transparency to patients about learning activities; and (3) the potential tensions between improving quality and reducing costs. Four additional topics were raised independently by our respondents: (4) ethics of data sharing and data management, (5) lag time between discovery and implementation, (6) transparency to patients about quality, and (7) the ethics of randomization for care and QI initiatives. Below, we present detailed description and illustrative quotations for each theme.
Table 3.
Ethical Challenge | Description Of Issue |
Ethical Oversight of Learning Activities | Distinguishing which learning activities should go to an IRB. |
Transparency to Patients about Learning Activities | Determining whether and how to disclose information to patients about ongoing learning activities. |
Potential Tensions in Improving Quality and Reducing Costs | Concern that moving toward continuous learning is not always in the financial interest of institutions. |
Ethics of Data Sharing and Data Management | Potential implications of sharing electronic data upon patient privacy. |
Lag Time Between Discovery and Implementation | Recognition of shortcomings of current system in both identifying and implementing evidence-based practices. |
Transparency to Patients about Quality | Determining whether and how to inform patients about underperforming providers or groups. |
Ethics of Randomization for Care and QI Initiatives | Concern that randomizing individuals to the placebo arm might fail to provide them with potential benefits. |
Ethical Oversight of Learning Activities
Issues related to IRB review were the most common type of issue described by our respondents. Respondents from 16 institutions referenced challenges in determining which of their learning activities should go to an IRB and which should instead be considered part of health care operations, including QI initiatives. Among these institutions, some described their institution as having developed consistent internal definitions or processes to distinguish between activities, while others said they were taking steps to achieve such internal consistency, such as developing criteria for use across all IRBs within the system.
Respondents described their approach to distinguishing activities that require IRB review from those that do not as generally involving consideration of two interrelated distinctions: (1) whether the activity is testing a novel feature, such as a new treatment, versus implementing something that has demonstrated efficacy; and (2) whether an activity aims to produce “generalizable knowledge,” which to them meant it is intended to be disseminated outside the institution, versus focused on internal processes instead, such as institutional QI. For respondents, activities that test a novel feature or are anticipated to produce generalizable knowledge require IRB oversight to maintain compliance with rules governing the protection of human subjects, whereas those that implement standard care practices or have demonstrated efficacy may instead call for either expedited review or no IRB oversight. As described by one respondent:
clearly we’re doing research when there’s a plan to do systematic collection of data for the purpose of dissemination of information. When we’re doing systematic collection of data for the purposes of evaluating our internal processes, that’s not research.
These respondents also emphasized intent to publish as a key criterion in determining whether they thought current policies and practices required IRB review, even if an activity’s primary aim was to improve internal quality rather than to test a new approach. One respondent attributed this as being motivated, at least in part, by a desire to sidestep objections from journal editors who insist upon IRB approval, even for activities that do not constitute human subjects research:
technically an IRB isn’t really necessary because it’s already a limited data set and all the user is going to get is a totally de-identified data set. But I want to make sure that our researchers do not get caught up with journal editors saying: “where was your IRB approval.”
Some respondents characterized issues related to IRB review as delaying learning activities, or even preventing some projects altogether. Eleven participants described IRB review and the federal guidelines governing human subjects research as processes that could, in their words, “hamper” or “chill” learning. One respondent attributed this to a mismatch between IRB regulations (as governed by the Common Rule) and the needs of modern health systems:
[I]n general, the standard Office of Human Research Protections [OHRP] guidelines are really designed…to regulate formal clinical trials. Obviously, a lot of the work that we do here fits into the QI umbrella or is using existing data in retrospective study designs. I think…what IRBs are hampered by is their guidelines haven’t caught up with health systems’ needs in those areas yet. So I think that’s just a general thing all IRBs face.
Another respondent made a similar observation, noting that existing protections for human subjects “may not be aligned with some of the research activities that need to move forward.” The result of this misalignment, according to the respondent, was that some projects were getting “shifted into QI,” which limited the likelihood that the results would be shared with others.
Several respondents described using a variety of strategies to navigate this challenge, including streamlined review mechanisms or alternatives to IRB review for ethical oversight of learning activities. Examples included having an IRB chair review protocols to determine appropriateness for full IRB review, using a separate working group to review nonclinical uses of patient data, and obtaining a waiver of informed consent for minimal risk research. While some respondents described satisfaction with expedited or alternate review mechanisms, others noted that alternatives were not necessarily without challenge. For example, one respondent noted that applying for exempt status from the IRB could be burdensome:
You have to fill out the whole 13-page form to do it. It’s electronic now but it’s still a bit painful. I’ve tracked it—it’s about eleven man-hours to do an exempt submission in terms of getting everything together, loading it, waiting. We’ll get usually a “yeah you’re exempt” in about two-and-a-half weeks which—that’s burdensome!.. [W]ho has a day-and-a-half to put together [an IRB application]that we all know darn well that it’s exempt.
Transparency to Patients about Learning Activities
Four respondents described facing challenges in determining whether and how to disclose information to patients about ongoing learning activities within their health systems. For example, while activities deemed as research typically require patient notification and consent, patients are traditionally not told about many other uses of their data. As explained by one respondent: “[We] use our patients’ data for all sorts of purposes, from comparative effectiveness research to operational interventions—when do patients deserve to know?” Another respondent offered a similar observation:
[S]ome of the biggest challenges that we’re facing…have to do with actually figuring out a way to disclose to patients in better ways than we do now that they are part of a learning health system…what we would like to do is be in a position so that patients and families understand that data are used for learning as part of QI… Right now it’s just embedded in HIPAA forms that nobody reads.
A third respondent reflected on the challenges of communicating learning activities to patients, particularly when those activities are targeted toward implementing evidence-based care practices, which many patients may assume are already occurring: “all we’re doing is assuring that the care that we were supposed to be doing was actually happening, and so I’m not sure what you tell the patient then.”
A fourth described a different challenge related to messaging, noting the following:
people want to feel like they’ve got the support of an integrated system but they don’t want to feel like they’re being treated like a number. They want to feel like they’re being treated like an individual. So that kind of messaging would be tricky and I’ve not really given it a lot of thought.
Four institutions described existing mechanisms within their systems to inform patients about learning activities. For example, one respondent described building patient awareness of a large, pragmatic clinical trial through placing informational posters in patient rooms. Another respondent described including community representatives on the patient safety committee, and posting informational posters within the hospital cafeteria about ongoing efforts to improve patient safety within the hospital. However, deliberate efforts to inform patients were generally limited, both in prevalence and in scope. Two respondents referenced including language about ongoing learning within general clinical consent forms, but both conveyed skepticism about whether patients currently noticed this information.
Potential Tensions in Improving Quality and Reducing Costs
During the interviews, several respondents noted that the LHCS could provide a “win-win,” improving quality while reducing costs. However, 14 respondents shared insights about the potential for tensions between quality and efficiency, acknowledging that moving toward continuous learning is not always in the financial interest of institutions. Several of these respondents attributed this to misaligned incentive structures within fee-for-service medicine. As one respondent explained:
we’re still on a fee-for-service [model] primarily, so the more we improve our outcomes, the more we’re actually hurting our bottom line. And it’s always been like that…You want to do the right thing, but when you look at the bottom line, it’s like, “Wow, how do you really survive in a value based system?”
Despite these challenges, our respondents emphasized that quality should be the first priority. Five respondents noted the importance of the institutional mission in decision-making about quality and cost. As one respondent described in the context of a program aimed at reducing hospital length of stay:
From a commercial perspective, it’s a big loss. But then you get back to what’s the right thing, and the right thing is to cut length of stay if you can. It’s not good for patients to hang out in the hospital unnecessarily…I guess I’m lucky I work at an organization that’s really about the mission. We used to do DVT [deep vein thrombosis] scans before we put patients on DVT prophylaxis…So we used to scan to make sure you didn’t have a clot. And we said that’s not evidence-based practice. It delays implementation of that prophylaxis and it was a big revenue loss to do that but we did it because [scanning] wasn’t the right thing to do. So I feel very fortunate I work at a place that does the right thing.
Ethics of Data Sharing and Data Management
Eleven respondents described challenges associated with data sharing and data management. Most commonly, respondents framed issues related to data sharing as being a regulatory issue, rather than emphasizing ethical concerns. For example, several respondents referenced concerns about violating the Health Insurance Portability and Accountability Act (HIPAA) when sharing patient data across institutions. However, four respondents explicitly linked concerns about sharing of electronic data to ethical issues involving patient privacy.
These respondents characterized privacy as being potentially in tension with other desired values, including efficiency of research, integration of care delivery, and ultimately, improving quality of care. For example, one respondent described his institution’s concern for patient privacy as making the institution reluctant to permit researchers to use email to recruit patients, despite researchers having email addresses through their clinical operations. To the respondent, this reluctance reflected the “nature of the disconnect between the clinical enterprise and the research enterprise,” a disconnect that ultimately “stifle[s]” efforts of investigators.
Another respondent characterized privacy as being in tension with integrated care:
there’s so many barriers to integrated care, based on who can share what information… we’re all for protecting individual privacy, but we’ve made it so arduous that it’s very hard to create integrated models.
A third respondent referenced the influence of public understandings of privacy and data ownership as shaping the debate over privacy within the LHCS:
our country has this sort of overly zealous sort of privacy imperative…In other countries, the data created is a by-product, and to share is considered a common good. Because, after all, the common wealth paid for your care. And given that the government pays for nearly 70 percent of healthcare in America, it would say that there should be a whole lot more opportunity for liberating [the data].
One respondent observed that provider competition could be in tension with data sharing:
[There is] a massive amount of big data that’s out there. We all have it. We’re all trying to tap into it to better care… It’s just so complicated when…all of our folks that we’re learning from, our partners, we’re also competing with. And that dynamic sometimes—the message is “you have to collaborate more” but really we’re a business and we’re competing…
Lag Time Between Discovery and Implementation
Two respondents identified the obligation to implement evidence-based practices as being a central ethical issue for the LHCS. As described by one respondent in reference to an evidence-based approach for reducing hospital-acquired infections:
so that [study] is out there, right? It’s been published. But, I guarantee you if you polled any number of hospitals across this country, I am sure a lot of hospitals are unaware of those results and/or haven’t changed practice patterns related to those results. So, that’s an ethical issue, right....How do we educate and then how do we motivate people to say, “This is something we need to do” and not tomorrow, but today.
Another respondent similarly said:
once you’ve uncovered something that potentially has a real significant impact on clinical practice patterns, you say, “We’ve been doing this and outcomes are better if we do that.” The time from when that result is deciphered from whatever study was being done to how soon it actually gets applied to patients, and to the extent that sort of the traditional research infrastructure that’s in place lengthens that time from when we’ve actually discovered something to when it actually impacts patient care patterns, I think that’s an ethical issue.
Transparency to Patients about Quality
An additional ethical issue identified by two respondents involved decisions about how to use information about providers or groups that are underperforming, either compared to others within their system or to external competitors. Two respondents described difficulties about deciding whether institutions should disclose to patients that their care would likely be better with another clinician, either within the system or at a competing institution. As described by one respondent:
if there are five cardiovascular surgeons in your hospital and two of them are outstanding or two of them are okay and one of them is so-so, and I come to your hospital to have my coronary bypass surgery, will you as a hospital tell me to only take the first two? I would bet you almost nobody has built into their system the willingness to tell people, “I wouldn’t let [Doctor X] operate on me.”
According to this respondent, such decisions are the “the biggest ethical and moral issue we face,” in health care. Another respondent invoked a comparison to the movie Miracle on 34th Street to make a similar observation regarding disclosure of quality and transparency. The respondent referenced a scene in which a department store Santa Claus directs families to competitor stores if the price or quality for a product is better elsewhere. In the movie, this disclosure promoted business by enhancing consumer trust. In health care, however, the respondent noted, “we’re not there yet.”
Ethics of Randomization for Care and Quality Improvement (QI) Initiatives
Finally, one respondent described traditional approaches to evaluation, particularly randomized controlled trials, as presenting potential ethical challenges for learning health activities. Specifically, this respondent expressed concern that randomizing individuals to the placebo arm of a QI trial might fail to provide individuals with potential benefit. As described by this respondent in the context of considering an RCT for care management for an at-risk population: “if you’ve engaged somebody and you really understand them, and you understand that you can help them, at some point, do you hit the button and say, ‘Oops, I’m sorry, you randomized out. We can’t help you. See you later.”
Discussion
The current United States health system has been described by the NAM as one characterized by “missed opportunities, waste, and harm.”11 From this perspective, transitioning toward a model of learning health care is a moral imperative. Nevertheless, individuals at the forefront of this transition highlight some ethics challenges along the way.
Ethical Oversight of Learning Activities
The experiences of our respondents provide key insights for contemporary efforts to ensure that systems for ethical oversight are aligned with the goals of LHCS.2,3,12,13 In particular, our findings suggest confusion persists about when learning activities need to be submitted to an IRB as systems enhance integration between research and practice. This confusion is consistent with concerns voiced previously in the literature,14,15,16,17 as well as in a recent empirical study by Whicher et al.,18 who found that professionals responsible for QI and CER experience challenges related to ethical oversight of these activities, particularly in determining which aspects of activities constitute research, and which are practice. These issues suggest a mismatch between existing regulatory guidelines of the Office of Human Research Protections (OHRP)—developed in the 1970s in response to significant ethical scandal—and the oversight needs of the LHCS.
A recent proposal by the United States Department of Health and Human Services (HHS) to amend regulatory guidelines for research may offer at least a partial remedy. In September 2015, the HHS, along with 15 other federal departments and agencies, released a “notice of proposed rulemaking” (NPRM) to revise the Common Rule. Several proposed changes were designed to reduce confusion and streamline oversight decisions related to learning activities.19 For example, the NPRM offers additional categories of exempt research that do not require review and for which IRBs need not prospectively confirm in each instance that they meet criteria, which could help to reduce regulatory burden for the large proportion of learning activities involving very low risk or burden. However, the current NPRM does not provide sufficient guidance for oversight of cluster randomized trials, including whether or what type of oversight is required for projects designed to compare one QI approach to another or one set of hospital policies or reminders to one another.
While the changes proposed in the NPRM may offer future clarity regarding whether and when oversight is needed, the experiences of our respondents highlight that this ongoing uncertainty, at least in some cases, brings real costs. Several respondents observed that the existing paradigm could “chill” activities aimed at improving the quality of care delivered to current and future patients. They referenced activities that were not undertaken, were delayed, or were done in less rigorous ways, due to the challenges of securing IRB approval. This concern is consistent with arguments that the current system overprotects patients from low-risk activities that stand to improve the quality and safety of health care, which ultimately risks underprotecting patients from the risks presented by exposure to inappropriate or substandard care due to insufficient research.11,20,21 Furthermore, as one of us has argued elsewhere, current regulatory approaches may create “dubious incentives” for institutions to design learning activities so as to avoid being classified as research, which may have the unfortunate by-product of reducing the rigor of their evaluation and their likelihood of being disseminated to other institutions.2
Relatedly, requiring IRB review for publication, even for minimal risk QI activities, may inadvertently burden IRBs, taking time away from the review of activities that merit greater ethical oversight. The OHRP has determined that intent to publish is an “insufficient criterion” for determining whether or not an activity constitutes research.22 While this clarification is helpful, the experience of our respondents suggests that this issue remains a source of confusion for investigators, IRBs, and journal editors. Further dissemination of OHRP’s determination may be useful.
Transparency to Patients about Learning Activities
Informing patients about learning activities underway in their health systems has been proposed as an ethical obligation of LHCS.2 However, the experience of our respondents suggests that transparency about learning activities is not the norm, despite several of these leaders believing it is an important commitment for institutions to advance. This suggests the need for further exploration into how patients should be informed about learning activities, including which frames are best understood, and which best communicate the ultimate goal of learning activities, namely, to improve quality of care that all patients receive as well as, more generally, how institutions can begin to incorporate such disclosures into ordinary practice. Relevant to this inquiry will be explorations of the degree to which patients are already aware of learning activities within their health systems, and of patients’ preferences regarding the disclosure of learning activities.
Potential Tensions in Improving Quality and Reducing Costs
Important ethical issues were raised by two of our respondents: that systematic learning may ultimately result in recommendations that fewer procedures or tests be conducted, and yet this recommendation could pose conflicts for institutions operating with a fee-for-service reimbursement mechanism. It has been suggested that certain commitments to quality will be easier to operationalize only when economic incentives are aligned with quality and outcomes rather than with volume. It was heartening to hear several respondents say that their institutions stand behind quality regardless of economic implications.
Ethics of Data Sharing and Data Management
Issues related to data sharing and privacy present another area where existing governance models may not align with the goals of learning health care. Our respondents described existing data protections such as those under HIPAA as presenting obstacles to the sharing and use of data, particularly sharing across institutions. Respondents noted concerns both with federal regulations themselves, and with institutional interpretations and applications of these rules. Their observations thus add weight to arguments that current approaches to protecting the privacy of health information may constrain health systems from fully realizing the benefits of CER and related efforts to aggregate health data to improve quality and patient outcomes.23,24
Several alternative policy frameworks have been proposed to govern data sharing within the context of CER and related learning activities, including: streamlining mechanisms for seeking and managing patient consent for data use, standardizing security controls for data sharing and storage across institutions and jurisdictions, and enhancing federal enforcement capabilities in the event of privacy violations.23,25 In the absence of broader policy changes, several contemporary models exist for managing data sharing across institutions. For example, networks such as those used by the Veteran’s Health Administration (VHA) and the High Value Health Care Collaborative (HVHC) have created centralized models for aggregating and sharing patient data across systems. Within the HVHC, data sharing policies and procedures are outlined in a Master Collaborative Agreement, specifying that, among other requirements, data exchange across institutions is limited to only de-identified data, and a single IRB governs all collaborative studies.26 The VHA also has taken steps to offer clarification as to what constitutes research and what constitutes “nonresearch health care operations activities,” including the collection and sharing of data related to various quality initiatives and public health investigations.27,28 The Food and Drug Administration’s Mini-Sentinel project, a collaborative effort for safety surveillance of drugs and other medical devices, has also been proposed as a model for secondary data use and data sharing across institutions.29 The experience of multi-institutional collaborations within the HMO Research Network (HMORN) may also be instructive, including approaches to cede review to a single IRB, and development of a reciprocal data use agreement (DUA) to govern data sharing across all sites.30 Finally, additional approaches will likely be suggested by the National Patient-Centered Clinical Research Network (PCORnet) operated by the Patient-Centered Outcomes Research Institute (PCORI), which recently launched a task force to develop privacy policies to govern data sharing within the network.31 Further research is needed to evaluate and compare different governance models for data sharing across different networks to identify which models work best, and in which settings.32
Other factors may influence willingness to share data, beyond concerns for compliance with legal requirements. For example, as described by one of our respondents, competitive pressures may undermine data sharing. This may occur at the level of individual researchers, based upon a desire to receive credit for a particular scientific finding, or at the institutional level, driven by market competition and the push to advance or maintain institutional prestige. This issue may be exacerbated by market competition among vendors of electronic health record and other data systems, some of whom are reportedly obstructing the electronic sharing of health information as part of a business strategy to enhance their market dominance.33 Encouragingly, prior research suggests that data overprotectiveness may be less problematic when participants have a history of working together.26 Further attention should be directed to this issue, including developing strategies to incentivize data sharing across institutions.
Transparency to Patients about Quality
Transparency about quality requires further attention. This issue is not unique to learning contexts; longstanding challenges encompass whether or when health care systems have an obligation to share data on underperforming providers. Nevertheless, the rapid expansion of both the amount of data and the analytic capability of health systems to use that data has amplified this issue.
Three large academic medical centers, (Johns Hopkins Medicine, Dartmouth–Hitchcock Medical Center, and the University of Michigan Health System), recently proposed quality thresholds for 10 high-risk procedures, including a provision that surgeons at their institutions not perform these surgeries unless they perform a minimum number each year.34 However, implementing the proposal will require approval from the physician leadership within the respective institutions. Alternatively, the Centers for Medicare and Medicaid Services (CMS) or The Joint Commission could set requirements for meeting certain thresholds to perform procedures, akin to current policies requiring CMS approval to perform organ transplants.
Ethics of Randomization for Care and Quality Improvement (QI) Initiatives
One respondent expressed discomfort about randomizing patients to a placebo arm as potentially being worse for quality of care. It is of course essential to allow such randomization only when evidence is actually lacking about the efficacy of the intervention in question. Remarkably, many interventions presumed to be effective have turned out upon more rigorous evaluation to not provide any added benefit, suggesting that in many cases this moral distress may be misplaced.21
However, this respondent’s observation relates to a debate elsewhere in the literature regarding the ethics of randomizing patients to two or more standard of care arms, and whether randomization should always require consent. This issue will only become more salient with the rise in CER. Further guidance will be needed to assist institutions in evaluating the ethics of randomization in these circumstances.35,36
Additional Ethical Issues
Note that certain potentially relevant ethical issues that were not discussed in our interviews. For example, several scholars have set forth normative arguments that patients have an obligation to contribute to learning activities.2,17,37 Such arguments are commonly grounded in claims of reciprocity, holding that it would be unfair for patients to benefit from learning activities without contributing themselves, but not all agree that patient participation should be obligatory, at least for certain learning activities.13,38 While it is possible some of our respondents would support such an obligation, none discussed it during our interviews. Additionally, no respondent described ethical challenges related to implementing evidence-based medicine, such as what to do when one treatment is nearly as effective as the best available but substantially less expensive, or how much leeway clinicians should be given to deviate from evidence-based practices to incorporate patient preferences or other factors in clinical decision-making.39
Limitations
Our study had several limitations. First, we purposively selected participants who were considered thought leaders in the transition toward a model of learning health care. While it was an explicit goal of this project to hear experiences from those furthest along in the transition to LHCS, the experiences and views of these individuals may not be representative of other health system leaders, or of the experiences others will have if or when they embark on this road. We also clearly were only able to interview a sample of such thought leaders, and other innovators likely would have provided additional insights.
Second, our sample did not include individuals with primary responsibility for ethical oversight, including bioethicists at these institutions, IRB staff or members, who, as individuals charged with thinking primarily about ethics, might well have had different or more extensive comments about the issues included in this paper. Had we targeted bioethicists, IRB members, or other individuals charged specifically with thinking about ethics, it is likely that additional ethical issues would have arisen. Our goal was to determine what those leading the LHCS charge had experienced from their own perspective. We also did not interview any individual with primary responsibility for informatics or privacy. The perspective of these individuals likely would have suggested additional challenges and strategies related to data management.
Lastly, the frequency of ethical issues described by our respondents reflects, at least in part, the topics included in our interview guide. It is likely more respondents would have described ethical issues related to the lag time between discovery and implementation, for example, if we had included such questions on our interview guide. Consequently, the frequency presented here should not be taken to reflect either the relative moral importance of the identified ethical issues, or how regularly they are encountered.
Conclusion
An estimated $750 billion is spent in the United States each year on care that is unnecessary, unproven, or wrong.1 Efforts of those such as PCORI, the VA, and indeed many institutions represented in this study, to more closely integrate care and research should create more efficient and available opportunities to learn. Nevertheless, if key ethical concerns remain unaddressed, progress will be slowed.
This is the first project we know of that asks leaders from health care systems committed to ongoing learning about the ethical issues they have faced in this effort. As more institutions transition to such systems, and as policymakers work to support these transitions, we hope that the experiences of these institutions will provide guidance on the ethical issues at stake and, in some cases, what can be done to resolve them.
Acknowledgments
We would like to thank colleagues at the National Academy of Medicine (NAM) for their tremendous support of this project, especially Claudia Grossmann, Michael McGinnis, and Marianne Hamilton-Lopez for their assistance with project design, recruitment, and dissemination; and Andrew Wong, Sophie Yang, and Mina Bahktiar for administrative support. Thank you also to the IOM for additional material and in-kind support, including the costs of transcription. Richard Platt, Ruth Faden, Rob Califf, Eric Larson, and Jeremy Sugarman provided guidance in conceptualizing the project, developing the interview protocol, and selecting the sample. We thank the Berman Institute and colleagues, especially Joe Ali, for feedback during a preliminary presentation of our study findings, and the Hecht-Levi Postdoctoral Fellowship, which supported Dr. Morain’s time on this project. The editorial staff at eGEMs and two anonymous reviewers provided exceptionally detailed and constructive feedback. Finally, we are grateful to our respondents, for their time in sharing their insights and experience with us.
Footnotes
Disciplines
Medicine and Health Sciences
References
- 1.Institute of Medicine . Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. Washington: National Academies Press; 2013. p. 17. [PubMed] [Google Scholar]
- 2.Faden RF, Kass NE, Goodman SN, Pronovost P, Tunis S, Buchanan TL. An ethics framework for a learning health care system. Hastings Cent Rep. 2013;43(S1):S16–27. doi: 10.1002/hast.134. [DOI] [PubMed] [Google Scholar]
- 3.Faden RR, Beauchamp TL, Kass NE. Informed consent, comparative effectiveness, and learning health care. N Engl J Med. 2014;370(8):766–768. doi: 10.1056/NEJMhle1313674. [DOI] [PubMed] [Google Scholar]
- 4.Solomon MZ. How Institutional Review Boards can support learning health systems while providing meaningful oversight. Health Affairs Blog. 2015 Jun 5; http://healthaffairs.org/blog/2015/06/05/how-institutional-review-boards-can-support-learning-health-systems-while-providing-meaningful-oversight/ [Google Scholar]
- 5.Fleischman AR, Solomon MZ. Comparative Effectiveness Research: Ethical and Regulatory Guidance. JAMA Pediatr. 2014;168(12):1089–90. doi: 10.1001/jamapediatrics.2014.1764. [DOI] [PubMed] [Google Scholar]
- 6.Grady C. Enduring and emerging challenges of informed consent. NEJM. 2015;372(9):855–862. doi: 10.1056/NEJMra1411250. [DOI] [PubMed] [Google Scholar]
- 7.Casarett D, Karlawish JH, Sugarman J. Determining when quality improvement initiatives should be considered research: proposed criteria and potential implications. JAMA. 2000;283(17):2275–80. doi: 10.1001/jama.283.17.2275. [DOI] [PubMed] [Google Scholar]
- 8.Mazor KM, Sabin JE, Goff SL, Smith DH, Rolnick S, Roblin D, Raebel MA, Herrinton LJ, Gurwitz JH, Boudreau D, Meterko V, Dodd KS, Platt R. Cluster randomized trials to study the comparative effectiveness of therapeutics: stakeholders’ concerns and recommendations. Pharmacoepidemiol Drug Saf. 2009;18(7):554–561. doi: 10.1002/pds.1754. [DOI] [PubMed] [Google Scholar]
- 9.Owens DK, Qaseem A, Chou R, Shekelle P. High-value, cost-conscious health care: concepts for clinicians to evaluate the benefits, harms, and costs of medical interventions. Ann Intern Med. 2011;154(3):174–80. doi: 10.7326/0003-4819-154-3-201102010-00007. [DOI] [PubMed] [Google Scholar]
- 10.Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, & theory. Health Serv Res. 2007;42(4):1758–72. doi: 10.1111/j.1475-6773.2006.00684.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Institute of Medicine . Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. Washington: National Academies Press; 2013. p. 17. [PubMed] [Google Scholar]
- 12.Largent EA, Miller FG, Joffe S. A prescription for ethical learning. Hastings Cent Rep. 2013;43(S1):S28–29. doi: 10.1002/hast.135. [DOI] [PubMed] [Google Scholar]
- 13.Grady C, Wendler D. Making the transition to a learning health care system. Hastings Cent Rep. 2013;43(S1):S32–S33. doi: 10.1002/hast.137. [DOI] [PubMed] [Google Scholar]
- 14.Foubistser V. Quality Matters. 2008. In Focus: Pursuing Patient Safety. [cited 2015 July 16]. Available from: The Commonwealth Fund. [Google Scholar]
- 15.Largent EA, Joffe S, Miller FG. Can research and care be ethically integrated? Hastings Cent Rep. 2011;41(4):37–46. doi: 10.1002/j.1552-146x.2011.tb00123.x. [DOI] [PubMed] [Google Scholar]
- 16.Baily MA. Harming through protection? N Engl J Med. 2008;358(8):768–9. doi: 10.1056/NEJMp0800372. [DOI] [PubMed] [Google Scholar]
- 17.Baily MA, Bottrell M, Lynn J, Jennings B. A Hastings Center special report: the ethics of using QI methods to improve health care. Hastings Cent Rep. 2006;36(4):s1–s40. doi: 10.1353/hcr.2006.0054. [DOI] [PubMed] [Google Scholar]
- 18.Whicher D, Kass N, Saghai Y, Faden R, Tunis S, Provonost P. The views of quality improvement professionals and comparative effectiveness researchers on ethics, IRBs, and oversight. Journal of Empirical Research on Human Research Ethics. 2015;10(2):132–44. doi: 10.1177/1556264615571558. [DOI] [PubMed] [Google Scholar]
- 19.U.S. Department of Health and Human Services. (2015, September 8) Notice of Proposed Rulemaking for revision to Common Rule. Federal Register. 80(173):53933–54061. [Google Scholar]
- 20.Beauchamp TL. Why our conceptions of research and practice may not serve the best interests of patients and subjects. JAMA. 2011;269(4):383–387. doi: 10.1111/j.1365-2796.2011.02350_1.x. [DOI] [PubMed] [Google Scholar]
- 21.Kass NE, Faden RR, Goodman SN, Pronovost P, Tunis S, Beauchamp TL. The research-treatment distinction: a problematic approach for determining which activities should have ethical oversight. Hastings Cent Rep. 2013;43(S1):S4–15. doi: 10.1002/hast.133. [DOI] [PubMed] [Google Scholar]
- 22.United States Department of Health and Human Services Quality Improvement Activities Frequently Asked Questions. Available at http://www.hhs.gov/ohrp/policy/faq/quality-improvement-activities/intent-to-publish.html.
- 23.Peddicord D, Waldo AB, Boutin M, Grande T, Gutierrez L. A proposal to protect privacy of health information while accelerating comparative effectiveness research. Health Aff. 2010;29(11):2082–90. doi: 10.1377/hlthaff.2010.0635. [DOI] [PubMed] [Google Scholar]
- 24.McGraw D. Paving the regulatory road to the “Learning Health Care System”. Stanford Law Review. 2012;64:75–81. [Google Scholar]
- 25.McGraw D, Dempsey JX, Harris L, Goldman J. Privacy as an enabler, not an impediment: building trust into health information exchange. Health Aff. 2009;28(2):416–427. doi: 10.1377/hlthaff.28.2.416. [DOI] [PubMed] [Google Scholar]
- 26.McGraw D, Leiter AB. Pathways to success for multi-site clinical data research. eGEMs. 2013;1(1) doi: 10.13063/2327-9214.1041. Article 13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Puglisi T. Reform within the Common Rule?”. Ethical oversight of Learning Health Care Systems, Hastings Center Report, Special Report. 2013;43(1):S40–S42. doi: 10.1002/hast140. [DOI] [PubMed] [Google Scholar]
- 28. Department of Veterans Affairs Health Administration Handbook 1058.05, “Operations Activities That May Constitute Research,” Oct 28, 2011.
- 29.McGraw D, Rosati K, Evans B. A policy framework for public health uses of electronic data. Pharmacoepidemiol Drug Saf. 2012;21(S1):18–22. doi: 10.1002/pds.2319. [DOI] [PubMed] [Google Scholar]
- 30.Lauf SL, Pieper LE, Rowe J, Vargas IM, Goff MA, Daley MF, Tuzzio L, Steiner J. Accelerating Regulatory Progress in Multi-Institutional Research. eGEMs (Generating Evidence & Methods to improve patient outcomes) 2014;2(1) doi: 10.13063/2327-9214.1076. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Greene S. PCORnet: An Update on Our Blueprint for Transforming Health Research. Jun 6, 2014. [cited 2015 July 16]. Available from: http://www.pcori.org/blog/pcornet-update-our-blueprint-transforming-health-research.
- 32.Elliot TE, Holmes JH, Davidson AJ, Nelson AF, Steiner JF. Data Warehouse Governance Programs in Healthcare Settings: A Literature Review and a Call to Action. eGEMs. 2013;1(1) doi: 10.13063/2327-9214.1010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Pear R. Tech Rivalries Impede Digital Medical Record Sharing. The New York Times. 2015 May 26; [Google Scholar]
- 34.Pronovost P. Cut the High Risk of Low-Volume Hospitals. US News. 2015 Jun 5; http://www.usnews.com/opinion/blogs/policy-dose/2015/06/03/low-volume-hospitals-create-big-risks-for-surgery-patients. [Google Scholar]
- 35.Kim SH, Miller FG. Ethical complexities in standard of care randomized trials: a case study of morning versus nighttime dosing of blood pressure drugs. Clin Trials. 2015;12(6):557–563. doi: 10.1177/1740774515607213. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.McRae AD, Weijer C, Binik A, Grimshaw JM, Boruch R, Brehau JC, Donner A, Eccles MP, Saginur R, White A, Taljaard M. When is informed consent required in cluster randomized trials in health research? Trials. 2011;12:202. doi: 10.1186/1745-6215-12-202. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Lynn J, Baily MA, Bottrell M, Jennings B, Levine RJ, Davidoff F. The ethics of using quality improvement methods in health care. Ann Intern Med. 2007;146(9):666–673. doi: 10.7326/0003-4819-146-9-200705010-00155. [DOI] [PubMed] [Google Scholar]
- 38.Menikoff J. The unbelievable rightness of being in clinical trials. Hastings Cent Rep. 2013;43(s1):S30–S31. doi: 10.1002/hast.136. [DOI] [PubMed] [Google Scholar]
- 39.Gupta M. A critical appraisal of evidence-based medicine: some ethical considerations. J Eval Clin Pract. 2003;9(2):111–21. doi: 10.1046/j.1365-2753.2003.00382.x. [DOI] [PubMed] [Google Scholar]