Abstract
Objective
To identify strategies that facilitate readiness for local Institutional Review Board (IRB) review, in multicenter studies.
Study Setting
Eleven acute care hospitals, as they applied to participate in a foundation-sponsored quality improvement collaborative.
Study Design
Case series.
Data Collection/Extraction
Participant observation, supplemented with review of written and oral communications.
Principal Findings
Applicant hospitals responded positively to efforts to engage them in early planning for the IRB review process. Strategies that were particularly effective were the provisions of application templates, a modular approach to study description, and reliance on conference calls to collectively engage prospective investigators, local IRB members, and the evaluation/national program office teams. Together, these strategies allowed early identification of problems, clarification of intent, and relatively timely completion of the local IRB review process, once hospitals were selected to participate in the learning collaborative.
Conclusions
Engaging potential collaborators in planning for IRB review may help expedite and facilitate review, without compromising the fairness of the grant-making process or the integrity of human subjects protection.
Keywords: Institutional Review Board, ethics committees, research, multicenter studies, research ethics
In early 2004, we were selected to head the National Program Office (B.S., M.R.) and evaluation component (J.B., J.B.) of the Robert Wood Johnson Foundation's “Expecting Success” (ES) program. The program applies quality improvement techniques in hospitals that provide care to significant volumes of African American and Latino patients with heart disease. In taking on our new roles, we knew that we faced a significant task in completing an Institutional Review Board (IRB) review process at the 10 different sites. We had seen growing numbers of reports of prolonged review periods in multicenter studies (Nelson et al. 2002; McWilliams et al. 2003; Dziak et al. 2005; Gold and Dewa 2005). Some of the dynamics underlying lengthy IRB reviews were documented in a recent report in this journal from a 45-site study. Using a protocol that was “designed to be qualified under U.S. government regulations for expedited review,” local IRB turnaround time ranged from 52 to 798 days. Moreover,
… Twenty three [of the sites] required inapplicable sections in the consent form and five required HIPAA (Health Insurance Portability and Accountability Act of 1996) consent from physicians although no health information was asked of them. Twelve sites requested, and two insisted upon, provisions that directly increased the risk to participants
(Green et al. 2006, p. 214).
In short, local IRB processes varied widely. But more significantly, boards differed with respect to their assessment of subject risk, their beliefs about what best mitigated those risks, and in their understanding of their obligations under the regulations.
This was important for us, because we were working on a tight timeline, which was driven by the funder's grantmaking schedule. A portion of the evaluation drew on a pre–post design, requiring the collection of survey data from patients shortly after the sites were to be notified that they would participate. In discussing how to meet this constraint, we agreed that prolonged IRB reviews are most likely when inexperienced researchers prepare and submit their completed applications in a vacuum, without consulting their local committees. While many IRBs provide information—for example in the form of posted forms and guidance on their Internet sites—these websites cannot possibly provide advice or policy with respect to all contingencies. In any event, some of the applicant hospitals were not major teaching institutions. Fewer than one half of their IRBs offered any human subjects guidance online, and in early conversations with the prospective investigators at the sites, it was clear that some had little experience with the review process.
We therefore saw our role as that of encouraging the sites to be proactive in their engagement with their local IRBs. We began to forge supportive relationships, to foster intrainstitutional communication with the goal of uncovering potentially troublesome issues in a timely fashion. Our aim was to maximize efficiency without compromising the integrity of the human subjects process. While the ES initiative included a planning phase for the collaborative, the intervention was expected to begin within 6 months of the grant award, and it was therefore important that a planned patient survey be fielded as soon as possible after the grant award notification.
In this paper, we describe our efforts to work with participating hospitals to expedite the process. We begin with a brief discussion of the context in which we were working, the ES program, and the material that we circulated to the sites. We describe the conference calling process that we used to help the local teams prepare their written applications for their respective committees. We then present some information on our follow-up process. Finally, we reflect back on our approach, and ask: was it ethical?
CONTEXT FOR OUR EFFORTS
Before we begin, we would underscore some features of the environment in which we were working. First, as “evaluators,” we were interacting with sites that were being chosen in a foundation-sponsored process. Therefore we did not have the kind of history of collaboration building that is the norm in multisite health services research. Second, when we began working with the sites they had not yet been informed that they had been chosen to participate in the foundation-funded collaborative and some would ultimately not receive awards. We return to this issue at the close of the paper. Finally, we were working during the time period of early HIPAA implementation, and there was some uncertainty among all parties as to what the regulations required. This increased the number of issues that needed to be addressed, as sites contemplated the release of potentially sensitive information.
A final point bears emphasis. All of us are active health services researchers. Two of us have served on our local IRB, and one of us currently chairs the IRB at her institution. We share a commitment to respecting the rights of human subjects, and believe that the research we were proposing was scientifically valid, consistent with the principles outlined in the Belmont Report, concordant with federal regulations, and in conformity with HIPAA requirements.
BACKGROUND ON THE LEARNING COLLABORATIVE: TIMELINE FOR THE IRB PROCESS
The ES program is a foundation-supported quality improvement collaborative targeting racial and ethnic disparities in care. It relies heavily on hospital performance data, which are analyzed by race, ethnicity, and language. The project seeks to understand the patient's experience with care as well as the costs and replicability of improvement strategies. Various sorts of data, from a variety of sources, are required to support and evaluate the project. These include postdischarge surveys of patients, patient clinical data (some of the UB-92 data fields that are traditionally used in health services research, as well as the fields that are used to report to the Centers for Medicare and Medicaid Services [CMS] on performance of patients with AMI and CHF), focus groups and interviews with physicians, interviews with hospital staff and clinicians, and organizational surveys and questionnaires. These data requirements had been detailed in the initial program announcement, and the IRB implications had been discussed briefly during visits to each of the 16 finalist sites, during the spring and summer of 2005.
In terms of common human subjects issues (potential harms, vulnerable populations, assurances of confidentiality), the proposed work was not of the sort that would usually raise the gravest concerns. While the protocol required sites to obtain consent for postdischarge contact from hospitalized patients, debilitated patients were excluded on the grounds that they might find participation in a telephone survey onerous or difficult. While the survey elicited clinical information, reports on access to care, and questions assessing the process of in-hospital care, it did not deal with sensitive issues like immigration status, illicit activities, or matters that would generally viewed as highly personal or potentially stigmatizing. Moreover, the firm that would be conducting the patient telephone surveys had significant prior experience with vulnerable populations.
The study involved the release of clinical data, both for patients who had provided consents and signed HIPAA authorizations and for all patients. For the latter group, the sites were to execute Data Use Agreements (DUA) to provide clinical data in Limited Data Set (LDS) form.
OUR APPROACH AND LESSONS LEARNED
Provide Model Applications for the Sites to Use as Templates
Early on, we provided model electronic IRB applications to the 16 finalist sites. These materials could be tailored to meet local submission requirements. These templates addressed the usual issues covered in an IRB application, including the research questions, study design, sampling approach, recruitment procedures, and consent process. Also provided were model consent forms, early drafts of the instruments, and schedules for the interviews that would be conducted. Procedures were described for handling potentially problematic or sensitive issues such as consent with hospitalized patients. For instance, we suggested that patient recruitment might be done:
by designated staff at the grantee hospitals, who will have been trained by re-searchers at [hospital name here] on recruitment procedures. To minimize coercion, designated staff will not include persons who have been involved in the care of the patient. Patients will be clearly informed that their willingness to participate will not influence the care that they receive at the institution in the future.
By including this level of detail in written materials early on, we signaled that potential coercion was an important issue for sites to consider thoughtfully. We also provided them with a protocol and model language that they could review with their IRB. We expected that the language would provide a basis for the applications that they would submit to their local review board.
In the documents, we tried to anticipate issues that might arise locally. This meant outlining alternative approaches to some potentially problematic issues. For instance, the research protocol required that we be able to link patient survey responses to clinical information. We knew that this might raise concerns about maintaining confidentiality, and so, recognizing the hospitals' ethical obligations as well as mandates under HIPAA, we offered sites the option of performing those linkages in-house and sending us deidentified data. Alternatively, we could receive identified data, with the assurance that those identifiers would be removed as soon as the data were linked. By offering these options up front (rather than leaving the sites to propose a single one, only to receive a categorical rejection from their local IRB), we aimed to discover the procedures that would be acceptable locally. We also intended to convey our willingness to honor local preferences, and minimize the work that sites would need to do. As it turned out, all of the sites ultimately accepted the option of secure transmission of the identified data. But it is our sense that they appreciated having been offered a choice, and benefited from thinking the issue through.
Assembling this material involved relatively little additional work on our part, as much of it was needed for our home institutional IRB reviews. For the evaluators—who were collecting the potentially sensitive patient survey data—the initial home IRB application was somewhat open ended, in the sense that some aspects of the procedures were described as “to be negotiated” with the sites. For example, in case of the data linkages mentioned above, the application to the researchers' home institution referenced the possible modes of linkage, and left open the possibility that different site IRBs would have different norms and preferences with respect to linkage. Similarly, both home institutions offered some leeway on wording of consent forms (more on this later). After the local sites had gone through the approval process, home IRBs reviewed each hospital's protocol before final sign off.
Break It Down into Pieces
Another key feature of the written material was modularity. Rather than presenting the research as a single study, with one grand IRB application, we presented it as several substudies. For each, the research questions, study design, sampling approach, recruitment procedures, and consent process were described separately. For instance, the portions of the study that used deidentified patient data were separated from the portions that used Private Health Information (PHI); the parts of the study that used patients as data sources were separated from those that collected information from staff surveys and interviews. While some of the sites ultimately combined the various portions into one application, the modular approach facilitated the identification of problem areas, and let the sites go forward with preparing their applications for those aspects of the study that would be less likely to raise human subjects concerns.
Talk with Sites Sooner Rather Than Later
After the sites had the opportunity to review these written materials—but before they prepared their IRB applications—we scheduled a round of conference calls. These were held just after the National Advisory Committee (NAC) had recommended awardee sites, but before the foundation board had made the awards. Eleven calls were made during July and August of 2005, up to 2 months before the anticipated award date. We began with the sites that were most likely to be selected, based upon the NAC's recommendations. We continued until the decisions were made. By the time that we had completed the 11 calls, the selection of 10 sites was official. This made calling the others unnecessary.
Have the Relevant Parties Represented on the Conference Calls
We requested that each site include on the call the Principal Investigator (PI), the proposed Project Director, a member of their IRB (often the chair), and their legal counsel. As it happened, most sites included additional members of the project team as well. We scheduled 1-hour calls with each of the sites, but few calls lasted more than 30 minutes.
We cannot stress enough the value of including IRB representatives on the calls. As issues arose, the local member was able to request clarification, and receive information directly from us, the local PI, or legal counsel. Local IRB representatives were quite willing to offer guidance as to which issues might prove problematic to the committee, allowing us to discuss and work through possible solutions. Of equal importance was the ability of IRB representatives to indicate issues that should be emphasized in the application (e.g., the qualifications of the survey research firm) and to identify issues that that would not be problematic for their local committee (interestingly, none of the sites had any comments on or modifications to the procedure for approaching patients and inviting them to participate).
Some unanticipated issues came up during the calls. For example at one site the IRB chairperson was quite concerned that the proposed interviews with hospital staff were problematic, under a specific notice of regulatory guidance. By speaking with this chairperson early, directly, and at some length, we were able to review the guidance in question, and clarify our intention. After some discussion, the chairperson agreed that the proposed approach was acceptable, and we were able to continue with the process. It is noteworthy that none of the other sites raised ethical or regulatory concerns about this matter, but it is our sense that had we not had this conversation early, that site's entire application might have been delayed considerably.
In sum, the inclusion of an IRB representative at this early stage allowed potential problems to be anticipated and addressed. Each site had an opportunity to suggest modifications that it felt would bolster human subject protection, or accommodate local administrative and/or regulatory concerns and procedures.
Think of the IRB Process as a Learning Opportunity
Our primary intent was to ensure the protection of human subjects, and to facilitate the timely completion of the IRB process. Nevertheless, we found that a detailed, early IRB process provided a stimulus for the sites to grasp the scope and magnitude of the work they would be expected to perform over a 29-month period.
In competitive processes, applicants will understandably devote considerable energy to being selected, and less attention to the demands that will be placed upon them after they have been chosen. Our calls and communications with the prospective sites focused sites' attention on the future. In many calls we clarified multiple aspects of the project design and timelines, even though much of this had already been detailed in the original program solicitation. Indeed, these calls may have led to the withdrawal of one finalist site. In this case, staff at the site became concerned as to whether they could and should meet the demands of management attention and ES data reporting. From our perspective this was a positive if unintended outcome.
Be Prepared for Issues to Emerge over Time
Of course, many important IRB concerns arose after the initial conference call. Most of these were procedural, but some were substantive. In terms of procedure, there was discussion at some sites as to whether HIPAA authorization forms and consent forms should be separate. We expressed our belief that consolidating information into one form minimizes confusion, is least troubling and intrusive, is most straightforward for patients, and therefore maximizes the protection of human subjects. But we were flexible. Most sites elected to use two forms, with the inevitable duplication of information—for instance, procedures for ensuring data confidentiality were typically described in some detail in both of the forms. At a typical two-form site, the documents totaled seven single-spaced pages, required two patient signatures, two signatures from the hospital staff member requesting consent, and in some cases, the signature by an independent witness.
On the substantive side, nearly all of the human subjects concerns centered on the disclosure of clinical data for nonconsenting patients. Relatively early on, one site expressed concern about disclosing dates of admission, discharge, and birth. In conversations with site representatives, we noted that disclosure of this information is consistent with the Privacy Rule under a DUA, and we offered to provide further assurances regarding our capacity to protect confidentiality. Nonetheless, the site's IRB preferred not to release this information, and we agreed that the site would transmit month and year, but not date, for each of these fields. In considering whether we should extend this modification to all sites, we weighed the risk and costs of disclosure of potentially identifying information against the benefits of study validity. While we felt that the risk of disclosure was extremely low, study validity was not enhanced by knowing precise birthdates. Accordingly, the request for “date of birth” was modified to “year of birth” at all sites; the request for admission date and discharge date remained, as knowing these enhanced the knowledge that could be accrued from the study.
OUTCOMES AND RESIDUAL ETHICAL CONCERNS
In the end, we met our short timeline: all 10 sites received approval within 180 days of notification of funding, and most did so within 90–120 days. Our “pre” patient surveys could be fielded as scheduled. We believe that part of this success is due to the efforts described in this brief communication. But while our approach worked out logistically, there is still a question as to whether it was ethical. For example, did our “jumpstarting” the process somehow compromise the IRB process? Did our preliminary activities unduly burden the applicant sites that were not chosen?
Was There Undue Site Burden?
With respect to the issue of burden, it is important to understand that by applying to participate in the program, sites agreed to undertake considerable work with no guarantee of success. While only 10 sites were selected, 122 sites submitted letters of intent, 23 submitted full applications, and 16 hosted full-day site visits. Throughout the process, sites were kept informed about project data needs and concomitant IRB issues. Our sense is that the preliminary IRB efforts described here required relatively little energy. As we have noted, none of the sites submitted an actual IRB application until they received notification of their award. Rather, the early process and the calls provided them the opportunity to begin thinking proactively about human subjects issues and IRB requirements.
Was There Site Coercion?
The possibility of influence by the selection process is more troubling. To the extent that sites viewed us as holding out a “carrot” of funding, they might have felt pressure to facilitate the IRB process, in order to appear more attractive as candidates. They might have even lobbied their review board to sanction unethical research practices. In assessing the likelihood of this conflict of interest, we would acknowledge that the “carrot” of funding gets substantial institutional attention. However, sites were told that their participation in the preaward IRB activities would not influence their prospects of receiving an award. Additionally, we got sufficiently detailed questions and suggestions from sites during the calls and subsequently to suggest that the IRBs were operating independently, as they should. Also in the conference calls with the applicants (that included an IRB representative), it was always abundantly clear who would be the final arbiter on any issue, and the tone was clearly one of trying to elicit an understanding of the ground rules and potential pitfalls in the process that might unnecessarily delay things. Finally, none of the parties involved in the final selection process (the NAC, Foundation staff, or Foundation trustees) was informed about IRB-related matters.
In sum, we believe that our process worked, and was fair to the subjects, the sites, and the funder. We hope that the approaches outlined here will prove useful to HSR readers who undertake multisite studies in the future.
Acknowledgments
This work was supported by a grant from the Robert Wood Johnson Foundation. Additionally, we are grateful to the editors and the anonymous reviewers for their astute and helpful comments.
Disclosures: None.
Disclaimers: None.
REFERENCES
- Dziak K, Anderson R, Sevick MA, Weisman CS, Levine DW, Scholle SH. Variations among Institutional Review Board Reviews in a Multisite Health Services Research Study. Health Services Research. 2005;40(1):279–90. doi: 10.1111/j.1475-6773.2005.00353.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gold JL, Dewa CS. Institutional Review Boards and Multisite Studies in Health Services Research: Is There a Better Way? Health Services Research. 2005;40(1):291–307. doi: 10.1111/j.1475-6773.2005.00354.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Green LA, Lowery JC, Kowalski CP, Wyszewianski L. Impact of Institutional Review Board Practice Variation on Observational Health Services Research. Health Services Research. 2006;41(1):214–30. doi: 10.1111/j.1475-6773.2005.00458.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McWilliams R, Hoover-Fong J, Hamosh A, Beck S, Beaty T, Cutting G. Problematic Variation in Local Institutional Review of a Multicenter Genetic Epidemiology Study. Journal of the American Medical Association. 2003;290:360–6. doi: 10.1001/jama.290.3.360. [DOI] [PubMed] [Google Scholar]
- Nelson K, Garcia RE, Brown J, Mangione CM, Lousi TA, Keeler E, Cretin S. Do Patient Consent Procedures Affect Participation Rates in Health Services Research? Medical Care. 2002;40:283–8. doi: 10.1097/00005650-200204000-00004. [DOI] [PubMed] [Google Scholar]