Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Oct 12.
Published in final edited form as: SOCRA Source. 2021 Feb;107:39–42.

Utility of Protocol Development Software for IRB Protocol Development: Experiences from one Institution

Beatrice A Boateng 1, Alison H Oliveto 1, Michael Bailey 1, Mtonya Hunter-Lewis 1, Jonathan Young 1, Laura P James 1
PMCID: PMC8508970  NIHMSID: NIHMS1677202  PMID: 34646090

INTRODUCTION

Deficiencies in research protocols have been identified as a potential barrier to completing research studies in a timely manner (Ghooi, 2014). These include inadequate description of methodological details such as allocation methods, primary outcomes, power calculations, and sponsor and investigator roles in the conduct of the research. These deficiencies are not only linked to potential research biases, but they also make systematic review of the trial results difficult. Thus, promoting rigorous reproducible clinical research while reducing administrative inefficiencies is vital to the translational research enterprise.

One of the most important documents in ensuring high quality research is the research protocol, which has to be clear, sufficiently detailed, and transparent. Research protocols have to provide sufficient detailed information to allow funding agencies to 1) assess the significance and rigor of the research; 2) allow institutional review boards (IRBs) to determine whether the research is sound, addresses an important problem, and has appropriate and sufficient human subjects protections in place; 3) adequately inform participants about study procedures, risks, and potential benefits of participation; and 4) guide researchers in study conduct and allow systematic reviewers and others to determine potential biases. (Tetzlaff, et al., 2012).

In spite of these expectations, compliance with the institutional review board (IRB) requirements for protocols involving human subjects continues to be inconsistent. This has often led to greater administrative burden, including the need to perform several rounds of reviews and/or revisions in response to reviews, thereby creating delays in obtaining IRB approval. Given the protocol development challenges faced by researchers, the goal of this study was to examine the feasibility of using an online protocol development tool to address the inconsistencies with protocol development and subsequently minimize delays in obtaining IRB approval.

A Web-based protocol building tool with templates for various studies has been developed by a commercial company with the goal of improving the efficiency of the protocol development, submission, and approval process. Advertised benefits included: 1) enhanced compliance with institutional protocol submission requirements; 2) reduction in errors when developing a protocol, which could accelerate the acceptance of protocols, thus reducing time to approval; 3) and improved adherence to IRB and other regulatory requirements. Specifically, we 1) examined acceptability of the online protocol development tool to prepare IRB protocols; and 2) determined whether protocols developed with this online tool would lead to shorter times to IRB approval and fewer contingencies relative to historical matched controls.

METHODS

Software acquisition

The research protocol company contacted the University of Arkansas for Medical Sciences (UAMS) Office of Research Compliance in August 2017, which followed a conference where the tool was advertised. A webinar was set up with various regulatory stakeholders to demonstrate the capabilities of the software and its potential usefulness to researchers and the IRB process. Three months after the initial webinar and after various communications to address concerns, the software was purchased to supplement and possibly replace existing institutional protocol development templates. The availability of the tool was advertised through email communications, the institute’s website, a research information support network (RESIN) meeting attended by research faculty and staff, campus wide flyers, informational sessions at some graduate classes, and it was also included on our research services request portal (Boateng, Jenkins, McGuire, Jorden, & James, 2018).

Participants.

We set a target of recruiting one hundred participants into the study, particularly trainees (i.e., graduate students, medical students, residents, etc.) or faculty relatively new to research or new to the institution. Our Institutional Review Board (IRB) deemed this project as “not human subject research.”

Study Design and Procedures

Individuals indicating interest in using the protocol development tool were funneled through the UAMS research services request portal to better track interest in the tool. Potential participants were contacted and informed about participating in this project to assess the tool. Those who opted not to participate in the study were provided access to the online protocol development tool. Those who consented to participate were provided with a unique code and link to a short survey (see below). After completing the survey, participants were given access to the protocol builder tool. Once participants submitted a protocol using the protocol builder tool, they were sent a link to complete a follow-up survey (see below). After completing the follow-up survey, participants’ unique IDs were entered into a drawing for a $25 gift card. Drawings were held for each block of 10 participants.

Instruments.

The initial survey queried participants’ area of research focus, prior experience with our institutional review process, previous guidance, if any, via mentorship or use of existing templates, and level of confidence (from “not at all” to “very”) in various protocol development areas such as the background and study design, data and safety monitoring plans, data analysis, and IRB and FDA requirements. The follow-up survey was developed using the software usability model (Daneshmandia, 2013). This included 21 items on seven usability characteristics of software, level of confidence (from “not at all” to “very”) in protocol development and demographic information about their college, type of researcher (faculty or student), and rank (for faculty).

Data Analyses.

Descriptive statistics were used to summarize the data.

RESULTS

We received 45 requests through the research services request portal to use the online protocol development tool during the study period (February 2018 – April 2019). After further communication, 11 people determined that the software was not what they needed. Of the 34 wanting access to the program, 29 (64%) agreed to participate in the evaluation. Twenty three participants completed the pre- assessment instrument, of which there were 8 faculty (6 assistant professors), 13 trainees and 2 research staff.

Most of the participants engaged in clinical (24%), behavioral (18.5%), biomedical research (16.7%), or chart reviews (16.7%). About half of those (13) had previously submitted IRB protocols using templates available through the institutional IRB or templates shared by other colleagues or mentors. Prior to using the tool, about 80% of participants were somewhat or very confident in their ability to develop the background and rationale for a study, research questions, aims and objectives, study design and methods, minimizing risks to subjects, and disseminating findings. About 40 – 60% reported little or no confidence in developing a data and safety monitoring plan, data analysis, or understanding what the IRB and FDA required for studies.

No follow-up data were collected because none of the participants submitted a final protocol using the online protocol development tool. We were also unable to compare the turnaround time and number of contingencies due to the lack of product use during the evaluation period. Phone calls and follow up emails were sent to those who had completed the initial survey to understand reasons for not completing the protocols in the online tool. The feedback fell into three categories; 1) familiarity with the existing institutional protocol development templates; 2) challenges with the online tool; and 3) confusion on the best template to use within the online tool. One of the faculty investigators indicated that the investigator’s students were already developing protocols using the existing templates and did not want to start to try a different tool, but the investigator may use it in the future. Another researcher, working on a time sensitive protocol, indicated that they ended up not using protocol development software. The researcher spent about two hours using the tool but then realized that only half of the researcher’s work had saved. The researcher thinks that the mistake of clicking “NEXT” before clicking “SAVE” then “NEXT” was made. Some sections had saved and others hadn’t, so the researcher started to figure out why and to fix that error. But then the researcher got frustrated and gave up. However, the researcher did use the existing protocol template, which was very easy to use and very helpful! One investigator, inadvertently, realized that the investigator had used a wrong template after IRB review and noted what appeared to be miscommunication between the investigator and the portal facilitator regarding study logistics.

DISCUSSION

We received mixed feedback on the protocol development tool. Some of the templates in the online tool (e.g., Investigational New Drug (IND) protocol template) appeared to include most of what is required for a study in that category and would have made IND protocol development more complete when compared to the existing template. However, none of the participants were developing a protocol that required an IND, so the IND protocol template was not used.

The protocol development tool appeared to be challenging to use for most of the users. This was due, in part, to prior experience with the institutional protocol development templates. There were also some technical issues, including that some investigators were unaware that they had to press “save’ for each section in order to not lose their work. In addition, the online tool required users to complete the protocol in a linear fashion, whereas writing does not follow that pattern.

The online tool was also not integrated into the institutional processes. Therefore, a completed protocol had to be saved and downloaded before uploading it into the institutional IRB submission system. If there had been a seamless integration between the protocol development tool and the institutional protocol submission software, there might have been better uptake of the software. In its current form, information requested by the protocol development software and the institutional system appeared duplicative.

Our experiences were similar to another institution that used a protocol development tool software (USCF –IRB, 2019). They, too, realized that their investigators were more likely to utilize their own templates or templates developed by the NIH (National Institutes of Health (NIH), n.d.). Although the adoption of the tool was low at our institution and the online tool has since been discontinued, protocol development templates could be useful in ensuring that research protocols are more complete. In addition, integration into existing institutional tools could provide a seamless transition and increase use of such a product.

ACKNOWLEDGEMENT:

The project described was supported by the Translational Research Institute (TRI), grant U54 TR001629 through the National Center for Advancing Translational Sciences of the National Institutes of Health (NIH). The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.

REFERENCES

  1. Boateng B, Jenkins A, McGuire A, Jorden R, & James LP (2018). Leveraging institutional tools to track Clinical and Translational Research Services: Lessons learned from the UAMS Translational Research Institute (TRI). Clinical Researcher, 32(2). [Google Scholar]
  2. Daneshmandia A (2013). A Usability Study of Moodle. Proceedings of the Spring 2013 Mid-Atlantic Section Conference of the American Society of Engineering Education. [Google Scholar]
  3. Ghooi R (2014). Institutional review boards: Challenges and opportunities. Perspectives in clinical research, 5(2), 60–65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. National Institutes of Health (NIH). (n.d.). Retrieved from National Institutes of Health (NIH) e-Protocol Writing Tool: https://e-protocol.od.nih.gov/#/home
  5. Tetzlaff J, Chan A, Kitchen J, Sampson M, Tricco A, & Moher D (2012). Guidelines for randomized clinical trial protocol content: A systematic review. Systematic Reviews, 1(43). [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES