Skip to main content
Clinical and Translational Science logoLink to Clinical and Translational Science
. 2011 Aug 29;4(4):266–267. doi: 10.1111/j.1752-8062.2011.00296.x

Streamlining Research by Using Existing Tools

Sarah M Greene 1, Laura‐Mae Baldwin 2, Rowena J Dolor 3, Ella Thompson 1, Anne Victoria Neale 4
PMCID: PMC3170080  NIHMSID: NIHMS309209  PMID: 21884513

Abstract

Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of US healthcare. Streamlining research operations would speed translation, particularly for multisite collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and multisite collaborations from scratch—reinventing the wheel. Our team developed a compendium of resources to address inefficiencies and researchers’ unmet needs and compiled them in a research toolkit website (http://www.ResearchToolkit.org). Through our work, we identified philosophical and operational issues related to disseminating the tool kit to the research community. We explore these issues here, with implications for the nation’s investment in biomedical research. Clin Trans Sci 2011; Volume 4: 266–267

Keywords: editorial, translational research, ethics

Introduction

The health research enterprise has matured rapidly over the past two decades, thanks to the confluence of increased federal funding for it, use of technology to facilitate multidisciplinary scientific collaboration, and recognition of the urgent need to apply pertinent research results in routine healthcare delivery. However, as multisite collaborations between academic, community, and practice‐based partners have proliferated, health research infrastructure has lagged. The Clinical and Translational Science Awards (CTSAs) initiative was established by the National Institutes of Health (NIH) in 2006 as one strategy to address the suboptimal pace of research from idea to implementation by encouraging cross‐disciplinary partnerships, capacity building, and training. However, accelerating the cycle of research and its translation requires both cultural and operational changes.

Opportunities exist to improve the operational efficiency of research by embracing approaches and resources used in other projects. Yet, most researchers start from scratch to develop recruitment materials (e.g., introduction letters, consent forms), data collection tools (e.g., telephone surveys, questionnaires), and dissemination tactics for each study. By reinventing materials de novo, they potentially do a disservice to themselves, their participants, and even funding agencies. Often, the barriers to reusing or adapting extant resources stem from the real and perceived differences between one’s own and prior studies—and from unawareness of available models and resources. Can a study looking at cognitive behavioral therapy for depression in 40‐ to 64‐year olds use as a template the materials from a study with a similar intervention focused on 50‐ to 65‐year olds with depression? Similarly, consider a study of statins based in a clinic in rural Idaho and one conducted in a community health center in Iowa. In either case, the differences in design, approach, and covariates may be manifold—or not. So, while researchers may be ardent about using validated measures, they may be less able or willing to find a “validated” approach to recruitment and consent. A repository of research resources and tools would be a useful starting point for reusing and recycling well‐vetted and successful approaches to study implementation.

Lessons Learned: Building a Tool Kit for Researchers

Through a project called Partnership‐driven Resources to Improve and Enhance Research (PRIMER), reported in a companion paper to this article here in the Journal, 1 we undertook a needs assessment survey to identify barriers and facilitators to more efficient research. The survey included questions relevant to both academic and community and/or practice‐based partners. The survey results informed our development of a Web‐based repository of tools: http://www.ResearchToolkit.org. We built the repository around the phases of a typical research project, with contents systematically reviewed and catalogued by all authors based on alignment with survey data, relevance, adaptability, credibility, and nonduplication. We drew from the authors’ experience developing tools for two other research consortia: the HMO Research Network Collaborative Toolkit (SMG, EET), and the Practice‐based Research Network (PBRN) Research Best Practices Checklist (AVN). The tool kit fulfills a unique need by covering the full spectrum of the research process from initiation to study closure, emphasizing multisite collaborative research. Now that we have built it, however, we have identified three challenges that warrant consideration by the larger research community: dissemination, sustainability, and heterogeneity.

Dissemination: Response to the tool kit has been positive as we have shared it with the research community at conferences and through news releases. Website hits have increased in each month since the tool kit’s 2009 launch. Yet, in this information age, it is difficult to gauge whether and when the website has become a “top of mind” resource for members of the research community. It may seem just as easy to open a search engine to find a tool or resource in the moment, or to ask trusted colleagues whether they have a resource that could be repurposed. The development of a strategy to drive online traffic to the tool kit website was beyond the scope of our project. Thus, diffusing and embedding the Research Toolkit website into the fabric of the research community remains a challenge.

Sustainability: The content of http://ResearchToolkit.org is public domain, and in essence, the website functions as an aggregator. Since the tool kit comprises dynamic content, and since relevant new resources are created routinely, the content should be refreshed and updated often. Users have offered additional content for the site’s next iteration, and conference audiences have encouraged us to create content directed at particular groups in the research community, such as community members or pre‐ and postdoctoral trainees. Yet, we were originally funded as a project to develop a static product, not as a service, and can no longer maintain and update the site without additional financial support. Unfortunately, this happens often in research, as dynamic products may be developed from grants that have no long‐range mechanism of support past the end of the funding cycle. One example is the Inventory and Evaluation of Clinical Research Networks project, which developed a directory of networks, and associated reports of the networks’ best practices. The funding ended, and the substantial investment in building the site was lost.

Heterogeneity: It is also important to acknowledge that research is a craft, with nuances to every study. Given such heterogeneity, researchers might be appropriately skeptical about adapting an approach or a tool without a thorough explanation or “metadata” for how it was used in prior research (much as new measures are developed, validated, and published). Still, we urge researchers to consider whether they need to start yet another study consent form, or create an authorship policy for a consortium project, from a blank page.

Tools as One Step Toward a More Efficient Research System

Given the challenges associated with disseminating sustainable research resources in a highly variable environment, it is useful to consider ways to engender durable changes. Certainly, policies are one strategy to produce changes in the research environment, as demonstrated by the introduction of the NIH data sharing policy, or journal policies for reporting clinical trials in accordance with the Consolidated Standards of Reporting Trials, or CONSORT. No similar mandates exist for using standard measures or consent form templates. Another strategy is to produce data that identify potential efficiencies that could be realized by adopting a standard process or tool. A systematic process evaluation at Vanderbilt University 2 identified several delays and barriers in the launching of clinical trials at their institution that could potentially be consolidated or simplified. A comparable “autopsy” of the research process would likely yield specific insights about the sources of delay, duplication, and other wastes of time and effort, especially in multisite studies. Finally, the research community relies on validated measures and rigor, so any new tools that are introduced to improve research operations should be systematically evaluated to give researchers confidence that the tools were also developed and disseminated rigorously.

Our PRIMER study 1 revealed many areas where researchers felt they would benefit from more tools to support their studies, suggesting that many opportunities exist to improve the pace and conduct of research. Barriers we specifically sought to address included finding suitable collaborators in communities and practices, grappling with multiple divergent institutional review board (IRB) requirements, recruiting study volunteers rapidly, and training geographically dispersed study staff. Additional impediments to efficient and effective research include searching for valid data collection measures, and the time lag from manuscript creation to journal publication. Cumulatively, these individually time‐consuming steps result in a protracted process that impedes scientific progress. Researchers and their institutions could address many of the process delays: for example, the adoption of electronic IRB systems to reduce queue times, creating or adapting templates for consent forms and other materials, and using emerging research network tools such as the Harvard University Profiles Research Networking Software or Research Gate. Research Match is an example of a research participant registry designed to facilitate recruitment. Other processes—such as journal review and publication timelines, and scientific peer review of grants—are being tackled (respectively) through “e‐publication” before a journal’s print edition is released, and NIH’s improving electronic processes for grant submission and review. Notably, the NIH has also sponsored recent efforts to consolidate health research measures and databases into curated repositories, including the Patient‐Reported Outcomes Measurement Information System (http://www.nihpromis.org) and the Grid‐Enabled Measures database (https://www.gem‐beta.org). All of these examples involve rigorously developed and tested tools.

Conclusion

Federal agencies are giving much‐needed attention and resources to the challenges described here. But real change requires these resources to be embedded as part of a modern research infrastructure, with funding for development, implementation, and sustainability. We need every means at our collective disposal to shorten the cycle of translating research into implementable findings. As often cited, an average of 17 years elapse before research findings result in changes to care. 3 This is a persistent concern, indicating that the entire research enterprise must become more nimble and adaptive—not to the point of sacrificing scientific integrity, but certainly with the understanding that the current model could improve in many ways. The nation’s sizable (and growing) investment in biomedical research signals that we can no longer conduct research as “business as usual.” The more quickly researchers can get their studies up and running, the sooner we can move study findings into real‐world communities and practices, improving the quality, accessibility, and affordability of US healthcare.

Acknowledgments

The authors wish to thank Drs. Sergio Aguilar‐Gaxiola, Lloyd Michener, and Donna Jo McCloskey for their support of this work. We also appreciate the invaluable assistance from Bill Tolbert for helping us to create the Research Toolkit website.

Sources of Funding

National Center for Research Resources Administrative Supplement to the University of Washington Institute of Translational Health Sciences‐3UL1RR025014‐02S1

Ethical Approval

Although this Special Report does not involve human participants, the PRIMER study, on which this article was based, was reviewed by the Group Health Human Subjects Review Committee and determined to be exempt.

References

  • 1. Dolor RJ, M GS, Neale AV, Baldwin LM. Partnership‐driven Resources to Improve and Enhance Research (PRIMER): a survey of community‐engaged researchers and creation of an online toolkit. Clinical and Translational Science. 2011; 4(5): 259–265. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Dilts DM, Sandler AB. Invisible barriers to clinical trials: the impact of structural, infrastructural, and procedural barriers to opening oncology clinical trials. J Clin Oncol.Oct 12006; 24(28): 4545–4552. [DOI] [PubMed] [Google Scholar]
  • 3. Westfall JM, Mold J, Fagnan L. Practice‐based research: blue highways on the NIH roadmap. JAMA. Jan2431 2007; 297(4): 403–406. [DOI] [PubMed] [Google Scholar]

Articles from Clinical and Translational Science are provided here courtesy of Wiley

RESOURCES