Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Mar 1.
Published in final edited form as: Eval Health Prof. 2013 Aug 7;37(1):98–113. doi: 10.1177/0163278713500140

Success Case Studies Contribute to Evaluation of Complex Research Infrastructure

Janice A Hogle , D Paul Moberg
PMCID: PMC3873347  NIHMSID: NIHMS505746  PMID: 23925705

Abstract

The success case studies approach examines in depth what works well in a program by describing cases and examining factors leading to successful outcomes. In this paper, we describe the use of success case studies as part of an evaluation of the transformation of a health sciences research support infrastructure. Using project-specific descriptions and the researchers’ perceptions of the impact of improved research infrastructure, we added depth of understanding to the quantitative data required by funding agencies. Each case study included an interview with the lead researcher, along with review of documents about the research, the investigator and their collaborators. Our analyses elucidated themes regarding contributions of the Clinical and Translational Science Awards(CTSA)program of the National Institutes of Health (NIH) to scientific achievements and career advancement of investigators in one academic institution.

Keywords: Success case studies, program evaluation, large research infrastructures, CTSA, qualitative evaluation, in-depth interviews

Introduction

The National Center for Advancing Translational Science (NCATS), within the National Institutes of Health (NIH), funds the Clinical and Translational Science Awards program (CTSA) to accelerate and improve the nation’s biomedical research process (CTSA, 2013; NCATS, 2013). Institutions receiving these awards are required to provide a variety of research resources to clinical investigators and to provide and improve training and career development opportunities for translational researchers (Rubio et al., 2010). The CTSA program recognizes the many barriers to successful career advancement for young physicians and scientists and provides mechanisms for overcoming these barriers (Rubio et al., 2011).

Evaluating the large complex research infrastructures receiving NIH funding through the CTSA program poses significant challenges for program evaluators working with each CTSA recipient institution. It is the responsibility of program evaluators affiliated with CTSA recipient institutions to assist administrators in documenting achievement of objectives and in determining changes needed within the program to help achieve objectives. CTSA evaluators track and assess improved research resources and services provided to investigators using a variety of approaches, but rely mainly on quantitative methods (Kane & Alexander, 2012).

In this paper we describe the use of success case studies as part of the overall evaluation of the University of Wisconsin (Madison) Institute for Clinical and Translational Research (UW ICTR). Each case study was not an evaluation of a specific investigator’s research. It was a descriptive story about each investigator’s research trajectory and use of ICTR research services. Included in the case studies were the investigators’ reflections on how use of ICTR resources contributed to the advancement of their research careers.

The purpose of the ICTR success case studies project was and is to obtain a deeper understanding of the value of research resources to research teams and to extract patterns and themes for use in administrative decisions and communication about the CTSA program. While we know that satisfaction with services was high among investigators who used research resources during the first two years of the CTSA funding cycle for UW ICTR (4.0 on a 5-point scale; UW ICTR, 2010), we wanted more detailed information regarding investigators who have been deemed successful in their career trajectories.

ICTR provides to its members a large variety of research services, pilot awards, and training in clinical and translational science. Summary statistics about service use indicate that requests for biostatistics services and for community-engaged research services constitute the largest number of consultation requests from investigators at UW ICTR annually (UW ICTR, 2012). Additional frequently-used resources and services include biomedical informatics, imaging, drug and device development support, clinical studies support and management, laboratory services, health disparities research consultations, research network opportunities, education, career development, scientific editing, pilot awards, study monitoring, and letters of support for research applications. All users of UW ICTR services are or become members of UW ICTR.

Despite the volume of quantitative data documenting the improvement of research resources at the UW ICTR, there remains a gap in understanding the complexity of investigators’ reactions to changes in research resources, their perceptions of contributions of ICTR to their own research programs, and their opinions about ICTR support for career development. This gap can be addressed in part by the use of qualitative evaluation approaches, including success case studies. These case studies allow evaluators to recognize successful approaches and identify problematic resource implementation for targeted improvement efforts.

The case study method is a qualitative research approach that explores a single case or set of cases for the purpose of understanding the particularity and complexity of a single case (Stake, 1995; Yin, 2003). There are many ways to do case studies. Patton (2011) described the use of success case studies to evaluate a Caribbean agricultural extension program by identifying outstanding extension staff and describing their stories in detail. The success case studies approach is similar to Appreciative Inquiry (Cooperrider & Whitney, 2008; Trajkovski et al., 2013; Stowell & West, 1991) and to the Most Significant Change technique (Davies & Dart, 2005). These methods focus on successful examples of phenomena and examine what is working and how, by describing the example and identifying features of the investigators’ experiences that contribute to successful outcomes. Appreciative Inquiry (AI) focuses on identifying and encouraging an organization’s positive accomplishments rather than eliminating its negative characteristics. The method is considered particularly applicable in organizations experiencing rapid change (Stowell & West, 1991). Davies and Dart’s Most Significant Change (MSC) technique is a participatory, qualitative approach to monitoring and evaluation that focuses on identifying and analyzing significant change stories chosen by stakeholders and beneficiaries. MSC stories help to elucidate the relationships of observed changes to program impact. While none of these approaches can claim to definitively establish causality of outcomes, they are useful techniques in program evaluation which seek to better understand program processes, generate accountability data, and identify areas of strength, and candidates for improvement. The success case studies method was selected because of its value in quickly providing details of cases that add depth to the quantitative breadth of related outcome data.

Success among rising scientists is typically defined by obtaining federal grants, publishing research results, and career advancement (Lee et. al., 2012). Ideally, a research program moves along a translational trajectory from more basic or speculative science in the direction of improving clinical practices and human health (Trochim et al., 2011). The career of the lead investigator evolves as he or she, often working in a collaborative team, advances the science along this translational trajectory (Lee et al., 2012; Rubio et al., 2011)extending from investigation through discovery to translation into practice.

Focusing on successes (Brinkerhoff, 2002; Brinkerhoff, 2005) within our clinical and translational science institute explores qualitatively the assumption that use of improved research resources will accelerate the translation of science into better clinical practices and ultimately into improved human health, while at the same time advancing scientists’ careers.

Background

UW ICTR received its first CTSA funding in 2007, as a member of the second cohort of academic research centers receiving CTSA funding through the NIH. Federal funding accounts for just under half of the annual operating budget, with additional funds leveraged locally. ICTR successfully applied for an additional five years of funding in 2011 with six major partners and 24 collaborating institutions.

All 61 CTSA institutions (CTSA, 2013) are required to include program evaluation as part of their administrative activities, although implementation of the mandate varies widely in structure and style across the CTSA consortium (Frechtling et. al., 2012; Kane & Alexander, 2012). At UW-Madison, ICTR evaluation staff use triangulated quantitative and qualitative data to assess the degree of success of the institute in meeting goals and objectives, as well as to assist with identifying needed programmatic changes in the complex, uncertain, and evolving environment of federal funding for health sciences research. Applying a utilization-focused, participatory, and methodologically-flexible approach (Patton, 2008), program evaluators working within UW ICTR are guided by the Centers for Disease Control and Prevention’s Framework for Program Evaluation (CDC, 1999) for steps on how to proceed with the evaluation, and by the American Evaluation Association Program Evaluation Standards (Yarbrough et al., 2011) for implementing high quality evaluation processes. ICTR program evaluation staff includes two full-time internal evaluators, overseen by a director (ten percent time). The focal evaluation staff are assisted by a program manager at Marshfield Clinic (a major collaborating institution), supported by a data, network and translational research librarian, and supplemented by data managers across the multiple components making up ICTR. During the first four years of UW ICTR implementation, evaluators incorporated a developmental evaluation perspective (Patton, 2011) to provide the flexibility needed to respond to the highly uncertain and rapidly changing environment within the University as it transformed from a Medical School to a School of Medicine and Public Health (SMPH) (Remington & Golden, 2009), as well as changing federal support for biomedical research (Sargent, 2013). In this context, UW ICTR evaluation serves as a resource and guide to administration in assessing institute progress, and contributes to priority setting, program accountability, and continuous quality improvement.

Methods

Our goal was to 1) explore specific examples of how investigators used ICTR research resources, 2)identify patterns of use and 3) describe the progression of research. We sought unstructured feedback from investigators, to inform quality improvement efforts within the UW ICTR. The success case studies provided a better understanding of investigators’ experiences using ICTR resources, perspectives on career development, perceptions regarding the benefits of ICTR resources, and ideas for quality improvement.

Choosing cases

Using a version of purposeful sampling strategy (described in Patton, 2002) called “extreme group case selection” or the success case method(elaborated by Brinkerhoff, 2002 and Brinkerhoff, 2005), three internal program evaluators selected successful investigators who had used ICTR resources. Investigators were chosen based on recommendations from senior ICTR staff, deans of partner institutions, and via service use records. Investigators who benefit the most from ICTR resources include researchers receiving direct funding in the form of training grants or pilot awards. Besides direct monetary support and protected time for research, ICTR investigators may also benefit from the receipt of various services (Table 1). Participants in career training or pilot award programs tend to be younger investigators in earlier stages of their careers. However, some senior scientists were recommended for success case studies due to their exemplary translational research experiences. As Table 2 demonstrates, a variety of investigators were included.

Table 1.

Summary of resources/services used by ICTR investigators profiled in success case studies

Biostatistical consultation ICTR Community Collaboration Grant
Clinical Research Unit KL2 Scholars Program
Collaborative Center for Health Equity (CCHE) Mobile Research Team (MRT)
Community Academic Partnership Core Office of Clinical Trials (OCT)
Community Health Connections (CHC) Program Personalized Medicine Research Project (Marshfield Clinic Research Foundation)
Data Monitoring Committee Qualitative Research Group
Health Equity Leadership Institute (HELI) Regional Research Council (CHC)
Health Innovations Program (HIP) Scientific Editing
Pilot Awards Program Scientific Review Committee
ICTR Administration – Letter of Support Wisconsin Institute for Medical Research (WIMR)
ICTR Client Services Center Research Ambassador Wisconsin Network for Health Research (WiNHR)

Table 2.

Investigators (researchers/scientists) profiled by ICTR success case studies in 2012

1. Licensed Psychologist and Assistant Professor
School of Nursing, UW-Madison
2. Professor and Registered/Certified Dietitian
Department of Nutritional Sciences, College of Agriculture and Life Sciences, UW-Madison
3. Clinical Psychologist and Senior Scientist
Center for Tobacco Research and Intervention, UW-Madison t
4. Associate Professor and Director of the Heart Failure Program
Department of Medicine, School of Medicine and Public Health, UW-Madison
5. Assistant Professor
School of Pharmacy (collaborating with College of Engineering), UW-Madison
6. Postdoctoral Fellow
Center for Human Genetics, Marshfield Clinic Research Foundation
7. Assistant Professor
Department of Counseling Psychology, School of Education, UW-Madison

Types of data

Each case study consists of two types of data. The first type includes information about the research and the lead investigator. The following sources were used to gather information: research descriptions from the UW website; individual progress reports of the scholars’, trainees’ and pilot project awardees’ research; recent publications; news reports; presentations to the ICTR External Advisory Committee; Google; and resource use data from the ICTR use tracking systems. Publication data for all investigators using ICTR resources was obtained by resource core data captains and appears in the ICTR central tracking database. Thus both quantitative data (e.g., number of resources used, number of publications and grants) and qualitative data (descriptions of findings, progress reports, publication contents) were used.

The second data source for each case study was an individual interview with the focal investigator. Semi-structured interview questions were developed by the ICTR evaluators in consultation with the ICTR Evaluation Working Group (EWG), representing all research resource areas affiliated with ICTR. The audio-recorded interviews were conducted one-on-one for an average of 60 minutes each. The interviewer also took notes during the interview. Afterwards, the interviewer edited the notes while listening to the recording.

The edited interview was returned to the investigator for review and verification of accuracy. The interviewer then wrote each case study using all information about the researchers and their use of ICTR services, illustrated with quotations from the interviews. Length of the written case studies ranged from five to 10 pages, of which the first page was a concise summary of the full case study.

The draft case study was also returned to the investigator for review and approval for accuracy of interpretation. Editing sometimes involved multiple exchanges of drafts. The investigator participated in the development of the case study because the case studies are public documents and in order to assure accuracy and transparency.

Interview guide

The in-depth interviews began with a general question asking about the “story of your research career.” We asked the investigators (the respondents) if they considered their research to be translational and why, and if they work with a team, to describe their team members. Additionally we asked about their point of entry to ICTR, how they learned about available research resources, the processes used for accessing services, and referral routes they followed between the various ICTR components. We also asked, “what have you needed to do your research that ICTR has helped to provide” and “has your involvement with ICTR helped you think about your research in different ways; if so, how?”

Results

Analysis includes ongoing review of all interviews and accompanying documentation about the investigators’ research. On the individual case level, this guarantees a faithful in-depth account of each case. As cases accumulate, we have begun to compare investigators’ experiences in using ICTR resources and look for common emerging themes. Because the success case studies project is ongoing, new data continue to undergo analysis. As we review each new case study narrative, we compare it to previous narratives, identifying areas of confirmation and contrast. We also examine the case study data in light of other quantitative data on ICTR performance.

Seven case studies were completed in 2012; the majority of these investigators were relative novices in clinical/translational research. One investigator’s research represented a mid-career shift from clinical practice. Two of the seven investigators were men. One is a physician/scientist; the others are PhDs. Table 2 summarizes the investigators interviewed thus far. Some of the themes emerging from comparative analysis of the case studies are presented in Table 3 with sample illustrative quotes.

Table 3.

Themes

Traditional definitions of research success
Investigators defined success in traditional terms of obtaining funding for their research. To acquire grant funding, one must develop a competitive research plan, and receive support in the form of training, mentoring, and pilot funding, all of which are resources provided to ICTR members. However, the ICTR infrastructure was also seen to integrate services and resources into a coordinated academic home, thereby enhancing the ability of clinical and translational researchers to access services and develop their research more efficiently.
“It’s an amazing story… I wrote my first grant in 2003… then to go from an idea to testing proof of concept, patenting, commercializing, investment equity – from 2003 to 2011. It’s been very intense. It’s highly multi- and inter –disciplinary… I’m not sure it could have been done anywhere else in the world except here at UW, because of the pieces you need ready to work together.”

Need for training provided through UW ICTR

The three post-graduate scholars uniformly emphasized the value of their training in promoting teamwork skills. Training was viewed as being inseparable from good mentoring – the mentoring process is embedded within their training, with both of these components supporting career advancement.
“… the HELI [Health Equity Leadership Institute] training was quite phenomenal for me because… when I went to that training [and experienced those speakers and the energy there] I finally felt like I knew what I was doing. My self-efficacy was not always present. I had that feeling [after HELI]… I’m finally where I need to be, because of the speakers and topics, and levels of nurturing and support.”
“… critical things happened… that were instrumental in solidifying what I was interested in. One was that we were required to take a qualitative research methods course… I discovered a whole new appreciation for ways to answer the kinds of questions we’re confronting at the [mental health] center. So, I became even more excited about research.”
“The K award [KL2 career training grant] allows for training and is a three-year project, so it allows me to take a broad spectrum approach to characterizing a number of issues taking place in pharmacies… I’m looking at the entire work system… What spoke to me with the K award was that you could get more training.”

Good mentoring for validating translational research tracks and supporting innovative ideas

Although no questions in the interviews specifically addressed mentoring, its supportive role in research development and career advancement was mentioned frequently. In one case, the respondent stated “I had no mentoring” as an aside, to explain difficulties described.
“The level of support of mentoring I’ve received has been outstanding… [one mentor] helped me to get clinical experience in the hospital and was supportive of research with African Americans and the methodology I used. [Another mentor] really helped me to conceptualize my research and develop the theoretical approach.”
“I’ve been better mentored here than anywhere else.”
“I credit the KL2 program for my strong research collaborations with UW Madison investigators… This collaboration has helped me think about my research in innovative ways with regard to study methods and design. For instance, an oncologist offers clinical expertise that allows me to delineate between clinically relevant and the not-so-clinically relevant research questions and findings. A clinical geneticist brings to my research an element of expertise, providing insight on genes that I find interesting and relevant to my work.”

Need for pilot awards to obtain preliminary data

A key function of the CTSA funding is to provide an infrastructure for managing pilot awards to help investigators advance to the next level of their research programs. Pilot awards were seen as supporting accelerated movement along the translational research pipeline.
“Of course, the ICTR Pilot grant funding was great… [Plus I had some funding ] through the School of Nursing, plus some funds through the graduate school; between those 3 I had enough to do the pilot research. You can’t get an R01 without pilot data. Even getting funding to do pilot work is difficult. My NIH reviewers paid attention to that; they said: she has a good track record of getting funding… When I talk with junior researchers, I tell them don’t downplay the pilot money… The NIH grant is gold standard, but if you can get small pools of money for pilot research, that’s good.”
From ICTR Pilot to R21 [NIH innovation grant]: “For me, that’s a success story and that’s a success story for ICTR because it is supposed to spawn academic careers and external funding and it worked.”
“The pilot review helped in revamping the proposal… I resubmitted it and they funded it. That’s been a success story and we had a publication.”
“My career path is a bit unique… For me it [ICTR pilot funding] was really critical.”

Interpreting the meaning of “translational”

All respondents were asked if they thought their research was translational, eliciting a range of interpretations. “Translational” can mean spanning the research spectrum of types of research, as well as specifically sharing findings and results in clinical settings, and changing practices among providers and patients in the community. Defining the word “translational” remains problematic (Trochim et. al., 2011)
“ICTR has confirmed my interest in community and translational research. It’s nice to know that that type of research is valued, acknowledged, and supported by ICTR and the university; and to also know that there is a place to go and talk about those research ideas.”
“My research is clearly translational… and we do that in a number of ways. I went back to the clinical center to think about how to implement the intervention with the Center’s own clients… I do that on the community level as well…”for example, inviting participants to a luncheon in appreciation for their participation as research subjects.
“My research is very applied, very practical… [there is a need to] translate existing clinical best practice recommendations into community settings that serve populations with smoking disparities (e.g., mentally ill, low income)… [my research] addresses the challenge of how to apply the clinical guidelines for different populations in real settings. The heart of the translational aspect of this work lies in the application of the guidelines.”
One respondent felt her research was “minimally translational” because there were no animal models for the research on which her lab is working. “It’s very hard to define translational. I don’t think translational really applies… I’m looking at basic principles and how they apply to disease.”

The themes identified from the interviews logically reflect the questions posed to the investigators. They talked about their interpretations of research success, largely in the form of obtaining funding for their studies. The ICTR Pilot Awards program was mentioned several times as an important contribution to obtaining preliminary data, contributing to success on future grant applications. Investigators also mentioned the need for good training and mentoring in clinical and translational research and the critical role that various ICTR programs play in their research training process. Investigators defined “translational” research and their “research teams” in a variety of unique ways reflecting their own interpretation of the terms. Investigators defined “team” broadly to include students, mentors, colleagues in community-based organizations, beneficiaries of research (e.g., people suffering from chronic health conditions), health care providers, investigators in other schools, colleges or departments, and state-level organizations. There were also many positive comments about the investigators’ perceptions of the value of the UW ICTR to their careers and to their research programs. One investigator pointed out that the fact that UW-Madison has a CTSA program is important to mention in applications for funding.

Preliminary internal review of interview data has facilitated improvements in service provision. For example, the availability or accessibility of specific laboratory resources needed for an investigator’s research has been considered. Issues investigators have with obtaining IRB approval have been addressed. Difficulties in obtaining appropriate statistical help have also emerged and are being addressed. As expected, investigators described both positive and challenging experiences while using ICTR resources. Even among successful investigators, heavy use of resources is not equivalent to complete satisfaction. Quality improvement activities at UWICTR progress on a daily basis integrated into regularly scheduled meetings and formally scheduled interviews, with program changes ordinarily driven by multiple data sources including evaluative feedback and success case studies.

Conclusion and Limitations

Developing success case studies has provided insight to internal evaluation efforts at UW ICTR. The information provided an in-depth perspective on investigators’ experiences navigating the complexities of obtaining training, funding, and assistance in implementing research programs.

In considering the limitations of this approach, it is first important to appreciate that ICTR is only in its sixth year at this writing (2013). Some investigators report experiences from past years on the use of services that might have already improved. Streamlining research processes proceeds slowly and unevenly across resource cores. Nevertheless, perceptions of past resource use may persist, affecting future decisions about utilizing ICTR resources.

Second, many of the UW ICTR evaluation products are produced by ICTR staff for reports to funders, and do not include direct feedback from the investigators themselves. Thus, success case studies of investigators’ perceptions are valuable to evaluators because they are direct reports and opinions of the scientists using the resources. These narratives were provided in a semi-structured format and then integrated with descriptions of their research available from other sources, providing a more comprehensive picture of an investigator’s experience using improved research resources.

Third, UW ICTR evaluation staff found it easier to talk to investigators conducting translational research focused on improvements in clinical practice and community health (Trochim et al., 2011), rather than those conducting clinical or basic research. The evaluators at UW ICTR are social scientists with public health training and experience, not basic scientists or clinicians, therefore conversations with these seven case study researchers about their research reflect shared domains of knowledge. Future success case studies need to include more investigators conducting basic biomedical and clinical research as these scientists are also UW ICTR resource users. It may be possible to identify interviewers from basic science and/or clinical research to conduct the interviews for us and/or to assist with writing the success case studies.

Fourth, UW ICTR evaluators believe that the utility of success case studies for program evaluation of clinical and translational science award recipients should be developed further. However, producing each case study is labor-intensive, involving research on the investigator and their research program, integrating quotations from individual interviews, and incorporating service use data. Although case studies are not required by funders, success case studies remain in the UW ICTR evaluation plan, and program evaluators continue to appreciate the direct access to investigators and the insight gained from the conversations.

Fifth, much of the data from the individual interviews remain confidential. The transcripts of the in-depth interviews with lead investigators are not public documents. They often contain extensive detail about use of services and resources, including honest and candid commentary about difficulties encountered and perceived value of resources used. These data have been useful internally and informally in the ongoing process of program improvement, for example, in conducting a needs assessment for additional laboratory services, and in developing a short satisfaction survey that can be administered directly after each consult request. Investigators understand that their responses to questions are not anonymous; however, if they request that something they say not be included in a transcript or in the case study, that request is honored.

Sixth, the case studies are beginning to be used and interest in them remains high. They have been used by ICTR staff for story-telling in newsletters, presentations, or grant applications to present vignettes of specific research experiences (example: UW ICTR Today newsletter, November/December 2012). Case studies can be used in a meaningful and legitimate manner to tell the stories implicit behind quantitative data on research productivity (publications and grants) and career development. UW ICTR evaluators have greatly appreciated the success case studies as a way to interact directly with investigators served by the UW ICTR and will continue to add case studies as time and resources allow.

Finally, it is important to emphasize that success case studies are certainly not the only evaluation approach used at the UW ICTR. Our evaluation has included tracking many quantitative outcomes. In the first five years, ICTR provided research resources/services to 1,786 unique investigators, many of whom used multiple resources over several years. The value of the federal grants supporting research that benefited from ICTR resources during those five years came to over one billion dollars. Nearly 1,000 publications resulted in the first five years from the research that utilized the integrated research support provided through the ICTR academic home for clinical and translational research (UW ICTR, 2012). These kinds of statistics are significant in documenting the success of the services provided to investigators. Case studies extend this understanding further by examining individual investigators’ experiences in detail. Since the case studies were written, one investigator received an R01 for five years for $1.8 million. Another one received tenure and a faculty/staff award in the spring of 2013. All investigators in our case studies have published.

Functioning in highly evolving environments with resource constraints and political challenges, CTSA evaluators have used a variety of strategies to cope with the practical realities of their responsibilities (Bamberger et al., 2012), including the use of mixed methods (quantitative and qualitative) and triangulated assessments of objective achievement. At the UWICTR, the success case studies method functions as a useful complement to quantitatively tracking processes and outcomes. The value of the method rests with its descriptive detail, providing examples of individual scientists and their research teams, as they generate research findings that have the potential to impact the health of the population.

Acknowledgments

This project was supported by the CTSA program, through the NIH National Center for Advancing Translational Sciences (NCATS), grant UL1TR000427, to the UW ICTR https://ictr.wisc.edu. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH. Funding was also provided by the UW School of Medicine and Public Health from the Wisconsin Partnership Program. Thanks to Christina J. Hower, MS, cjhower@gmail.com and Bobbi Bradley, MPH, Bradley. bobbi@mcrf.mfldclin.edu for their participation in this project as case study interviewers and writers. Thanks also to the Qualitative Research Group, led by Nora Jacobson, PhD, najacobson@wisc.edu and sponsored by the Community-Academic Partnership core of the UW ICTR, for assistance throughout the project. Thanks to Laura Hogan, PhD, lhh@medicine.wisc.edu for very helpful editing assistance.

Footnotes

Declaration of Conflicting Interests

The authors declared no conflicts of interest with respect to the authorship and/or publication of this article.

Contributor Information

Janice A. Hogle, Email: jhogle@wisc.edu.

D. Paul Moberg, Email: dpmoberg@wisc.edu.

References

  1. Bamberger M, Rugh J, Mabry L. Real World evaluation: Working under budget, time, data, and political constraints. 2. Thousand Oaks, CA: Sage; 2012. Retrieved from http://www.realworldevaluation.org/ [Google Scholar]
  2. Brinkerhoff RO. The success case study method. San Francisco: Berrett-Koehler; 2002. [Google Scholar]
  3. Brinkerhoff RO. Success case method. In: Mathison S, editor. Encyclopedia of evaluation. Thousand Oaks, CA: Sage; 2005. pp. 401–402. [Google Scholar]
  4. Centers for Disease Control and Prevention (CDC) CDC program evaluation framework. 1999 Retrieved from http://www.cdc.gov/eval/framework/index.htm.
  5. Clinical and Translational Science Awards. Accelerating Discoveries Toward Better Health. 2013 Retrieved from https://ctsacentral.org/
  6. Cooperrider DL, Whitney D. A positive revolution in change: Appreciative inquiry. 2008 Retrieved from http://appreciativeinquiry.case.edu/uploads/whatisai.pdf.
  7. Davies R, Dart J. The most significant change (MSC) technique: A guide to its use. 2005 Retrieved from http://www.learningtolearn.sa.edu.au/learningworkroom/
  8. Frechtling J, Raue K, Michie J, Miyaoka A, Spiegelman M. The CTSA national evaluation final report. Rockville MD: Westat; 2012. Retrieved from http://www.academia.edu/2466959/The_CTSA_National_Evaluation_Final_Report. [Google Scholar]
  9. Kane C, Alexander A. National Evaluators Survey Preliminary Results 2011–2012. CTSA Evaluation Key Function Committee; 2012. Retrieved from https://www.ctsacentral.org/committees/documents/369. [Google Scholar]
  10. Lee LS, Pusek SN, McCormack WT, Helitzer DL, Martina CA, Dozier AM, Rubio DM. Clinical and translational scientist career success: metrics for evaluation. Clinical and Translational Science. 2012;5:400–407. doi: 10.1111/j.1752-8062.2012.00422.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. National Center for Advancing Translational Sciences (NCATS) About the CTSA program. 2013 Retrieved from http://www.ncats.nih.gov/research/cts/ctsa/about/about.html.
  12. Patton MQ. Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York: Guilford Press; 2011. [Google Scholar]
  13. Patton MQ. Utilization-focused evaluation. 4. Thousand Oaks, CA: Sage; 2008. [Google Scholar]
  14. Patton MQ. Qualitative Research and Evaluation Methods. 2. Thousand Oaks, CA: Sage; 2002. [Google Scholar]
  15. Remington P, Golden RN. Transforming medicine from the bottom up. Wisconsin Medical Journal. 2009;108(3):166–7. Retrieved from https://www.wisconsinmedicalsociety.org/_WMS/publications/wmj/pdf/108/3/166.pdf. [PubMed] [Google Scholar]
  16. Rubio DM, Schoenbaum EE, Lee LS, Schteingart DE, Marantz PR, Anderson KE, Esposito K. Defining translational research: Implications for training. Academic Medicine. 2010;85(3):470–475. doi: 10.1097/ACM.0b013e3181ccd618. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Rubio DM, Primack BA, Switzer GE, et al. A comprehensive career-success model for physician-scientists. Academic Medicine. 2011;86(12):1571–1576. doi: 10.1097/ACM.0b013e31823592fd. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Sargent JF. Federal research and development funding: FY2013. Congressional Research Service 7–5700, R42410. 2013 Retrieved from https://www.fas.org/sgp/crs/misc/R42410.pdf.
  19. Stowell FA, West D. The appreciative inquiry method: A systems based method of knowledge elicitation. In: Jackson MC, Mansell GJ, Flood RL, Blackham RB, Probert SVE, editors. Systems thinking in Europe. New York, NY: Plenum; 1991. pp. 493–497. [Google Scholar]
  20. Stake RE. The art of case study research. Thousand Oaks, CA: Sage; 1995. [Google Scholar]
  21. Trajkovski S, Schmied V, Vickers M, Jackson D. Implement the 4D cycle of appreciative inquiry in health care: a methodological review. Journal of Advanced Nursing. 2013;69(6):1224–1234. doi: 10.1111/jan.12086. doi: 10.1111/jan. 12086. Retrieved from http://onlinelibrary.wiley.com/doi/10.1111/jan.12086/pdf. [DOI] [PubMed] [Google Scholar]
  22. Trochim W, Kane C, Graham MJ, Pincus HA. Evaluating translational research: a process marker model. Clinical and Translational Science. 2011;4:153–162. doi: 10.1111/j.1752-8062.2011.00291.x. doi: 10.1111/j. 1752-8062.2011.00291.x. Retrieved from http://onlinelibrary.wiley.com/doi/10.1111/j.1752-8062.2011.00291.x/pdf. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. UW ICTR. ICTR member survey report. Institute for Clinical and Translational Research. University of Wisconsin (Madison), School of Medicine and Public Health; 2010. [Google Scholar]
  24. UW ICTR. Annual report submitted to NCATS/NIH. Institute for Clinical and Translational Research. University of Wisconsin (Madison), School of Medicine and Public Health; 2012. Annual Progress Report. Unpublished. [Google Scholar]
  25. UW ICTR Today Newsletter. Spotlight on: Bruce Christiansen, ICTR Pilot Grant Recipient. 2012 Nov-Dec;:4. issue. Retrieved from https://ictr.wisc.edu/files/ICTR_Today_v5n6_v6.pdf.
  26. Yarbrough DB, Shulha LM, Hopson RK, Caruthers FA. The Program Evaluation Standards: A Guide for Evaluators and Evaluation Users. 3. Thousand Oaks, CA: Sage; 2011. Retrieved from http://www.eval.org/evaluationdocuments/progeval.html. [Google Scholar]
  27. Yin RK. Applications of case study research. 2. Thousand Oaks, CA: Sage; 2003. [Google Scholar]

RESOURCES