Skip to main content
Journal of Clinical and Translational Science logoLink to Journal of Clinical and Translational Science
. 2018 Oct 31;2(4):217–222. doi: 10.1017/cts.2018.320

Clinical research coordinators’ instructional preferences for competency content delivery

H Robert Kolb 1,*, Huan Kuang 2, Linda S Behar-Horenstein 3
PMCID: PMC6382289  PMID: 30820358

Abstract

Introduction

A lack of standardized clinical research coordinator (CRC) training programs requires determining appropriate approaches for content delivery. The purpose of this study was to assess CRCs preferred training delivery methods related to the 8 designated Joint Task Force Clinical Trial Competency domains.

Methods

Repeated measures analysis of variance and split-plot analysis of variance were adopted to compare the group means among 5 training delivery methods by 8 competency content domains and to examine whether demographic variables caused different preference patterns on the training delivery methods.

Results

Participants reported a preference for online video; mentoring/coaching was the least preferred. Significant training delivery method preferences were reported for 3 content domains: participant safety considerations, medicines development and regulation, and clinical trials operations.

Discussion

Observed statistical differences in the training delivery methods by the content domains provides guidance for program development. Ensuring that standardized educational training is aligned with the needs of adult learners may help ensure that CRCs are appropriately prepared for the workforce.

Key words: Adult learners, JTF competency, clinical research coordinators, education and training, preferences for training delivery methods

Introduction

The ongoing management of clinical research, from start-up to close-out, is generally delegated to a clinical research coordinator (CRC). The CRC is a highly specialized professional working in a research team whose responsibilities are critical to study trial success [1]. It is the CRC who ensures that criteria are met and that complications are recognized and resolved directly. Clinical research translation requires a trained and well-prepared workforce of CRCs who can effectively and efficiently conduct critical testing in clinical trials [2]. Moreover, the most recent version of the Declaration of Helsinki pointed out that, “Medical research must be conducted by individuals with appropriate training and qualifications in clinical research” [3]. They are essential to the success of the clinical research enterprise.

The current state of industry and federally funded clinical trials has been criticized for variable and inconsistent quality in the design, execution, analysis, and reporting of clinical trial activity [4]. This dilemma is further exacerbated as the development of new drugs, devices, and behavioral interventions continue to be one of the most highly regulated endeavors in the United States [5]. At the same time, the intricacy of clinical trial protocols and the guidelines required to manage clinical trial activity has also increased in scope and complexity. One barrier to completion of effective, efficient, and rigorously conducted clinical trials is varying or missing competency-based training for study staff involved in clinical trials [6].

The Clinical and Translational Science Award (CTSA) Research Coordinator Taskforce recognized the need for improved training of CRCs when they reported that the “provision of adequate training and support…is critical to the overall goal of human subject protection” [7]. Although emphasizing the need for appropriately trained CRCs, the Task Force concluded that current training programs must be improved. The absence of standardized requirements for providing and ensuring appropriate levels of qualification or professional standards compounds this dilemma.

To address this need, a national movement of professionals has been dedicated towards ensuring that there is a set of common core competencies from which to build standardized didactic curriculum. Following the work of the Joint Task Force (JTF) for Clinical Trial Competency, Sonstein, Seltzer, Li, Silva, Jones, and Daemen [8] provided a core competency framework for the clinical research professional. This action resulted in the development of a single, high-level set of 8 standards to be adopted globally and serve as a framework of defined professional competencies for the clinical research enterprise [8]. The domains include:

  1. Scientific Concepts and Research Design: Knowledge of scientific concepts related to the design and analysis of clinical trials.

  2. Ethical and Participant Safety Considerations: Care of patients, aspects of human subject protection, and safety in the conduct of a clinical trial.

  3. Medicines Development and Regulation: Knowledge of how drugs, devices, and biologicals are developed and regulated.

  4. Clinical Trials Operations, Good Clinical Practices (GCPs): Study management, GCP compliance; safety management (adverse event identification, reporting, post market surveillance, pharmacovigilance), and handling of investigational products.

  5. Study and Site Management: Content required at the site level to run a study (financial and personnel aspects). Includes site and study operations (excluding regulatory and GCPs).

  6. Data Management and Informatics: Data acquisition and management during a clinical trial, including source data, data entry, queries, quality control, correction, and the concept of a locked database.

  7. Leadership and Professionalism: Principles and practice of leadership and professionalism in clinical research.

  8. Communication and Teamwork: Communication practices within the site and between the site and sponsor, Clinical Research Organization, and regulators, and teamwork skills necessary for conducting a clinical trial.

Goldstein [9] advised that the development of any educational programs be “undertaken by individuals skilled in instructional design and curriculum development [and be built] upon the principles of adult learning” [10]. In addition, the CTSA Coordinator Taskforce recommended that “institutions conduct a gap analysis to determine areas of weakness or additional needs in CRC training. This effort should include a focus on CRC core competencies” [7]. Few studies have examined the teaching strategies or training methods for delivering content for JTF for Clinical Trial Competency [7, 11]. The purpose of this study was to assess CRCs’ preferences for receiving JTF competency content through training methods grounded by learning theory lens to better inform subsequent design of competency-driven CRC training.

This learning lens refers to Malcolm Knowles’ adult learning theory, andragogy, introduced in the early 1970s. This theory provided advancements in the field of adult education [12]. According to Knowles et al., “andragogy is a core set of adult learning principles. The six principles of andragogy are: the learner’s need to know; self-concept of the learner; prior experience of the learner; readiness to learn; orientation to learning; and motivation to learn… [and] andragogy is preferred in practice when it is adapted to fit the uniqueness of the learners and the learning environment” [12, pp. 4–5].

Methods

This study utilized quantitative methodology. The authors inductively adapted the classification of training delivery methods derived from Jones et al.’s [11] and Speicher et al.’s [7] studies based on field experience at our site in tandem with the competency domains. For our study, we decided to include 5 categories: (1) mentoring or coaching, (2) online text-based training, (3) online video-based training, (4) live lecture, and (5) flipped classroom model. The 8 competency domains reflected the JTF framework referenced above.

The researcher-constructed survey was comprised of 45 items. Five items measured participant demographic background including age, gender, highest degree, years of being a CRC, and department affiliation. The other 40 questions addressing 8 domains of CRC core competencies and 5 training delivery methods comprised the survey. For each training method, the participant was asked to indicate how likely they would want to enroll in a training course that employs the given training method to facilitate their master of a competency domain. The participant repeated the process for all 8 domains of the JTF core competencies. For example, one question asked was: “Please indicate the extent to which you like or unlike the Mentoring or Coaching to deliver knowledge of scientific concepts related to the design and analysis of clinical trials?” These items were scored using a 7-point Likert scale (7=extremely like, 1=extremely unlike) (see Table 1). Purposeful, non-probability sampling was used. Individuals (n=160) who worked as a CRC at a single research intensive university in the Southeast portion of the United States were invited to participate in this study. Data were collected online via Qualtrics between November and December in 2016. The university’s Institutional Review Board approved this study (IRB201601579). The data were analyzed using SPSS (version 24). The repeated measures analysis of variance (ANOVA) and split-plot ANOVA were adopted to compare the group means among 5 training delivery methods. The repeated measures ANOVA were used to examine which training delivery methods were preferred by CRC’s in the aforementioned 8 domains. Since the variances of the differences between all combinations of related groups were unequal, the sphericity assumption was violated. Therefore, the lower-bound corrections (the lowest possible theoretical value) was adopted to produce a more valid critical F-value and to reduce the potential increase in type I error rates [12]. The split-plot ANOVA was used to examine whether demographic variables caused a different preference pattern on the training methods [13]. The statistical null hypothesis was assumed for all research questions. The significance level was set at α=0.05 for all analysis. Pairwise deletion technique was used for handling missing data.

Table 1.

Questionnaire

Questions Options
Q1: Age range Under 18, 18–24, 25–34, 35–44, 45–54, 55–64, 65–74, 75–84, 85 or older
Q2: Gender Male, female
Q3: What is the highest degree or level of school you have completed? Less than high school, High school graduate, Some college, 2 year degree, 4 year degree, Professional degree, Doctorate
Q4: For how long have you been a Clinical Research Coordinator? <1 year, 1–2 years, 3–5 years, 6–10 years, 11–20 years, 20+ years
Q5: Current Department Type your response
Instruction for Q6–Q40: Please indicate the extent to which you like or unlike each training method to deliver the given competency domain?
Q6–Q10 (Domain 1): Scientific Concepts and Research Design: Knowledge of scientific concepts related to the design and analysis of clinical trials.
Options matrix
7 Extremely like 6 Moderately like 5 Slightly like 4 Neither like nor unlike 3 Slightly unlike 2 Moderately unlike 1 Extremely unlike
Mentoring or coaching
Online text-based training module
Online video-based training module
Live lecture
Flipped classroom
Q11–Q15 (Domain 2): Ethical and Participant Safety Considerations: Care of patients, aspects of human subject protection, and safety in the conduct of a clinical trial (The same options matrix was used)
Q16–Q20 (Domain 3): Medicines Development and Regulation: Knowledge of how drugs, devices, and biologicals are developed and regulated (The same options matrix was used)
Q21–Q25 (Domain 4): Clinical Trials Operations, Good Clinical Practices (GCPs): Study management, GCP compliance; safety management (adverse event identification, reporting, post market surveillance, pharmacovigilance), and handling of investigational products (The same options matrix was used)
Q26–Q30 (Domain 5): Study and Site Management: Content required at the site level to run a study (financial and personnel aspects). Includes site and study operations (excluding regulatory and GCPs) (The same options matrix was used)
Q31–Q35 (Domain 6): Data Management and Informatics: Data acquisition and management during a clinical trial, including source data, data entry, queries, quality control, correction, and the concept of a locked database (The same options matrix was used)
Q36–Q40 (Domain 7): Leadership and Professionalism: Principles and practice of leadership and professionalism in clinical research (The same options matrix was used)
Q41–Q45 (Domain 8): Communication and Teamwork: Communication practices within the site and between the site and sponsor, Clinical Research Organization (CRO), and regulators, and teamwork skills necessary for conducting a clinical trial (The same options matrix was used)

Results

In total, 160 active CRCs were invited to participate in this study. Of those, 87 responded for a response rate of 54.4%. Demographic information including gender, highest degree, age, years of being CRCs, and department affiliation are shown in Table 2. On average, coordinators reported that they slightly or moderately like the selected 5 training delivery methods to convey requisite information related to each of the 8 competency content domains (see Table 3). Generally, participants reported a preference for online video-based training (mean=45.41, SD=9.55) and least preferred the mentoring or coaching (mean=40.89, SD=12.69). In terms of encompassing knowledge of scientific concepts related to the design and analysis of clinical trials (mean=5.84, SD=1.34), study and site management (mean=5.71, SD=1.48), as well as leadership and professionalism (mean=5.68, SD=1.45), the coordinators thought that live lecture was preferable. For promoting knowledge of ethical and participant safety considerations (mean=5.77, SD=1.35), medicines development and regulation (mean=5.83, SD=1.32), clinical trials and operations (mean=5.86, SD=1.17), data management and informatics (mean=5.69, SD=1.44), and communication and teamwork (mean=5.52, SD=1.49), the coordinators reported a preference for online video-based training.

Table 2.

Overview of demographics (n=87)

Demographic information n (%)
Gender
Female 75 (13.8%)
Male 12 (86.2%)
Highest degree
Some college or 2 year degree 8 (9.2%)
4 year degree 30 (34.5%)
Professional degree 36 (41.4%)
Doctorate 13 (14.9%)
Age
18–24 4 (4.6%)
25–34 20 (23%)
35–44 25 (28.7%)
45–54 18 (20.7%)
Older than 55 20 (23%)
Years of being CRCs
<1 year 6 (6.9%)
1–2 years 18 (20.7%)
3–5 years 20 (23%)
6–10 years 21 (24.1%)
11–20 years 19 (21.8%)
>20 years 3 (3.4%)
Department
Pediatrics 10 (11.5%)
Medicine or surgery related 49 (56.3%)
Social behavioral research 11 (12.6%)
Other 7 (8%)
Missing value 10 (11.5%)

CRC, clinical research coordinator.

Table 3.

Mean and SD of platform by domain (n=87)

Mentoring or coaching Online text-based training Online video-based training Live lecture Flipped classroom
Overall* [mean (SD)] 40.89 (12.69) 41.59 (12.02) 45.41 (9.55) 44.8 (9.41) 43.25 (11.81)
Scientific concepts and research design 5.16 (1.84) 5.24 (1.62) 5.67 (1.37) 5.84 (1.34) 5.56 (1.54)
Ethical and participant safety considerations 4.76 (2.01) 5.31 (1.69) 5.77 (1.35) 5.65 (1.34) 5.33 (1.64)
Medicines development and regulation 4.64 (1.96) 5.31 (1.69) 5.83 (1.32) 5.51 (1.53) 5.23 (1.68)
Clinical trials operations 4.85 (2.08) 5.32 (1.63) 5.86 (1.17) 5.51 (1.49) 5.43 (1.67)
Study and site management 5.57 (1.79) 5.14 (1.69) 5.67 (1.37) 5.71 (1.48) 5.33 (1.65)
Data management and informatics 5.13 (1.95) 5.21 (1.75) 5.69 (1.44) 5.49 (1.41) 5.44 (1.63)
Leadership and professionalism 5.49 (1.83) 5.10 (1.75) 5.48 (1.53) 5.68 (1.45) 5.56 (1.61)
Communication and teamwork 5.29 (1.78) 5.10 (1.72) 5.52 (1.49) 5.46 (1.52) 5.37 (1.63)
*

The sum of the 8 domains and it ranges from 1 to 56. The higher overall score indicates that the coordinators believe the specific platform works better in general.

Results of 1-way repeated measures ANOVA showed statistically significant differences (see Table 4) in CRCs’ preferences with respect to ethical and participant safety considerations (η2 p=0.065, F 1,85=5.917, p=0.017), medicines development and regulation (η2 p=0.091, F 1,85=8.509, p=0.005), as well as clinical trials operations (η2 p=0.059, F 1,86=5.375, p=0.023). Regarding the remaining competency domains, scientific concepts and research design, study and site management, data management and informatics, leadership and professionalism, and communication and teamwork, there were no statistically significant preference differences among the 5 training delivery methods.

Table 4.

Selected repeated measures analysis of variance and posthoc result of training method comparison on each competency domain (n=87)

Selected pair comparison (posthoc)
df Mean square F p Partial η2 Observed power Pair* Mean difference SE p Values
Ethical and participant safety considerations 1, 85 53.991 5.917 0.017 0.065 0.672 1 vs. 3 −1.023 0.260 0.002
1 vs. 4 −0.895 0.249 0.005
1 vs. 5 −0.581 0.197 0.040
Medicines development and regulation 1, 85 66.572 8.509 0.005 0.091 0.822 1 vs. 3 −1.198 0.227 0.000
1 vs. 4 −0.872 0.217 0.001
1 vs. 5 −0.593 0.202 0.043
2 vs. 3 −0.523 0.171 0.029
3 vs. 5 0.605 0.192 0.029
Clinical trials operations 1, 86 46.377 5.375 0.023 0.059 0.630 1 vs. 3 −1.011 0.246 0.001
2 vs. 3 −0.540 0.154 0.007
*

1, Mentoring or coaching; 2, online text-based training; 3, online video-based training; 4, live lecture; 5, flipped classroom.

The posthoc analysis of ethical and participant safety considerations (Table 4) indicated a statistically significant difference in the CRCs’ preferences between mentoring or coaching, and online video-based training (mean difference [MD]=1.023, SE=0.260, p=0.002), as well as between mentoring or coaching and live lecture (MD=0.895, SE=0.249, p=0.005) and between mentoring or coaching and flipped classroom (MD=0.581, SE=0.197, p=0.040). There were no statistically significant differences among other pairs. In other words, the coordinators least preferred the mentoring or coaching platform in terms of learning ethical and participant safety considerations compared with online video-based training, live lecture, and flipped classroom. Except for the mentoring and coaching training method, the remaining 4 training delivery methods were equally preferable for the ethical and participant safety considerations domain.

The posthoc analysis of medicines development and regulation (Table 4) suggested that there was a statistically significant difference in the coordinators’ preference between mentoring or coaching and online video-based training (MD=1.198, SE=0.227, p<0.001), between mentoring or coaching and live lecture (MD=0.872, SE=0.217, p=0.001), between mentoring or coaching and flipped classroom (MD=0.593, SE=0.202, p=0.043), as well as between online text-based training and online video-based training (MD=0.523, SE=0.171, p=0.029) and between online video-based training and flipped classroom (MD=0.581, SE=0.197, p=0.040). There were no statistically significant differences among other pairs. In other words, the coordinators preferred the online video-based training platform and live lecture over mentoring or coaching. They preferred the online text-based training and flipped classroom for the medicines development and regulation domain. The posthoc analysis of clinical trials operations (Table 4) indicated that there was a statistically significant difference in the coordinators’ preference between mentoring or coaching and online video-based training (MD=1.011, SE=0.246, p=0.001), as well as between online text-based training and online video-based training (MD=0.540, SE=0.154, p=0.007). There were no statistically significant differences among other pairs. In particular, the coordinators had less preference for the mentoring or coaching and online text-based training delivery methods compared with online video-based training, live lecture and flipped classroom when studying clinical trials operations. The result of split-plot ANOVA suggested that there was no different preference patterns on the training delivery methods due to the demographic variables including gender, highest degree, age, years of being CRCs, and department affiliation.

Discussion

The researchers explored CRC participant preferences for training delivery methods across 8 standard content domains. Researchers also examined the degree to which demographic variables were related to training delivery methods. Overall, participants reported a preference for online video, whereas mentoring or coaching was least preferred. There were statistical differences in the delivery methods by the selected content domains including, Ethical and Participant Safety Considerations, Medicines Development and Regulation, and Clinical Trials Operations. No significant differences across delivery methods and content domains by demographic variables were observed. This is the first study that we are aware of that has quantitatively assessed participant preferences for content delivery methods across the 8 JTF competency domains.

Limitations of this study include the use of a convenience sample and the inherent potential for social desirability bias in self-report surveys. The generalizability of the findings are limited to the participants in this study.

Other researchers have studied CRC preferences for training delivery methods. In Speicher et al. [7], they asked participants to indicate which types of training delivery they wanted to provide for newly hired CRCs. They found that mentorship, online training modules, orientation courses, conferences, and book trainings predominated. Jones et al. [11] asked CRCs to rate their preferences for teaching strategies including: distance education (i.e., online, email); experiential learning opportunities; portfolio development; virtual clinical trial practicum; simulation, mock patients and case studies in clinical research; opportunities to interact with international coordinators (via email); and traditional classroom setting. Unlike the Jones’ and Speicher studies, we quantified participant preferences. In another study, when asked to indicate a preference for online or classroom learning, researchers [13], found that novice and experienced CRCs held a preference for classroom learning. Notably, their options for teaching strategies were more limited. Findings from this study provide insight into participant preferences for training delivery methods related to clinical research competency domains.

Participant preferences may viewed though a learning theory lens to shed light on the preferred design of competency driven CRC training. The data obtained in this study can be used to inform future instructional design of CRC competency training and professional development programs. For example, asking participants to rate the training delivery methods is compatible with Malcolm Knowles’ adult learning theory, andragogy [14]. The CRCs’ ratings of the possible training delivery methods in this study can be viewed as a form of practicing andragogy.

It was perhaps somewhat surprising that the mentoring or coaching training method was less preferred, given myriad benefits attributed to mentoring [15]. However, the CRC field, lacks a history and culture of formal mentoring. Perhaps developing a peer-to-peer support network, like Mentor Academy programs [16], which exists in some CTSAs would be advisable. At this institution, programmatic efforts have been undertaken to develop a peer-to-peer support network [17] and to use hybridized content delivery. Drawing upon these experiences, the researchers have found that an approach that combines classroom and online learning embedded within a community network may hold the most promising outcomes for standardizing CRC training [14, 17, 18, 19].

The Enhancing Clinical Research Professionals’ Training and Qualifications (ECRPTQ) National Center for Advancing Translational Sciences’ supplement project [20] identified at least 334 training courses in various formats for emerging training platforms. However, it is critical to understand what it means to provide essential training and to ensure that regardless of what platform, model or delivery method used that trainings are firmly linked to a meaningful integration of established core competencies. Other CTSA institutions should look to the JTF for Clinical Trial Competency conceptual framework [8] to import competency language into local educational and training initiatives and then establish a common competency framework across the consortium. Notably, this framework has been used to define professional competency across the clinical research enterprise. Subsequently, CTSA investigators in ECRPTQ established and vetted a set of standards. Although the consortium of CTSA sites impacts diverse audiences, they are linked together by a common focus: excellence in clinical research. Findings from our study offer guidance to those charged with developing the training for CRCs.

Conclusion

The results of our survey, reveal participants’ tacit desire for using online video offerings for some competency content. This observation has been influential in promoting the expansion of our training delivery portfolio. Currently, we are developing more online video content, peer-to-peer support networks and hybrid certification classes. We are also collaborating with instructional design experts in our training and development office to provide hybrid certification classes. Using the findings from this study as a guide, we plan to develop a suite of online training videos that will eventually span the 8 JTF domains, while locally contextualizing their application. The goal is to transmit the idea that the role of a research coordinator is grounded in the need for a facile grasp of what it means to conduct clinical research in a safe, competent, and compliant manner situated in the framework of the JTF Clinical Research Competencies. These online videos will enrich our hybrid classroom experiences across our peer-to-peer mentoring/support networks as we continue to combine classroom and online learning while remaining cognizant of the cultural and specific workplace needs embedded within our community network.

Acknowledgments

The authors would like to acknowledge the support of the University of Florida Clinical and Translational Science Institute (CTSI).

Disclosures

The authors have no conflicts of interest to declare.

Footnotes

Financial Support

Research reported in this publication was supported by the University of Florida Clinical and Translational Science Institute, which is supported in part by the NIH National Center for Advancing Translational Sciences under award number UL1TR001427. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

References

  • 1. Woodin K. The CRC’s Guide to Coordinating Clinical Research, 3rd edition Boston, MA: Thomson CenterWatch, 2016. [Google Scholar]
  • 2. Zerhouni EA. Translational and clinical science—time for a new vision. New England Journal of Medicine 2005; 353: 1621–1623. [DOI] [PubMed] [Google Scholar]
  • 3. World Medical Association. World Medical Association Declaration of Helsinki: ethical principles for medical research involving human subjects. JAMA 2013; 310: 2191–2194. [DOI] [PubMed] [Google Scholar]
  • 4. Rosenberg RN. Translating biomedical research to the bedside: a national crisis and a call to action. JAMA 2003; 289: 1305–1306. [DOI] [PubMed] [Google Scholar]
  • 5. Sparrow MK. The Regulatory Craft: Controlling Risks, Solving Problems, and Managing Compliance. Washington, DC: Brookings Institution Press, 2011. [Google Scholar]
  • 6. Sung NS, et al. Central challenges facing the national clinical research enterprise. JAMA 2003; 289: 1278–1287. [DOI] [PubMed] [Google Scholar]
  • 7. Speicher LA, et al. The critical need for academic health centers to assess the training, support, and career development requirements of clinical research coordinators: recommendations from the clinical and translational science award research coordinator taskforce. Clinical and Translational Science 2012; 5: 470–475. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Sonstein SA, et al. Moving from compliance to competency: a harmonized core competency framework for the clinical research professional. Clinical Research 2014; 28: 17–23. [Google Scholar]
  • 9. Goldstein IL. Training in Organizations: Needs Assessment, Development, and Evaluation. Pacific Grove, CA: Thomson Brooks/Cole Publishing Co, 1993. [Google Scholar]
  • 10. Calvin-Naylor NA, et al. Education and training of clinical and translational study investigators and research coordinators: a competency-based approach. Journal of Clinical and Translational Science 2017; 1: 1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Jones CT, et al. Education and training preferences of clinical research managers. Research Practitioner 2008; 9: 202–214. [Google Scholar]
  • 12. Lomax RG, Hahs-Vaughn DL. An Introduction to Statistical Concepts, 3rd edition New York, NY: Taylor & Francis Group, 2012, p. 503. [Google Scholar]
  • 13. Behar-Horenstein LS, Potter J, Prikhidko A, Swords S, Sonstein S, Kolb HR. Training impact on novice and experienced research coordinators. The Qualitative Report 2017; 22: 3118–3138. [PMC free article] [PubMed] [Google Scholar]
  • 14. Knowles MS, Holton EF, Swanson RA. The Adult Learner: The Definitive Classic in Adult Education and Human Resource Development. New York, NY: Routledge, 2014.
  • 15. Behar-Horenstein LS, Prikhidko A. Exploring mentoring in the context of team science. Mentoring & Tutoring 2017; 25: 430–454. [DOI] [PMC free article] [PubMed]
  • 16. Behar-Horenstein LS, Feng X, Prikhidko A, Su Y, Kuang H, Roger B, Fillingim RB. Assessing mentor academy program effectiveness using mixed methods. Under review. [DOI] [PMC free article] [PubMed]
  • 17. Solberg L, Kolb HR, Prikhidko A, Behar-Horenstein LS. Ensuring representativeness in competencies for research coordinators. Clinical Researcher 2018; 32. [PMC free article] [PubMed]
  • 18. Behar-Horenstein LS, Prikhidko A, Kolb HR. Advancing the practice of CRCs: Why professional development matters. Therapeutic Innovation and Regulatory Science 2018: 1–10. [DOI] [PMC free article] [PubMed]
  • 19. Behar-Horenstein LS, Baiwa W, Kolb HR, Prikhidko A. A mixed method approach to assessing online dominate gcp training platforms. The Clinical Researcher 2017; 31: 38–42. [PMC free article] [PubMed]
  • 20. Shanely T, Masour G, Baron R. Enhancing clinical research professionals’ training and qualifications (ECRPTQ) competency assessments [Internet], 2015 [cited Apr 7, 2016]. (http://www.ctsa-gcp.org/uploads/3/9/2/5/39256889/ecrptq_assessments_103015_a2.pdf)

Articles from Journal of Clinical and Translational Science are provided here courtesy of Cambridge University Press

RESOURCES