Skip to main content
PLOS One logoLink to PLOS One
. 2025 Sep 15;20(9):e0332587. doi: 10.1371/journal.pone.0332587

The comprehensive researcher development framework (CRDF): Core learning outcomes for research training

Janet L Branchaw 1,2,*, Amanda R Butz 2, Joseph C Ayoob 3
Editor: Amy Prunuske4
PMCID: PMC12435680  PMID: 40953066

Abstract

Becoming a researcher involves the iterative development of deep disciplinary knowledge, specific technical skills, and psychosocial attitudes, behaviors, and beliefs. Consequently, training researchers is resource- and time-intensive. In addition, expectations can be opaque because the traditional apprenticeship model used in research training is idiosyncratic, defined by norms and traditions that vary across disciplines. To align and make research training expectations more transparent, we developed the Comprehensive Researcher Development Framework (CRDF) by extracting and analyzing learning outcomes from 56 previously published evidence-based frameworks from across disciplines. The individual frameworks each addressed a limited range of training stages (e.g., undergraduate only), focused on a subset of learning outcomes (e.g., technical skills), and/or included a single or narrow subset of disciplines (e.g., biomedical sciences). The CRDF derived from these frameworks includes 79 core learning outcomes nested under 8 areas of researcher development that are supported by evidence of content validity collected from experts in the research community. The CRDF builds consensus across disciplines and addresses undergraduate through postdoctoral career stages to define a coherent continuum of research learning outcomes that can be used to monitor and study researcher development. The CRDF does not replace existing discipline-based or training stage specific frameworks but rather can link and coordinate their use. The CRDF can be used by research training program directors to design new or refine existing research training programs, track individual research mentee development over time, and demystify the research training process for mentors and mentees. The CRDF can also be used by scholars studying researcher development to link data on core learning outcomes across research training programs, stages, and disciplines.

Introduction

Attracting motivated students with high potential from diverse backgrounds to research careers and providing them with rigorous, yet supportive research training experiences [1,2] is key to building a strong and innovative research workforce. However, training individuals to become researchers is complex and takes time. It involves the development of deep disciplinary knowledge, specific technical skills, and psychosocial attitudes, behaviors, and beliefs that promote integration and belonging in disciplinary research communities [36]. The diverse ways of knowing [7], research methodologies, and types of research projects across disciplines coupled with the apprentice model of training used in many disciplines produce research training journeys that are unique to each student. Consequently, designing and assessing the effectiveness of varied research training pathways can be challenging.

Though approaches and methods for conducting research are always evolving, and learning to do research is a lifelong process, formal research training in most disciplines begins during undergraduate education, is the primary focus of graduate education, and may be extended with postdoctoral training depending on the discipline. Ideally, formal research training across training stages forms a continuum that builds increasingly sophisticated disciplinary knowledge, perspectives, expertise, professional responsibilities, and relationships that are needed to successfully design and conduct rigorous research. However, consensus across disciplines and training stages about research learning outcomes is limited. Without defined common core learning outcomes, it is difficult to coordinate research training across programs, training stages, and between mentors, which can lead to contradicting or ill-defined expectations for mentees.

Inconsistencies across training programs and unclear expectations pose challenges for mentees. These challenges sometimes result in talented mentees abandoning a seemingly uncertain research career path for more well-defined, familiar, or lucrative opportunities outside of research [814]. This can be particularly important for mentees with limited research backgrounds, for whom the research culture is unfamiliar and persistence along a research career path uncertain. Systematizing research training and clarifying expectations is key to retaining the talented, high potential students we need to build the research workforce [3,4].

To clarify, align, and study research training, scholars (including two of the authors, [15]) have conducted studies to identify and understand how researchers develop and have published researcher development frameworks and/or assessments based on their findings (S1 Appendix). Conceptual frameworks are structures that describe “the factors and/or variables involved in (a) study and their relationships to one another” [16]. They can be used to guide training programs, mentors, and mentees in selecting and evaluating the impact of training activities, as well as to assess and monitor mentee development as a researcher over time. The individual frameworks in published studies, however, typically span a limited range of training stages (e.g., undergraduate only), focus on a subset of learning outcomes (e.g., technical skills), and/or include a single or narrow subset of disciplines (e.g., biomedical sciences). Consequently, building a coherent continuum of research training or studying researcher development across training stages and/or disciplines is challenging.

We leveraged the prior work done on the discipline and training stage specific frameworks to demonstrate consensus across disciplines and develop the Comprehensive Researcher Development Framework (CRDF). We analyzed 56 individual frameworks to identify the common knowledge, skill, and psychosocial attitudes, behaviors, and beliefs that researchers develop from the undergraduate through postdoctoral training stages. Through multiple phases of input from the research community, we defined and confirmed the importance of 79 core learning outcomes and organized them into 8 areas of researcher development (Table 1).

Table 1. Comprehensive Researcher Development Framework (CRDF).

1. Foundational Disciplinary Knowledge Understand historical and emerging disciplinary content, concepts, frameworks, and theories and how they relate to other disciplines.
Researchers:
1.01 know the fundamental content in their discipline (e.g., frameworks, theories, and models).
1.02 know the history of knowledge generation in their discipline.
1.03 know the processes by which new knowledge is generated and evaluated.
1.04 ground hypotheses and research questions in established disciplinary knowledge, theories, frameworks, or observations.
1.05 know inferences and implications of research findings.
1.06 know the ways that content knowledge from other disciplines is related to content knowledge in their discipline.
2. Practical and Cognitive Research Skills Know and apply disciplinary knowledge, technical, and reasoning skills to conduct research that advances knowledge in the discipline.
Researchers:
2.01 use tools and databases to search the disciplinary literature.
2.02 use literature search strategies that identify relevant prior research.
2.03 use logical and critical thinking in evaluating research.
2.04 connect diverse research ideas and approaches in novel and creative ways.
2.05 consider alternative approaches and interpretations of research.
2.06 make connections between content knowledge in their discipline and content knowledge in other disciplines.
2.07 identify gaps in existing knowledge or research results to investigate.
2.08 set research goals.
2.09 use disciplinary theories, frameworks and models in designing research studies.
2.10 provide a logical rationale for their study designs.
2.11 know assumptions and limitations in study designs (e.g., reporting uncertainty/error).
2.12 formulate hypotheses and research questions that can be systematically tested or investigated.
2.13 select appropriate methods to investigate research questions.
2.14 follow standard protocols to collect and store research data.
2.15 have up-to-date technical skills to conduct research in the discipline.
2.16 develop new data collection or analytical methods when needed.
2.17 use troubleshooting skills to address theoretical or technical problems in research.
2.18 apply the appropriate analytic and statistical methods to analyze data.
2.19 use disciplinary theories, frameworks and models in analyzing data.
2.20 interpret the results of data analyses (e.g., coding, mathematical, and statistical calculations).
2.21 interpret or synthesize research findings.
2.22 propose new inferences and implications of research findings (their own and/or others’).
2.23 draw conclusions from research findings.
2.24 refine existing and/or contribute new disciplinary theories, frameworks, and models based on research findings.
3. Ethical and Responsible Research Practices Follow guidelines for responsible conduct of research and recognize and respond to ethical issues that impact and emerge from conducting research. 
Researchers:
3.01 follow research safety regulations.
3.02 follow disciplinary data ownership and stewardship practices.
3.03 follow ethical guidelines for working with research data.
3.04 follow guidelines for ethical treatment of research subjects (e.g., individuals, communities, animals, etc.).
3.05 follow guidelines for conducting rigorous and reproducible research in their discipline.
3.06 follow disciplinary norms and policies regarding credit for contributions to research (e.g., citing previous research, authorship order, acknowledging work).
3.07 recognize and minimize legal issues, ethical issues, and potential conflicts of interest in research.
3.08 recognize instances of research misconduct and take steps to address them.
3.09 consider the role of social and cultural factors in research.
3.10 consider the implications of research to individuals and society.
3.11 consider how system structures provide differential access to participation in research.
3.12 act to increase access to research for all.
4. Research Communication Skills Translate and communicate research ideas and findings in multiple formats to multiple audiences.
Researchers:
4.01 construct appropriate ways to present and visualize data.
4.02 use disciplinary conventions to communicate research (e.g., ideas, results, implications) orally (e.g., conference presentations, invited talks, research team meetings).
4.03 use disciplinary conventions to communicate research (e.g., ideas, results, implications) in writing (e.g., research articles, grant proposals, policy briefs).
4.04 translate research findings into policies and practices.
4.05 translate research (e.g., ideas, results, implications) and engage with audiences outside of their research discipline (e.g., scholars in other disciplines, non-research, or general audiences).
4.06 promote and advocate for research through communications to various audiences (e.g., institutions, disciplines, public stakeholders).
5. Interpersonal Research Skills Build relationships and skills to productively interact and collaborate with people from diverse backgrounds and perspectives in the research environment.
Researchers:
5.01 understand and conduct themselves in accordance with the cultural and social norms of professionals in the discipline.
5.02 express respect for others' differences.
5.03 use appropriate and effective interpersonal communication practices with research colleagues.
5.04 manage difficult conversations and conflicts with research colleagues.
5.05 work effectively with others on collaborative and/or interdisciplinary teams.
5.06 consider and include multiple perspectives in decision making.
5.07 make meaningful contributions to collaborative research projects.
5.08 provide critical and constructive feedback on research to colleagues.
5.09 accept, interpret, and modify their research based on constructive criticism and feedback from colleagues.
5.10 network with other research professionals.
6. Researcher Self-Beliefs and Attitudes Develop personal qualities (i.e., curiosity, confidence, identity, self-regulation, self-assessment and perseverance) that are critical for long-term success in research.
Researchers:
6.01 proactively set their research goals and secure the guidance and resources needed to achieve those goals.
6.02 persevere when problems or challenges arise in research (e.g., unexpected, ambiguous, or uncertain results, failed projects).
6.03 recognize and manage their feelings and behaviors in the research environment.
6.04 accurately self-assess their research strengths and weaknesses.
6.05 express curiosity in exploring and conducting research.
6.06 work at an appropriate level of independence.
6.07 manage time to meet individual research project milestones.
6.08 develop confidence in their capability to successfully conduct research.
6.09 identify themselves as a researcher or expert in their discipline.
6.10 engage in practices that support work-life balance (e.g., time management, pursuing interests beyond research).
7. Knowledge and Skills to Pursue a Research or Research-Related Career Apply and translate knowledge and skills as a researcher to identify and pursue a research or research-related career.
Researchers:
7.01 are aware of career pathways related to their research training.
7.02 identify and clarify a long-term strategic vision for their research career.
7.03 translate and apply their research skills and knowledge across career pathways.
7.04 are prepared to pursue research career pathways.
8. Knowledge and Skills to Administer and Manage Research Projects and Teams Develop administrative skills to lead research personnel and projects.
Researchers:
8.01 identify and clarify a long-term strategic vision for a program of research.
8.02 identify opportunities and make decisions about the research to be done.
8.03 mentor other developing researchers using best practices in mentoring.
8.04 know how to identify and secure funding (e.g. investments, grants) to support research in their discipline.
8.05 estimate and secure the funds needed to conduct research.
8.06 track research expenditures.
8.07 have the administrative skills to manage research projects and/or heterogenous research teams.

Importantly, the CRDF is not meant to replace existing disciplinary or training stage specific frameworks but rather provide a comprehensive benchmark against which these frameworks can be compared and through which they can be linked. The CRDF can be used to study researcher development and research training programs across disciplines and training stages, to guide training program development when specific frameworks do not exist, or as a template from which to build new frameworks for specific disciplines or training stages. The CRDF can also be shared with mentees to make the expectations of research training explicit and empower them to take responsibility for their research training experience. Likewise, it can be shared with mentors to help them articulate expectations and assess their mentees’ progress. Tools based on the CRDF to support these uses are provided in the S2a and S2b Appendix.

Methods

An overview of the process used to develop the CRDF is presented in Fig 1. All human subjects research presented in this article was approved by the University of Wisconsin - Madison’s Institutional Review Board, protocol # 2024−0876.

Fig 1. Comprehensive Researcher Development Framework Development Process.

Fig 1

  • 1. Synthesize research

We followed the five stage process of research synthesis outlined by Cooper [17]: Problem Formulation; Literature Search; Data Evaluation; Analysis & Interpretation; and Presentation of Results. The first four stages are described in the Methods section and the final stage in the Results section.

Problem formulation

We sought to answer the question: What common sets of knowledge, skills, and psychosocial attitudes, behaviors, and beliefs do researchers-in-training develop across different disciplines and training stages (i.e., undergraduate through postdoctoral)?

Literature search

We conducted a comprehensive review of the literature published in the last quarter century (2000–2024) to identify research development frameworks across disciplines from undergraduate to postdoctoral career stages that were supported with evidence of validity. Prior to conducting the literature search, key terms to inform the search were defined:

  • Research knowledge: specific disciplinary knowledge; knowledge about the process of investigating or exploring the unknown needed to conduct disciplinary research and/or to advance the discipline; knowledge of the structures and resources that support research in the discipline (e.g., peer review practices, funding mechanisms).

  • Research skills: sets of skills that trainees develop while engaging in research that advance their development as researchers.

  • Research psychosocial attitudes, behaviors, and beliefs: disciplinary cultural norms, ways of networking, and engaging in interpersonal interactions; development of identity as a researcher.

  • Researcher development frameworks: evidence-based organization of concepts or ideas around researcher development; research skill assessment.

The scope of the literature search included identifying undergraduate, graduate, and postdoctoral frameworks in the physical, life, or social sciences, arts/humanities or in multiple disciplines. Searches were conducted on Web of Science and Ebsco Academic Search for articles published from 2000 to 2024 using keywords. In the following list of key words, the brackets refer to different iterations of the same search term, the quotes were used to search for specific text, and the asterisk allowed us to search for similar terms: [undergraduate, graduate, postdoctoral] research “development framework” or framework, [undergraduate, graduate, postdoctoral] “research* competencies”, “research competency” assessment “research learning” assessment, “research framework” assessment.

In addition, we reviewed professional organization websites for the Council on Undergraduate Research, Council of Graduate Schools, and the National Postdoctoral Association for guidance on key skills that mentees may be expected to learn at each training stage. Finally, frameworks known to us or referenced in articles found in the initial search were incorporated.

Data evaluation

The articles identified in the literature search were evaluated to determine if they met 3 inclusion criteria: 1) They must define specific areas of knowledge, skill, or psychosocial development; 2) They must address undergraduate, graduate, and/or postdoctoral research training; and 3) They must be published with at least 1 source of validity evidence as defined by the Standards on Educational Testing [18], i.e., evidence-based on test content, response processes, internal structure, or relation to other variables or criterion. Only articles meeting all three criteria were included in the study.

Analysis & interpretation

The articles that met the inclusion criteria were reviewed, and individual knowledge, skill, and psychosocial elements of researcher development were identified and extracted. Three researchers, a discipline-based educational researcher trained in neurophysiology research methods (JB), an educational psychologist and academic motivation researcher trained in quantitative and qualitative social science research methods (AB), and a director of training and mentoring programs in computational and systems biology and discipline-based educational researcher trained in cellular and developmental neuroscience and molecular genetics (JA), independently analyzed the extracted elements and proposed themes to code the elements through an open coding process [19]. Each researcher organized the elements into similar groups and assigned themes to their groups. The researchers met to compare and discuss the individually proposed group themes and agreed on an initial set of themes and definitions to begin coding. The researchers met to compare, discuss, and revise the themes throughout the coding process. Elements representing general headings or non-research skills (e.g., teaching skills) in the published frameworks were removed from the list of elements.

  • 2. Draft and Iteratively Revise Core Learning Outcomes

One researcher (JB) compared, contrasted, and grouped the elements coded under each of the final themes to draft an initial set of core learning outcomes that represented specific knowledge, skills, and psychosocial attitudes, behaviors, and beliefs. The other two researchers (AB and JA) reviewed the initial drafts, and the team met to discuss their feedback and make revisions. To ensure that all elements extracted from the original source frameworks were represented, JB and AB mapped the original elements to the resulting draft core learning outcomes. Iterative revision of the learning outcomes based on feedback from the research community continued throughout the development process.

  • 3. National Survey to Collect Feedback and Evidence of Content Validity

The draft core learning outcomes were organized into 5 broad categories for use in an online survey to gather feedback from professional researchers (postdoctoral scholars and faculty/staff) across the nation. The broad categories on the survey were: Thinking and Communicating about Research, Researcher Self-Beliefs and Attitudes, Research Career Readiness, Relationships in the Research Environment, and Conducting Research. The S4 Appendix tracks all the learning outcome revisions from the initial to the final versions. While initially designed to capture data on all 79 learning outcomes, 4 learning outcomes related to career pathways were not rated for importance, due to a survey error, and thus results were only available for 75 learning outcomes regarding importance.

The survey was sent to individuals in the researchers’ networks, as well as several local, regional, and national groups (see Table in the S5 Appendix for a full listing of groups contacted). Survey respondents were asked to report the level of research mentee(s) with whom they work, their disciplinary area of expertise, their current role, and their institution. To avoid survey fatigue, each respondent was randomly asked one of two questions: 1) How important is achieving each learning outcome to becoming a mature, independent scholar in your discipline? (scale: not relevant to research professionals in my discipline, not important, slightly important, moderately important, extremely important); or 2) At what stage of training in your discipline do research scholars make the greatest gains toward each learning outcome? (scale: not emphasized in training, undergraduate/post baccalaureate, graduate/professional, postdoctoral). Once a respondent answered the first question for each draft learning outcome, they were given the option to answer the other question for each draft learning outcome. After answering one (or both) questions, respondents were invited to give feedback on the draft learning outcomes and to submit any learning outcomes important to their discipline that were missing from the list.

  • 4. Define Framework Categories to Organize Learning Outcomes

Research community members were invited to participate in 2 rounds of card sorting exercises to organize the 78 core learning outcomes into categories for the framework. Card sorting is used to understand how people group and categorize information [20]. Card sorting exercise participants were recruited through email listservs for graduate training program leaders and postdoc scholars at a single research university in the Midwest. The message invited them to sign up and asked them to share the invitation with others in their networks who might be interested in participating.

Open card sorting was conducted in groups that included faculty, research staff, and postdoctoral scholars from multiple disciplines. Participants worked collaboratively to open-sort 78 cards, each with a core learning outcome, into categories defined by the group. First, participants discussed the meaning of each learning outcome, then iteratively organized them into groups, and then named their final groups. Important points of discussion and feedback on the learning outcomes were documented by researchers in real time and used for subsequent revision of the learning outcomes. Data from across groups were combined, and based on analysis of the combined data set, an initial set of categories of researcher development were derived. The categories defined by each group and the cards associated with each group were entered into a spreadsheet that was developed to analyze card sort data [21]. Similar categories offered by groups were merged into standardized categories prior to analysis and the percent agreement among all groups was examined to determine whether the standard categories accurately represented each group of learning outcomes.

Using the categories from the open card sorting exercise, faculty, research staff, postdoctoral scholars, graduate students and undergraduate students from across the country participated in a second, closed card sorting exercise (i.e., all categories were predetermined). Participants either participated in in-person group sessions on the same research university campus as the open card sort exercise or as individuals online using an online card sorting tool [22] at multiple campuses across the country. All were asked to sort the 78 learning outcome cards into the categories generated by the open card sorting exercise. Participants were forced to select one primary category to assign each learning outcome, though many could reasonably be assigned to more than one category, and participants were able to note other categories to which they considered assigning a particular learning outcome. Data from the groups was weighted by the number of participants and combined with the individual card sort data. The combined data was analyzed for patterns to determine under which primary category each learning outcome should nest.

  • 5. Back Map Source Framework Elements to Learning Outcomes and Categories to Confirm Coverage

To ensure that all the elements extracted from the original source frameworks were represented in the final core learning outcomes, the researchers back mapped each extracted element to one or more final learning outcome. The elements were divided into three groups and each researcher mapped one group. Then, a second researcher reviewed the mapping and either confirmed or flagged it for further discussion. Flagged elements were reviewed and discussed by all researchers to agree on which learning objectives it should be mapped.

Results

  • 1. Synthesize research

The literature search yielded over 13,000 results. Articles were flagged for further review if they addressed undergraduate, graduate or postdoctoral training stages and related to research trainee development. All but 123 articles were excluded from further review because they described empirical research and not researcher learning outcomes or researcher development frameworks.

Of the 123 articles reviewed, 56 met the criteria for inclusion: 34 were identified in either Ebsco or Web of Science, 14 were referenced in other articles, 1 was a professional society framework, and 7 were known to the authors. Thirty-four percent of the articles were published by researchers outside of the United States. The sources, validity evidence, and the discipline(s) and career stage(s) of these 56 frameworks [5,15,2376] were documented (S1 Appendix). Figs 2 and 3 show summaries of the career stage(s) and disciplines addressed in the articles.

Fig 2. Career stages addressed in included framework articles (N = 56).

Fig 2

Note that some articles addressed more than one career stage.

Fig 3. Research disciplines addressed in included framework articles (N = 56).

Fig 3

Note that some articles addressed more than one discipline.

The three researchers reviewed, identified and extracted 1,434 elements from the 56 frameworks. Removal of general headings and non-research skill elements left 1,343 elements to be coded. Each researcher individually reviewed the elements and proposed themes to use in coding. Through discussion, they agreed on an initial shared set of 44 themes to begin coding. Through three rounds of coding, the themes and discrepancies in code assignments were discussed and revisions to the themes and code assignments were made. A final set of 48 themes was used and consensus [19] was reached on the theme codes for all 1,343 elements (45% agreement in the first round, 73% agreement in the second round, and 100% agreement in third round). A full list of the elements and code themes is available in the S3 Appendix.

  • 2. Draft and iteratively revise core learning outcomes

One researcher (JB) reviewed the elements coded in each theme and drafted 78 core learning outcomes. These were reviewed by the other two researchers (AB and JA) and the three researchers met to discuss and revise the draft core learning outcomes to generate a revised initial set of 79 core learning outcomes (S4 Appendix).

  • 3. National survey to collect feedback and evidence of content validity

To collect feedback on the draft learning outcomes and evidence of content validity (i.e., validity evidence based on the relationship between the content and the construct that it is intended to measure as determined by experts [18]), a national online survey was conducted with researchers asking about the importance of each learning outcome and at what career stage it is emphasized. Overall, 169 individuals responded to the survey. A summary of the characteristics of the survey respondents is presented in Table 2.

Table 2. Characteristics of National Survey Respondents (N = 169).

Career Stage Number of respondents (%) Discipline Number of respondents (%)
Research Trainees 10 (6%) Biological Sciences 75 (44%)
Faculty/Staff 102 (60%) Physical Sciences 40 (24%)
Research Training Program Leaders 25 (15%) Social Sciences 32 (19%)
Not reported 32 (19%) Arts & Humanities 9 (5%)
Cross-Disciplinary 13 (8%)

Note: Individuals classified as Cross-Disciplinary included those who ran interdisciplinary training programs as well as training programs that served multiple disciplines.

Learning outcome importance

Of our 169 survey respondents, 123 rated how important each learning outcome was to researcher development in their discipline (Fig 4). The overwhelming majority of respondents categorized most of the learning objectives as moderately or extremely important for the development of a researcher in their field. This evidence of content validity from experts supports that the core learning outcomes are relevant to research training across disciplines.

Fig 4. Percentage of 75 learning outcomes for which 50% or more of respondents selected each level of importance in response to the question “How important is achieving each learning outcome in your discipline?” (N = 123).

Fig 4

Though there is some variability in level of importance across disciplines, several learning outcomes were consistently rated as extremely important. Table 3 shows the learning outcomes that 80% or more of respondents agreed were extremely important, with perseverance most frequently rated as extremely important.

Table 3. Learning Outcomes Rated as Extremely Important by 80% or More of Respondents.

Learning Outcome % Overall Sample
(N = 123)
persevere when problems or challenges arise in research. 91%
draw conclusions from research results. 90%
interpret or synthesize research results. 86%
can interpret the results of analyses. 86%
use logical and critical thinking in evaluating information and knowledge and in conducting research. 83%
know and follow responsible research practices. 83%

Note: The learning outcomes were iteratively revised throughout development based on feedback from the research community. Therefore, some learning outcomes are slightly different than those in the final CRDF (Table 1).

Overall, two thirds (66.67%) or 50 of the 75 learning outcomes were rated as extremely important by over 50% of the survey respondents. For the remaining 25 (33.33%) there was not a majority rating, nor were there discernable rating patterns based on learning outcome content. We discovered some variation in the level of importance by discipline, with individuals from different disciplines agreeing on the specific level of importance for 29 (38.67%) of the 75 learning outcomes. When we combine the ratings of moderately important and extremely important, the level of agreement increases to 65 (86.67%).

Learning outcome training stage

We also received input from 112 survey respondents who rated the career stage at which researchers in training make the greatest gains toward each of the 79 learning outcomes in their discipline (Fig 5). Overall, 51 (64.56%) of the learning outcomes were reported as emphasized during graduate education by over 50% of the survey respondents, 7 (8.86%) during undergraduate education, and 6 (7.59%) during postdoctoral training. There was not a majority rating for the remaining 15 (18.99%) learning outcomes, which were reported as addressed during various career stages. When examined within each discipline, trends in the various sciences reflected the overall trend, but the career stage at which learning outcomes were addressed in the Arts & Humanities was primarily shifted to the earlier undergraduate stage. However, given the small number of respondents in the Arts & Humanities it is not possible to draw conclusions about these differences.

Fig 5. Percentage of 79 learning outcomes for which 50% or more of respondents selected the career stage at which greatest gains were made. (N = 112).

Fig 5

Notably, the data show that 7 learning outcomes (3.09, 3.11, 3.12, 4.04, 4.06, 6.10, and 8.06) were rated as not emphasized in training (Fig 5) but were rated as moderately or extremely important (Fig 4) by most respondents. This suggests that research training program directors should consider these important learning outcomes and integrate new learning activities or experiences to address them if their program is not currently addressing them.

In addition to rating the learning outcomes for importance and career stage, survey respondents were asked to provide feedback on the learning outcomes. They were asked to comment on whether the language used was appropriate for their discipline or if any learning outcomes important for their discipline were missing. Based on this feedback and ongoing review by the research team, 21 learning outcomes were revised, 9 were merged, 5 were added, and 1 was deleted. The total number of learning outcomes at the end of this stage of development was 78. See the S4 Appendix for details about the learning outcome revisions made from the initial drafts to the final versions.

  • 4. Define Framework Categories to Organize Learning Outcomes

We solicited input from those in the research community engaged in research training (practitioners and trainees) as well as those who study research training (educational researchers) to organize the learning outcomes into categories (areas of researcher development) that would be useful to the community. Two separate phases of card sorting activities were implemented. In the first phase, expert research practitioners with deep knowledge of research training (faculty, research staff, and postdoctoral scholars) from across disciplines were recruited to work in groups to open sort the learning outcomes into categories representing areas of researcher development. The preliminary researcher development categories generated across these groups were analyzed and a consensus list developed. In the second phase, professional research practitioners as well as undergraduate and graduate research students used the categories generated from the open sorting exercise to assign the learning outcomes to the categories in a closed sorting exercise. The characteristics of the card sorting exercise participants are summarized in Table 4.

Table 4. First and Second Round Card Sort Participant Characteristics.

Open Sort – Generate Categories Closed Sort – Assign Learning Outcomes to Categories
In-person
(N = 29)
In-person
(N = 16)
Online
(N = 46)
N (%) N (%) N (%)
Career Stage
 Undergraduate Students 0 (0%) 5 (31%) 0 (0%)
 Graduate Students 0 (0%) 6 (38%) 4 (9%)
 Postdoctoral Scholars 13 (45%) 0 (0%) 12 (26%)
 Faculty or Research Staff 16 (55%) 5 (31%) 18 (39%)
 Not reported 0 (0%) 0 (0%) 12 (26%)
Discipline
 Biological Sciences 19 (66%) 7 (44%) 23 (68%)
 Physical Sciences 8 (28%) 8 (50%) 3 (9%)
 Social Sciences 2 (7%) 7 (44%) 11 (32%)
 Arts & Humanities 1 (3%) 1 (6%) 1 (3%)
 Not reported 12 (26%)
Gender
 Woman 6 (21%) 7 (44%) 21 (46%)
 Man 7 (24%) 6 (44%) 11 (24%)
 Another gender identity 0 (0%) 1 (6%) 2 (4%)
 Not reported 16 (55%) 1 (6.3%) 12 (26%)
Race/Ethnicity
 White 11 (38%) 11 (69%) 27 (59%)
 Non-White 2 (7%) 5 (31%) 7 (15%)
 Not reported 16 (55%) 0 (0%) 12 (26%)

Note: Some participants reported more than one discipline, Gender, or Race/Ethnicity. Gender and Race/Ethnicity information was collected as part of a follow up survey for Open Card sort participants, so higher numbers of not reported information are present in those cells. Non-White included individuals who identified as American Indian or Alaskan Native, Asian, Hispanic or Latinx, or with Two or More Races. Another gender identity includes individuals who reported their gender as genderqueer, non-binary, or transgender. The numbers in these categories are reported together to protect participants’ confidentiality.

Open card sort to define researcher development categories

The recruited research professionals worked in nine separate groups of two to four members each to sort the 78 learning outcomes into categories. Each group discussed and named their categories independently. A total of 93 category titles across groups were generated (Table 5) and the category titles were analyzed to identify commonalities. There was overlap, but groups chose to define their categories at different levels of detail. For example, a single category generated by one group could align with multiple categories generated by another group. Comparisons of the learning outcomes each group assigned to their categories were used to understand how categories across different groups were related and to identify consensus categories across groups. Based on this data analysis, the research team defined nine integrated preliminary categories (Table 5). Results from the individual groups are available in S6 Appendix.

Table 5. Evolution of Area of Researcher Development Categories.

Open Card Sort Categories Generated by Different Groups Integrated Preliminary Categories Final Categories After Closed Sort
• Research Content Knowledge
• Foundational knowledge
• Expertise – Early Stage
• Discipline/ Disciplinary Knowledge
• Knowledge/ breadth
Foundational Disciplinary Knowledge
• Research Interpretation
• Synthesis
• Interpretation
• Expertise – Late Stage
• Experimental Design & Analysis
• Research Skills Related to Overall Discipline/ Research Vision
• Critical thinking
• Research independence/ novel research questions
• Basic Research Skills
• Research process – conducting research & synthesizing new knowledge
Research Thinking and Reasoning Skills Practical and Cognitive Research Skills
• Research Tech Skills
• Planning/ hypothesis
• Technical Competency
• Expertise – Mid Stage
• Foundational Research Skills
• Technical Research Skills
• How to Design and Execute an Experiment or Project
• Technical
• Analysis/technical skills
Practical Research Skills
• Ethics/ Ethical Practices/Research ethics
• Ethical/ Professional Behavior
• Guidelines/ Professional Ethics
• Safety
• Responsible Conduct of research/ Research Integrity
• Broader Impacts
Ethical and Responsible Research Practices
• Translational/ Public Science
• Professional Communication
• Communication & Sharing to Society
• Public/ Broad Communication
• Research Communication
• Scientific Communication
• Scientific Communication to non-scientific community
Research Communication Skills
• Social aspects/ lab culture
• Communication w/ others
• Access/Inclusion
• Communication/ Feedback
• Teams/ Group Networking
• Mentoring
• People (Interpersonal)
• Collaborative Research
• Promoting Equality in Research
• Professional Collaboration
• Mentorship
• Professional Skills
• Inclusion
• Collaboration
• Management
• Communication and collaboration
• Mentorship soft skills management
• Advocacy
• Collaborative Teams
• Leadership
• Interpersonal & research Management
Interpersonal Skills as a Researcher Interpersonal Research Skills
• Self-efficacy
• Affective
• Agency, Grit, Ownership, Resilience
• Grit/Self-Regulation
• Attitude/Personal Development
• Self-Management
• Self-efficacy & Research identity
Personal Attributes as a Researcher Researcher Self-Beliefs and Attitudes
• Career exploration
• Career
• Research/Career Independence
• Career Pathway
• Personal/Career Knowledge Development
• Career Development
Knowledge and Skills to Pursue a Research or Research-Related Career
• Administration
• Money/ Resources – Research Admin
• Finance Resources
• Research Management & Implementation
• Management/Administrative
• Funding
• Money
• Lab Management
• Funding
Knowledge and Skills to Manage Research Projects and Teams Knowledge and Skills to Administer and Manage Research Projects and Teams

Closed Sort to Assign Learning Outcomes to Categories.

Faculty, research staff, postdoctoral scholars, graduate students, and undergraduate students participated in the closed card sorting exercise to provide feedback and assign the 78 learning outcomes to the 9 integrated preliminary categories generated by the open card sorting exercise (Table 5). Some participants engaged as part of in-person groups while others engaged individually with an online card sorting tool. Nine in-person groups with two to four participants each were held, and 46 individuals from across the country participated online. Data from the groups was weighted by the number of individuals in the group and combined with online data from individuals. For example, if four individuals participated in a focus group, the results of their card sort were weighted four times compared to card sort data from individual participants who completed the online card sort activity.

A map of the closed card sorting results is presented in Table 6. Overall, closed card sorting participants reported that it was sometimes challenging to assign a learning outcome to just one area. Consensus across groups for each learning outcome shown in Table 6 was considered strong if there was 75–100% agreement (dark blue), moderate if there was 50–75% agreement (light blue), and weak if there was less than 50% agreement (no shading). Assignment of learning outcomes between the “Research Thinking and Reasoning Skills” and “Practical Research Skills” categories showed significant overlap. Therefore, these two categories were combined into one, “Practical and Cognitive Research Skills,” and the data from the two original categories were combined for analysis.

Table 6. Phase 2 Card Sorting Results.

Areas of Researcher Development and Learning Outcomes % Agreement
Foundational Disciplinary Knowledge
know the fundamental content in their discipline (e.g., frameworks, theories). 83.9%
can relate content knowledge from other disciplines to content knowledge in their discipline1. 53.2%
ground hypotheses and research questions in established disciplinary knowledge, theories, or frameworks. 46.8%
know the history of knowledge generation in their discipline. 85.5%
know the processes by which new knowledge is generated and evaluated in their discipline. 72.6%
Practical and Cognitive Research Skills
know assumptions and limitations in study designs (e.g., reporting uncertainty/error). 66.1%
select appropriate methods to investigate research questions in the discipline. 71.0%
draw conclusions from research results. 88.7%
use disciplinary theories, frameworks and models in analyzing the results of research studies. 62.9%
use logical and critical thinking in evaluating information in research (e.g., designing, conducting, defending, and reviewing research) 71.0%
consider alternative approaches and interpretations of research. 75.8%
set research goals. 54.8%
refine existing and/or contribute new disciplinary theories, frameworks, and models. 50.0%
connect diverse research ideas and approaches in novel ways. 69.4%
interpret or synthesize research results. 79.0%
can interpret the results of analyses of data (e.g., coding, mathematical and statistical calculations) 83.9%
can provide a logical rationale for their study designs. 67.7%
recognize the inferences and implications of research findings on and beyond the discipline1. 38.7%
identify gaps in existing knowledge or research results to investigate. 56.5%
use disciplinary theories, frameworks and models in designing research studies. 53.2%
formulate hypotheses and research questions that can be systematically tested or investigated. 74.2%
have up-to-date technical skills to conduct research in the discipline. 79.0%
use troubleshooting skills to address theoretical or technical problems in research. 88.7%
can use tools and databases to search the disciplinary literature. 61.3%
select the appropriate analytic and statistical methods used in the discipline. 83.9%
use literature search strategies that identify relevant prior research. 53.2%
develop new data collection or analytical methods when needed to address novel research questions. 82.3%
Ethical and Responsible Research Practices
recognize and minimize potential conflicts of interest in research 87.1%
know and follow disciplinary norms and policies regarding credit for contributions to research (e.g., citing previous research, authorship order, acknowledging work). 72.6%
understand how system structures provide differential access to participation in research2. 30.6%
follow standard protocols to document and securely store research data1. 46.8%
consider the implications of research to individuals and society. 54.8%
recognize instances of research misconduct and take steps to address them 82.3%
consider the role of social and cultural factors in research. 61.3%
know and follow guidelines for ethical treatment of research subjects (e.g., individuals, communities, animals, etc.) 83.9%
know and follow guidelines for research rigor and reproducibility in your discipline 62.9%
know and follow disciplinary data ownership/stewardship practices 69.4%
follow research safety regulations. 59.7%
act to increase access to research for all. 27.4%
Research Communication Skills
use disciplinary conventions to communicate research effectively (e.g., ideas, results, implications) orally (e.g., conference presentations, invited talks, research team meetings). 85.5%
construct appropriate ways to present and visualize data. 61.3%
promote and advocate for research within the institution, the discipline, and through interactions with public stakeholders2. 46.8%
are able to translate research findings into policies, practices, and daily life. 54.8%
can translate research (e.g., ideas, results, implications) and engage with audiences outside of their research discipline (e.g., to scholars in other disciplines, non-research, or general audiences). 77.4%
use disciplinary conventions to communicate research effectively (e.g., ideas, results, implications) in writing (e.g., research articles, grant proposals, policy briefs). 67.7%
Interpersonal Skills as a Researcher
accept, interpret, and modify their research based on constructive criticism and feedback from colleagues. 35.5%
make meaningful contributions to collaborative research projects. 58.1%
consider and include multiple perspectives in decision making. 54.8%
use appropriate and effective interpersonal communication practices with research colleagues. 77.4%
understand and conduct themselves in accordance with the cultural and social norms of professionals in the discipline. 45.2%
work effectively with others on collaborative and/or interdisciplinary teams. 80.6%
express respect for others’ differences. 69.4%
are able to network with other research professionals. 64.5%
provide critical and constructive feedback on research to colleagues. 62.9%
are able to manage difficult conversations and conflicts with research colleagues. 62.9%
Personal Attributes as a Researcher
develop confidence in their capability to successfully conduct research. 77.4%
engage in practices that support work-life balance (e.g., time management, pursuing interests beyond research) 69.4%
persevere when problems or challenges arise in research (e.g., unexpected, ambiguous or uncertain results, failed projects) 80.6%
are able to accurately self-assess their strengths and weaknesses. 83.9%
develop attitudes about research that support success in research. 64.5%
are able to recognize and manage their feelings and behaviors in the research environment. 75.8%
identify themselves as a researcher or expert in their discipline. 79%
work at an appropriate level of independence. 61.3%
express curiosity in exploring and conducting research. 67.7%
self-advocate and take responsibility when working with mentors to set research goals and secure the guidance and resources needed to achieve those goals. 66.1%
are able to manage time and meet research project milestones in a timely manner2. 41.9%
Knowledge and Skills to Pursue a Research or Research-Related Career
are aware of career pathways related to their research training. 88.7%
are prepared to pursue research career pathways. 83.9%
can translate and apply research skills and knowledge across career pathways. 72.6%
Knowledge and Skills to Manage Research Projects and Teams
identify and clarify a long-term strategic vision for research1. 35.5%
mentor other developing researchers using best practices in mentoring. 51.6%
can estimate the funds needed to conduct research. 67.7%
are able to manage heterogenous research teams. 82.3%
know how research is funded in the discipline2. 29%
identify opportunities and make decisions about the research to be done2. 27.4%
can track research expenditures. 74.2%
have the administrative skills to manage research projects, personnel, and support staff. 77.4%
can secure funding to conduct research3. 45.2%

Note: 1 learning outcome was split into multiple learning outcomes; 2 learning outcome was revised; 3 learning outcome was removed.

After the combined “Practical and Cognitive Research Skills” category was created, analysis of the raw data showed strong agreement for 8 (36%) of the 24 learning outcomes and moderate agreement for 13 (59%). These were assigned to the given category, and no further modifications were made to them. Learning outcomes across all categories that showed weak agreement were reviewed by the research team. Those that were misinterpreted based on observations during in-person card sorting sessions or comments shared by online participants were revised to clarify their meaning. Those that generated equally valid, but different interpretations by different groups or individuals were divided into two separate learning outcomes. Development and refinement of the core learning outcomes are detailed in the S4 Appendix.

In addition to combining the two areas of researcher development, the names of three other areas of researcher development were modified based on the closed card sort participant feedback. “Interpersonal Skills as a Researcher” was modified to “Interpersonal Research Skills” to clarify that this category includes learning outcomes focused on a researcher’s ability to interact with other researchers. “Personal Attributes as a Researcher” was modified to “Researcher Self-Beliefs and Attitudes” based on feedback received that the word “attribute” suggested that these were unchangeable traits, rather than psychosocial skills that individuals can develop. Finally, “Knowledge and Skills to Manage Research Projects and Teams” was modified to “Knowledge and Skills to Administer and Manage Research Projects and Teams” to clarify that this category included learning outcomes needed for administration of research. The last column in Table 5 shows the final categories. The final CRDF with the categories and their nested core learning outcomes is in Table 1.

  • 5. Back Map Source Framework Elements to Learning Outcomes and Categories to Confirm Coverage

Researchers back mapped the elements derived from the original source frameworks to the final CRDF core learning outcomes and areas of researcher development (categories) to confirm that all were covered in the final CRDF. Table 7 reports the percentage of CRDF core learning outcomes in each area of researcher development addressed in the original source frameworks. These percentages were calculated from mapping each source element to one or more core learning outcomes. The detailed map of each element to a core learning objective(s) is in the S7 Appendix.

Table 7. Original Framework Elements Mapped to Areas of Researcher Development.

Framework Number Framework Article Foundational Disciplinary Knowledge Practical and Cognitive Research Skills Ethical and Responsible Research Practices Research Communication Skills Interpersonal Research Skills Researcher Self-Beliefs and Attitudes Knowledge/Skills to Pursue a Research or Research-Related Career Knowledge/Skills to Administer and Manage Research Projects/Teams
2 Competency-based assessment for the training of PhD students and early-career scientists. [23] 50.0% 29.2% 33.3% 50.0% 30.0% 20.0% 50.0% 42.9%
3 Researcher Skill Development Framework (US English Edition) [24] 0.0% 25.0% 8.3% 33.3% 20.0% 0.0% 0.0% 14.3%
4 National Postdoctoral Association Core Competencies [5] 33.3% 33.3% 58.3% 66.7% 60.0% 20.0% 25.0% 71.4%
5 The Basic Competencies of Biological Experimentation: Concept-Skill Statements. [25] 50.0% 70.8% 33.3% 66.7% 0.0% 0.0% 0.0% 14.3%
6 Development and National Validation of a Tool for Interpreting the Vision and Change Core Competencies [26] 50.0% 79.2% 91.7% 100.0% 50.0% 20.0% 0.0% 0.0%
7 Qualitative Investigation to Identify the Knowledge and Skills That U.S.-Trained Doctoral Chemists Require in Typical Chemistry Positions [27] 16.7% 29.2% 33.3% 50.0% 30.0% 20.0% 0.0% 28.6%
8 Assessment in Undergraduate Research: The EvaluateUR Method [28] 66.7% 33.3% 25.0% 33.3% 60.0% 50.0% 75.0% 0.0%
9 Entering Research Learning Assessment (ERLA): Validity Evidence for an Instrument to Measure Undergraduate and Graduate Research Trainee Development. [15] 16.7% 20.8% 16.7% 100.0% 30.0% 30.0% 75.0% 28.6%
10 Towards a framework for research career development: An evaluation of the UK’s Vitae Researcher Development Framework. [29] 16.7% 8.3% 0.0% 33.3% 20.0% 10.0% 50.0% 42.9%
11 A competency framework for Ph.D. programs in health information management. [30] 0.0% 25.0% 0.0% 33.3% 0.0% 0.0% 0.0% 0.0%
12 Research competencies for undergraduate rehabilitation students: a scoping review [31] 50.0% 45.8% 25.0% 50.0% 10.0% 30.0% 0.0% 14.3%
13 Competency-based postdoctoral research training for clinical psychologists: An example and implications. [32] 50.0% 12.5% 16.7% 50.0% 10.0% 10.0% 0.0% 42.9%
14 Commonly known, commonly not known, totally unknown: a framework for students becoming researchers [33] 0.0% 20.8% 0.0% 33.3% 0.0% 0.0% 0.0% 0.0%
15 Development of the Scientific Research Competency Scale for nurses [34] 33.3% 37.5% 50.0% 33.3% 20.0% 50.0% 25.0% 57.1%
16 Social-Scientific Research Competency. [35] 0.0% 29.2% 0.0% 0.0% 0.0% 20.0% 25.0% 28.6%
17 Evaluating the development of chemistry undergraduate researchers’ scientific thinking skills using performance-data: first findings from the performance assessment of undergraduate research (PURE) instrument. [36] 16.7% 37.5% 0.0% 16.7% 0.0% 0.0% 0.0% 0.0%
18 Assessment of Undergraduate Research Learning Outcomes: Poster Presentations as Artifacts. [37] 0.0% 12.5% 16.7% 33.3% 10.0% 10.0% 0.0% 0.0%
19 Objectivity of the subjective quality: Convergence on competencies expected of doctoral graduates [38] 0.0% 0.0% 0.0% 16.7% 0.0% 10.0% 0.0% 0.0%
20 Experiences of using the researching professional development framework [39] 0.0% 4.2% 0.0% 50.0% 10.0% 20.0% 0.0% 14.3%
21 Developing a Competency Framework for Population Health Graduate Students Through Student and Faculty Collaboration. [40] 16.7% 0.0% 0.0% 83.3% 0.0% 0.0% 0.0% 14.3%
22 Professional learning and development framework for postdoctoral scholars [41] 0.0% 4.2% 8.3% 100.0% 20.0% 20.0% 50.0% 28.6%
23 Development and psychometric testing of the Research Competency Scale for Nursing Students: An instrument design study [42] 16.7% 29.2% 8.3% 33.3% 0.0% 0.0% 0.0% 0.0%
24 Bioinformatics core competencies for undergraduate life sciences education. [43] 33.3% 20.8% 25.0% 0.0% 0.0% 0.0% 0.0% 0.0%
25 A systematic review of doctoral graduate attributes: Domains and definitions [44] 33.3% 33.3% 25.0% 100.0% 20.0% 70.0% 0.0% 14.3%
26 A structured professional development curriculum for postdoctoral fellows leads to recognized knowledge growth [45] 0.0% 0.0% 8.3% 0.0% 10.0% 20.0% 0.0% 28.6%
27 Guidelines for competency development and measurement in rehabilitation psychology postdoctoral training [46] 16.7% 25.0% 16.7% 33.3% 0.0% 0.0% 0.0% 0.0%
28 Are You Doing It Backward? Improving Information Literacy Instruction Using the AALL Principles and Standards for Legal Research Competency, Taxonomies, and Backward Design. [47] 66.7% 58.3% 33.3% 50.0% 10.0% 10.0% 0.0% 28.6%
29 Evaluating research-oriented teaching: a new instrument to assess university students’ research competences. [48] 33.3% 29.2% 8.3% 33.3% 0.0% 0.0% 0.0% 0.0%
30 Research skills for university students’ thesis in E-learning: Scale development and validation in Peru [49] 16.7% 20.8% 0.0% 16.7% 0.0% 0.0% 0.0% 0.0%
31 Evaluating Undergraduate Research Experiences—Development of a Self-Report Tool. [50] 66.7% 41.7% 0.0% 50.0% 20.0% 10.0% 0.0% 0.0%
32 Evaluating a Summer Undergraduate Research Program: Measuring Student Outcomes and Program Impact. [51] 50.0% 41.7% 25.0% 33.3% 40.0% 50.0% 0.0% 0.0%
33 Threshold concepts in research education and evidence of threshold crossing [52] 33.3% 20.8% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
34 Faculty Mentors’, Graduate Students’, and Performance-Based Assessments of Students’ Research Skill Development [54] 16.7% 25.0% 0.0% 0.0% 0.0% 0.0% 0.0% 14.3%
35 Postdocs’ lab engagement predicts trajectories of PhD students’ skill development. [54] 0.0% 29.2% 8.3% 33.3% 0.0% 0.0% 0.0% 0.0%
36 Development of the Research Competencies Scale [55] 50.0% 50.0% 50.0% 33.3% 0.0% 0.0% 0.0% 0.0%
37 Competency-Based Postdoctoral Education [56] 16.7% 33.3% 8.3% 66.7% 10.0% 0.0% 0.0% 14.3%
38 An Exploratory Investigation of the Research Self-Efficacy, Interest in Research, and Research Knowledge of Ph.D. in Education Students [57] 16.7% 12.5% 8.3% 33.3% 0.0% 0.0% 0.0% 0.0%
39 Inquiry Experiences to the NACE Career Readiness Competencies [58] 0.0% 29.2% 25.0% 66.7% 50.0% 40.0% 25.0% 14.3%
40 Development and implementation of a competency-based module for teaching research methodology to medical undergraduates [59] 0.0% 33.3% 8.3% 16.7% 20.0% 0.0% 0.0% 0.0%
41 Development of a structured undergraduate research experience: Framework and implications. [60] 0.0% 8.3% 0.0% 33.3% 20.0% 10.0% 0.0% 28.6%
42 Criteria for academic bachelor’s and master’s curricula. [61] 100.0% 62.5% 16.7% 50.0% 30.0% 70.0% 25.0% 0.0%
43 Doctoral conceptual thresholds in cellular and molecular biology. [62] 0.0% 8.3% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0%
44 Toward a conceptual framework for measuring the effectiveness of course-based undergraduate research experiences in undergraduate biology. [63] 16.7% 4.2% 0.0% 100.0% 10.0% 0.0% 0.0% 0.0%
45 Building sustainability research competencies through scaffolded pathways for undergraduate research experience. [64] 16.7% 4.2% 0.0% 0.0% 20.0% 10.0% 0.0% 14.3%
46 Developing Research Competence to Support Evidence-Based Practice. [65] 0.0% 33.3% 0.0% 66.7% 0.0% 0.0% 0.0% 14.3%
47 Evaluator Competencies: What’s Taught Versus What’s Sought. [66] 16.7% 25.0% 0.0% 66.7% 0.0% 0.0% 0.0% 14.3%
48 Evaluating Mastery of Biostatistics for Medical Researchers: Need for a New Assessment Tool [67] 0.0% 29.2% 8.3% 33.3% 0.0% 0.0% 0.0% 0.0%
49 Application of the competency model to clinical health psychology [68] 33.3% 12.5% 8.3% 33.3% 20.0% 0.0% 0.0% 0.0%
50 Competency-Based Veterinary Education and Assessment of the Professional Competencies [69] 16.7% 4.2% 0.0% 0.0% 0.0% 20.0% 0.0% 0.0%
51 Climbing the stairway to competency: Trainee perspectives on competency development. [70] 16.7% 29.2% 0.0% 33.3% 0.0% 0.0% 0.0% 0.0%
52 MIA Board white paper: definition of biomedical informatics and specification of core competencies for graduate education in the discipline [71] 66.7% 41.7% 8.3% 16.7% 10.0% 0.0% 0.0% 42.9%
53 Building Interdisciplinary Research Models: A Didactic Course to Prepare Interdisciplinary Scholars and Faculty [72] 16.7% 12.5% 0.0% 66.7% 60.0% 0.0% 0.0% 0.0%
54 Applying the Cube Model to Pediatric Psychology: Development of Research Competency Skills at the Doctoral Level [73] 66.7% 20.8% 16.7% 33.3% 30.0% 0.0% 0.0% 0.0%
55 Information Literacy Competency Standards for Higher Education. [74] 16.7% 50.0% 41.7% 50.0% 10.0% 0.0% 0.0% 14.3%
56 Developing a Scoring Rubric for Resident Research Presentations: A Pilot Study [75] 0.0% 16.7% 0.0% 16.7% 10.0% 0.0% 0.0% 0.0%
57 Core Competencies for Research Training in the Clinical Pharmaceutical Sciences [76]] 33.3% 25.0% 16.7% 16.7% 0.0% 0.0% 0.0% 14.3%

Discussion

To our knowledge, the CRDF presented here is the first published framework that synthesizes common outcomes across multiple disciplines and addresses undergraduate through postdoctoral training career stages. The CRDF will play an important role in standardizing programmatic, assessment and evaluation efforts, as well as demystify the researcher development process for mentees and mentors. Though feedback and evidence of validity were gathered from the research community in the United States, 34% of the source frameworks were published by scholars outside the United States suggesting the CRDF will be applicable globally. Below we detail needs that this new framework will address and how it will benefit the research community at large.

The CRDF will identify focus areas and potential gaps in existing frameworks

Currently available frameworks tend to address only a subset of the areas of researcher development and be discipline and/or career stage specific. Focus areas and gaps in the 56 frameworks used to develop the new framework are revealed in Table 7, where the elements of these source frameworks are mapped to the eight areas of researcher development in the CRDF and in the S7 Appendix where they have been mapped in greater detail to the individual 79 learning outcomes. For example, the map in Table 7 reveals that several frameworks include a focus on Research Communication Skills, while many fewer frameworks address developing Knowledge and Skills to Pursue a Research or Research-Related Career. Using the CRDF, research training program directors can compare any framework they’re using to design and/or evaluate their training programs to the CRDF to identify focus areas and gaps they may need to address.

The CRDF will align the training and performance expectations of multiple stakeholders engaged in research training

The learning outcomes in the CRDF are meant to represent a relative consensus across disciplines to promote a shared understanding of the knowledge, skills, and psychosocial attitudes, behaviors, and beliefs that should be developed through research training among the multiple stakeholders (e.g., mentees, mentors, training programs, disciplinary communities, funders) involved in developing researchers and the research workforce. The core learning outcomes provide the means to track mentee development across training career stages and disciplines, as well as the structure needed to coordinate training across institutions. Training programs that use the CRDF to design their programs and assess development of their mentees will be better positioned to collaborate and provide support for mentees in transition from one program to the next and therefore contribute to national efforts to develop the research workforce. Those in STEM will also be better positioned to address the recommendations outlined in two recent National Academies of Science, Engineering and Medicine (NASEM) reports on undergraduate [3] and graduate research [4] training, which were referenced in developing the CRDF.

The CRDF will level the playing field for mentees by making the expectations in research training transparent

The lack of structure in research learning experiences, especially apprentice-style research learning experiences, has created an irregular and hidden curriculum that disadvantages students with limited research backgrounds (e.g., first-generation college students). Novice research mentees often lack the knowledge and social capital they need to successfully navigate the research environment and are consequently more likely to encounter and struggle to meet expectations and navigate unanticipated challenges throughout their training journey [7780]. The core learning outcomes in the CRDF clarify what mentees should be learning during formal research training and can therefore empower them to meet expectations and successfully navigate the research training environment. The core outcomes also build mentees’ agency to take responsibility in designing their training journey by allowing them to self-assess their progress and to self-advocate for learning experiences that will support achievement of the learning outcomes. The CRDF can be shared with research mentees by including it in a student handbook or by using the Researcher Development Plan tool provided in the S2a and S2b Appendix, which can be implemented in conjunction with Individual Development Plans and Mentor-Mentee Compacts.

The CRDF will support the development of common metrics to measure and understand how researchers develop across training programs

Core learning outcomes synthesized in the CRDF provide the basis for developing measurement tools such as learning assessments, rubrics, and interview/focus group protocols that can be used in program evaluation and basic research on research training and researcher development. Evaluators can use the data generated by common metrics to provide targeted feedback to research training program directors and mentors about the efficacy of the research learning experiences they are providing to guide the continuous improvement of specific learning experiences for mentees and research training programs [81]. Researchers can use the tools in their investigations of researcher development and research learning experiences across multiple training sites, thus potentially discovering causal relationships and generalizable knowledge about researcher development that can be applied broadly. Common metrics used across multiple sites allow investigation of the mechanisms by which research training environments, specific learning experiences, and individual mentee characteristics contribute to or inhibit researcher development. Even when common metrics or frameworks are not used across programs, the CRDF may allow researchers and practitioners to align outcomes from different studies to investigate the impact of research training programs [3,8284].

Limitations

We were able to gather only limited input on the CRDF from the arts and humanities research community. Therefore, it is difficult to draw conclusions about the use of the framework in this community or about the disciplinary differences we observed between this community and the sciences. Nonetheless, since arts and humanities frameworks were used to construct the CRDF and we are confident in the feedback we did receive, we believe the CRDF can be used in these disciplines. Further testing in this community is needed to verify or highlight areas for revision in the framework for use in arts and humanities.

The majority, though not all the content validity data reported here was from one large research university. Likewise, there was a lack of racial/ethnic diversity in the national survey respondent pool and among the card sorting activity participants for whom we have this information. Therefore, as the framework is implemented across institutions, gaps resulting from the lack of diversity in the pool of researchers who provided input may emerge and modifications may be needed to make it more universally applicable.

Conclusion

We were able to define and document consensus across disciplines on core learning outcomes that articulate the knowledge, skills, and psychosocial attitudes, behaviors, and beliefs needed to become a researcher. The resulting comprehensive framework, the CRDF, transcends disciplines and formal training career stages. Adoption and adaptation of the CRDF will not only support individual research mentees and improvement of individual research training programs but also facilitate coordination of research workforce development across programs and training stages. The S2a and S2b Appendix includes two tools to facilitate CRDF use: 1) a tool for research training program directors to map their current program activities and assessments to identify potential gaps; and 2) a tool to complement individual development plans for mentees to use with their mentors and thesis committees in planning and mapping their development as a researcher over time.

Future directions

Our research team is committed to developing evaluation and assessment rubrics and instruments based on the CRDF. We welcome collaborators on this work and invite those interested to contact us.

Supporting information

S1 Appendix. Research Development Frameworks and Assessments included in Literature Review.

(PDF)

pone.0332587.s001.pdf (131KB, pdf)
S2a Appendix. Research Training Program Development and Mapping Tool.

(XLSX)

pone.0332587.s002.xlsx (49.9KB, xlsx)
S2b Appendix. Researcher Development Plan (RDP) Tool.

(XLSX)

pone.0332587.s003.xlsx (49.8KB, xlsx)
S3 Appendix. Code Themes, Definitions and Assignments.

(PDF)

pone.0332587.s004.pdf (541.7KB, pdf)
S4 Appendix. Evolution of Learning Outcomes for CRDF.

(PDF)

pone.0332587.s005.pdf (90.4KB, pdf)
S5 Appendix. Dissemination Contacts for National Survey.

(PDF)

pone.0332587.s006.pdf (42.8KB, pdf)
S6 Appendix. Open Card Sorting Results.

(PDF)

pone.0332587.s007.pdf (95.9KB, pdf)
S7 Appendix. Framework Mapping to CRDF.

(XLSX)

pone.0332587.s008.xlsx (834.9KB, xlsx)

Acknowledgments

We thank undergraduate student Shefali Bhatt for her help with the literature search and our colleagues Drs. Amber Smith, Melissa McDaniels, Christine Pfund, and Fatima Sancheznieto for their feedback on a preliminary draft of this manuscript.

Data Availability

All relevant data are within the paper and its Supporting Information files.

Funding Statement

The author(s) received no specific funding for this work.

References

  • 1.Sanford N. Where Colleges Fail: A study of the student as a person. SanFrancisco: Jossey-Bass; 1967. [Google Scholar]
  • 2.Sanford N. Self and Society. Routledge. 2017. doi: 10.4324/9781315129112 [DOI] [Google Scholar]
  • 3.Gentile J, Brenner K, Stephens A, editors. Undergraduate Research Experiences for STEM Students: Successes, Challenges, and Opportunities. Washington, D.C.: National Academies Press; 2017. doi: 10.17226/24622 [DOI] [Google Scholar]
  • 4.Leshner A, Scherer L, editors. Graduate STEM Education for the 21st Century. Washington, D.C.: National Academies Press; 2018. doi: 10.17226/25038 [DOI] [Google Scholar]
  • 5.NPA Core Competencies. [cited 25 Apr 2025]. Available: https://www.nationalpostdoc.org/page/CoreCompetencies
  • 6.Research Training Framework for Doctoral Students. 30 Oct 2014 [cited 25 Jun 2025]. Available: https://www.ukri.org/publications/research-training-framework-for-doctoral-students/
  • 7.Shulman LS. Ways of seeing, ways of knowing: ways of teaching, ways of learning about teaching. Journal of Curriculum Studies. 1991;23(5):393–5. doi: 10.1080/0022027910230501 [DOI] [Google Scholar]
  • 8.Nicole F, Deboer J. A Systematized Literature Review of the Factors that Predict the Retention of Racially Minoritized Students in STEM Graduate Degree Programs. 2020. Available: https://peer.asee.org/a-systematized-literature-review-of-the-factors-that-predict-the-retention-of-racially-minoritized-students-in-stem-graduate-degree-programs
  • 9.Young SN, Vanwye WR, Schafer MA, Robertson TA, Poore AV. Factors Affecting PhD Student Success. Int J Exerc Sci. 2019;12(1):34–45. doi: 10.70252/CEJT2520 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Sverdlik A, C. Hall N, McAlpine L, Hubbard K. The PhD Experience: A Review of the Factors Influencing Doctoral Students’ Completion, Achievement, and Well-Being. IJDS. 2018;13:361–88. doi: 10.28945/4113 [DOI] [Google Scholar]
  • 11.Mbonyiryivuze A, Dorimana A, Nyirahabimana P, Nsabayezu E. Challenges Affecting Women PhD Candidates for Completion of Doctoral Educations: A Synthesis of the Literature. African Journal of Educational Studies in Mathematics and Sciences. 2023;19:123–34. [Google Scholar]
  • 12.Muchaku S, Mwale M, Magaiza G, Tjale MM. No doctoral studies without hurdles: A review on pathways to prevent dropouts. IJER. 2024;6:1–12. doi: 10.38140/ijer-2024.vol6.14 [DOI] [Google Scholar]
  • 13.Rigler KL, Bowlin LK, Sweat K, Watts S, Throne R. Agency, Socialization, and Support: A Critical Review of Doctoral Student Attrition. Online Submission. 2017. Available: https://eric.ed.gov/?id=ED580853 [Google Scholar]
  • 14.Layton RL, Brandt PD, Freeman AM, Harrell JR, Hall JD, Sinche M. Diversity Exiting the Academy: Influential Factors for the Career Choice of Well-Represented and Underrepresented Minority Scientists. CBE Life Sci Educ. 2016;15(3):ar41. doi: 10.1187/cbe.16-01-0066 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Butz AR, Branchaw JL. Entering Research Learning Assessment (ERLA): Validity Evidence for an Instrument to Measure Undergraduate and Graduate Research Trainee Development. CBE Life Sci Educ. 2020;19(2):ar18. doi: 10.1187/cbe.19-07-0146 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Luft JA, Jeong S, Idsardi R, Gardner G. Literature Reviews, Theoretical Frameworks, and Conceptual Frameworks: An Introduction for New Biology Education Researchers. CBE Life Sci Educ. 2022;21(3):rm33. doi: 10.1187/cbe.21-05-0134 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Cooper HM. Synthesizing research: A guide for literature reviews. Sage; 1998. [Google Scholar]
  • 18.American Educational Research Association, American Psychological Association, National Council on Measurement in Education, editors. Standards for Educational and Psychological Testing. Washington, D.C: American Educational Research Association; 2014. [Google Scholar]
  • 19.Creswell JW. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. 3rd ed. Thousand Oaks, CA: Sage; 2009. [Google Scholar]
  • 20.Spencer D. Card Sorting: Designing Usable Categories. Rosenfeld; 2009. Available: https://rosenfeldmedia.com/books/card-sorting/
  • 21.Spencer D. Card sorting analysis spreadsheet | Maadmob. 2007. Available: https://maadmob.com.au/resources/card_sort_analysis_spreadsheet
  • 22.UX Testing & Research Tools | Proven By Users. [cited 22 Apr 2025]. Available: https://provenbyusers.com/
  • 23.Verderame MF, Freedman VH, Kozlowski LM, McCormack WT. Competency-based assessment for the training of PhD students and early-career scientists. Elife. 2018;7:e34801. doi: 10.7554/eLife.34801 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Willison J, O’Regan K, Kuhn SK. Researcher Skill Development Framework (US English Edition). 2018. [Google Scholar]
  • 25.Pelaez N, Anderson T, Gardner S, Yin Y, Abraham J, Bartlett E, et al. The Basic Competencies of Biological Experimentation: Concept-Skill Statements. PIBERG Instructional Innovation Materials. 2016. Available: https://docs.lib.purdue.edu/pibergiim/4
  • 26.Clemmons AW, Timbrook J, Herron JC, Crowe AJ. BioSkills Guide: Development and National Validation of a Tool for Interpreting the Vision and Change Core Competencies. CBE Life Sci Educ. 2020;19(4):ar53. doi: 10.1187/cbe.19-11-0259 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Cui Q, Harshman J. Qualitative Investigation to Identify the Knowledge and Skills That U.S.-Trained Doctoral Chemists Require in Typical Chemistry Positions. J Chem Educ. 2020;97(5):1247–55. doi: 10.1021/acs.jchemed.9b01027 [DOI] [Google Scholar]
  • 28.Singer J, Weiler D, Zimmerman B, Fox S, Ambos E. Assessment in Undergraduate Research. Earth Sciences Faculty Publications. 2022. doi: 10.1017/9781108869508 [DOI] [Google Scholar]
  • 29.Bray R, Boon S. Towards a framework for research career development. International Journal for Researcher Development. 2011;2(2):99–116. doi: 10.1108/17597511111212709 [DOI] [Google Scholar]
  • 30.Ahmadi M, Sheikhtaheri A, Tahmasbi F, Eslami Jahromi M, Rangraz Jeddi F. A competency framework for Ph.D. programs in health information management. Int J Med Inform. 2022;168:104906. doi: 10.1016/j.ijmedinf.2022.104906 [DOI] [PubMed] [Google Scholar]
  • 31.Charumbira MY, Berner K, Louw QA. Research competencies for undergraduate rehabilitation students: A scoping review. AJHPE. 2021;13(1):52. doi: 10.7196/ajhpe.2021.v13i1.1229 [DOI] [Google Scholar]
  • 32.Drotar D, Cortina S, Crosby LE, Hommel KA, Modi AC, Pai ALH. Competency-based postdoctoral research training for clinical psychologists: An example and implications. Training and Education in Professional Psychology. 2015;9(2):92–8. doi: 10.1037/tep0000032 [DOI] [Google Scholar]
  • 33.Willison J, O’Regan K. Commonly known, commonly not known, totally unknown: a framework for students becoming researchers. Higher Education Research & Development. 2007;26(4):393–409. doi: 10.1080/07294360701658609 [DOI] [Google Scholar]
  • 34.Duru P, Örsal Ö. Development of the Scientific Research Competency Scale for nurses. J Res Nurs. 2021;26(7):684–700. doi: 10.1177/17449871211020061 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Gess C, Geiger C, Ziegler M. Social-Scientific Research Competency. European Journal of Psychological Assessment. 2019;35(5):737–50. doi: 10.1027/1015-5759/a000451 [DOI] [Google Scholar]
  • 36.Harsh J, Esteb JJ, Maltese AV. Evaluating the development of chemistry undergraduate researchers’ scientific thinking skills using performance-data: first findings from the performance assessment of undergraduate research (PURE) instrument. Chem Educ Res Pract. 2017;18(3):472–85. doi: 10.1039/c6rp00222f [DOI] [Google Scholar]
  • 37.Hayes-Harb R, St. Andre M, Shannahan M. Assessment of Undergraduate Research Learning Outcomes: Poster Presentations as Artifacts. SPUR. 2020;3(4):55–61. doi: 10.18833/spur/3/4/10 [DOI] [Google Scholar]
  • 38.Kariyana I, Sonn RA, Marongwe N. Objectivity of the subjective quality: Convergence on competencies expected of doctoral graduates. Cogent Education. 2017;4(1):1390827. doi: 10.1080/2331186x.2017.1390827 [DOI] [Google Scholar]
  • 39.Lindsay H, Floyd A. Experiences of using the researching professional development framework. SGPE. 2019;10(1):54–68. doi: 10.1108/sgpe-02-2019-049 [DOI] [Google Scholar]
  • 40.Miller L, Brushett S, Ayn C, Furlotte K, Jackson L, MacQuarrie M, et al. Developing a Competency Framework for Population Health Graduate Students Through Student and Faculty Collaboration. Pedagogy in Health Promotion. 2019;7(3):280–8. doi: 10.1177/2373379919859607 [DOI] [Google Scholar]
  • 41.Nowell L, Dhingra S, Kenny N, Jacobsen M, Pexman P. Professional learning and development framework for postdoctoral scholars. SGPE. 2021;12(3):353–70. doi: 10.1108/sgpe-10-2020-0067 [DOI] [Google Scholar]
  • 42.Qiu C, Feng X, Reinhardt JD, Li J. Development and psychometric testing of the Research Competency Scale for Nursing Students: An instrument design study. Nurse Educ Today. 2019;79:198–203. doi: 10.1016/j.nedt.2019.05.039 [DOI] [PubMed] [Google Scholar]
  • 43.Wilson Sayres MA, Hauser C, Sierk M, Robic S, Rosenwald AG, Smith TM, et al. Bioinformatics core competencies for undergraduate life sciences education. PLoS One. 2018;13(6):e0196878. doi: 10.1371/journal.pone.0196878 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Senekal JS, Munnik E, Frantz JM. A systematic review of doctoral graduate attributes: Domains and definitions. Front Educ. 2022;7. doi: 10.3389/feduc.2022.1009106 [DOI] [Google Scholar]
  • 45.Steen K, Vornhagen J, Weinberg ZY, Boulanger-Bertolus J, Rao A, Gardner ME, et al. A structured professional development curriculum for postdoctoral fellows leads to recognized knowledge growth. PLoS One. 2021;16(11):e0260212. doi: 10.1371/journal.pone.0260212 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Stiers W, Barisa M, Stucky K, Pawlowski C, Van Tubbergen M, Turner AP, et al. Guidelines for competency development and measurement in rehabilitation psychology postdoctoral training. Rehabil Psychol. 2015;60(2):111–22. doi: 10.1037/a0038353 [DOI] [PubMed] [Google Scholar]
  • 47.Talley NB. Are you doing it backward? Improving information literacy instruction using the AALL principles and standards for legal research competency, taxonomies, and backward design. Law Libr J. 2014;106:47–68. [Google Scholar]
  • 48.Böttcher F, Thiel F. Evaluating research-oriented teaching: a new instrument to assess university students’ research competences. High Educ. 2017;75(1):91–110. doi: 10.1007/s10734-017-0128-y [DOI] [Google Scholar]
  • 49.Ipanaqué-Zapata M, Figueroa-Quiñones J, Bazalar-Palacios J, Arhuis-Inca W, Quiñones-Negrete M, Villarreal-Zegarra D. Research skills for university students’ thesis in E-learning: Scale development and validation in Peru. Heliyon. 2023;9(3):e13770. doi: 10.1016/j.heliyon.2023.e13770 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Maltese A, Harsh J, Jung E. Evaluating Undergraduate Research Experiences—Development of a Self-Report Tool. Education Sciences. 2017;7(4):87. doi: 10.3390/educsci7040087 [DOI] [Google Scholar]
  • 51.Singer J, Zimmerman B. Evaluating a summer undergraduate research program: measuring student outcomes and program impact. Counc Undergrad Res Q. 2012;32:40–7. [Google Scholar]
  • 52.Kiley M, Wisker G. Threshold concepts in research education and evidence of threshold crossing. Higher Education Research & Development. 2009;28(4):431–41. doi: 10.1080/07294360903067930 [DOI] [Google Scholar]
  • 53.Feldon DF, Maher MA, Hurst M, Timmerman B. Faculty Mentors’, Graduate Students’, and Performance-Based Assessments of Students’ Research Skill Development. American Educational Research Journal. 2015;52(2):334–70. doi: 10.3102/0002831214549449 [DOI] [Google Scholar]
  • 54.Feldon DF, Litson K, Jeong S, Blaney JM, Kang J, Miller C, et al. Postdocs’ lab engagement predicts trajectories of PhD students’ skill development. Proc Natl Acad Sci U S A. 2019;116(42):20910–6. doi: 10.1073/pnas.1912488116 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Swank JM, Lambie GW. Development of the Research Competencies Scale. Measurement and Evaluation in Counseling and Development. 2016;49(2):91–108. doi: 10.1177/0748175615625749 [DOI] [Google Scholar]
  • 56.Carnethon MR, Neubauer LC, Greenland P. Competency-Based Postdoctoral Education. Circulation. 2019;139(3):310–2. doi: 10.1161/CIRCULATIONAHA.118.037494 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Lambie GW, Hayes BG, Griffith C, Limberg D, Mullen PR. An Exploratory Investigation of the Research Self-Efficacy, Interest in Research, and Research Knowledge of Ph.D. in Education Students. Innov High Educ. 2013;39(2):139–53. doi: 10.1007/s10755-013-9264-1 [DOI] [Google Scholar]
  • 58.Mekolichick J. Mapping the Impacts of Undergraduate Research, Scholarship, and Creative Inquiry Experiences to the NACE Career Readiness Competencies. NACE Journal. 2021;82: 34–40. Available: https://ebiztest.naceweb.org/career-readiness/competencies/mapping-the-impacts-of-undergraduate-research-scholarship-and-creative-inquiry-experiences-to-the-nace-career-readiness-competencies/ [Google Scholar]
  • 59.Patra S, Khan AM. Development and implementation of a competency-based module for teaching research methodology to medical undergraduates. J Educ Health Promot. 2019;8:164. doi: 10.4103/jehp.jehp_133_19 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Brown AM, Lewis SN, Bevan DR. Development of a structured undergraduate research experience: Framework and implications. Biochem Mol Biol Educ. 2016;44(5):463–74. doi: 10.1002/bmb.20975 [DOI] [PubMed] [Google Scholar]
  • 61.Meijers AWM, Borghuis VAJ, Mutsaers EJPJ, Overveld van CWAM, Perrenet JC. 2e, gew. dr. ed. Eindhoven: Technische Universiteit Eindhoven. 2005. [Google Scholar]
  • 62.Feldon DF, Rates C, Sun C. Doctoral conceptual thresholds in cellular and molecular biology. International Journal of Science Education. 2017;39(18):2574–93. doi: 10.1080/09500693.2017.1395493 [DOI] [Google Scholar]
  • 63.Brownell SE, Kloser MJ. Toward a conceptual framework for measuring the effectiveness of course-based undergraduate research experiences in undergraduate biology. Studies in Higher Education. 2015;40(3):525–44. doi: 10.1080/03075079.2015.1004234 [DOI] [Google Scholar]
  • 64.Elder S, Wittman H, Giang A. Building sustainability research competencies through scaffolded pathways for undergraduate research experience. Elem Sci Anth. 2023;11(1). doi: 10.1525/elementa.2022.00091 [DOI] [Google Scholar]
  • 65.Burke LE, Schlenk EA, Sereika SM, Cohen SM, Happ MB, Dorman JS. Developing research competence to support evidence-based practice. J Prof Nurs. 2005;21(6):358–63. doi: 10.1016/j.profnurs.2005.10.011 [DOI] [PubMed] [Google Scholar]
  • 66.Dewey JD, Montrosse BE, Schröter DC, Sullins CD, Mattox JR II. Evaluator Competencies. American Journal of Evaluation. 2008;29(3):268–87. doi: 10.1177/1098214008321152 [DOI] [Google Scholar]
  • 67.Enders F. Evaluating mastery of biostatistics for medical researchers: need for a new assessment tool. Clin Transl Sci. 2011;4(6):448–54. doi: 10.1111/j.1752-8062.2011.00323.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.France CR, Masters KS, Belar CD, Kerns RD, Klonoff EA, Larkin KT, et al. Application of the competency model to clinical health psychology. Professional Psychology: Research and Practice. 2008;39(6):573–80. doi: 10.1037/0735-7028.39.6.573 [DOI] [Google Scholar]
  • 69.Hodgson JL, Pelzer JM, Inzana KD. Beyond NAVMEC: competency-based veterinary education and assessment of the professional competencies. J Vet Med Educ. 2013;40(2):102–18. doi: 10.3138/jvme.1012-092R [DOI] [PubMed] [Google Scholar]
  • 70.Kamen C, Veilleux JC, Bangen KJ, VanderVeen JW, Klonoff EA. Climbing the stairway to competency: Trainee perspectives on competency development. Training and Education in Professional Psychology. 2010;4(4):227–34. doi: 10.1037/a0021092 [DOI] [Google Scholar]
  • 71.Kulikowski CA, Shortliffe EH, Currie LM, Elkin PL, Hunter LE, Johnson TR, et al. AMIA Board white paper: definition of biomedical informatics and specification of core competencies for graduate education in the discipline. J Am Med Inform Assoc. 2012;19(6):931–8. doi: 10.1136/amiajnl-2012-001053 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Larson EL, Landers TF, Begg MD. Building interdisciplinary research models: a didactic course to prepare interdisciplinary scholars and faculty. Clin Transl Sci. 2011;4(1):38–41. doi: 10.1111/j.1752-8062.2010.00258.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Madan-Swain A, Hankins SL, Gilliam MB, Ross K, Reynolds N, Milby J, et al. Applying the cube model to pediatric psychology: development of research competency skills at the doctoral level. J Pediatr Psychol. 2012;37(2):136–48. doi: 10.1093/jpepsy/jsr096 [DOI] [PubMed] [Google Scholar]
  • 74.American Library Association. Information Literacy Competency Standards for Higher Education. Jan 2000. Available: https://alair.ala.org/items/294803b6-2521-4a96-a044-96976239e3fb
  • 75.Musial JL, Rubinfeld IS, Parker AO, Reickert CA, Adams SA, Rao S, et al. Developing a scoring rubric for resident research presentations: a pilot study. J Surg Res. 2007;142(2):304–7. doi: 10.1016/j.jss.2007.03.060 [DOI] [PubMed] [Google Scholar]
  • 76.Poloyac SM, Empey KM, Rohan LC, Skledar SJ, Empey PE, Nolin TD, et al. Core competencies for research training in the clinical pharmaceutical sciences. Am J Pharm Educ. 2011;75(2):27. doi: 10.5688/ajpe75227 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Bauer KW, Bennett JS. Alumni Perceptions Used to Assess Undergraduate Research Experience. The Journal of Higher Education. 2003;74(2):210–30. doi: 10.1080/00221546.2003.11777197 [DOI] [Google Scholar]
  • 78.Hurtado S, Eagan MK, Cabrera NL, Lin MH, Park J, Lopez M. Training Future Scientists: Predicting First-year Minority Student Participation in Health Science Research. Res High Educ. 2008;49(2):126–52. doi: 10.1007/s11162-007-9068-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Mau WCJ. Characteristics of US students that pursued a STEM major and factors that predicted their persistence in degree completion. Universal Journal of Educational Research. 2016;4:1495–500. [Google Scholar]
  • 80.Carver S, Sickle JV, Holcomb JP, Quinn C, Jackson DK, Resnick AH, et al. Operation STEM: increasing success and improving retention among first-generation and underrepresented minority students in STEM. Journal of STEM Education: Innovations and Research. 2017;18. [Google Scholar]
  • 81.Denecke D, Kent J, McCarthy MT. Articulating Learning Outcomes in Doctoral Education. Washington, D.C: Council of Graduate Schools; 2017. Available: https://cgsnet.org/wp-content/uploads/2022/01/ArticulatingLearningOutcomesinDoctoralEducationWeb-2.pdf [Google Scholar]
  • 82.Crowe M, Brakke D. Assessing the impact of undergraduate-research experiences on students: An overview of current literature. Council on Undergraduate Research Quarterly. 2008;28. Available: https://cdn.serc.carleton.edu/files/NAGTWorkshops/undergraduate_research/cur_publication_summer_2008.pdf
  • 83.Linn MC, Palmer E, Baranger A, Gerard E, Stone E. Education. Undergraduate research experiences: impacts and opportunities. Science. 2015;347(6222):1261757. doi: 10.1126/science.1261757 [DOI] [PubMed] [Google Scholar]
  • 84.Haeger H, Banks JE, Smith C, Armstrong-Land M. What We Know and What We Need to Know about Undergraduate Research. SPUR. 2020;3(4):62–9. doi: 10.18833/spur/3/4/4 [DOI] [Google Scholar]

Decision Letter 0

Amy Prunuske

7 Aug 2025

Dear Dr. Branchaw,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

==============================

Overall, this work generates a useful framework that can be used to support professional development of researchers. The reviewers suggested the authors clarify the results and some of the figures prior to publication.

==============================

Please submit your revised manuscript by Sep 21 2025 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols .

We look forward to receiving your revised manuscript.

Kind regards,

Amy Prunuske

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

3. If the reviewer comments include a recommendation to cite specific previously published works, please review and evaluate these publications to determine whether they are relevant and should be cited. There is no requirement to cite these works unless the editor has indicated otherwise. 

4. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments:

The reviewers were enthusiastic about the creation of the Comprehensive Research Development Framework, and noted several areas for improvement. I encourage the authors to address the reviewers' comments regarding the results section and Figure 4. The reviewers suggest that it might be helpful to include a flow chart and that the authors should consider being more explicit about generating a tool that demonstrated consensus across disciplines.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

Reviewer #1: Partly

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously? -->?>

Reviewer #1: N/A

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available??>

The PLOS Data policy

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English??>

Reviewer #1: No

Reviewer #2: Yes

**********

Reviewer #1: The manuscript by Branchaw, Butz, & Ayoob aims to develop a comprehensive framework to guide research training, primarily at the graduate and postdoctoral level. This is accomplished first by a literature review to establish a preliminary list of research competencies and then refined through an online survey and two rounds of card-sorting exercises by experienced researchers to organize the competencies into overall categories. The result is eight categories of research skills that contain 57 specific skills/competencies. This is an interesting study. The methodology is well thought out and the resulting framework they’ve developed has genuine applications for graduate student training across disciplines (and likely for post-docs as well). Unfortunately, the manuscript is seriously hobbled by a lack of clarity in the writing of the Results section. I’ve attempted to list specific problems below, but it might be worth the authors getting together to re-outline this section and then use that outline as a basis for a more readable Results section.

Lines 69-70: This is an important statement. Is there a citation?

It might be worthwhile for the authors to have a flow chart-like figure that explains their process of collecting competencies from the literature > surveying researchers > creating a refined list > open card sorting > creating categories > closed card sorting > refining the framework.

Regarding the card-sorting exercises. Is this an evidence-based approach? If so, what are the citations sorting the use of the open and closed sorting approaches.

It feels like the Results are missing a section about their literature review.

Lines 409-412 - Information about the 79 outcomes prepared for the survey and the 75 that actually appeared in the survey due to a survey error belongs in the Methods section. For the results section, the authors should present the data as though there were a list of 75 outcomes since that's what there is actually data on.

Figure 4 - It is not clear what figure 4 is actually supposed to show us or if it is even that important. Is this meant to communicate the % of the 75 learning outcomes judged important by the survey (segregated by discipline)? Is this really something that is important to show? What outcomes were deemed important is certainly interesting and is presented in Table 4. Figure 4's message seems to be "there were things that researchers thought were important" and that's it. Consider omitting (or explaining it better).

Table 4 - Perhaps the title should be "Learning Outcomes ... by 80% or More of Overall Respondents". Also, what is the significance of the non-bolded numbers? Just that they are below 80%?

Lines 430-432: Are the details about what categories the relatively low scored (variable importance) outcomes really needed here. It distracts from the subsequent report about the higher scoring 50 outcomes.

Lines 434-440: I'm completely confused by this section. "Some variation" is reported by discipline but not what those differences are or whether they were statistically different. Then ratings were combined to minimize those differences. If there really are differences between disciplines, then it seems like something worth talking about and maybe analyzing with distribution stats. If not, then just talk about the combined data. Also, the last sentence presents a series of numbers in parentheses. Are these the learning outcomes in Table 1? If so, the format does not correspond (there are no 3.01 or 4.04 in Table 1).

Lines 454-455: Again, why this discussion on what wasn't highly scored.

Lines 501-502: Shouldn't the authors refer to Table 5? Also, I'm not sure that Tables 5 & 6 are in the right order. Or Table 5 is confusing. The process of going to phase 1 to phase 2 to the final 8 set of categories is shown in table 5, but the process of getting to the last set of 8 categories is based on data in Table 6. Either this needs to be explained better in the text or the order of the tables should be changed.

Lines 523-526: I'm not sure I understand this approach of dealing with the learning outcomes that showed weak consensus. It sounds arbitrary, but I suspect it was not. Two potential reasons for lack of consensus were that there were discipline-specific differences in which researchers thought these outcomes should be placed. Another is that the outcome could be placed in multiple categories because it was interpreted differently by different groups. If the latter, then perhaps the right approach by the authors would be to "split" the outcome into two outcomes with each reflecting the category that researchers matched them to. An interesting follow-up experiment would be to repeat the forced choice supporting and see if the split/renamed categories got sorted as the authors predicted/assumed.

Lines 549-557: How were the Table 7 data generated. I didn't see anything in the Methods that might explain it. Also, what exactly is it saying? Is this mapping how often a given outcome was discussed in the literature reviewed? Finally, doesn't this belong in the Results section?

Line 563 paragraph: It might be worth stating that the CDRF is meant to represent a relative consensus across disciplines.

Line 576 paragraph: The use of the CRDF to increase transparency is an important point. Could the authors discuss how to operationalize that (similar to how they discussed using the CRDF as an assessment tool in the subsequent section). Placed in a student handbook? Part of a program/mentor/student compact? Something else?

This is somewhat discussed in the limitations section, but regarding the data from the Arts & Humanities. How confident are the authors regarding this data given the small sample size? Perhaps these 5 are in a post-grad research career because they had this research training in their undergrad years.

Reviewer #2: “The Comprehensive Research Development Framework (CRDF): Core Learning Outcomes for Research Training” manuscript presents an important framework that the authors developed. The process for developing the framework was excellent – supplementing the authors’ deep expertise with input from multiple stakeholders in multiple ways. The framework will be useful to a range of stakeholders (e.g., program administrators, faculty members, students, postdoctoral scholars) across a wide range of fields – including this reviewer. The manuscript is well written and easy to follow.

Below are items that could be improved in the manuscript.

-Consistency or simplification of the headings would be helpful. For example, the methods sections has STEPS 1-4, with STEP 1 having four lettered subsections; the results section has two main subsections and one of those subsections lists the two phases of card sorting; and limitations, conclusions, and future directions are all short main sections.

-Somewhere in the manuscript it would be helpful to note the geographic comprehensiveness of the framework. For example, were most of the frameworks used in the literature review from the United States (or from the US and Europe) – or was there global representation from where the frameworks were developed? Is researcher development often done similarly globally, or is this framework likely most applicable to US researcher development?

-There are some items that could be clarified on the figures and tables.

*More detail in the captions would be helpful so that the tables and figures could stand alone.

*There are some superscripts in Table 6, and the descriptions of what those superscripts represent is missing (i.e., there are not footnotes below Table 6).

*Table 7 has a column titled “framework names,” but the items listed seem to be the article titles.

*Table 1 lists the learning outcomes as 1.1, 1.2, 1.3…, and the text of the manuscript adds a “0” before the learning outcomes below 10 (e.g., 3.01 on line 439).

**********

what does this mean? ). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy

Reviewer #1: No

Reviewer #2: Yes:  Meghann Jarchow

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org

PLoS One. 2025 Sep 15;20(9):e0332587. doi: 10.1371/journal.pone.0332587.r002

Author response to Decision Letter 1


26 Aug 2025

Editor- address the reviewers' comments regarding the results section and Figure 4

Response: See responses to individual reviewer comments below.

Editor - helpful to include a flow chart

Response: Figure 1 was revised and is now a more detailed flow chart.

Editor - consider being more explicit about generating a tool that demonstrated consensus across disciplines

Response: We added text indicating the CRDF transcends and demonstrates consensus across disciplines in multiple places in the article. See lines 32, 84, 590-1, and 654.

Reviewer #1 - the manuscript is seriously hobbled by a lack of clarity in the writing of the Results section

Response: The Methods and Results sections have been reorganized to be parallel, both following the detailed flow chart presented in figure 1. We believe this addresses the lack of clarity in the writing. The change necessitated flipping the numbering of Tables 3 & 4.

Reviewer #1 - This is an important statement. Is there a citation?

Response: Citations have been added in line 70.

Reviewer #1 - It might be worthwhile for the authors to have a flow chart-like figure that explains their process of collecting competencies from the literature > surveying researchers > creating a refined list > open card sorting > creating categories > closed card sorting > refining the framework.

Response: Figure 1 has been revised with more details as a flow chart. In addition, the Methods and Results sections have been reorganized to more explicitly reflect the process.

Reviewer #1 - Regarding the card-sorting exercises. Is this an evidence-based approach? If so, what are the citations sorting the use of the open and closed sorting approaches.

Response: The citation about use of card sorting activities has been added in line 327.

Reviewer #1 - It feels like the Results are missing a section about their literature review.

Response: A section about the literature search and review has been added to the Results in the new organization.

Reviewer #1 -Information about the 79 outcomes prepared for the survey and the 75 that actually appeared in the survey due to a survey error belongs in the Methods section. For the results section, the authors should present the data as though there were a list of 75 outcomes since that's what there is actually data on.

Response: The information about the 75 of the 79 outcomes that were included in the survey (4 were missing due to an error) has been moved from the Results to the Methods section.

The data presented in the Results now explicitly lists 75 learning outcomes in the legend of figure 4, where data collected about the importance of the learning outcomes is presented. Note the in figure 5 data from all 79 outcomes are included because all were asked about on the career stage survey questions.

Reviewer #1 - It is not clear what figure 4 is actually supposed to show us or if it is even that important. Is this meant to communicate the % of the 75 learning outcomes judged important by the survey (segregated by discipline)? Is this really something that is important to show? What outcomes were deemed important is certainly interesting and is presented in Table 4. Figure 4's message seems to be "there were things that researchers thought were important" and that's it. Consider omitting (or explaining it better).

Response: Figure 4 provides evidence of content validity from experts for the core learning outcomes. A statement explicitly explaining this has been added in lines 417 – 418.

Reviewer #1 - Perhaps the title should be "Learning Outcomes ... by 80% or More of Overall Respondents". Also, what is the significance of the non-bolded numbers? Just that they are below 80%?

Response: Table 4 was simplified to report only the overall importance ratings across disciplines and the bolded numbers were removed.

Reviewer #1 - Are the details about what categories the relatively low scored (variable importance) outcomes really needed here. It distracts from the subsequent report about the higher scoring 50 outcomes. I'm completely confused by this section. "Some variation" is reported by discipline but not what those differences are or whether they were statistically different. Then ratings were combined to minimize those differences. If there really are differences between disciplines, then it seems like something worth talking about and maybe analyzing with distribution stats. If not, then just talk about the combined data. Also, the last sentence presents a series of numbers in parentheses. Are these the learning outcomes in Table 1? If so, the format does not correspond (there are no 3.01 or 4.04 in Table 1). Again, why this discussion on what wasn't highly scored.

Response: We agree with the reviewers’ comments and have followed their recommendation to remove the details about possible small disciplinary differences and to focus on the main findings.

Reviewer #1 - Shouldn't the authors refer to Table 5? Also, I'm not sure that Tables 5 & 6 are in the right order. Or Table 5 is confusing. The process of going to phase 1 to phase 2 to the final 8 set of categories is shown in table 5, but the process of getting to the last set of 8 categories is based on data in Table 6. Either this needs to be explained better in the text or the order of the tables should be changed.

Response: Table 5 precedes table 6 because table 5 shows the results of the first (open) card sort and table 6 the results of the second (closed) card sort. Table 5 also shows the final categories, which were finalized based on the second (closed) card sort results. To clarify this, the table 5 column headers were modified, and the text has been modified to more explicitly describe what is in each table using “open” and “closed” card sorting (rather than first and second phase).

Reviewer #1 - I'm not sure I understand this approach of dealing with the learning outcomes that showed weak consensus. It sounds arbitrary, but I suspect it was not. Two potential reasons for lack of consensus were that there were discipline-specific differences in which researchers thought these outcomes should be placed. Another is that the outcome could be placed in multiple categories because it was interpreted differently by different groups. If the latter, then perhaps the right approach by the authors would be to "split" the outcome into two outcomes with each reflecting the category that researchers matched them to. An interesting follow-up experiment would be to repeat the forced choice supporting and see if the split/renamed categories got sorted as the authors predicted/assumed.

Response: The explanation for how we dealt with the learning outcomes for which there was low consensus in the closed card sorting exercise has been clarified in lines 534 – 539 by adding more detail about the decision-making process.

Reviewer #1 - How were the Table 7 data generated. I didn't see anything in the Methods that might explain it. Also, what exactly is it saying? Is this mapping how often a given outcome was discussed in the literature reviewed? Finally, doesn't this belong in the Results section?

Response: A section was added to the Methods “Back Map Source Framework Elements to Learning Outcomes and Categories to Confirm Coverage” to describe this process (lines 360 – 365), and Table 7 was moved from the Discussion to the Results section.

Reviewer #1 - It might be worth stating that the CDRF is meant to represent a relative consensus across disciplines.

Response: Great suggestion! We added text indicating the CRDF transcends and demonstrates consensus across disciplines in multiple places in the article. See lines 32, 84, 590-1, and 654.

Reviewer #1 - The use of the CRDF to increase transparency is an important point. Could the authors discuss how to operationalize that (similar to how they discussed using the CRDF as an assessment tool in the subsequent section). Placed in a student handbook? Part of a program/mentor/student compact? Something else?

Response: Specific examples of how the CRDF can be used for and with research mentees have been added in lines 617 – 620.

Reviewer #1 - This is somewhat discussed in the limitations section, but regarding the data from the Arts & Humanities. How confident are the authors regarding this data given the small sample size? Perhaps these 5 are in a post-grad research career because they had this research training in their undergrad years.

Response: We are confident in the Arts & Humanities data and have therefore included it, but because of the small sample size we are reluctant to draw sweeping conclusions from it. We believe that acknowledging and discussing it in the Limitations section is the best way to convey this and have elaborated on what is written there to be more explicit. See lines 644 – 645.

Reviewer #2 - Consistency or simplification of the headings would be helpful. For example, the methods sections has STEPS 1-4, with STEP 1 having four lettered subsections; the results section has two main subsections and one of those subsections lists the two phases of card sorting; and limitations, conclusions, and future directions are all short main sections.

Response: Figure 1 has been revised with more details as a flow chart outlining the process we used and headings (rather than numbers and letters) have been used in the revised Methods and Results sections.

Reviewer #2 - Somewhere in the manuscript it would be helpful to note the geographic comprehensiveness of the framework. For example, were most of the frameworks used in the literature review from the United States (or from the US and Europe) – or was there global representation from where the frameworks were developed? Is researcher development often done similarly globally, or is this framework likely most applicable to US researcher development?

Response: 34% (19) of the source frameworks were developed outside the United States, including: Iran, Germany (2), United Kingdom (3), South Africa (2), Turkey, Canada (2), Peru, Australia (3), Netherlands, India, China, and Hong Kong. A statement about this was added to the Results in lines 375 - 376 and in the discussion in lines 570 – 573.

Reviewer #2 - More detail in the captions would be helpful so that the tables and figures could stand alone.

Response: We reviewed the captions and feel confident that they convey all the information needed to interpret the tables and figures. It is unclear what the reviewer means by “stand alone.” We do not think it necessary to replicate the text from the Methods and Results sections in the captions.

Reviewer #2 - There are some superscripts in Table 6, and the descriptions of what those superscripts represent is missing (i.e., there are not footnotes below Table 6).

Response: The superscript notations are now referenced at the bottom of the table.

Reviewer #2 - Table 7 has a column titled “framework names,” but the items listed seem to be the article titles.

Response: The column title has been changed to “Framework Article.”

Reviewer #2 - Table 1 lists the learning outcomes as 1.1, 1.2, 1.3…, and the text of the manuscript adds a “0” before the learning outcomes below 10 (e.g., 3.01 on line 439).

Response: The learning outcome numbers have been updated to include the 0 in those below 10 (e.g., 1.1 is now 1.01).

Attachment

Submitted filename: Response to Reviewers_CRDF_BranchawButzAyoob.pdf

pone.0332587.s010.pdf (540.9KB, pdf)

Decision Letter 1

Amy Prunuske

2 Sep 2025

The Comprehensive Researcher Development Framework (CRDF): Core Learning Outcomes for Research Training

PONE-D-25-37622R1

Dear Dr. Branchaw,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice will be generated when your article is formally accepted. Please note, if your institution has a publishing partnership with PLOS and your article meets the relevant criteria, all or part of your publication costs will be covered. Please make sure your user information is up-to-date by logging into Editorial Manager at Editorial Manager®  and clicking the ‘Update My Information' link at the top of the page. For questions related to billing, please contact billing support .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Amy Prunuske

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Amy Prunuske

PONE-D-25-37622R1

PLOS ONE

Dear Dr. Branchaw,

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team.

At this stage, our production department will prepare your paper for publication. This includes ensuring the following:

* All references, tables, and figures are properly cited

* All relevant supporting information is included in the manuscript submission,

* There are no issues that prevent the paper from being properly typeset

You will receive further instructions from the production team, including instructions on how to review your proof when it is ready. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few days to review your paper and let you know the next and final steps.

Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

You will receive an invoice from PLOS for your publication fee after your manuscript has reached the completed accept phase. If you receive an email requesting payment before acceptance or for any other service, this may be a phishing scheme. Learn how to identify phishing emails and protect your accounts at https://explore.plos.org/phishing.

If we can help with anything else, please email us at customercare@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Amy Prunuske

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Appendix. Research Development Frameworks and Assessments included in Literature Review.

    (PDF)

    pone.0332587.s001.pdf (131KB, pdf)
    S2a Appendix. Research Training Program Development and Mapping Tool.

    (XLSX)

    pone.0332587.s002.xlsx (49.9KB, xlsx)
    S2b Appendix. Researcher Development Plan (RDP) Tool.

    (XLSX)

    pone.0332587.s003.xlsx (49.8KB, xlsx)
    S3 Appendix. Code Themes, Definitions and Assignments.

    (PDF)

    pone.0332587.s004.pdf (541.7KB, pdf)
    S4 Appendix. Evolution of Learning Outcomes for CRDF.

    (PDF)

    pone.0332587.s005.pdf (90.4KB, pdf)
    S5 Appendix. Dissemination Contacts for National Survey.

    (PDF)

    pone.0332587.s006.pdf (42.8KB, pdf)
    S6 Appendix. Open Card Sorting Results.

    (PDF)

    pone.0332587.s007.pdf (95.9KB, pdf)
    S7 Appendix. Framework Mapping to CRDF.

    (XLSX)

    pone.0332587.s008.xlsx (834.9KB, xlsx)
    Attachment

    Submitted filename: Response to Reviewers_CRDF_BranchawButzAyoob.pdf

    pone.0332587.s010.pdf (540.9KB, pdf)

    Data Availability Statement

    All relevant data are within the paper and its Supporting Information files.


    Articles from PLOS One are provided here courtesy of PLOS

    RESOURCES