Table 2.
Topic | In‐class activities |
---|---|
Reproducibility Crisis 2 small group (3–5 students) discussions 15 min each 5‐min summary of each group’s discussion |
Discussion 1: In the 5 y since the Collins and Tabak article, which of the proposed ideas for NIH to address the reproducibility crisis have been implemented? Conduct some internet “sleuthing.” Your team may provide a general scan or a “deep dive” of one aspect. Why do you think some ideas succeeded and others failed? What are your thoughts about the potential impact for the implemented changes to address the reproducibility issues? During the course of your internet searching, did you come across any new ideas that have been implemented by NIH (or others) to address the reproducibility crisis? |
Discussion 2: Each team select one of the stakeholder roles (i.e., student, journal editor, academic institution (e.g., promotion committees), funder, researcher). From your stakeholder perspective, discuss strategies to address the reproducibility crisis.
| |
Evaluating the rigor or previous research, scientific premise 1 small group (3–5 students) discussion 30 min for discussion with 5‐min summary from each group |
Each group discusses the high‐level overview of an F30 proposal assigned to the group. Based on readings regarding the importance of scientific premise in NIH review of proposals, what specific prior research studies would your group like to see referenced in support of the scientific premise of this NIH proposal? Has the research your group believes is necessary been done? What is the quality or the previous research which forms as the foundation for the current proposal? Discuss how to determine the rigor of the studies you would like to see before you would highly score the application. |
Project presentation 1: Students present reproducibility or replication project, why selected, team members, outline of what the team believes will be reasonable to accomplish (e.g., download data, recreate the sample, recode variables, run preliminary analyses) | |
Rigorous experimental design and bias |
Class watches NIH Video together (Module 2: Blinding and randomization 30 ) Followed by small group discussion with questions provided by NIH (e.g., can you think of a particular instance in which blinding and randomization could have a dramatic impact on the results?) Cochrane assessment bias tool exercise. Hands on exercise with Experimental Design Assistant Tool 31 |
Biological variables, authentication, and quality control |
Discussion 1: Class watches NIH video together (Module 4: Sample size, outliers, sex as a biological variable 30 ) Followed by small group discussion with questions provided by the NIH (e.g., Have you or someone you know only used male mice in an experiment as a way of avoiding the “sex issue?” Do you think this is appropriate? Does it depend on the type of experiment being done?). Discussion 2: Before the class on quality control, trainees complete an assignment to (1) obtain a standard operating procedure, manual, or protocol from their research laboratory (or a classmate if they have not been assigned to a laboratory), (2) review it, and (3) observe the practices in the laboratory. During in‐class small group discussions, trainees shared what they learned from this exercise and discuss deviations from protocols (if any) and how the laboratory could improve processes. If no deviations were observed, trainees were challenged to reflect on reasons why. |
Project presentation 2: Recap of topic, overview of methods | |
Reporting expectations 1 small group (3–5 students) discussion 20 min for discussion with 5‐min summary from each group |
Each small group assigned an article. Who is to blame? Summarize evidence in support of (researchers, sponsors, editors) based on the article assigned to your group. What can be done about it? Brainstorm ideas to address publication bias given your thoughts and the evidence regarding who is “to blame.” |
Project presentation 3: Recap of topic, tasks accomplished, challenges experienced, and preliminary results | |
Implementing transparency 2 small group (3–5 students) discussions 15 min each 5‐min summary of each group’s discussion |
Class watches NIH video #1 (Module 1: Lack of transparency 30 ) Followed by small group discussion with questions provided by NIH (e.g., Do you think the corresponding author should have handled the situation differently?). Moving Forward: Individually, critically reflect on practices in your laboratory and consider possible steps toward increased transparency. What are the most pressing needs for improving practices in your laboratory? How would you address them moving forward? |
Open science |
Standard debate format (see text). Debate 1: Should scientists at our institution be required to use an open science framework for their research? Debate 2: Should federal funders of research in the United States (e.g., NIH, NSF, etc.) participate in Plan S? |
Reproducibility/ replication projects | Project presentation 4: Team, topic, methods, open science / transparency methods used, challenges, preliminary results, unexpected aspects of the project, findings, transparency, thoughts on open science, transparency, rigor, etc. |
Abbreviations: NIH, National Institutes of Health; NSF, National Science Foundation.