Skip to main content
. 2021 Jan 25;14(3):1026–1036. doi: 10.1111/cts.12966

Table 5.

Self‐assessments a of competencies, before and after rigor and reproducibility class

Before

(n = 18) b

Before

(n = 11) b

After

(n = 11) b

1. The origins of the reproducibility crisis. 1.72 (1.32) 2.36 (1.29) 5.45 (1.29)
2. Strategies for addressing the reproducibility crisis. 2.67 (0.97) 2.73 (0.90) 5.82 (1.08)
3. The NIH response to the reproducibility crisis. 2.44 (1.25) 2.45 (1.13) 5.45 (1.51)
4. The role and importance of rigor and reproducibility in NIH proposal writing and scientific review. 3.78 (1.26) 3.45 (1.04) 6.09 (0.94)
5. The importance of scientific premise in NIH proposal preparation. 3.50 (1.15) 3.54 (1.04) 6.00 (0.94)
6. Critically assess sample scientific premise statements. 3.39 (1.50) 3.45 (1.75) 5.36 (1.63)
7. The importance of rigorous experimental design and documentation for transparency. 4.78 (1.17) 4.73 (0.65) 6.45 (5.20)
8. The importance of including sex as a biological variable in research. 4.50 (1.29) 4.27 (1.10) 6.50 (0.71)
9. Bias and the sources of bias in the conduct of science. 4.56 (1.04) 4.72 (1.19) 6.00 (1.00)
10. Assessing bias using the Cochrane Collaboration’s tool for assessing risk of bias in randomized trials. 1.28 (0.67) 1.27 (0.65) 4.64 (1.63)
11. Developing a prospective experimental design that comports with appropriate guidelines. 3.39 (1.09) 3.55 (1.04) 5.45 (1.29)
12. Key elements to include in an authentication plan for an NIH grant application. 1.72 (0.96) 1.91 (1.04) 4.73 (1.68)
13. Quality practices important to basic biomedical research. 4.06 (1.21) 4.00 (1.00) 6.18 (0.98)
14. Implementation of quality practices for basic biological research. 3.83 (1.42) 3.73 (1.27) 5.82 (0.98)
15. Evaluation of image data to determine whether unacceptable manipulation has occurred. 2.44 (1.62) 2.82 (1.78) 5.64 (1.12)
16. Software tools used to inspect images for manipulation. 1.89 (1.13) 2.18 (1.25) 4.82 (1.25)
17. Evaluating adherence to transparent reporting publication guidelines. 2.39 (1.04) 2.45 (1.13) 5.40 (1.17)
18. The role of laboratory notebooks in promoting rigor and reproducibility and transparency. 4.56 (1.58) 4.82 (1.60) 6.45 (0.93)
19. The roles of the data management plan, metadata, and data dictionary in promoting reproducibility and transparency. 3.56 (1.95) 3.72 (2.10) 5.73 (1.42)
20. Challenges and benefits of increased scientific transparency. 3.94 (1.43) 4.09 (1.45) 6.18 (0.75)
21. Critically assessing practices in your laboratory and consider possible steps toward increased transparency. 3.56 (1.82) 3.45 (1.69) 5.82 (0.75)
22. “Open Science” and its overall goals. 2.94 (1.55) 2.91 (1.64) 6.18 (0.98)
23. The challenges to the implementation of Open Science. 2.39 (1.20) 2.45 (1.21) 6.09 (1.04)
24. Identifying changes to current practices that promote Open Science. 2.67 (1.41) 2.73 (1.49) 5.73 (0.79)
25. Institutional changes that promote rigor and reproducibility. 3.17 (1.20) 3.55 (0.93) 5.64 (0.81)

Abbreviation: NIH, National Institutes of Health.

a

Students ranked each item on a scale where 1 = know nothing, 2 = very basic understanding, 3 = low/moderate understanding, 4 = moderate understanding, 5 = high/moderate understanding, 6 = strong understanding, and 7 = highly competent.

b

All students completed the assessment before class, 11 students completed the post‐assessment. All paired t‐tests were <0.05 for the all students and MD/PhD students (n = 7), pre‐post scores were not statistically different for Clinical and Translational Research Pathway students (n = 4) 5, 6, 9, 10, 11, 12, 16, 17, 19, 22, and 25.