Skip to main content
Journal of Medical Education and Curricular Development logoLink to Journal of Medical Education and Curricular Development
. 2020 Jul 22;7:2382120520940662. doi: 10.1177/2382120520940662

An Educational Evaluation of a Journal Club Approach to Teaching Undergraduate Health Care Research

Michaela Friesth 1, Kristina Dzara 1,2,3,4,
PMCID: PMC7376377  PMID: 32743078

Abstract

Background:

Health care research is a common undergraduate health sciences requirement. There is limited literature regarding course structure, content, or learning outcomes; most courses have traditionally been taught through didactic lecture. This is misaligned with Generation Y learner values, as they desire guided learning, real-world examples, active engagement, learning through doing, and psychological safety.

Methods:

A “journal club” approach to teaching health care research was implemented at Northeastern University in Fall 2018. Each session involved (1) a moment of reflection; (2) an introduction to the topic; (3) 1 student methods report presentation; (4) 2 student “journal club” self-directed structured article summary presentations; (5) large-group discussion; (6) plus/delta feedback to instructor. Each student completed 2 “journal club” and 1 methods presentations, 6 peer reviews, CITI research training, a quality improvement survey, and a final course reflection. We utilized a convergent mixed-methods educational evaluation, integrating data from 3 distinct sources—a quality improvement survey, final student course reflections, and Plus/Delta feedback—which were analyzed via thematic analysis. The Northeastern University Institutional Review Board exempted the study.

Results:

Students appreciated the course structure and reported confidence in their critical appraisal abilities. Four qualitative themes emerged: (1) enabled a high degree of growth as students and scholars; (2) designed in thoughtful and unique format; (3) initially intimidated students and was academically challenging; and (4) prioritized and enabled psychological safety.

Conclusions:

Although initially intimidating and admittedly challenging, undergraduate health sciences students applauded the course’s curricular design and enabling of psychological safety, which aligned with Generation Y learner values, ultimately leading to growth in perceived and realized confidence and ability to critically review research articles.

Keywords: Health professions education, health sciences, research methods, journal club

Introduction

Health care research is a common graduation requirement for undergraduate health sciences or nursing students who will enter graduate allied health programs.1 Nearly 70% of senior medical students have received research training, which have traditionally been taught through didactic lecture, more recently with PowerPoint slides.2-4 The didactic teaching style focuses on transferring content from teacher to learner and supports only surface knowledge, minimal retention of learning, and in some cases, rote memorization.1

Prior health care research course designs have utilized workbooks or templates to scaffold learning and promote critical reasoning skills.5-7 Small group work is commonly used which allows for peer feedback and active participation in discussion.2,8 Some have implemented approaches to group work such as the Jigsaw Technique, a collaborative approach in which each student’s grade depends on the other’s performance, or peer mentoring and tutoring to facilitate active participation and discussion.8,9 More innovative forms of teaching have included experiential learning which encourages the students to draw from and reflect on their own experiences.10 These prior studies have used quantitative quiz scores, in-class feedback, and end-of-course learner satisfaction surveys or reflections to gather feedback from students about course structure, format, and learning outcomes.5-7,9-11

Importantly, journals clubs are an established method for increasing exposure to research methods and supporting critical appraisal skills in medical school, residency, and fellowship.12-16 However, undergraduate students are rarely introduced to this format prior to advanced medical training, suggesting they are unprepared for the level of baseline research knowledge and critical appraisal skills necessary to participate in increasingly complex research conversations.3 Moreover, the journal club approach has not been adequately applied or studied as part of undergraduate research training, and it is unclear whether the approach effectively prepares students to understand research fundamentals with the intention for them to competently enter graduate allied health programs.7,17

Moreover, generation Y students—born after 1980—prefer more active learning, often including instructor guidance, active engagement with others, learning through doing, and course material related to real-world experiences.2,18,19 They also desire a strong classroom structure with consistent psychological safety.2,18,20

We developed a convergent mixed-methods educational evaluation for a journal club approach to teaching undergraduate health care research to Generation Y students, complemented by multiple key educational design strategies to support deep learning. We add to the literature by illuminating students’ perspectives on this approach, detailing their perceived educational impact on their understanding of and confidence with research methods via the use of a quality improvement survey, a final course reflection, and Plus/Delta feedback.

Methods

Setting and participants

All participating students were upper-level students in the Health Science major (550 total students) at the Bouvé College of Health Sciences (2000 total students) at Northeastern University in Boston, MA. In Fall of 2018, the senior author—a medical and health professions educator—taught 2 sections of Health Care Research at Northeastern University. The morning course had 16 students while the afternoon course had 24 students (total n = 39). All classrooms contained a computer with a ceiling mounted projector and screen that enabled PowerPoint presentations, as well as a chalkboard.

Curriculum design

Kern’s 6-step curriculum development process guided curriculum development.21 Initially, the first author reviewed prior literature regarding how health care research is taught and spoke with other Northeastern University faculty members about their perceptions of what the students needed to be successful.

The instructor embedded multiple core cognitive science and adult learning principles in the course design, as outlined below.21-23 First, students were consistently informed and reminded that research methods would be essential for their future careers as health professionals, and each course session included a “who is this?” component to introduce students to practicing health care researchers. Second, although journal club assignments were scaffolded by the instructor who provided a clear template for students, significant self-directed learning and autonomy among students was required.15 Third, the instructor encouraged students to draw on their prior learning experiences in research and statistics. Fourth, readiness to learn was supported by the use of articles of increasing complexity throughout the course so that students had to continually revisit complex course concepts. Fifth, students’ motivation to learn was encouraged through the development and maintenance of a “safe learning space,” in which students consistently asked questions as well as provided and received feedback from the instructor and engaged in peer review. The intention was to cultivate a desire among all students to complete assignments at the highest level of their ability. Sixth, psychological safety was additionally encouraged via the use of a minute at the beginning of class to hit pause and “situate ourselves in our learning space” to remind students to actively choose to be engaged in their own learning for the session.23 Finally, the journal club design enabled active learning, largely through large group discussion with student-led questions and supported spaced repetition through the interleaving of main concepts.

Course structure

The courses spanned 14 weeks, with 2 sessions per week at 140 minutes, amounting to 23 course sessions. Students were assigned a research article, and utilized the “self-directed structured summary” template to develop a 1-page summary.15 This summary included the introduction, methods, results, and the conclusion, highlighting challenges or concerns students noted (Appendix 1). The students presented study findings to the instructor and their peers, using PowerPoint presentations. Students were encouraged to make presentations fun, interesting, and engaging. They were also responsible for leading discussion about the journal article after their presentation, with input, guidance, and questions from the instructor. Articles varied in scope and methodology, and topics were sequentially organized by complexity to align with the articles for the day’s session. The Journal Club Implementation Flowchart details the process by which the instructor and students navigate each course session (Figure 1).

Figure 1.

Figure 1.

Journal Club Implementation Flowchart.

The sessions were organized as follows: situate self in learning space, the day’s outline, plus/delta review from the previous course session, student methods report presentation, introduction to the day’s topic, 3-minute break, student journal club structured summary and presentation #1, “Who is this?,” student journal club structured summary and presentation #2, and completion of the plus/delta for the course session. On 4 sessions, guest speakers with expertise provided the introduction to the day’s topic, as well as served as moderators and discussants for the student journal club structured summary and presentations.

All students completed 6 methods reports or journal club structured summary and presentation peer reviews throughout the semester. They offered feedback in 3 areas: “What the presenter did well,” “What the presenter could have done better,” and “What additional comments do you have for the presenter?” To protect student confidentiality, there were 2 peer reviews for every presentation. The peer reviews were reviewed by the teaching assistant for appropriateness and then aggregated, blinded, and sent back to students as part of formative feedback. Students were instructed in subsequent presentations to indicate to the class 1 way they incorporated this feedback into their presentation.

Most assignments were completed independently, although a small number of students in the afternoon class completed 1 methods report presentation or journal club structured summary and presentation in teams of 2 due to a slight excess of students compared with course topics.

Course grading

Course grading was on a 1- to 100-point scale, with 1 methods report presentation (20 points), 2 journal club structured summaries and presentations (40 points), 6 peer reviews (15 points), course engagement (15 points), CITI training (5 points), and a final course reflection (5 points). After each journal club structured summary and methods report presentation, students were emailed a rubric with their grade and written formative feedback from both the instructor and their peer reviewers. The rubrics detailed whether learners followed citation and formatting instructions, clearly explained topics, and related topics to prior course content or readings. Journal club rubrics specifically indicated whether students were accurate in their interpretation of the research question, design, methods, data analysis, results, and conclusions. At the course midpoint, students were given a grade update, with their raw score as well as a scaled percentage.

Data collection

As part of routine educational evaluation and quality improvement, multiple sources of course evaluation data were collected by the instructor for educational program evaluation. These data sources included the following:

  1. Anonymous, voluntary, final quantitative course quality improvement survey developed by the instructor and collected as part of routine end of the semester educational evaluation.

  2. Deidentified, final 1-page course reflection assignments in which students reflected on their learning throughout the semester by drafting a 1- to 2-page reflection paper which connects course content and learning throughout the course.

  3. Anonymous Plus/Delta data completed by all students after every class session to inform educational quality improvement throughout the semester. Students were asked to take 1 minute at the end of each session to answer 2 simple questions: (1) what went well, and (2) what could have gone better. The instructor reviewed and summarized this feedback and presented it to all students at the beginning of the next session. When possible, the instructor made small iterative changes to improve student experience. When changes could not be made, the instructor was transparent as to why.

This project was reviewed by Northeastern University Human Subjects Research Protection and exempted from further review.

Data analysis

This mixed-methods convergent educational evaluation utilizes the quantitative quality improvement survey, qualitative final course reflections, and Plus/Delta feedback as data sources. Our purpose was to understand how the course design was experienced by students, as well understand the perceived educational impact on their understanding of and confidence with research methods.

Descriptive statistics were obtained for the quality improvement survey. Plus/Delta feedback were collated and reviewed to determine key recommendations from learners. Thematic analysis following the 5 stages to qualitative research framework was utilized for the qualitative final course reflections.24 Dedoose (QSR International Inc, Burlington, MA) was used to facilitate data management while coding. First, 2 primary coders (K.D. and M.F.) familiarized themselves with the data, each independently reviewing the first 5 reflections to create a preliminary list of codes. They compared codes and refined definitions as a team with discrepancies reconciled in person by both coders until the codebook was finalized and entered into Dedoose. Interrater reliability between the coders was established using 5 transcripts at random with a 0.75 kappa indicating good to excellent interrater reliability. Subsequently, codes were systematically applied to all reflections. After independently coding all reflections, the 2 coders discussed recurring patterns in the data and codes were combined into categories and then themes—a cluster of codes that, when combined, provided a meaningful statement about students experience in the course. The coders independently reread all coded data within each theme to ensure consistency and identify illustrative quotations. The authors then considered how the 3 data sources aligned and integrated the finding in the results to meaningfully report the impact of the course on students.

To confirm findings, a student who took the class served as a member checker by critically reviewing the manuscript prior to journal submission and providing feedback which was incorporated into the final version. The Northeastern University Institutional Review Board exempted the study.

Results

Thirty-nine students completed the quality improvement survey (97.5%) and 40 completed the final course reflection (100%). All were expected to complete the plus/delta data anonymously after each session. Table 1 presents the quality improvement survey results. These results, as well as the Plus/Delta feedback, were integrated within the 4 themes constructed from the qualitative analysis of the final course reflection (Table 2) for results reporting.

Table 1.

Quality improvement survey results.

Statement Low score High score Mean score SD
The Journal Article Structured Summary format is an effective way to teach critical appraisal of research studies 1 5 4.72 .759
The Journal Club Presentation format is an effective way to teach critical appraisal of research studies 1 5 4.56 .788
The Methods Report format is an effective way to teach critical appraisal of research studies 2 5 4.32 .775
The Methods Report Presentation format is an effective way to teach critical appraisal of research studies 2 5 4.18 .865
I understand quantitative research methods 4 5 4.51 .506
I understand qualitative research methods 3 5 4.54 .555
I can interpret statistical tests 1 5 4.00 .918
I can interpret research results 3 5 4.62 .544
I can critically review research papers 3 5 4.62 .544
I learned from engaging in peer review 3 5 4.21 .656
I see the value in peer review 4 5 4.64 .468
I enjoyed participating in peer review 3 5 4.26 .751
The class overall was a positive learning experience 4 5 4.95 .226
The class overall enhanced my critical reading skills 4 5 4.89 .311
The class overall was worth my time and effort 2 5 4.87 .529
No Yes
I would recommend this course to a friend in the health sciences major 0 (0%) 39 (100%)

Note: Thirty-nine students completed the quality improvement survey. Statements were rated on a 5-point Likert-type scale ranging from strongly disagree (1) to strongly agree (5).

Table 2.

Selected Illustrative Quotes by Theme.

Theme 1: Enabled a High Degree of Growth as Students and Scholars
Appreciation for Research and Desire to Be a Critical Readers
“I also appreciated research much more as a field of study and as a professional concentration.” (Reflection 2)
“I now feel comfortable with reading these papers, and thus being critical about the data, methodologies, and analysis that the authors use.” (Reflection 18)
Perceived Growth as Learners and Scholars
“This class challenged me not only academically but also personally. Throughout this semester I was able to learn, grow as a scholar and as a person.” (Reflection 1)
“One of my greatest lessons from the class was learning how to improve my writing and oral presentation skills through reading papers and presenting.” (Reflection 1)
Journal Club and Methods Reports Approach Enabled Understanding of Research Methods
“Journal Club was most effective in helping me become a better critical reader of research articles and identify the research question, research methods, and study design.” (Reflection 7)
“The methods reports also helped me learn about health care research in a unique way.” (Reflection 15)
Application of Knowledge in Current and Future Coursework
“I didn’t realize how much I was learning in this class, until one day, when I was reading through research articles for a project in my genetics class. As I was going through different articles, I thought, ‘Wait, I am understanding what these papers are saying. This feels much easier than any research I have done before.’ I think this was a direct result of the knowledge I was gaining through this class.” (Reflection 5)
Theme 2: Designed in Thoughtful and Unique Format
Learners Appreciated the Unique Design which Enabled Learning
“I firmly believe that this class was one of the most insightful and relevant classes that I’ll attend at Northeastern.” (Reflection 11)
“Having the plus/deltas at the end of every class made me feel like a more active participant in my learning and someone who’s opinions were valued. I could see the class improving each time I came, instead of holding on to issues or concerns until the end of the semester.” (Reflection 3)
A Consistent Course Format with Some Variety Engaged Learners
“Having a varied format was a great way to keep each class interesting. Having a similar structure each time made each class predictable enough to be comfortable, but different enough to keep my attention. It became familiar to have a methods report, a methods lecture, a journal club presentation, a “who is this,” another journal club, then our plus-deltas.” (Reflection 2)
Peer Teaching, Review, and Feedback was Appreciated, Valued, and Implemented
“Having to present in front of my classmates and receive critiques on my presentation of the journals and methodological topic definitely enhanced my public speaking skills and allowing me to speak with confidence when I am discussing research findings in studies.” (Reflection 30)
“Providing feedback was not only helpful to the person receiving the feedback, it also allowed me to able to learn from other presenters to improve my own communication skills.” (Reflection 7)
Theme 3: Initially Intimidated Students and was Academically Challenging
The Course Was Academically Challenging and Initially Intimidating
“To be completely honest, after the first day of this class, I was dreading the rest of the semester. First, I thought I already knew that research was not something I was interested in pursuing. Second, after hearing that most of our final grade would be based on these ‘journal club’ and ‘methods report’ presentations, I was terrified.” (Reflection 5)
“I found this course to be challenging both in the content and the assignments” (Reflection 36)
Theme 4: Prioritized and Enabled Psychological Safety
The Learning Environment was Comfortable and Supportive
“As the semester went on, I began to realize that this class fostered a supportive learning environment.” (Reflection 35)
“I could tell everyone wanted to be there each class, and I felt comfortable approaching classmates with questions outside of class or for further discussion of topics we addressed during class time.” (Reflection 3)

Enabled a high degree of growth as students and scholars

Students noted a high degree of perceived growth as students and scholars. They reported gaining an appreciation for research through the course and indicated that they now knew how important it was to be a critical reader of the literature. They indicated feeling that they grew as scholars and gained academic confidence, in part through improvements in both oral and written communication. The journal club self-directed structured summary provided an instructor-scaffolded approach that allowed the student to “chunk” the article in smaller sections, thus enabling them to understand the article as a whole. This activity required significant self-directed learning and was positively received by students. Overall, students felt enabled to read research articles and understand research methods, which they attributed in part to the methods report presentations by students during each session. This was supported by responses to multiple survey questions, which indicated that self-directed structured summaries, methods reports, and presentations were an efficacious way to teach critical appraisal. Importantly, students reported utilizing skills learned in the course in another course that term, or expectations that what they learned would be helpful in future classes or training.

Designed in thoughtful and unique format

Throughout their reflections, students offered praise for the thoughtful and unique course design, which they experienced as novel and greatly appreciated. Students found the mindfulness exercise at the beginning of each class refreshing and appreciated the opportunity reset from a busy day and refocus energy on learning. Students were highly supportive of the multiple opportunities for class discussion and felt comfortable engaging with their instructor and peers. Guest speakers were welcomed and students found them knowledgeable, inspiring, and helpful to expand their breadth of learning. The plus/delta activity at the end of each course to obtain student feedback was perceived as respectful, especially when changes were implemented during the next session. The consistent session structure was comfortable to students who also noted finding the varied course activities and variety of topics stimulating. Finally—as was also indicated in the survey—the utilization of peer teaching and learning including peer review was valued. In many cases, students incorporated this feedback into future assignments.

Initially intimidated students and was academically challenging

Students admitted to entering the course with a high degree of apprehension and anxiety about the topic and did not feel prepared by prior courses. They expected to be bored and uninterested. They noted feeling overwhelmed by the syllabus and apprehensive about the multiple required presentations. Once the course was underway, they felt academically challenged by the course and assignments, and felt the level of work required for the course was appropriate. Some disliked the methods report presentations and felt it was of lower educational value than other course activities.

Prioritized and enabled psychological safety

Students reported not being “stressed out” as a function of the course through the safe, welcoming, and comfortable learning environment created and sustained throughout the semester. They felt they could learn and grow and that the instructor and, subsequently, their peers were invested in their academic growth. They were comfortable engaging in discussion and noted doing so more than in other courses. Plus/Delta feedback was consistently received and was transparent—for example, offering instances when articles were too complex to facilitate learning, or times when the instructor may have been more effective in explaining key concepts. When possible, this immediate feedback informed iterative course changes, such as implementation of a 3 minute course break for restroom use and reduction in number of course readings. Overall, feedback was overwhelmingly positive, with students indicating that the instructor was high-energy and invested in their growth, and that their time and input as students was respected. These findings were supported by survey results, in which nearly all students reported the course being a positive learning experience worth their time and effort. Importantly—and further evidence of the psychological safety enabled through the course—students noted that future improvements could include a wider variety of in-class activities, inclusion of a group project, more integration between the methods report and journal club structured summaries, and additional engagement through discussion and the limiting of student laptop use.

When considered in the aggregate, the 3 data sources converge and highlight the perceived positive impact of the course on learners, who found it challenging yet noted that the class was worth their time and effort. Moreover, they ultimately developed great respect for the instructor and the course design, and would recommend it to a friend in the same major. Perhaps most importantly, they both reported and were proud of their own subsequent resultant growth as scholars and researchers.

Discussion

We describe the development, implementation, and evaluation of a undergraduate health care research course utilizing a journal club format which Generation Y students indicated allowed them to better understand research articles and improve as scholars. Psychological safety was reinforced consistently and students expressed comfort in and out of the classroom that led to their overall success in the course. Students also felt appreciated throughout the course, which in turn encouraged them to be more involved and therefore they were able to reach a higher level of learning. Students also greatly appreciated that the instructor maintained a consistent structure of the course while still offering varied activities and covering diverse topics in each session. Importantly, students noted that improvement was possible and gave multiple recommendations for how that could be achieved.

Importantly, in the United States, the Accreditation Council for Graduate Medical Education mandates that programs support advancement in resident knowledge of basic scientific inquiry principles, including how research is designed, conducted, assessed, explained to patients, and applied to patient care.25 Those trainees who have a strong undergraduate research experience may be more likely to readily engage in research and quality improvement projects as trainees, and may be more equipped to encourage critical thinking and evidence-based medicine among their peers.

Our study has multiple limitations. First, this course was implemented in just 2 sections at 1 university and thus may not translate to other settings. Moreover, data from 2 course sections were combined for more robust data analysis and differences in student composition may have influenced class culture. The survey does not have existing validity evidence and was created by the instructor. We also did not design our study as research but instead report the evaluative results of an educational intervention. Finally, students were not asked how their learning experience compared to other courses at their institution, which offered no counterfactual or benchmark from which to compare the course evaluation.

Conclusion

Teaching styles that focus heavily on didactic teaching are misaligned with the teaching and learning desires for Generation Y students. The student-led “journal club” course structure scaffolded with a self-directed structured summary template allowed students to gain mastery over the challenging task of critically appraising health care research articles in a way which was meaningful and impactful to them. This educational evaluation demonstrates evidence that the course structure was well-received by students and resulted in perceived growth as students and scholars. We attribute our success in part due to the course’s high degree of psychological safety, which was noted, appreciated, and valued by students.

Acknowledgments

We acknowledge the students who participated in and evaluated the course.

Appendix 1

Journal club article self-directed structured summary template

Citation: Journal citation including authors, journal name, article title, volume, issue, and page numbers.

Summary: In no more than 4 lines, summarize the paper.

Background:

  • – Why do the authors feel their work is important and necessary? Briefly, what do we need to know about their research focus?

Purpose and Research Question:

  • – What is the purpose of the study? What is/are their research question(s)?

Aims or Hypotheses:

  • – What are the authors state aims, objectives, or hypotheses?

Setting and Data:

  • – Was the study approved by an Institutional Review Board?

  • – What data did they use in their study? Where did they obtain or collect the data, and how? (eg, existing database, chart review, survey, interviews, focus groups)

  • – What was the population of individuals from whom they drew their data?

  • – How many respondents were in their population or sample? What was their response rate?

Research Design:

  • – What was their study design (eg, trial, prospective, retrospective, cross-sectional, cohort, case-control, secondary analysis of existing database, systematic review)

Research Methods:

  • – Was the research quantitative, qualitative, or mixed methods?
    • If quantitative, what were their Dependent variable(s)? Independent variables? Control variables?
    • If qualitative, how did they analyze their data? (Thematic analysis, Grounded theory)
  • – Did they conduct statistical analyses?
    • If yes, what tests did they employ? (eg, t test, chi-square, analysis of variance, logistic regression, Cox proportional hazards models).

Results:

  • – What were the authors’ major findings?

Conclusions:

  • – What conclusions did they draw regarding their findings?

Compliments/Critiques/Comments:

  • – Was anything about the study unclear? Did you note any methodological challenges or concerns?

  • – Were the authors’ conclusions supported by their findings? Do you agree with the findings?

  • – Are the findings relevant to you as a future health care professional?

  • – Would you have changed anything about their research question or methods and if yes, what and why?

Note: Include page numbers where you found the information. IF direct quotes from the original article are used, indicate and cite them properly.

Example: “Journal Club is a great way to review and discuss relevant and interesting journal articles” (p. 456).

Footnotes

Declaration of Conflicting Interests:The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding:The author(s) received no financial support for the research, authorship, and/or publication of this article.

Authors’ Note: Institution at which the research was conducted: Northeastern University.

Author Contributions: KD developed the curriculum, designed the study, and collected the data. MF and KD contributed equally to the data analysis and results, and drafting of the final manuscript.

ORCID iD: Kristina Dzara Inline graphic https://orcid.org/0000-0001-9425-2679

References

  • 1. Peachey AA, Baller SL. Ideas and approaches for teaching undergraduate research methods in the health sciences. Int J Teach Learn Higher Educ. 2015;27:434-442. [Google Scholar]
  • 2. Hills CM, Levett-Jones T, Lapkin S, Warren-Forward H. Generation Y health professional students’ preferred teaching and learning approaches: a systematic review. Open J Occup Ther. 2017;5:12. [Google Scholar]
  • 3. Chean C, Drake T, Bath M, et al. Medical research and audit skills training for undergraduates: an international analysis and student-focused needs assessment. Postgrad Med J. 2017;94:1-6. [DOI] [PubMed] [Google Scholar]
  • 4. Wickramasinghe S, Wickramasinghe K, Atukorale K, et al. Evaluation of research skills and attitudes about research skills training among medical students. Educ Prim Care. 2017;28:189. [DOI] [PubMed] [Google Scholar]
  • 5. Alguire PC, Anderson WA, Henry RC. Teaching research skills: development and evaluation of a new research program for residents. Teach Learn Med. 1993;5:37-43. [Google Scholar]
  • 6. Sullivan SG, Hoiriis KT, Paolucci L. Description of a change in teaching methods and comparison of quizzes versus midterms scores in a research methods course. J Chiropr Educ. 2018;32:84-89. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Supino PG, Borer JS. Teaching clinical research methodology to the academic medical community: a fifteen-year retrospective of a comprehensive curriculum. Med Teach. 2007;29:346-352. [DOI] [PubMed] [Google Scholar]
  • 8. Secomb J. A systematic review of peer teaching and learning in clinical education. J Clin Nurs. 2008;17:703-716. [DOI] [PubMed] [Google Scholar]
  • 9. Leyva-Moral JM, Riu Camps M. Teaching research methods in nursing using Aronson’s Jigsaw Technique. A cross-sectional survey of student satisfaction. Nurse Educ Today. 2016;40:78-83. [DOI] [PubMed] [Google Scholar]
  • 10. Tetley J, Glover J. Use of experiential methods to teach research in a pre-registration nursing curriculum. Nurse Educ Today. 1999;19:633-638. [DOI] [PubMed] [Google Scholar]
  • 11. Selby ML, Tuttle DM. Teaching nursing research by guided design: a pilot study. J Nurs Educ. 1985;24:250-252. [DOI] [PubMed] [Google Scholar]
  • 12. Topf JM, Sparks MA, Phelan PJ, et al. The evolution of the journal club: from Osler to Twitter. Am J Kidney Dis. 2017;69:827-836. [DOI] [PubMed] [Google Scholar]
  • 13. Moharari R, Rahimi E, Najafi A, Khashayar P, Khajavi M, Meysamie A. Teaching critical appraisal and statistics in anesthesia journal club. QJM. 2009;102:139-141. [DOI] [PubMed] [Google Scholar]
  • 14. McLeod P, Steinert Y, Boudreau D, Snell L, Wiseman J. Twelve tips for conducting a medical education journal club. Med Teach. 2010;32:368-370. [DOI] [PubMed] [Google Scholar]
  • 15. Dzara K, Jain G, Soltys SM. The self-directed, structured summary as a teaching tool in a psychiatry journal club. Acad Psychiatry. 2012;36:490-492. [DOI] [PubMed] [Google Scholar]
  • 16. Dzara K, Frey-Vogel AS. Medical education journal club for the millennial resident: an interactive, no-prep approach. Acad Pediatr. 2019;19:603-607. [DOI] [PubMed] [Google Scholar]
  • 17. Fazzone PA. An experiential method for teaching research to graduate nursing students. J Nurs Educ. 2001;40:174-179. [DOI] [PubMed] [Google Scholar]
  • 18. Walker JT, Martin T, White J, et al. Generational (age) differences in nursing students’ preferences for teaching methods. J Nurs Educ. 2006;45:371-374. [DOI] [PubMed] [Google Scholar]
  • 19. Roberts DH, Newman LR, Schwartzstein RM. Twelve tips for facilitating Millennials’ learning. Med Teach. 2012;34:274-278. [DOI] [PubMed] [Google Scholar]
  • 20. Edmondson A. Psychological safety and learning behavior in work teams. Admin Sci Q. 1999;44:350-383. [Google Scholar]
  • 21. Thomas PA, Kern DE, Hughes MT, Chen BY. Curriculum Development for Medical Education: A Six-Step Approach. Baltimore, MD: JHU Press; 2016. [Google Scholar]
  • 22. Knowles M. The Adult Learner: A Neglected Species. 4th ed Houston, TX: Golf Publishing Company; 1999. [Google Scholar]
  • 23. Rice GT. Hitting Pause: 65 Lecture Breaks to Refresh and Reinforce Learning. Sterling, VA: Stylus Publishing, LLC; 2017. [Google Scholar]
  • 24. Ritchie J, Spencer L. Qualitative data analysis for applied policy research. In: Bryman A, Burgess B, eds. Analyzing Qualitative Data. London: Routledge; 1994:173-194. [Google Scholar]
  • 25. Accreditation Council for Graduate Medical Education. Common program requirements. https://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirements. Published 2020. Accessed March 3, 2020.

Articles from Journal of Medical Education and Curricular Development are provided here courtesy of SAGE Publications

RESOURCES