Abstract
Background
There are a limited number of emergency medicine (EM) physicians with expertise in education research. The Harvard Macy “step‐back” method is an emerging model utilized to gather group feedback. Despite its use in multiple educational settings, there are little published data demonstrating effectiveness.
Objectives
Our objective was to create and evaluate a national faculty development session providing consultation in education research utilizing the step‐back method.
Methods
This was a pilot study. EM experts in education research from across the country served as facilitators for a faculty development session held at the 2018 Council of Emergency Medicine Residency Directors Academic Assembly. Small groups consisting of two or three facilitators and one or two participants were formed and each participant underwent a step‐back consultation for their education research study. Participants wrote their study question before and after the session. After the session, facilitators and participants completed an evaluative survey consisting of multiple‐choice, Likert‐type, and free‐response items. Descriptive statistics were reported. Qualitative analysis using a thematic approach was performed on free‐response data. Participant study questions were assessed by the PICO (population, intervention, comparison, outcome) and FINER (feasible, interesting, novel, ethical, relevant) criteria. Both scales were evaluated using a two‐way random‐consistency intraclass correlation. Before and after scores were evaluated with a paired t‐test.
Results
Twenty‐four facilitators and 13 participants completed the step‐back session. Evaluations from 20 facilitators and nine participants were submitted and analyzed. Sixteen of 20 facilitators felt that the step‐back method “greatly facilitated” their ability to share their education research expertise. All facilitators and participants recommended that the session be provided at a future academic assembly. Regarding suggestions for improvement, qualitative analysis revealed three major themes: praise for the session, desire for additional time, and a room set up more conducive to small group work. Seven of nine responding participants felt that the session was “very valuable” for improving the strength of their study methods. Qualitative analysis regarding change in study as a result of the step‐back session yielded four major themes: refinement of study question, more specific outcomes and measurements, improvement in study design, and greater understanding of study limitations. Both FINER and PICO scale comparisons showed improvement pre‐ and postintervention (PICO 60% relative increase; FINER 16% relative increase). Neither achieved statistical significance (PICO t(5) = –1.835, p = 0.126; and FINER t(5) = –1.305, p = 0.249).
Conclusion
A national‐level education research consultation utilizing the step‐back method was feasible to implement and highly valued by facilitators and participants. Potential positive outcomes include refinement of study question, more specific outcomes and measurements, improvement in study design, and greater understanding of limitations. These results may inform others who want to utilize this method.
The field of education research in emergency medicine (EM) is burgeoning. There has been a call for increased methodologic rigor in education research.1, 2, 3, 4, 5, 6 An estimated 43% of academic EM faculty are primarily involved in education, but a relatively small number possess formal training or expertise in performing and disseminating education research.7 This problem is not unique to EM. Consensus groups of medical educators have concluded that lack of expertise and mentorship are among the most significant barriers to the production of high‐quality education research.8 Methods and venues that provide further training in education research techniques and spark cross‐institutional mentorship are needed to address these gaps.
The “step‐back” method is a technique for developing and critiquing project proposals in a collaborative as well as objective fashion. It was proposed and described by Dr. Robert Kegan of the Harvard Macy Institute for Physician Educators in 2002 and has been used in their courses.9 During a step‐back session, the project presenter provides a summary and then “steps back,” allowing the other members of the group to take on and develop the project as though it were their own, without input from the presenter. At the end, the presenter returns to the conversation. This technique allows new ideas and perspectives to be fully entertained without being inhibited by the presenter responding immediately. Given the format, active engagement and collaboration are encouraged, which has been shown to improve learning outcomes.10, 11, 12 Despite potential benefits, little has been published about the efficacy of the step‐back technique or the applicability to the EM education research community.
The Council of Emergency Medicine Residency Directors (CORD) Academy for Scholarship sought to provide faculty development in education research utilizing the step‐back method to EM educators at the national level. The objective of this study was to evaluate the efficacy of this approach at a national meeting to allow for the sharing of education research expertise and the development of a community of practice in EM education research. Secondarily, we sought to understand in what ways this technique may have aided the development of education research proposals.
Methods
Study Design
This was a pilot survey study of EM educators from across the United States. This study was approved by the institutional review board at Rush Medical Center.
Study Setting and Participants
This study was performed at the CORD Academic Assembly in April 2018. Attendees of the conference could sign up as participants for the special education research consult session utilizing the step‐back method with their conference registration. The CORD Academy for Scholarship identified and recruited session facilitators with education research expertise from across the country based on personal knowledge of Academy leadership. All facilitators were education faculty with a successful track record for publishing education research. Nominations for faculty facilitators were solicited from Academy leadership, agreed upon by consensus, and then recruited by e‐mail by session directors to participate. Participation in the session, both as a participant and as a facilitator, and completion of the evaluative survey were voluntary.
Instrument Development
Two separate evaluation instruments were developed for program participants and facilitators by members of the study group, expert EM education researchers, after literature review to optimize content validity. Survey development followed established guidelines for survey research.13 The facilitator instrument consisted of five items including multiple‐choice, Likert‐type, and free‐response items. The participant instrument consistent of six items including multiple‐choice, completion, and Likert‐type items. Items were discussed amongst the study group to ensure response process validity and piloted with a small group of representative subjects. Revisions for clarity and readability were made. Final versions of the evaluations are available in Data Supplement S1 (available as supporting information in the online version of this paper, which is available at http://onlinelibrary.wiley.com/doi/10.1002/aet2.10349/full).
Study Protocol
Enrolled participants were instructed to prepare a brief education research project idea or proposal in advance of the session. They were also provided with an overview of the session including the goals and objectives as well as the process of the step‐back. The objectives of the session were as follows:
To provide an opportunity for education researchers to have their projects and proposals reviewed by experts for methodologic issues especially focusing on research questions and study design.
To promote high‐quality education research by refining the next wave of projects.
To connect more junior researchers with senior experts to create the opportunity for mentoring in the future.
Facilitators were also oriented to the goals and objectives of the session and how to perform the step‐back consultation. During the session, participants were divided into small groups consisting of two or three facilitators and one or two participants. Each participant was asked to write down their research study question and subsequently underwent a step‐back consultation for their education research study for approximately 20 minutes with the goals of receiving targeted feedback on strengths, weakness, potential barriers, solutions, and next steps for their study. The components of the step‐back consultation are shown in Figure 1. At the end of the session participants rewrote their study question, and both facilitators and participants completed evaluative surveys.
Figure 1.

Step‐back consultation outline.
Data Analysis
Descriptive statistics were reported for items with discrete answer choices. Qualitative analysis using a thematic approach was performed by two analysts (JJ and MG) on data from free‐response items. Data were reviewed line by line and assigned codes using the constant comparative technique.14 The two analysts met to decide a final coding scheme after independent review. This coding scheme was then applied to all data by each of the analysts. Inter‐rater agreement was 87.5%. Discrepancies were resolved by in‐depth discussion and negotiated consensus. Participant study questions were evaluated by two reviewers (KS and MG), blinded to the time at which the question was written, utilizing the PICO (population, intervention, comparison, outcome) and FINER (feasible, interesting, novel, ethical, relevant) criteria. Each question was evaluated for the “presence” or “absence” of each item in the PICO scale. Each question was also rated as “yes” or “no” for each component of the FINER criteria. Both scales were evaluated using a two‐way random‐consistency intraclass correlation. Before and after scores were evaluated with a paired t‐test. All calculations were performed using SPSS v25 (IBM Corp.).
Results
General Results
Twenty‐four facilitators and 13 participants completed the step‐back session. Evaluations from 20 facilitators and nine participants were submitted and analyzed.
Participants
Seven of nine responding participants felt that the step‐back discussion was “very valuable” for improving the strength of their study methods, one of nine felt that it was “moderately valuable,” and one of nine felt that it was “a little valuable.” No participant felt that it was “not valuable at all.” Five of nine participants had never published an education research manuscript. For the remaining four participants who had published education research previously, the mean number of publications was 4.5 ± 1.73. All responding participants recommended that the session be provided again at a future CORD academic assembly, and the majority (8/9) would be willing to serve as a facilitator in the future. Seven participants responded to a question regarding their plans to contact members of their group that were not previously known to them to discuss their project in the future; three responded “yes,” three responded “no,” and one responded “maybe.” When asked how their study changed as a result of the step‐back session, qualitative analysis of responses yielded four major themes: refinement of study question, more specific outcomes and measurements, improvement in study design, and greater understanding of study limitations.
Facilitators
Facilitators positively viewed the session with 16/20 facilitators reporting that the step‐back method “greatly facilitated” their ability to share their education research expertise while the remaining 4/20 felt that it “somewhat facilitated” their ability. The majority of facilitators (19/20) considered the step‐back session to be participation in a community of practice. All facilitators recommended that the session be provided at a future academic assembly and would participate as a facilitator again. Qualitative analysis of a question regarding suggestions for improvement revealed five major themes: praise for the session, desire for additional time, a room set up more conducive to small group work, greater number of participants, and more advanced preparation of participants. Exemplar quotes include:
“ Awesome! I'm going to do this at my med‐ed research group at home.”
“ Need more time.”
“This was well designed, suggestions for improvement include round tables and more participants.”
“ I think participants should submit their research questions in advance.”
Participant Study Questions
Both FINER and PICO scales showed good interrater reliability, PICO ICC = 0.88, p = .001 and FINER ICC = 0.713, p = .025. Both scale comparisons showed improvement pre‐ and post‐intervention [PICO mean(SD) 2.25(1.37) pre‐intervention and 3.59(0.49) post‐intervention, 60% relative increase; FINER 4.25(1.17) pre‐intervention and 4.92(0.204) post‐intervention, 16% relative increase]. Neither achieved statistical significance [PICO t(5) = ‐1.835, p = .126 and FINER t(5) = ‐1.305, p = .249].
Discussion
To the best of our knowledge, this is the first study to evaluate the feasibility and effectiveness of the Step Back method for education research consultation. Given that clinician educators often face multiple competing demands for their time and may have limited local education research expertise available, this type of faculty development holds great potential to advance the field by providing instruction in education research methodology, dedicated feedback specific to an individual's project and needs, and an opportunity for collaboration and mentorship that might not have otherwise been accessible.15, 16, 17
Our pilot study found that this technique was feasible to implement and valued highly by both the facilitators and participants alike. Qualitative assessment identified multiple methodologic areas where improvements in research projects were made from study question to research design to potential limitations. This demonstrates the wide range of content that was addressed through this modality. Another benefit of this session was the ability to create new educational networks for future collaboration, which has been recommended by education researchers as a strategy for success.18 The majority of facilitators perceived this activity to be participation in a community of practice. While a true community of practice requires multidimensional experiences over time between a group of likeminded educators, this may be the first step in creating a community of practice among the participants.19 In fact, half of responding participants stated they had plans to follow up with their group and continue the collaboration after the session.
Based on PICO criteria, quality of initial questions designed by participants ranged widely. Not surprisingly, novice researchers often choose study topics based on interest and passion but don't always apply the PICO criteria, resulting in less rigorous study questions, e.g. “How can we improve medical student performance on a simulation scenario?” We expected that expert consultation would improve study question and design; however, while this study did show improvement it both scales, it was not statistically significant. Based on analysis by two reviewers, there was a 60% increase in conforming to PICO criteria after the study intervention. Interestingly, the assessment of FINER criteria did not demonstrate much improvement after the intervention which may simply speak to the fact that the questions were already worthwhile endeavors but not rigorously developed. This is also supported by the qualitative analysis identifying refinement of study question, more specific outcomes and measurements, improvement in study design and greater understanding of limitations as improvements in their project rather change of topic or question content. A lack of power may also be contributory. It is important to note that these assessments only evaluate the study question and while an extremely important part of a research study, there are many other components, as highlighted through qualitative analysis, that are essential for a methodologically sound study that the Step‐Back method could impact. It currently remains unclear whether a 20‐min step‐back consultation could achieve significant improvements, however, given that education research experts particularly in EM are not ubiquitous, the authors believe a forum at a national conference would add value to developing education research projects.
The main lesson learned was that for a “Step‐Back” exercise to be successful, conditions should be conducive to a small group discussion. This pilot study occurred in a large hall with all the small group discussions occurring in one room; this was identified as a concern by the participants, facilitators and the program organizers. Future large‐scale “Step‐Back” exercises would benefit from separating into small meeting rooms with round tables after initial group instruction and adhering to optimal conditions for small group education.20, 21 Participants and facilitators both suggested allowing more time for the discussions. This requires further evaluation. In the studied session, time was allotted by the CORD Academic Assembly Program Committee and influenced by program leaders' desire to accommodate the greatest number of participants given the available facilitators. The ideal amount of time remains unclear; additional time would probably be more satisfying for all participants but whether the participants' research studies would improve significantly requires further investigation.
This pilot study suggests that the “Step‐Back” method can be utilized to provide much needed faculty development in education research content, methodology, as well as create an opportunity for collaboration and mentorship, thereby serving to address identified needs to improve the quality and quantity of education research.1, 2, 3, 4, 5, 6, 7, 8, 18 Future studies should also assess what is the most effective time frame and number of group members for this technique. Additionally, studies evaluating objective outcomes in the short, intermediate and long term such as methodologic changes, future collaborations, research presentations, and successful publications will shed light on the true value of this program.
Limitations
It is important to consider several limitations with respect to the current paper. First, this was performed at a national conference of residency educators and it is unclear whether similar benefits would be identified in a different educator population. Similarly, as no demographic data was collected, it is unclear in which populations this program may be beneficial, though we suspect that it would have value to anyone seeking to improve their current level of knowledge in education research. Additionally, the sample size was relatively small. However, we believe this was acceptable given that this was a pilot study of a new research evaluation strategy and the limitations of a conference setting. Additionally, the current study limited the session to 20 minutes. Based upon the feedback, it appears that more time would have been beneficial and it is unclear how this would influence the technique's effectiveness.
Conclusions
A national level education research consultation utilizing the step‐back method was feasible to implement and was highly valued by both facilitators and participants. Potential positive outcomes include refinement of study question, more specific outcomes and measurements, improvement in study design, and greater understanding of limitations. These results may inform others who wish to utilize this method.
Supporting information
Data Supplement S1. Evaluative surveys.
AEM Education and Training 2019;3:347–352
The authors have no relevant financial information or potential conflicts to disclose.
References
- 1. Sullivan GM, Simpson D, Cook D, et al. Redefining quality in medical education research: a consumer's view. J Grad Med Educ 2014;6:424–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Cook DA, Beckman TJ, Bordage G. Quality of reporting of experimental studies in medical education: a systematic review. Med Educ 2007;41:737–45. [DOI] [PubMed] [Google Scholar]
- 3. Cook DA, Levinson AJ, Garside S. Method and reporting quality in health professions education research: a systematic review. Med Educ 2011;45:227–38. [DOI] [PubMed] [Google Scholar]
- 4. Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and the quality of published medical education research. JAMA 2007;298:1002–9. [DOI] [PubMed] [Google Scholar]
- 5. Chen FM, Bauchner H, Burstin H. A call for outcomes research in medical education. Acad Med 2004;79:955–60. [DOI] [PubMed] [Google Scholar]
- 6. Lurie SJ. Raising the passing grade for studies of medical education. JAMA 2003;290:1210–2. [DOI] [PubMed] [Google Scholar]
- 7. Jordan J, Coates C, Clarke S, et al. Exploring scholarship and the emergency medicine educator: a workforce study. West J Emerg Med 2016;18:163–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Yarris LM, Juve AM, Artino AR, et al. Expertise, time, money, mentoring and reward: systemic barriers that limit education research productivity from the AAMC GEA workshop. J Grad Med Educ 2014;6:430–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Harvard Macy Institute . Twenty Years of Innovation: Educating to Innovate in Healthcare. Available at: http://www.harvardmacy.org/images/documents/hmi_20_years_final.pdf. Accessed November 7, 2018.
- 10. DeClute J, Ladyshewsky R. Enhancing clinical competence using a collaborative clinical education model. Phys Ther 1993;73:683–9. [DOI] [PubMed] [Google Scholar]
- 11. Hake RR. Interactive‐engagement versus traditional methods: a six‐thousand‐student survey of mechanics test data for introductory physics courses. Am J Phys 1998;66:64–74. [Google Scholar]
- 12. Michael J. Where's the evidence that active learning works? Adv Physiol Educ 2006;30:159–67. [DOI] [PubMed] [Google Scholar]
- 13. Rickards G, Magee C. Artino AR Jr You can't fix by analysis what you've spoiled by design: developing survey instruments and collecting validity evidence. J Grad Med Educ 2012;4:407–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res 2007;42:1758–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Farley H, Casaletto J, Ankel F, et al. An assessment of the faculty development needs of junior clinical faculty in emergency medicine. Acad Emerg Med 2008;15:664–8. [DOI] [PubMed] [Google Scholar]
- 16. Welch J, Sawtelle S, Cheng D, et al. Faculty mentoring practices in academic emergency medicine. Acad Emerg Med 2017;24:362–70. [DOI] [PubMed] [Google Scholar]
- 17. Gottlieb M, Dehon E, Jordan J, et al. Getting published in medical education: overcoming barriers to scholarly production. West J Emerg Med 2018;19:1–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Jordan J, Coates WC, Clarke S, et al. The uphill battle of performing education scholarship: barriers educators and education researchers face. West J Emerg Med 2018;19:619–29. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Lave J, Wenger E. Situated Learning: Legitimate Peripheral Participation. New York, NY: Cambridge University Press, 1991. [Google Scholar]
- 20. Tomcho TJ, Foels R. Meta‐analysis of group learning activities: empirically based teaching recommendations. Teach Psychol 2012;39:159–69. [Google Scholar]
- 21. Walton H. Small group methods in medical teaching. Med Educ 1997;31:459–64. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Supplement S1. Evaluative surveys.
