Skip to main content
BMJ Simulation & Technology Enhanced Learning logoLink to BMJ Simulation & Technology Enhanced Learning
. 2017 Oct 3;3(4):169–171. doi: 10.1136/bmjstel-2016-000186

Classifying simulation-based studies using the description, justification and clarification framework: a review of simulation conference abstracts

Alastair Campbell Graham 1, Helen Rachael Church 1, Deborah G Murdoch-Eaton 1
PMCID: PMC8936684  PMID: 35517832

Abstract

Introduction

Simulation-based medical education (SBME) is an accepted learning methodology with an ever-expanding evidence base. Concerns have been expressed that research output in SBME lacks explicit links to educational theory. Using the ‘Description, Justification and Clarification’ framework we have investigated the extent to which SBME conference abstracts declare the educational theory underpinning their studies.

Methods

Abstracts from four major international SBME conferences (for 2014 and 2015) were reviewed. Abstracts were classified using the framework offered by Cook et al who classified studies published in major educational journals. Clarification studies are those which specifically declare and test their underpinning educational approach.

Results

We reviewed 1398 conference abstracts which we classified as Description 54.4%, Justification 36.3% and Clarification 9.3%. The two most frequently declared educational theories were Cognitive Theories and Experiential Learning.

Conclusion

The low proportion of Clarification studies found in the SBME conference abstracts reflects previous findings highlighting the lack of medical education studies that establish how and why SBME works. Researchers should be encouraged to declare their underpinning educational theories when presenting their work. Conference organisers play an important role in facilitating this through allowing sufficient word count in their submission criteria.

Keywords: Simulation, Research, Healthcare, Education, Conference Abstracts

Introduction

In June 2010, an Utstein Style Meeting, held in Copenhagen, Denmark, brought together 20 experts from the global simulation community.1 This aimed to establish a research agenda for simulation-based healthcare education and emphasised the need for such research to be grounded in theoretical or conceptual educational frameworks. The meeting highlighted the integral role of educational frameworks in linking individual studies in a meaningful way and reinforced the value of simulation as a suitable environment in which to apply established theories in new contexts.

Cook et al2 proposed a framework to classify the purpose of medical education research into three categories: Description, Justification and Clarification. These categories are based on the underpinning scientific methods within a cycle of enquiry consisting of observation, formulation of a hypothesis to explain the results, testing of the hypothesis and obtaining results to feed into the next cycle of enquiry (figure 1).

Figure 1.

Figure 1

The cycle of enquiry with classification of studies (adapted from Cook et al2).

Their framework was applied to a sample of articles from four leading medical education research journals and two specialty journals (one surgical and one medical) that frequently publish medical education research.2 Of these, 72% were Justification studies, 16% Description studies and 12% Clarification studies. Having demonstrated that clarification is uncommon in experimental studies in medical education, the authors published their framework and findings to stimulate education scholars to reflect on the purpose of their interventions and ask more clarification-style research questions.

Bordage3 states that ‘scholars are responsible for making explicit in their publications the assumptions and principles contained in the conceptual framework(s) they use,’ thus allowing scholars to build on each other’s work.

Description studies satisfy the question ‘What was done?’. They concentrate on observation and describe what was done with no comparison. They may report subjective and/or objective outcome data. For example, the description of a novel simulation course which instructs a single cohort of physiotherapists and reports only course evaluation data.

Justification studies aim to answer the question ‘Did it work?’ and focus on the last part of the cycle of enquiry. They compare an intervention to an alternative or a control, including single-group preintervention and postintervention evaluation studies. However, Justification studies do not confirm or refute an educational theory or framework. Such a study may compare debriefing with and without the use of video playback, but does not test the underpinning educational theory.

Clarification studies encompass all steps of the cycle of enquiry and conclude ‘How or why did it work?. Such studies articulate and test the educational approaches or theories underpinning the intervention. For example, they may demonstrate an improvement in students’ clinical skill performance based on deliberate practice.

Sevdalis,4 in the inaugural editorial for BMJ Simulation & Technology Enhanced Learning (BMJ STEL), articulated a need to move away from studies presenting self-report data from small numbers of attendees towards a deeper theoretical and practical understanding of effective simulation-based training within health and social care. Without this theoretical understanding, practice in medical education will remain anecdotal and perpetuate traditional and historical learning pedagogies, and reduce approaches likely to lead to learning. Studies that clarify the success or failure of a particular educational approach are critical to advance simulation-based medical education.

Abstracts represent the broadest and most up-to-date description of simulation-based studies. The mean/median time from presentation of abstract to full publication has been reported between 16.5 months and 22 months with 34.7%–51.2% of abstracts converted into peer-reviewed publications.5–8 We believe that conference proceedings can provide a richer and wider source of data. The purpose of this study was to apply the framework to abstracts presented at the four major global simulation conferences to identify Description, Justification and Clarification studies and compare the results with those obtained by Cook et al.2

Method

The local ethics committee deemed formal ethical approval for this review was not required. We reviewed all abstracts for 2014 and 2015 from the four largest simulation-focused conferences: Association for Simulated Practice in Healthcare (ASPiH), the International Meeting on Simulation in Healthcare (IMSH), the Society in Europe for Simulation Applied to Medicine (SESAM), and SimHealth (Australasia). Full conference proceedings were obtained either in print or online for all conferences,9–16 and their respective submission guidelines compared. A total of 1398 abstracts were reviewed. ACG and HRC independently classified the abstracts using the Description, Justification and Clarification framework according to the definitions given above (see the Introduction section).

Following initial independent review, any differences in opinion were resolved by discussion and mutual agreement on the final classification. Where an abstract was classified as a Clarification study the educational approach was recorded. Inter-rater reliability was evaluated using Cohen’s kappa coefficient.

Results

Conference abstract submission guidelines differed in both word count and content. Word count ranged from 300 words (ASPiH), 3500 characters (approximately 500 words) (SESAM), 600 words (IMSH) and 600–800 words (SimHealth) depending which session the abstract was being presented to. All conferences required a structured abstract but none required a statement of underpinning educational or theoretical framework.

Cohen’s kappa coefficient was 0.81, indicating strong inter-rater agreement across all conference abstracts.

Results from each set of conference abstracts are presented in table 1.

Table 1.

Results of the classification of the purpose of simulation-based studies presented as conference abstracts

Conference Classification Total number of abstracts
Description number (%) Justification number (%) Clarification number (%)
ASPiH 2014 118 (57.6)  79 (38.5)   8 (3.9)  205
ASPiH 2015  83 (50.6)  66 (40.2)  15 (9.2)  164
IMSH 2014 207 (53.8) 142 (36.9)  36 (9.3)  385
IMSH 2015  68 (43.9)  68 (43.9)  19 (12.2)  155
SESAM 2014 102 (60)  54 (31.8)  14 (8.2)  170
SESAM 2015  76 (53.9)  54 (38.3)  11 (7.8)  141
SimHealth 2014  52 (58.4)  31 (34.8)   6 (6.8)   89
SimHealth 2015  57 (64.0)  21 (23.6)  11 (12.4)   89
Total (%) 760 (54.4) 508 (36.3) 130 (9.3) 1398
Cook et al2  17 (16)  75 (72)  13 (12)  105

ASPiH, Association for Simulated Practice in Healthcare; IMSH, International Meeting on Simulation in Healthcare; SESAM, Society in Europe for Simulation Applied to Medicine.

There were 54 different educational theories identified from the conference abstracts. The 10 most commonly declared educational or conceptual frameworks (frequency) were:

  • Cognitive Theories (19)

  • Experiential Learning (13)

  • Gaming Theories (7)

  • Learning Styles (6)

  • Deliberate Practice (5)

  • Interprofessional Learning (4)

  • Mastery Learning (4)

  • Realism (4)

  • Self-regulated Learning (4)

  • Flipped Classroom (3)

Discussion

Our results support Sevdalis’4 assertion that simulation studies tend to present self-report data showing satisfaction with the simulation-based training session (Descriptive, 54.4%) or simple comparative studies (Justification, 36.3%). The high percentage of descriptive studies may reflect the continued expansion of simulation within healthcare, whereby new Centres wish to disseminate the details of their establishment, their range of simulation-based training programmes and current research interests. Only 9.3% of abstracts tested and articulated how or why an educational approach worked, illustrating scope for those presenting their work, no matter how early in development, to declare the underlying educational framework. Grounding simulation-based research in an educational framework is important to allow individual studies to be linked together in a more meaningful way.1

Being a teacher and researcher in medical education requires more than being an expert in the content area; it also requires a familiarity and use of differing educational approaches,.17 The Academy of Medical Educators acknowledge this in their ‘expected standards’ for medical educators for both teaching and educational research.18 These require medical educators to match the educational methods and technologies to their intended learning outcomes and those undertaking educational research are expected to demonstrate an awareness, understanding and application of educational theories and principles. Our review demonstrates that those conducting Clarification studies have applied a rich variety of educational approaches, with 54 different theories identified. The two most commonly identified were Cognitive Theories and Experiential Learning, which is not surprising given that these are two of the major educational theories relevant to SBME.19 However, it is encouraging that authors are exploring a wide variety of possible educational theories to enhance the delivery of SBME, for example, Gaming Theory and The Flipped Classroom. By highlighting the variety of educational approaches declared, we aim to encourage those using SBME to think creatively when applying educational approaches to their research.

Although reviewing abstracts highlights the most ‘up-to date’ research data in SBME, there are some associated limitations: The original classification framework proposed by Cook et al2 was developed for full journal articles. Abstract word count regulations inherently limit the detail of the study and perhaps authors choose to defer the details of the theoretical underpinning of their work to the subsequent oral / poster presentation or journal article. Due to the retrospective method used, confirmation of study categorisation at presentation was not possible. Therefore, reviewing only abstracts may have decreased the sensitivity to identify Clarification studies. Having demonstrated the utility of the Description, Justification and Clarification framework, the next stage would be to apply it to published research articles.

Some of the variation in Clarification study identification among the conferences reviewed could be attributed to differences in submission criteria. For example, the 2015 conferences with the lowest word limit (ASPiH and SESAM) had the lowest rate of identifiable Clarification studies. Also, none of the conference submission guidelines required authors to declare their educational approaches. Therefore, increasing the word limit to that of IMSH and SimHealth and requiring authors to declare the underlying educational approaches of their studies could promote (and help identify) Clarification studies.

To advance SBME, we must build a more comprehensive and rich evidence base where researchers are encouraged to be creative in their educational approaches, publishing and sharing their findings whether successful or not. Within a collaborative community, the sharing of theory-rich studies can inform future innovative research to advance simulation-based education so that we achieve the goal of ‘moving the field forward.’4

Footnotes

Contributors: ACG initially devised the concept for the research project. He then acquired the abstracts from the relevant conference proceedings and classified them. He drafted and critically revised subsequent versions of the manuscript and approved the final version for submission, agreeing to be accountable for all aspects of the work.

HRC reviewed the abstracts and classified them. She carried out the statistical analysis on the raw data. She drafted and critically revised subsequent versions of the manuscript and approved the final version for submission, agreeing to be accountable for all aspects of the work.

DGME made substantial contributions to the interpretation of data. She critically revised subsequent versions of the manuscript and approved the final version for submission, agreeing to be accountable for all aspects of the work.

Competing interests: None declared.

Provenance and peer review: Not commissioned; externally peer reviewed.

Data sharing statement: There is no additional unpublished data in relation to this study.

References

  • 1.Issenberg SB, Ringsted C, Østergaard D, et al. Setting a research agenda for simulation-based healthcare education. Simul Healthcare 2011;6:155–67. 10.1097/SIH.0b013e3182207c24 [DOI] [PubMed] [Google Scholar]
  • 2.Cook DA, Bordage G, Schmidt HG. Description, justification and clarification: a framework for classifying the purposes of research in medical education. Med Educ 2008;42:128–33. 10.1111/j.1365-2923.2007.02974.x [DOI] [PubMed] [Google Scholar]
  • 3.Bordage G. Conceptual frameworks to illuminate and magnify. Med Educ 2009;43:312–9. 10.1111/j.1365-2923.2009.03295.x [DOI] [PubMed] [Google Scholar]
  • 4.Sevdalis N. Simulation and learning in healthcare: moving the field forward. BMJ Stel 2014:1–2. 10.1136/bmjstel-2014-000003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Walsh CM, Fung M, Ginsburg S. Publication of results of abstracts presented at medical education conferences. JAMA 2013;310:2307–9. 10.1001/jama.2013.281671 [DOI] [PubMed] [Google Scholar]
  • 6.Bydder SA, Joseph DJ, Spry NA. Publication rates of abstracts presented at annual scientific meetings: how does the Royal Australian and New Zealand College of Radiologists compare? Australas Radiol 2004;48:25–8. 10.1111/j.1440-1673.2004.01243.x [DOI] [PubMed] [Google Scholar]
  • 7.Gregory TN, Liu T, Machuk A, et al. What is the ultimate fate of presented abstracts? the conversion rates of presentations to publications over a five-year period from three North American plastic surgery meetings. Can J Plast Surg 2012;20:33–6. 10.1177/229255031202000118 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Meissner A, Delouya G, Marcovitch D, et al. Publication rates of abstracts presented at the 2007 and 2010 Canadian association of radiation oncology meetings. Curr Oncol 2014;21:e250–4. 10.3747/co.21.1764 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Association for Simulated Practice in Healthcare. Association for simulated practice in healthcare annual conference: abstracts. BMJ Stel 2014;1:A1–88. [Google Scholar]
  • 10.Association for Simulated Practice in Healthcare. Conference proceedings of the association for simulated practice in healthcare (ASPiH) annual conference. BMJ Stel 2015;1:A1–A70. [Google Scholar]
  • 11.Society for Simulation in Healthcare. Program innovations to be presented at the 14th annual international meeting on simulation in healthcare. Simul Healthcare 2013;8:400–638. [Google Scholar]
  • 12.Society for Simulation in Healthcare. Research abstracts to be presented at the 15th annual international meeting on simulation in healthcare. Simul Healthcare 2014;9:394–492. [Google Scholar]
  • 13.Society in Europe for Simulation Applied to Medicine. 20th Anniversary meeting of the society in Europe for simulation applied to medicine: Ponzań, Poland. http://pure.au.dk/portal/files/78913813/sesam2014_abstracts.pdf (accessed 16 August 2016).
  • 14.Society in Europe for Simulation Applied to Medicine. 21st anniversary meeting of the society in Europe for simulation applied to medicine: Belfast, Northern Ireland. http://pure.au.dk/portal/files/96352647/SESAM_2015_Full_Abstracts.pdf (accessed 16 Aug 2016).
  • 15.Simulation Australasia. SimTecT and SimHealth 2014 conference proceedings. http://www.simulationcongress.com/wp-content/uploads/2014_Proceeding_Papers.pdf (accessed 13 Feb 2017).
  • 16.Simulation Australasia. SimTecT and SimHealth 2015 conference proceedings. http://www.simulationaustralasia.com/files/upload/pdf/SimTecTandSimHealth2015ConferenceProceedings.pdf (accessed 16 March 2016).
  • 17.van der Vleuten CPM, Dolmans D, Scherpbier A. The need for evidence in education. Medical Education 2000;22:246–50. [Google Scholar]
  • 18.Academy of Medical Educators. Professional standards for medical, dental and veterinary educators http://www.medicaleducators.org/Professional-Standards (accessed 13 Dec 2016).
  • 19.Ker J, Bradley P. Simulation in medical education. In: Swanick T, ed. Understanding medical education. 2nd ed. Oxford: Wiley Blackwell, 2014:175–92. [Google Scholar]

Articles from BMJ Simulation & Technology Enhanced Learning are provided here courtesy of BMJ Publishing Group

RESOURCES