Skip to main content
Physiotherapy Canada logoLink to Physiotherapy Canada
. 2018 Summer;70(3):272–273. doi: 10.3138/ptc.2017-11-cc

Clinician's Commentary on Melling et al.1

Christine Léger 1
PMCID: PMC6158561  PMID: 30311916

Take a minute. Think back to the most memorable moment you had as a learner in your physiotherapy programme—whether it was feeling overwhelmed by taking a history from a difficult patient, feeling excited when hearing the benefits of a cardiorespiratory treatment on a chest assessment, or feeling concerned when palpating an abnormality while examining a patient. Physiotherapy is a profession based on clinical reasoning, hands-on skills, and patient–family interactions. For each of these moments, simulation-based training that provides experiential learning could be an ideal modality as it can integrate all these skills.

I have had the benefit of reading Melling and colleagues'1 article through two different lenses: first, that of a critical care physiotherapist with 10 years of experience and, second, as an educator who joined the simulation world a year ago. Research in simulation has typically occurred in the nursing and medical fields. Melling and colleagues thought they needed to shed light on the currently limited reports on the value of simulation to physiotherapy programmes in Canada. Their findings are similar to those of other studies on simulation, which have found variation in how the term fidelity was defined as well as how simulation was implemented and what barriers and benefits there were in using it. For me, the article raises two questions: What is this work a starting point for? Where do we go from here?

Do We Need to Define the Term Fidelity?

From my work as a simulation educator, Melling and colleagues'1 findings that fidelity is not clearly defined and that it varies widely across programmes was not a surprise because this term has been a source of debate in the simulation world for years. Interestingly, Hamstra and colleagues2 suggested that we abandon the term fidelity altogether.2 Those authors took issue with the idea of fidelity because it was “defined as the degree to which a simulator looks, feels, and acts like a human patient … emphasiz[ing] technological advances and physical resemblance over principles of educational effectiveness.”2(p387) In other words, simulation educators' focus should be less on the nice, shiny, wireless manikin and more on meeting the objectives of the students' education and optimizing learning transfer to clinical practice.

Instead of fidelity, Hamstra and colleagues2 recommended using the term functional task alignment—that is, aligning the simulation task with the clinical task to enhance transfer to clinical application. Let us use tracheal suctioning as an example. If the objective were for the student to provide the appropriate technique and identify the carina or an obstruction, one would choose a simulator that could best mimic what that would feel like in real life—whether it were a manikin, a pig's trachea, or a polystyrene foam cup. Regardless of the simulator, the educational effectiveness is seen to be the same, especially if the learners are engaged, able to suspend their disbelief, and buy into the simulation.

When designing simulation scenarios for educational purposes, one should focus on ensuring that a scenario meets specific objectives (technical, non-technical, or both), and, instead of fidelity, we should think in terms of realism relevant to the learning objectives. We want a learner to buy into the realism of the scenario, whether it is physical, conceptual, or emotional. As the field of simulation has already grappled with fidelity and produced other terms—functional task alignment and suspension of disbelief—perhaps the efforts put into defining fidelity could be better used to consider how learning objectives are aligned with the modality of training and evaluating those modalities to ensure that the learning objectives have been met.

How Are People Designing and Being Trained to Deliver Simulation-Based Training?

Melling and colleagues1 conducted high-level interviews that explored the use of differing simulation technology in physiotherapy, although they did not ask how simulation helped to improve physiotherapy training and assessment. To focus more on the latter, it would be useful to understand how physiotherapy educators are using simulation to create an educational experience—specifically, how they design, implement, and evaluate their sessions. For example, are sessions designed as peer-to-peer training? “See one, teach one, do one”? Case-based versus deliberate- or mastery-skill training? How many facilitators are there? How are facilitators trained?

Although one element of design, debriefing, was deemed to be important across all programmes, in my experience as a simulation educator, debriefing can be a difficult skill to develop. I would be curious to know whether clinicians and educators taking part in simulation have been taught to debrief effectively. PEARLS (Promoting Excellence and Reflective Learning in Simulation) is one debriefing framework that is widely used; it offers a blended approach that accounts for the experience of the facilitator, the insights of the learner, the time available to debrief, and the session objectives.3 The skill of debriefing differs from that of providing feedback, and, as a clinician, I believe it would be valuable for not only those involved in simulation but also those providing clinical placements for students.

Where Do We Go from Here?

The article provides an excellent overview of how simulation is used in physiotherapy programmes. As a simulation educator, I recommend that those interested in using simulation as a training modality or researching its effectiveness, need to:

  1. Study how simulation enhances learning transfer and how that transfer can affect other domains, such as patient care and safety;

  2. Examine how well educators who develop simulation scenarios or programmes understand best practices in simulation, thereby ensuring that their learners and their patients derive the most value from the experience;

  3. Learn what resources are available across local and affiliated centres to optimize how we support teaching for learners as well as for new graduates and colleagues engaged in continuing education and lifelong learning; and

  4. Integrate ourselves into an existing simulation community of practice so that we leverage existing evidence and learn from and with like-minded individuals.

References

  • 1. Melling M, Duranai M, Pellow B, et al. Simulation experiences in Canadian physiotherapy programmes: a description of current practices. Physiother Can. 2018;70(3):262–71. 10.3138/ptc.2017-11.e [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Hamstra SJ, Brydges R, Hatala R, et al. Reconsidering fidelity in simulation-based training. Acad Med. 2014;89(3):387–92. 10.1097/ACM.0000000000000130. Medline:24448038 [DOI] [PubMed] [Google Scholar]
  • 3. Eppich W, Cheng A.. Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthc. 2015;10(2):106–15. 10.1097/SIH.0000000000000072. Medline:25710312 [DOI] [PubMed] [Google Scholar]

Articles from Physiotherapy Canada are provided here courtesy of University of Toronto Press and the Canadian Physiotherapy Association

RESOURCES