Abstract
Objective
To better understand the potential of a needs assessment approach using qualitative data from manikin-based and virtual patient simulation debriefing sessions compared with traditional data collection methods (ie, focus groups and interviews).
Design
Original data from simulation debrief sessions was compared and contrasted with data from an earlier assessment of critical care needs in a community setting (using focus groups and interviews), thus undertaking secondary analysis of data. Time and cost data were also examined. Debrief sessions were coded using deductive and inductive techniques. Matrices were used to explore the commonalities, differences and emergent findings across the methods.
Setting
Critical care unit in a community hospital setting.
Results
Interviews and focus groups yielded 684 and 647 min of audio-recordings, respectively. The manikin-based debrief recordings averaged 22 min (total=130 min) and virtual patient debrief recordings averaged 31 min (total=186 min). The approximate cost for the interviews and focus groups was $13 560, for manikin-based simulation debriefs was $4030 and for the virtual patient debriefs was $3475. Fifteen of 20 total themes were common across the simulation debriefs and interview/focus group data. Simulation-specific themes were identified, including fidelity (environment, equipment and psychological) and the multiple roles of the simulation instructor (educative, promoting reflection and assessing needs).
Conclusions
Given current fiscal realities, the dual benefit of being educative and identifying needs is appealing. While simulation is an innovative method to conduct needs assessments, it is important to recognise that there are trade-offs with the selection of methods.
Keywords: qualitative research, quality in health care
Strengths and limitations of this study.
Simulation is an innovative methodology to undertake needs assessments.
Using simulation permits the development of an environment that enables the learner to perform naturally and gain insight into the complexity of the actual workplace.
Study adds to the relative dearth of qualitative work in simulation and medical education.
Study sample is relatively small and is performed at a single centre.
Cross-sectional nature of the study does not permit generalisations.
Introduction
Calls for innovative strategies in conducting needs assessments (NAs) have been made in the medical literature over an extended period of time.1–5 A NA is a systematic process to collect and analyse information on a target group’s needs (ie, examine gaps between current and desired situations).6 Simulation holds potential as a NA method to promote a better understanding of these gaps given that it aims ‘to develop an environment that enables the learner to perform naturally to gain insight into the complexity of the actual workplace’7 (p. 59). Prior research has demonstrated that simulation permits trainees to live through a realistic experience, make mistakes in a safe environment and practice before they actually perform on real people.8 9 Similarly, medical educators also find simulation experiences to be stimulating and realistic and provide opportunities for the integration of basic clinical teaching with advanced problem solving especially given the opportunities to reflect on the case after the simulation scenario.8 Through a process of experiential learning and deliberate practice, the use of simulation in health professionals’ education has been shown to consistently improve the acquisition of knowledge, skills and behaviours.10 11 However, there is a paucity of literature on the role of simulation in performing NAs, including the use of simulation to determine system and/or institutional level gaps for change management. In addition, there is a general lack of qualitative simulation studies in medical education that compare simulation to more traditional qualitative methods.12–14
Recognition and care of critically ill patients in community settings is complex, requiring skilled staff and optimal use of resources at the site, plus a coordinated system for interaction with and transfer to the referral centre when needed. In 2006, the Critical Care Strategy was announced by the Ministry of Health and Long-Term Care of Ontario, Canada. The purpose of this ongoing initiative is to improve access, quality and system integration to ensure all citizens of Ontario have equal access to high-quality critical care. In keeping with this mandate, a comprehensive NA was completed by members of the current research team, which identified gaps in caring for critically ill patients at a single community hospital.15 These results provided insights into the needs of a community to optimise care of its critically ill patients, as well as suggestions for how a referral hospital may best support its community site. However, the cost and time required to complete this study was substantial, and the process requires streamlining in order to be feasible to implement across numerous sites.
This earlier study included interviews, focus groups, manikin-based simulation (MBS) and virtual patient simulation (VPS), questionnaires and a family survey. Following each of the MBS and VPS, 20 min debrief sessions were held and were video-recorded. These debrief sessions were not included in the comprehensive NA but rather were included as normal pedagogical practice in providing feedback for simulation participants and to facilitate development of reflective skills and teaching for simulation participants.16 However, on reviewing the recordings, it was notable that many of the same themes that were discussed in the larger NA were also identified by participants in these debriefs. This serendipitous finding suggested that simulation debriefs could be of value as data for NA either alongside or instead of traditional approaches. The overarching guiding research questions included: (1) how do the needs identified through simulation compare with those identified using traditional methods of NA data collection?, (2) can similar data be captured more efficiently in the simulation debrief session compared with lengthier traditional methods? and (3) what are the strengths and limitations of using simulation in NA?
Specifically, this study aims to better understand the potential of a NA approach using qualitative data from MBS and VPS debriefing sessions to explore the system, team and individual level needs in caring for critically ill patients in a community context, compared with traditional methods (ie, focus groups and interviews). We also aimed to compare feasibility in terms of time and cost.
Methods
Secondary analysis has been recognised as an important yet underused research approach.16 It has been defined as the reanalysis of an existing data set, which may be used to investigate new research questions or verify previous research findings.17 18Using an exploratory qualitative design, this current research used original data that were compared and contrasted from simulation debriefs with data from the earlier assessment of critical care needs in a community setting, enabling exploration of the current research question from our existing data.
Patient and public involvement
By employing a patient-centred approach to research, the research questions and outcome measures were informed by patient outcomes in mind. That is, by understanding feasible and perhaps more timely approaches to conducting NAs earlier interventions can be implemented to facilitate patient care. It should be noted that patients nor patient advisors were involved in the recruitment or conduct of this study. Presentations at hospital medical rounds and continuing professional development sessions are the primary mechanisms to disseminate results to study participants.
Design and analysis
Original study data collection and analysis
The original mixed-method study was conducted between June 2011 and February 2012. A conceptual framework, centred on the critically ill patient, guided the design and selection of that data collection instruments. Different perspectives sampled included regional leaders, healthcare professionals at the community and its referral hospital, as well as family members of patients who had received care at the community intensive care unit. Interviews and focus groups were designed to follow a semistructured, broad, predetermined line of inquiry that was flexible permitting exploration of themes. Data from each interview and focus group were transcribed and entered into NVivo software, and inductive coding techniques were applied as informed by Creswell’s thematic analysis approach.19 The constant comparative method was used as data were analysed.18 Full information regarding the original study can be found in Sarti et al. 15
Simulation
Simulations were conducted at the community hospital to obtain data on human and social capital at the community hospital, including interdisciplinary team functioning, crisis resource management and critical care knowledge and skills.10 20 21 The simulation component of the NA consisted of two forms of simulation: MBS (eg, SimMan) and VPS (eg, interactive video with patient actors), each followed by debriefing sessions using an expert facilitator engaging participants in reflective and focused discussion on a particular scenario while simultaneously providing teaching.10 16 21–23 To maximise participants’ exposure to the various cases, each team completed two MBS and two VPS sessions. Canadian experts in critical care designed the scenarios to represent prototypical clinical encounters. These scenarios were originally developed for residents in Canada, with the Acute Critical Events Simulation course. The scenarios, which included cases of impending respiratory failure, shock, sepsis and arrhythmias, were reviewed by an interdisciplinary panel, modified to reflect the realities of practice in the community hospital and were video-recorded. To assess performance during simulation, custom task checklists and two validated global rating scales were completed.22 23 Only quantitative data from the simulations was included in the original NA,15 given that debriefs have not been described as NA tools.
The MBS and VPS scenarios were each followed by a 20 min debrief session, which were video-recorded. The debriefs were designed to establish an engaging and supportive learning environment, promote facilitated reflection and discussion, explore performance gaps and provide feedback to the participants with respect to the scenarios.24 Facilitators used a blended approach, including focused facilitation to encourage critical reflection and deeper understanding of events and also to provide information through directed performance feedback and teaching.25 In addition to the standard learner-centred debriefing, participants were encouraged to discuss their practice context and reality.
Time and cost analysis
Time for each of the data collection methods, interviews, focus groups and debriefs was captured from audio files. Data on the financial costs were captured in budgets and expenditure tracking documents, including equipment, travel expenses and hourly salary rates. MBS specific costs included manikin rental, rental van for transportation. Both MBS and VPS required use of computer programs, a simulation instructor and technologist. Travel was required for both forms of simulation and focus groups. The interviews from the earlier study were held via telephone. The debriefs, interviews and focus groups all required a facilitator, audio recorder, transcriptionist, researcher and research assistant to perform coding and thematic analysis. Investment costs for initial implementation of a simulation programme, annual operational maintenance and replacement expenses were not considered. Time and cost to prepare the interview/focus group guides and simulation cases were not included in the analysis, as there were not enough data available to accurately estimate.
Secondary data analysis
Data analysis comprised secondary thematic analysis and comparative analysis.17 18 Comparative analysis was required to compare and contrast the data from the earlier study with the MBS and VPS debriefs.
Thematic analysis of the debriefs was performed.26–28 Transcripts were entered into NVivo software. Codes identified in previous work/inquiry were applied to the data.19 To enhance study rigour multiple coders coded the transcripts, including two researchers who were involved with coding in the original NA (AJS and SS) and one researcher who was not involved with coding in the original study (RA). Researchers actively searched for disconfirming data and identification of additional codes; inductive and deductive approaches were used. Themes and their definitions were decided through researcher discussion and negotiation. Qualitative data from the simulation debriefs was contrasted to the qualitative data obtained with the earlier NA (focus groups and interviews). The final analytic component included reading through all the transcripts in each data collection modality (traditional, VPS and MBS) so as to selectively identify areas of convergence and divergence in both the content and structure of the transcript per data collection method.29
Study rigour
Multiple strategies were employed to minimise threats to the validity/credibility of the study. Efforts were made to search for disconfirming evidence through the use of purposive sampling, with the selection of participants to provide a balanced representation of the collective group, including potential differences of opinion. Two forms of triangulation were employed to achieve a balanced perspective and enhance the reliability of the conclusions: (1) data source triangulation (using multiple data sources and informants) and (2) investigator triangulation (using more than one person to collect, analyse and interpret data).
Results
Participants
There were 31 participants in the focus groups (13 from the community hospital, 11 from the referral hospital and 7 in an interhospital focus group; this included 12 physicians, 14 nurses and 5 respiratory therapists (RTs) and 22 participants in the interviews (2 regional leaders, 7 community hospital leaders and 13 referral hospital leaders). In the simulations, there were 13 participants from the community hospital (six physicians, six nurses and one RT) who formed six teams (see table 1).
Table 1.
Earlier comprehensive NA | |
Interviews | Total=22 |
Regional leaders | 2 |
Community hospital leaders | 7 |
Referral hospital leaders | 13 |
Focus groups | Total=31 |
Community hospital | 6 MDs, 6 RNs and 1 RT. |
Referral hospital | 4 MDs, 5 RNs and 2 RTs. |
Interhospital | 2 MDs, 3 RNs and 2 RTs. |
Simulation debriefs | |
Manikin-based simulations (MBS) | Total=13 (6 MD, 6 RNs and 1 RT). |
Community hospital | 6 teams (1 MD, 1 RN±RT per team); each team performed two MBS cases. |
Virtual patient simulations (VPS) | Total=13 (6 MDs, 6 RNs and 1 RT). |
Community hospital | 6 teams (1 MD, 1 RN±RT per team*); each team performed two VPS cases.** |
*One VPS was completed by a physician alone (no other team member).
**One team completed only one of the two VPS cases.
MD, physician; RN, nurse; RT, respiratory therapist.
Time and cost analysis
The 22 interviews (average 31 min; range 15–48 min) and 6 focus groups (average 108 min; range 57–154 min) yielded 684 min and 647 min of audio recordings, for a total of 1331 min. The MBS debriefs averaged 22 min (range 17–30 min; total=130 min), and VPS debriefs averaged 31 min (range 25–48 min; total=186 min). The results of the cost analysis are displayed in table 2. The total cost for interviews and focus groups was approximately $13 560, for MBS was $4 030 and for VPS debriefs was $3 475.
Table 2.
Items | Interviews/focus groups (FGs) | Virtual patient simulations | High fidelity simulations |
Costs with running the simulations* | |||
Rental van – Bringing equipment to site | N/A | N/A | $550 |
Facility rates† | N/A | No charge | No charge |
Manikin daily rental fee | N/A | N/A | $500 |
Computer software program |
N/A | $0 (newly developed software program licencing fee) |
$0 (software program owned) |
Needles/gauze/ syringes and so on for MBS | N/A | N/A | No additional charge. Reusable materials. |
Simulation instructor‡ | N/A |
$1002
($1250–$248, Total daily cost minus debrief) |
$1074
($1250–$176, Total daily cost minus debrief) |
Technologist§ | N/A | $400 | $400 |
Subtotal | N/A | $1402 | $2524 |
Cost specifically required for the NA/debrief* | |||
Facilitator |
$1332
(22.2 hours × $60/hour) |
$248
(3.1 hours × $80/hour) |
$176
(2.2 hours × $80/hour) |
Travel to the site¶ |
$360
($120×3 visits to the site for focus groups) |
$120 | $120 |
Audio recorder | No additional expense (if you have to buy one it is about $250) |
No additional expense | No additional expense |
Transcription** |
$1434
Interviews=11.4 data hours × 2.5 transcription hours per hour of data × $20/hour=570. FGs=10.8 data hours × 4 transcription hours per hour of data × $20/hour=864. |
$248
(3.1 hours × 4×20=248) |
$176
(2.2 hours × 4×20) |
NVivo data entry†† |
$1554
(22.2×35×2) |
$217
(3.1×35×2) |
$154
(2.2×35×2) |
Data analysis – coding and thematic analysis‡‡ |
$8880
(22.2 data hours × 2 researchers at 80/hour for 2.5 hours per hour of collected data) |
$1240
(3.1 × data hours × 2 researchers at 80/hour for 2.5 hours per hour of collected data) |
$880
(2.2 data hours × 2 researchers at 80/hour for 2.5 hours per hour of collected data) |
Subtotal | $13 560 | $2073 | $1506 |
Total | $13 560 | $3475 | $4030 |
*Note all funds are reported in Canadian dollars.
†Facility rates at this site were not charged. Note that typical rental costs are between 200 and 300 per hour.
‡Cost assumes access to a trained instructor. Instructor training would be an additional cost. The daily cost for a simulation instructor is $1250. The cost of the debrief sessions has been separated in this table.
§Cost assumes access to a trained technologist. Training would be an additional cost.
¶Land travel at $0.54/km. Travel required for simulations and FGs (interviews were via telephone).
**Transcription costs: for one to one interview assumes 2.5 hours per 1 hour recording for transcription. For focus group and simulation debriefs assumes 4 hours per 1-hour recording. Transcriptionist rate is $20 per hour.
††NVivo data entry: research assistant salary $35 per hour – assumes 2 hours required per hour of data.
‡‡Data analysis includes researcher salary of $80 per hour. Considers two researchers for coding with approximately 2.5 hours for each researcher per hour of data collected.
Comparative analysis
Data from VPS and MBS debriefs contributed to 15 of 20 total themes compared with the earlier study (see online supplement A). When comparing the top five themes in terms of highest frequency, two themes consistently appear across all three data collection modalities: knowledge, skills and abilities (KSAs) (NA interviews and focus groups: n=104, MBS: n=53, VPS: n=127), and solutions (NA interviews and focus groups: n=193, MBS: n=28, VPS: n=57). Similarly, when comparing the five themes with the lowest frequency counts, two themes appear across all data collection modalities: leadership (NA interviews and focus groups: n=23, MBS: n=6, VPS: n=10) and night/weekend (NA interviews and focus groups: n=48, MBS: n=5, VPS: n=27). Themes not identified with either form of simulation debriefs included palliative/end-of-life care, patients postreferral hospital, lack of understanding, vision and family and patient thoughts. A descriptive matrix with the themes and representative quotes from the various data collection methods is presented in online supplement B. In general, for the themes common to both interviews/focus groups and simulation debriefs, similar high-level needs were identified, and similar overarching conclusions could be drawn from the simulation debriefs compared with the earlier NA. However, more descriptive data were discovered with the earlier NA versus the simulation debriefs where data were more direct and to the point.
bmjopen-2017-020570supp001.pdf (23.2KB, pdf)
bmjopen-2017-020570supp002.pdf (33.2KB, pdf)
As an exemplar, KSA was identified across all methods. A key gap identified within this theme was the management of respiratory failure and ventilation. This gap was identified in the interviews, focus groups and simulation debriefs. Key issues identified in the earlier study and simulation debriefs, within this topic included basic and difficult ventilation strategies, troubleshooting and managing status asthmaticus. Weaning and lung protective strategies specifically were only identified in the interviews and focus groups. Both the earlier study and simulation debriefs identified system-level gaps that contributed to this need, including the need for 24-hour RT coverage. Where this need was identified in the simulation debriefs, a greater depth of data emerged during the focus groups surrounding the nature and impact of the gap/lack of 24-hour coverage. In the following focus group, participants discussed challenges of weaning patients:
We’ve been wanting to put patients on APRV at night and it makes it difficult because as they improve their volumes are going to get larger and it’s something that you really have to watch on the vent, and the nurses don’t. They’ll watch but they don’t really understand as much as what we do, the doctors have no idea, it’s just really us. We’re leery sometimes to put somebody on bi-level APRV, whatever you want to call it, because we’re not here 24 hours to watch the whole process happen.
The main themes identified from the simulation (not found in the interview/focus group data) were related to the fidelity of the simulation (environmental, equipment and psychological) and the role of the simulation instructor in teaching and promotion of reflection (see online supplement C). In addition, the theme of interruption was identified only in the MBS debriefs, which occurred when the facilitator interrupted a participant to provide teaching/impart knowledge.
bmjopen-2017-020570supp003.pdf (33.4KB, pdf)
In some instances, lower fidelity led to the discovery of gaps in practice. In the following example, the creation of an ‘unreal’ environment led to the discovery of a system-level gap. In this situation, the participant highlighted that receiving blood work quickly in the MBS, which does not match their reality and may impact patient care:
The blood work is too long in [the community hospital]. It’s horrible. Like you can do a code for an hour and you won’t even know your potassium, your calcium, or your CBC; it’s just a disaster.
Thus, the role of the facilitator was coded as producing several themes that only emerged within the MBS and VPS datasets. Unlike the traditional NA facilitator, the simulation facilitators carried out multiple roles. Two codes (promoting reflection and teaching) were evident in the educative roles the facilitator played. That is, the facilitator served to further engage the learners in the simulated scenario by promoting reflection through reflective cues. We defined reflection as the ‘process of learning through and from experience towards gaining new insights of self and/or practice’30 (p. 1). The following is an example of the facilitator providing reflective cues linking learning to the experience:
Facilitator: So that was an issue that was brought up by a couple of other nurses, not having an RT and not having ventilation. Having regular ventilation control, do you agree with that or do you have a different opinion?
Participant: I think there should be an RT 24/24 in this hospital.
Also, the teaching code was evident throughout both MBS and VPS. These educative remarks/exchanges were designed by the facilitator to provide information to the participants to impart knowledge rather than cuing the participants to reflect specifically on their experience.
Facilitator: The only thing I point out to you is that sometimes we like to choose the gentler sedatives, but they’re going to need sedation then they just may need more adequate haemodynamic support as well.
Finally, a code that only appeared in MBS data was one called ‘interruption’. This code highlighted the conflicting roles of ‘educator’ and ‘researcher’. During the simulation debriefs, at times the facilitator would interrupt the participants to provide education. In the following example, the participant starts to discuss a potential need to have an oscillator (eg, a specialised ventilator). The instructor interrupts the flow of the simulation debrief with directed questioning to provide education that this would not be required in their setting:
Participant: And we don’t have an oscillator if we truly needed one and we don’t…
Facilitator: Do you think you need an oscillator?
Participant: No, absolutely not.
In contrast, in the following quote, a focus group participant describes wanting to have the resource and skills to place Swan-Ganz catheters (a procedure not widely used in tertiary critical care). In this instance, the moderator does not provide education as is typical in interview/focus groups but rather summarises and continues to probe to ensure understanding of the needs. In this situation, the participants leave with the same perspective that this is perceived as being a priority.
Participant: We are not utilizing for example using Swan-Ganz… I tried to put Swan-Ganz for some of my patients that I thought they need it but then most of the nurses said, well last time we had it was 10 years ago, lost experience with that and we don’t have the modalities… Maybe that will give the nurses more confidence when they do it more frequent.
Facilitator: So is that ongoing education of the nursing staff…
Participant: Absolutely, because that’s what the ICU needs.
A comparison of the three different data collection methods (traditional, VPS and MBS) is displayed in table 3. The areas of convergence or where all three data collection modalities revealed the same element (to varying degrees) included: variation in reflection and uncovering system level barriers. Areas of divergence included: time, structure, facilitator skill level and education (the degree to which education was ‘built-in’ the method). The two elements that were present in the simulation data collection methods were the ability to conduct multiple cases in one session, as well as the simultaneous multiple roles played by the facilitator.
Table 3.
Observation/notation | Traditional interviews/focus groups (FGs) | VPS | MBS |
Skill level of facilitator | Moderate | High | Extremely High |
Time (average duration) | Interviews: 31 min. FGs: 108 min. |
31 min | 22 min |
Structure | Inquiry involves continuous questioning and answers. | Multiple cases involving a structure of playing part of a case, stopping to debrief/discuss, playing more of the case, stopping to debrief, discuss and so on. | Two cases in 15 min with a 5/10 min structure, that is, 5 min devoted to what the participants thought about the scenario, did they like it, was it realistic and so on, then 10 min to reflect on the case regarding their own practice realities. |
Variation in reflection | Reflect on past experience. | Serves as a prompt to reflection on reality (not focused on VPS case). | Immediacy of reflection tied tightly/coupled to simulation scenario, thus creating a platform for: (1) reflection in/on simulation and (2) reflect on reality. |
Educative purpose | Low | High | High |
Roles of the facilitator | Single: researcher/needs assessor. | Triple role: (1) teaching (education), (2) reflection and (3) researcher/needs assessor. | Triple role: (1) teaching (education), (2) reflection and (3) researcher/needs assessor. |
Trade-offs with various roles of the moderator/facilitator | Not applicable. | Triple role=more potential for impact. | Triple role=more potential for impact, that is, if teaching and interrupt may lead to less data collected for the research purpose (ie, identifying needs). |
Uncovering system level barriers | Requires a lot of time and perhaps multiple lines of questioning and/or interviews. | Moderate ability to probe system level abilities (as people want to waver and chat around many issues – not as streamlined and direct as sim scenarios). | Streamlined to uncover system level barriers. |
Technical difficulties | No occurrence in this dataset. Would be limited possibility (eg, audio recorder failure). | ‘Technical glitch’ RC Sim Team B (eg, blood gases results do not come up) and as a result they had to move on. | No occurrence in this dataset but could happen, more technical aspects hence likely greater risk than with traditional methods. |
Multiple cases at once | Not applicable. | Multiple cases. | One case per scenario. |
MBS, manikin-based simulation; VPS, virtual patient simulation.
Discussion
This study explored the potential use of MBS and VPS debriefs as NA tools and revealed that debriefs may be more efficient under certain circumstances, in terms of time and cost at capturing similar needs contrasted to traditional methods of data collection (interviews/focus groups). Our investigation has also highlighted various trade-offs that exist with selecting simulation as a NA method.
Time and cost
With respect to time, the simulation debriefs yielded a considerably shorter total length of audio recording (76% less time than interviews/focus groups). As such, the costs specifically required for the NA were significantly lower for the simulations compared with the interviews and focus groups (73% less cost incurred). Even when taking into consideration the total costs of running the simulation cases before the debriefs and the debriefs themselves, the cost remained lower due to the high cost of transcription, NVivo data entry and data analysis with larger volume of data collected. It is notable, that for the cost of simulations multiple goals may be achieved, in that the observed simulation scenario performance allows for quantitative measure of performance gaps, may serve as preintervention baseline performance data, and may reveal additional unperceived performance gaps, not otherwise captured in interviews, focus groups or debriefs, as demonstrated in our earlier study.15 It is important to note that cost analysis did not include the initial investment costs or maintenance of a simulation programme. Hence, if there were not a programme in place, the cost of simulation would be increased.31 The cost of a manikin-based simulator is substantially higher than a virtual patient simulator,32 which is an important consideration for those considering using simulation as debriefs in NA.
Comparative analysis
Even with substantially less time spent in the simulation debriefs, the majority of themes were identified in the simulation debriefs compared with the interviews and focus groups. Perhaps capturing needs is better accomplished when participants have an experiential and emotional encounter (possibly feeling more vulnerable), with the discussion occurring close to the event and promoting active participation. Theory underpinning the debriefs includes facilitating the transformation of experience into learning through reflection where ‘the ultimate goal of debriefing is for learners to reflect on and make sense of their simulation experience and generate meaningful learning that translates to clinical practice’.25 Links between emotion and cognition have been suggested and hence, actively experiencing an event accompanied by intense emotions, may result in long-lasting learning.30 33 Broadening the concept of participation, increasingly the importance of materiality (ie, objects and technologies) and relations (with social and material ‘forces’) are being recognised in the literature through a sociomaterial approach to practice and learning.34 Fenwick argues that materials, often missing from learning accounts, cannot be ignored as they fundamentally impact human activity (medical practice and knowledge), further stating that ‘any medical practice is a collective sociomaterial enactment, not a question solely of an individual’s skill’34 (p. 48). With this approach, simulation provides a model setting to better understand complex medical practice, hence allowing the opportunity to identify needs at various levels (system/team/individual) and across various complex intertwined elements (material/social/cultural) within unique systems. As the learners work to make sense of the simulation experience in reference to their own world, there is the opportunity to both identify needs and provide education. By identifying and interrupting matters that had previously felt settled, the so-called ‘black boxes that masquerade as matters of fact’ may be opened (p. 50).34
Although the majority of themes were identified in the simulation debriefs (15 of 20), as compared with the interview and focus groups, a greater depth of data was captured through the more traditional methods. With NAs, initial data collection may inform subsequent data collection decisions.35 In addition, priorities must be set, which includes identifying needs of greatest importance and most amenable to change.35 Depending on the purpose and scope of a given NA, simulation debriefs may stand alone or may be used to make decisions surrounding whether more extensive data are required. Performing simulation debriefs may also help identify the highest priority needs and determine the initial set of needs to be targeted, in that the needs that are most readily uncovered may be the highest priority contrasted to those that require more probing and questioning.
The findings highlight that not all themes identified in the interviews and focus groups were captured in the debriefs. More specifically, palliative and end-of-life care was not identified in the debriefs, nor was the vision of participants or two themes relating to the interhospital interaction (patients’ postreferral hospital and lack of understanding). In addition, although the theme of patient transfers was identified across all methods, the relative frequency and depth of data was much lower in the debriefs compared with the interviews and focus groups. This is an important yet not unexpected finding, given the simulation cases were not specifically designed to explore the areas of end-of-life care or the interaction between the community and referral hospital, contrasted to the traditional NA which undertook a broad line of inquiry along with probing into various aspects of critical care, including both end-of-life care and interhospital interactions. The debriefs also did not include asking participants their vision, and these data would be unlikely to emerge independent of directed inquiry. This finding highlights the risk of missing needs with the simulation debriefs and demonstrates the importance of scenario selection and development.
Trade-offs
In this investigation, multiple inter-related roles of the simulation facilitator during the debriefs were identified, including promoting reflection, teaching participants and exploring gaps in practice. Despite using different cases, online supplement C reveals that the two simulation methods produced similar patterns in terms of thematic frequency scores. That is, the three highest rated simulation specific themes were reflection and teaching. Perhaps this finding is indicative of the method whereby education is infused, upfront in simulation. In this way, a strength of simulation debriefs may include that they can act simultaneously as an education tool and data collection modality.
Simulation debriefs focus on transformative learning through self-reflection and may include individual and/or social engagement.30 The simulation debriefs capitalised on the social spectrum of reflection and through critical discourse between the facilitator and participants, needs/gaps were uncovered beyond individual and team performance, also uncovering system level gaps. Thus, a strength of using simulation debriefs may also include providing a tool for assessing needs across individual, team and system levels. Furthermore, this finding highlights the importance of working to structure the debriefs to promote deeper reflection,36 hence potentially surfacing unknown unknowns that combined with the quantitative data (normative needs) from the simulation offers more depth than eliciting only felt needs (known unknowns).
It is important to note that having the simulation facilitator act in multiple roles inevitably presents challenges and trade-offs among these roles, which is a potential limitation of using debrief session in NAs. For example, in the traditional interviews and focus groups, the facilitator remains ‘neutral’ and does not provide education while they pursue questioning to better understand the needs.37 In contrast, in the simulation debriefs, the facilitator does not remain neutral, at times interrupting the participants to redirect and provide education, as evidenced by the emergence of the interruption code within the MBS data. Interruption was coded as instances whereby the facilitator would intentionally stop the conversation to correct participants when they were clearly discussing inaccurate content. When priority is given to the educative role, the actions of the facilitator risks not allowing the participants to explore and express details surrounding their needs. However, the educative element also promotes engagement through a collaborative approach and participants may leave with a better understanding and having learnt something. Making transparent, thoughtful decisions surrounding which methods to select and recognising there are advantages and disadvantages to each are fundamental to performing NAs.37–40 If debriefs are to be more widely used in NAs, we need to better understand the trade-offs and their impact on the NA.
In this study, very experienced master instructors facilitated the debriefs. The quality of the debriefs may be linked to this, in that someone of lesser experience may not have been able to uncover these gaps, while providing skilled education, which potentially limits general use of debriefs in NA. How educators facilitate debriefings has been shown to be highly variable.41 Debrief facilitation also appears to be influenced by the professional background and style of the facilitators. In their exploratory investigation, van Soeren et al 12 described how some facilitators assumed the role of an interprofessional guide, whereas others assumed the role of teacher, tending to impart their knowledge. This variability in facilitation is an important consideration for assessing needs, in that if the facilitator were to have a style more strongly connected with teaching, then needs may not be readily uncovered. As simulation instructors interact with participants in collecting data for the NA, their role must be considered as meaning is actively coconstructed.42 In addition, the skill level of MBS and VPS may be different (ie, higher level/more experienced) than that of a facilitator collecting data in a more traditional qualitative manner.
Strengths of our study include highlighting the efficiency in using MBS and VPS simulation as a timely and potentially cost-efficient alternative to employing traditional (interviews and focus groups) methods although under certain assumptions (ie, the research team had access to a simulation centre with predeveloped simulation scenarios for both the MBS and VPS sessions). This finding is interconnected to the issue of the breadth and depth in data coverage. That is, the results of this study demonstrate similarities in breadth of themes using traditional methods and simulation debrief with the notable difference in terms of depth. Undeniably, the qualitative interviews and focus groups were able to provide more depth and richness in the data as opposed to the simulation techniques that were considerably shorter in terms of transcript coverage. However, simulation offers the added benefit of providing quantitative performance data that can serve as a baseline and to triangulate with the debrief data.
This was an exploratory study, which included secondary analysis of an existing dataset. Where secondary analysis has been recognised as an important, underused research approach, there are limitations to this method. The quality of the secondary data analysis rests on the quality of the existing dataset.17 It is important to highlight that, as described, our earlier study was performed with a rigorous methodology with numerous methods in place to ensure high quality and credibility of our findings. One concern noted in the literature is the potential ‘problem of data fit’.18 In the current study, the data were not originally collected for the current research objective; however, the available data were well positioned to answer the current research questions in an exploratory manner. In addition, ‘the problem of not having been there’ has been cited as a concern, in that challenges exist when the secondary researcher was not involved in the original data collection.18 Limitations of this study include the relatively small sample size and the focus on a single centre. Furthermore, while the results are comparable in terms of frequency of mention, they cannot be taken as absolutely equivalent, given the qualitative approach employed in this study. Further research is required to better understand the utility of simulation as a NA tool, the design features for NA and type of needs best identified using this approach. Moreover, it will be imperative that various stakeholder groups participate in each type of data collection methods so as to make more definitive conclusions.
In conclusion, this investigation provides support for the use of simulation debriefs as a NA method to explore needs at the system, team and individual levels. Qualitative data collected during debriefs may be a suitable substitute to the typical interviews and/or focus groups. Simulation debriefs promote a participatory, collaborative approach with the educative function built in. Given current fiscal realities, the dual benefit of being both educative while identifying needs is appealing although under certain conditions. While simulation is an innovative and effective method to conduct NAs, it is important to recognise that there are trade-offs with selection of methods requiring careful scenario design and debriefing.
Supplementary Material
Acknowledgments
The authors are grateful to all the participants who gave their time to assist us with this study.
Footnotes
Contributors: AJS contributed to the study planning and conceptualisation and led data collection, data interpretation/analysis, manuscript development and review. RA contributed to the study planning and conceptualisation, interpretation/data analysis, manuscript development and review. SS contributed to the study conceptualisation, interpretation/data analysis, manuscript development and review. AL contributed to data collection, manuscript development and review. JK contributed to the study planning and conceptualisation, data collection, manuscript preparation and review. PC contributed to the study planning and conceptualisation, data collection, manuscript development and review.
Funding: This study was funded by a grant from The Ottawa Hospital Academic Medical Organization (TOHAMO)
Competing interests: None declared.
Patient consent: Not required.
Ethics approval: Ottawa Hospital Research Ethics Board.
Provenance and peer review: Not commissioned; externally peer reviewed.
Data sharing statement: No additional data are available.
References
- 1. Laxdal OE. Needs assessment in continuing medical education: a practical guide. J Med Educ 1982;57:827–34. [DOI] [PubMed] [Google Scholar]
- 2. Norman GR, Shannon SI, Marrin ML. The need for needs assessment in continuing medical education. BMJ 2004;328:999–1001. 10.1136/bmj.328.7446.999 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Mazmanian PE. Resources and studies are required to build knowledge on assessment, service, and health care. J Contin Educ Health Prof 2010;30:75–6. 10.1002/chp.20061 [DOI] [PubMed] [Google Scholar]
- 4. Palinkas LA, Horwitz SM, Chamberlain P, et al. . Mixed-methods designs in mental health services research: a review. Psychiatr Serv 2011;62:255–63. 10.1176/ps.62.3.pss6203_0255 [DOI] [PubMed] [Google Scholar]
- 5. Gonsalves CL, Ajjawi R, Rodger M, et al. . A novel approach to needs assessment in curriculum development: going beyond consensus methods. Med Teach 2014;36:422–9. 10.3109/0142159X.2013.877126 [DOI] [PubMed] [Google Scholar]
- 6. Watkins R, Meiers MW, Visser Y. A Guide to Assessing Needs. Washington: World Bank Publications, 2012. [Google Scholar]
- 7. Flanagan B, Nestel D, Joseph M. Making patient safety the focus: crisis resource management in the undergraduate curriculum. Med Educ 2004;38:56–66. 10.1111/j.1365-2923.2004.01701.x [DOI] [PubMed] [Google Scholar]
- 8. Gordon JA, Wilkerson WM, Shaffer DW, et al. . "Practicing" medicine without risk: students' and educators' responses to high-fidelity patient simulation. Acad Med 2001;76:469–72. [DOI] [PubMed] [Google Scholar]
- 9. Larue C, Pepin J, É A. Simulation in preparation or substitution for clinical placement: A systematic review of the literature. J Nurs Educ Pract 2015;5:132–40. [Google Scholar]
- 10. Cook DA, Hatala R, Brydges R, et al. . Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011;306:978–88. 10.1001/jama.2011.1234 [DOI] [PubMed] [Google Scholar]
- 11. Maran NJ, Glavin RJ. Low- to high-fidelity simulation - a continuum of medical education? Med Educ 2003;37 Suppl 1(s1):22–8. 10.1046/j.1365-2923.37.s1.9.x [DOI] [PubMed] [Google Scholar]
- 12. van Soeren M, Devlin-Cop S, Macmillan K, et al. . Simulated interprofessional education: an analysis of teaching and learning processes. J Interprof Care 2011;25:434–40. 10.3109/13561820.2011.592229 [DOI] [PubMed] [Google Scholar]
- 13. Barry Issenberg S, Mcgaghie WC, Petrusa ER, et al. . Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27:10–28. 10.1080/01421590500046924 [DOI] [PubMed] [Google Scholar]
- 14. Sanford PG. Simulation in nursing education: A review of the research. The Qualitative Report 2010;15:1006–11. [Google Scholar]
- 15. Sarti AJ, Sutherland S, Landriault A, et al. . Comprehensive assessment of critical care needs in a community hospital*. Crit Care Med 2014;42:831–40. 10.1097/CCM.0000000000000036 [DOI] [PubMed] [Google Scholar]
- 16. Issenberg SB, McGaghie WC, Hart IR, et al. . Simulation technology for health care professional skills training and assessment. JAMA 1999;282:861–6. 10.1001/jama.282.9.861 [DOI] [PubMed] [Google Scholar]
- 17. Sales E, Lichtenwalter S, Fevola A. Secondary analysis in social work research education: past, present, and future promise. J Soc Work Educ 2006;42:543–60. 10.5175/JSWE.2006.200404136 [DOI] [Google Scholar]
- 18. Heaton J. Secondary analysis of qualitative data: An overview. Historical Social Research 2008;33:33–45. [Google Scholar]
- 19. Creswell JW. Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research. Boston, MA: Pearson, 2012. [Google Scholar]
- 20. Issenberg SB, McGaghie WC, Petrusa ER, et al. . Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27:10–28. 10.1080/01421590500046924 [DOI] [PubMed] [Google Scholar]
- 21. Cook DA, Triola MM. Virtual patients: a critical literature review and proposed next steps. Med Educ 2009;43:303–11. 10.1111/j.1365-2923.2008.03286.x [DOI] [PubMed] [Google Scholar]
- 22. Kim J, Neilipovitz D, Cardinal P, et al. . A pilot study using high-fidelity simulation to formally evaluate performance in the resuscitation of critically ill patients: The University of Ottawa Critical Care Medicine, High-Fidelity Simulation, and Crisis Resource Management I Study. Crit Care Med 2006;34:2167–74. 10.1097/01.CCM.0000229877.45125.CC [DOI] [PubMed] [Google Scholar]
- 23. Cooper S, Cant R, Porter J, et al. . Rating medical emergency teamwork performance: development of the Team Emergency Assessment Measure (TEAM). Resuscitation 2010;81:446–52. 10.1016/j.resuscitation.2009.11.027 [DOI] [PubMed] [Google Scholar]
- 24. Cheng A, Donoghue A, Gilfoyle E, et al. . Simulation-based crisis resource management training for pediatric critical care medicine: a review for instructors. Pediatr Crit Care Med 2012;13:197–203. 10.1097/PCC.0b013e3182192832 [DOI] [PubMed] [Google Scholar]
- 25. Eppich W, Cheng A. Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthc 2015;10:106–15. 10.1097/SIH.0000000000000072 [DOI] [PubMed] [Google Scholar]
- 26. Pope C, Ziebland S, Mays N. data aqualitative. BMJ 2000;320:114–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27. Ritchie J, Spencer L. Qualitative data analysis for applied social research In: A B, Burgess RG, eds Analyzing Qualitative Data. Routledge, 1994. [Google Scholar]
- 28. Huberman M, Miles MB. The qualitative researcher’s companion. SAGE 2002. [Google Scholar]
- 29. Miles MB, Huberman AM. Analysis QD. SAGE 1994. [Google Scholar]
- 30. Finlay L. Reflecting on “reflective practice.” practice-based professional learning centre. 2008. http://www.open.ac.uk/opencetl/files/opencetl/file/ecms/web-content/Finlay-(2008)-Reflecting-on-reflective-practice-PBPL-paper-52.pdf.
- 31. Danzer E, Dumon K, Kolb G, et al. . What is the cost associated with the implementation and maintenance of an ACS/APDS-based surgical skills curriculum? J Surg Educ 2011;68:519–25. 10.1016/j.jsurg.2011.06.004 [DOI] [PubMed] [Google Scholar]
- 32. Petscavage JM, Wang CL, Schopp JG, et al. . Cost analysis and feasibility of high-fidelity simulation based radiology contrast reaction curriculum. Acad Radiol 2011;18:107–12. 10.1016/j.acra.2010.08.014 [DOI] [PubMed] [Google Scholar]
- 33. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc 2007;2:115–25. 10.1097/SIH.0b013e3180315539 [DOI] [PubMed] [Google Scholar]
- 34. Fenwick T. Sociomateriality in medical practice and learning: attuning to what matters. Med Educ 2014;48:44–52. 10.1111/medu.12295 [DOI] [PubMed] [Google Scholar]
- 35. Altschuld JW, Watkins R. A primer on needs assessment: more than 40 years of research and practice. New Dir Eval 2014;2014:5–18. 10.1002/ev.20099 [DOI] [Google Scholar]
- 36. Husebø SE, Dieckmann P, Rystedt H, et al. . The relationship between facilitators' questions and the level of reflection in postsimulation debriefing. Simul Healthc 2013;8:135–42. 10.1097/SIH.0b013e31827cbb5c [DOI] [PubMed] [Google Scholar]
- 37. Tipping J. Focus groups: A method of needs assessment. J Contin Educ Health Prof 1998;18:150–4. 10.1002/chp.1340180304 [DOI] [Google Scholar]
- 38. Ratnapalan S, Hilliard RI. Needs assessment in postgraduate medical education: a review. Med Educ Online 2002;7:4542–7. 10.3402/meo.v7i.4542 [DOI] [PubMed] [Google Scholar]
- 39. Crandall SJS. Using interviews as a needs assessment tool. J Contin Educ Health Prof 1998;18:155–62. 10.1002/chp.1340180305 [DOI] [Google Scholar]
- 40. Mann KV. Not another survey! Using questionnaires effectively in needs assessment. J Contin Educ Health Prof 1998;18:142–9. 10.1002/chp.1340180303 [DOI] [Google Scholar]
- 41. Tannenbaum SI, Cerasoli CP. Do team and individual debriefs enhance performance? A meta-analysis. Hum Factors 2013;55:231–45. 10.1177/0018720812448394 [DOI] [PubMed] [Google Scholar]
- 42. Ng S, Lingard L, Kennedy T. Qualitative research in medical education: Methodologies and methods Swanwick T, Understanding Medical Education. Oxford, UK: John Wiley & Sons, 2013:371–84. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
bmjopen-2017-020570supp001.pdf (23.2KB, pdf)
bmjopen-2017-020570supp002.pdf (33.2KB, pdf)
bmjopen-2017-020570supp003.pdf (33.4KB, pdf)