Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2026 Feb 1.
Published in final edited form as: Simul Healthc. 2024 Mar 21;20(1):42–47. doi: 10.1097/SIH.0000000000000793

Using Mobile and Remote Simulation as a Research Methodology for Healthcare Delivery Research

Vicky J-H Yeh 1, Aysun Tekin 2, Ellen Green 3, Elizabeth Reifsnider 4, Alicia Lewis 5, Morgan Carver 6, Yue Dong 7
PMCID: PMC11415535  NIHMSID: NIHMS1963738  PMID: 38506500

SUMMARY STATEMENT

Mobile and remote simulation can be used as a research methodology to collect data in simulated environments to answer research questions pertaining to healthcare delivery. This research methodology can exponentially increase the reachable target study participants and provide generalizable conclusions. Using a large-scale national study in the United States (U.S.) as an exemplar, this article outlines the technology and equipment required to conduct mobile and remote simulations for research purposes. The cost associated with using mobile and remote simulations as well as the advantages and challenges of using this research methodology are also discussed.

INTRODUCTION

Healthcare simulation has traditionally been used to provide training to learners in health professions. To expand the reach of simulation education, the concepts of mobile simulation (bringing simulation to learners, e.g., in-situ simulation) and remote simulation (delivering simulation from a remote location) have been increasingly utilized to provide clinical training in areas where there is a lack of simulation facilities and/or educators.1 By far, research using mobile and remote simulations has largely been either to test the learning outcomes or to identify safety threats in clinical areas.2,3 Less has been reported about using mobile and remote simulations as a research methodology to collect data in a simulated environment to answer research questions pertaining to healthcare delivery, e.g., how treatment may vary between clinicians in different parts of the country. Just as mobile and remote simulation can increase the reach of education programs, it can also improve the generalizability of study results by increasing the geographical representation of study participants.4 In this article, we outline the technology and equipment required to conduct mobile and remote simulations for research purposes, using our experience of conducting a large-scale national study in the United States (U.S.) as an exemplar. We discuss the cost associated with using mobile and remote simulations as well as the advantages and challenges of using this research methodology.

CONCEPTUAL ANALYSIS - MOBILE AND REMOTE SIMULATIONS

In this paper, we define mobile and remote simulations based on the Healthcare Simulation Dictionary.5 Minor modifications were made to the definitions to fit the use of mobile and remote simulations in our study. Mobile simulation is defined as “a simulat[ion] that can be transported relatively easily and often refers to digital simulations with minimal equipment, without manikins”.5,6 Remote simulation is defined as “simulation performed with the [researcher] in an offsite location separate from other members [delivering the simulation] to complete [simulated] activities”.5,78 These approaches differ from more traditional modalities (conducting simulation activities at a local simulation center) as their delivery is not limited by the availability of a local simulation facility. This is particularly useful within a research context because mobile and remote simulations can increase the reachable target study participants when the intent of the study is to collect a diverse sample of participants (clinicians) and provide generalizable conclusions.

EXEMPLAR STUDY

In the following paragraphs, we describe our experiences using mobile and remote simulations to successfully implement14 simulation experiments across 11 states in the U.S. between October 2021 and December 2023.

In our study, we employed mobile simulation by bringing equipment on the road and implementing simulation outside of simulation centers (e.g., convention centers) in different cities in the U.S. We employed remote simulation by collaborating and training a simulation team at a remote simulation center (e.g., a simulation center in a different city) to deliver the simulation without our physical presence. For both mobile and remote simulations, we recruited standardized patients (SP) residing close to where the simulation would take place and trained them remotely (e.g., using video conference). All simulations were in-person, i.e., participants and SPs interacted face-to-face.

This study was funded by the National Institute of Aging. The research protocol for this study received approval from the Institutional review boards (IRB) at Arizona State University (ASU) and at Mayo Clinic, ensuring that it adheres to ethical standards and guidelines for human subjects research.

Objective of Using Mobile and Remote Simulation

The premise of the exemplar study was to investigate decisions made by primary care providers (PCP) during patient encounters. Examples of the decisions of interest were treatment prescribed, referrals made, and patient education provided. The PCP in this study included licensed physicians (both medical doctors [MD] and doctor of osteopathic medicine [DO]), physician assistants/associates (PA), and nurse practitioners (NP).

The objective of using mobile and remote simulations as the research methodology for our study was twofold. First, we aimed to maximize our ability to enroll PCPs of different professions (e.g., MD/DO, PA, and NP) and practice in a wide range of primary care settings (e.g., different states, urban and rural areas, private practice, public clinics, and academic medical centers etc.). Second, we intended to create controlled and repeatable simulated clinical environments and patient encounters for data collection.

In the following paragraphs, we describe the key considerations of implementing mobile and remote simulations in three major phases: 1) planning phase, 2) data collection phase, and 3) data analysis phase. We aim to provide insights from our experience and suggestions for future researchers who wish to adopt this research methodology for healthcare delivery research.

Planning Phase

Simulation Scenario Creation

When creating the simulation scenarios, we found it necessary to consider two things. First, the validity of the scenario, and second, the feasibility of delivering and recreating the scenarios over time in different locations. To provide content validity evidence for the scenarios, we worked with a panel of experts in primary care (2 MDs, 1 NP, and 1 PA) to create multiple standardized patient-based scenarios commonly seen in the primary care setting. Agreement within the expert panel provided content validity evidence for the scenarios created.

In terms of the feasibility of delivering and recreating the scenarios, we found there are two key considerations. First, the availability of standardized patients (SP) of a specific demographic should be considered. We found it challenging to recruit age-appropriate SPs to portray young adults. Individuals in this age range generally have less flexibility in working an all-day simulation due to conflicts with class or work schedules. Additionally, it is important to consider the differing practices among PCPs. For example, some procedures such as a Pap smear are a daily practice for some PCPs and a referral for others. These details should be considered carefully so that the scenarios can be replicated at different locations and are relevant to all potential study participants.

Experiment Site Selection

As our target population was PCPs, we selected study sites that our potential study participants could conveniently get to. This included conference centers where annual meetings for PCPs were held and hospital/university-based simulation centers. By recruiting from primary care conferences (e.g., the annual convention for the American College of Osteopathic Internists), we were able to enroll participants practicing across the nation. Using hospital/university-based simulation centers allowed us to recruit local PCPs of different professions and significantly reduced setup time and costs. Depending on the simulated scenario and the space required to simulate the desired healthcare space, researchers could also consider utilizing office spaces, regular conference rooms or a large open space such as a warehouse or lecture hall. The ability to reconfigure the rooms, rearrange furniture, have adequate power outlets, stable wifi, and convenient parking for the participants are all important considerations in selecting the site to conduct the simulation experiment. Due to inherent differences between study sites, differences in the setup cannot be avoided (e.g., size of the room, table, location of the camera etc.). Nevertheless, the equipment and supplies made available (e.g., stethoescopes, otoscopes etc.) and the documentation software used are consistent across sites and between mobile and remote simulations.

We typically contact the hosting professional organization to discuss opportunities to recruit participants during the conference 9–12 months in advance. The costs involved varied. We have paid the organization to utilize reserved but unused space and we also have paid the conference host hotel directly to reserve space. For utilizing simulation centers, we contacted the simulation center managers directly to inquire about utilizing their space to conduct research studies. The associated cost ranged from free of charge (at researchers’ home institutions - ASU and Mayo Clinic) to paying the simulation center personnel time. Some organizations allowed us to recruit via attendee listservs and or social gatherings while others did not. These are costs and benefits that we had to weigh when selecting the experiment sites.

Recruitment and Training of Standardized Patients

We hired SPs who lived close to the selected experiment sites and trained them remotely. We had good success recruiting from the local simulation center SP programs (typically from local medical and nursing schools or hospitals). To ensure consistent portrayal of the case, we developed and conducted a standardized three-part SP training program: (1) Online self-paced study of case materials (review patient information documents, memorize scripts, review an exemplar video recorded from a previous experiment, and complete an online quiz regarding critical items in the case). (2) A 1-hour video conference training that included an introduction to the study, time to clarify questions, and a dry run of the case. This video conference training was conducted by one of our simulation team members and a second training was scheduled on an as-needed basis. (3) An on-site refresher on the first day of the study was used to reinforce key case information and to do a quick run-through of the script. All SPs were trained by authors AT, VY, and YD. During the experiment, the study team also provided feedback to SPs for any performance variation from the script when noted. Team debriefing within the research team after each day and each experiment were also conducted to discuss and implement any SP performance reinforcement needs identified. The SPs were compensated for completing the three-part training and were paid hourly at a competitive rate matching or above the local SP salary.

Because SPs were an integral part of our study, we strove to recruit one primary SP and one backup SP for each scenario. At times when it was difficult to recruit SPs, we hired one backup SP that was trained to portray two patient scenarios. This was possible as the patients from two of our cases were of the same gender and age range. In the event that their participation as an SP is not needed, we hired them as assistants to help facilitate the simulation. During our study, we were able to establish a good relationship with the SPs with whom we worked. As national conferences usually are held repeatedly in certain U.S. cities (e.g., Orlando), we were able to recruit the same SPs for multiple experiments, which could lower the variability of SP performance and reduce training efforts. In total, between October 2021 and December 2023, we trained 58 individuals (including primary and backup SPs) and of which 38 participated in our study as an SP.

Technology and Equipment Required

The technology and equipment required are dependent on the data the researchers need to capture. For our study, we needed to consistently capture the following data: (1) clinician-patient interaction (audio and video), (2) clinicians’ electronic health record (EHR) documentation, and (3) clinicians’ demographic information and experience. We used products (both hardware and software) from SIMLATIONiQ9 from EMS, LLC, Exton, PA to accomplish these goals.

To capture clinician-patient interactions, we used the SIMULATIONiQ mobile capturing system.10 We purchased one mobile capturing system for each simulation room needed. Each mobile system included three mobile cameras and a controlling laptop that supported both secure live viewing and cloud storage. An additional recording system (the VideoCAPTURE application11 installed on a tablet) was used as a backup in each room to hedge against technological issues. Stable Wifi internet connection was required for immediate cloud upload. When we ran the simulation study at a remote simulation center, we used Zoom Video Communications, Inc, San Jose, CA12, a video conferencing platform, and webcams as the video recording system. Zoom was selected because it was a software that the remote simulation center staff already were familiar with. It captured quality video and audio needed for our data analysis and allowed us to avoid having to ship the expensive SIMULATIONiQ mobile capturing system, risking potential damage and loss during shipping.

For EHR documentation, we used SIMULATIONiQ Enterprise, a medical simulation software,13 as the platform to house the EHRs completed by clinicians during our study. The EHR was created in the form of a survey. Similar to the EHR, we created a post-experiment survey that could also be accessed from SIMULATIONiQ Enterprise to collect participants’ demographics and experiences. Basic laptops (e.g., Google Chromebooks) were purchased for participants to access EHR and the post-experiment survey. When running the study at a remote simulation center, participants were also able to access the EHR and post-experiment surveys from locally available laptops via SIMULATIONiQ Enterprise.

Study Participant Recruitment

We used multiple strategies to maximize the reach of our recruitment. This included emails (listservs of hospital employees, conference attendees, and professional organizations), distribution of flyers at conference sites, in-person recruitment (talking to conference attendees, attending social events, etc.), social media posts (e.g., X, formerly known as Twitter), and word-of-mouth. Interested individuals were prompted to go to Microsoft Booking14 (an online scheduling tool) to register for one of the available time slots. Selecting a booking system that can generate automatic reminders to study participants and has the capability for registrants to change or cancel their bookings was crucial in terms of saving the study team time. We found that at professional conferences, personal contact by one of our team members who is a PCP yielded the best results. Therefore, researchers could consider using a peer recruiter if they encounter low enrollment.

Mobilizing Equipment and Supply

When traveling via air or car, we shipped most of the supplies (printed materials and items for the simulated exam rooms, e.g., gloves, otoscopes, stethoscopes etc.) via a shipping company (e.g., FedEx) directly to the conference centers. We found packing the supplies with suitcases is most reliable as cardboard boxes can easily be damaged during shipping and handling and cannot be reused. We recommend hand-carry electronic equipment (SIMULATIONiQ mobile capturing system and laptops) because this equipment not only is expensive but cannot be readily replaced if lost or delayed during shipping.

Data Collection Phase

Creating Standardized Simulated Environment

As our study aimed to investigate provider decision-making by analyzing the PCPs’ interactions with SPs, our goal was to create multiple simulated outpatient clinic exam rooms at the selected study sites (i.e., conference facilities or local simulation centers). Simulation centers usually had patient exam rooms that could be readily used (Fig. 1). At conference centers, we found using conference rooms that could be separated into multiple rooms by doorfold partition walls worked best for soundproofing. We also used pipe and drape to create 10 by 10 feet clinic rooms in a large exhibit hall; however, to avoid noise pollution, the rooms needed to be placed at least 10 feet apart (Fig. 2). Converting hotel conference rooms into clinic rooms was also a viable option (Fig. 3). Basic furniture needed for each room included an exam table (can be replaced with a massage table, a rollaway bed, or a sturdy table), one table to lay out exam supplies, and two chairs for the provider and the SP (Fig. 3).

Figure 1.

Figure 1.

Example patient exam room setup at a simulation center

Figure 2.

Figure 2.

Creating patient exam rooms using pipe and drape at a conference site

Figure 3.

Figure 3.

Example patient exam room setup at a conference site

Data Management

Data management is a critical consideration for researchers. The ability to download stored data in an Excel format can eliminate the need to enter data manually. We recommend prior to purchasing the software, discuss with the vendors about data management needs specific to the study. While we were able to download our study data (EHR documentation, and post-experiment survey) directly from the SIMULATIONiQ Enterprise into Excel files, significant data cleaning and file merging were required using RStudio15, Post It, Boston, MA before we were able to perform statistical analysis.

Pilot Testing and Dry Run

Conducting a full pilot testing of the entire process (including recruiting and training SPs, recruiting volunteer PCPs, setting up the simulated environment, reviewing data etc.) was instrumental in the success of our implementation of mobile simulation. Similarly with remote simulation, scheduling a time with the hosting simulation center staff to conduct a full dry run of the experiment is also critical. We were able to identify and address issues that would have impacted the quality of the data such as sound pollution, mobile unit connectivity, and clarity of the instructions, only because a pilot and/or a full dry run was conducted.

DISCUSSION

Simulation research includes studying the use of simulation as an education methodology and using simulation as a research methodology.12 The latter is largely unreported in the current literature. Below we discuss the strengths and challenges of using mobile and remote simulations as a research methodology to provide insight for researchers considering this approach.

Strengths

Participant Reach

Between October 2021 and December2023, we conducted 14 experiments across 11 states, recruiting 186 study participants licensed by 48 states in the U.S., Guam, and US Virgin Islands. Twelve of the experiments were conducted using mobile simulation; two experiments were conducted using remote simulation. Our participants represented each of the clinical degrees allowed to practice independently in primary care (DO, MD, PA, and NP) and from different environments (rural, suburban, and urban). This indicates the utilization of mobile and remote simulations is effective at recruiting participants from diverse geographical locations, professional backgrounds, and practice environments. Aside from using mobile and remote simulations, recruiting from national conference attendees also contributed to the success of recruiting a diverse group of participants. Therefore, researchers can strategize their recruitment location and venue based on their target study participants to maximize the reach.

Standardization of Research Across Study Sites

The use of mobile and remote simulations makes possible the standardization of the data collection process across study sites. Especially with the use of SPs, we were able to have control over the patients’ (SP) behaviors variation to a certain extent. This is a variable that we would not be able to control if the research were conducted in a real-world environment with actual patients.

Adaptability

Mobile simulation can be set up in nearly any environment that has adequate space, furniture, electricity, internet connection, and noise control. With remote simulation, researchers can partner with different simulation centers to conduct the simulation in a standardized manner in different locations in the U.S. or even around the world.

Challenges and Lessons Learned

Study Participant Recruitment

While using mobile and remote simulations can extend the reach of study participants, recruiting study participants without being affiliated with a local hospital or university can be challenging. Taking our study as an example, only about 20–30% of the available time slots were filled on average before a planned simulation experiment, despite efforts to recruit participants via email listservs and social media posts. Onsite recruitment from our experience is critical in filling the available slots, which requires buy-in from local organizations (e.g., in our case, buy-in from professional organizations to help promote the recruitment of the study at conferences).

Cost

Costs for running a research study using mobile and remote simulations are higher than using a traditional modality (i.e., recruiting at local simulation centers). This is due to the costs of travel, renting local space, and training local staff as well as the need to purchase mobile equipment. Table 1 lists the major expense items from our exemplar study. The per-subject costs fluctuated depending on the venue and number of participants recruited per day. To provide a general idea, in our study, with full recruitment (filling 10 available slots per day), the per-subject costs ranged approximately from $443.13-$733.64. Of note, this reported cost only included the expenses associated with conducting the experiments at a remote site (e.g., subject pay, travel, rentals, shipping of materials, and advertisement), we did not include expenses for investigator effort, indirect costs, simulation equipment, etc. From these figures, we know that running mobile and remote simulations at a national scale is costly. However, conducting the sessions at conferences proved to be more cost-effective due to the lower opportunity cost for clinicians. Recruiting was easier compared to attempts at simulation centers where individuals were sacrificing their weekends rather than time at a conference.

Table 1.

Major Expense Items Conducting Mobile and Remote Simulations Nationally

Hardware and Software
 SIMULATIONiQ
 Warranty
 Tech Support
Simulation Supplies
 Physical Examination Supplies
Research Team Travel Expenses (Mobile Simulation Only)
 Airfare/Rental Cars
 Checked Bags
 Ground Transportation
 Hotels
 Meals
Shipment Cost (Mobile Simulation Only)
Study Site Expenses
Mobile Simulation
 Room Rental
 Pipe and Drapes
 Advertisement
 Conference Registration
 Electricity
 Internet
Remote Simulation
 Simulation Center Space Rental
 Local Staff Time
Standardized Patients Salary
Study Participant Remuneration

Technical Challenges

As we expand the use of mobile simulation from education activity to research methodology, we also identify unique needs the available mobile simulation hardware and software on the market have not considered. On multiple occasions, the software developers from SIMULATIONiQ pushed out updates based on our suggestions that improved the simulation experience. However, these improvements came at the expense of learning from failures, i.e., data loss, and difficulty with user experience. Having a backup plan for video and audio recording, EHR or surveys that rely on internet connection, as well as a thorough dry run testing the mobile simulation system on-site before each experiment was and still is essential.

Lastly, we provided a summary table of key take-home points of our exemplar study for researchers considering adopting this process in the future (Table 2).

Table 2.

Key Take-Home Points for Using Mobile and Remote Simulations as a Research Methodology for Healthcare Delivery Research

Planning Phase
 Simulation Scenario Creation
  - Work with a panel of experts to provide content validity evidence
  - Consider feasibility and reproducibility of scenarios

  Experiment Site Selection
  - Select sites easily accessible for study participants
  - Consider ability to reconfigure rooms, adequate power outlet, and stable Wi-Fi
  - Consider associated cost (room use, pipe and drape, etc.)

 Recruitment and Training of Standardized Patients
  - Follow standardized training procedure
  - Continuous quality control
  - Offer competitive hourly rate to match local rate

  Technology and Equipment Required
  - Have a back-up audio and video recording system
  - For remote simulation, utilize softwares that the host simulation center are familiar with

  Study Participant Recruitment
  - Recruit via multiple channels (email, social media etc.)
  - Recruit on-site and in-person
  - Use peer recruiters

  Mobilizing Equipment and Supply
  - Ship supplies in suitcases
  - Hand-carry electronics when travel
Data Collection Phase
  Creating Standardized Simulated Environment
  - Convert conference rooms into clinic rooms is viable
  - Keep makeshift clinic rooms at least 10 feet apart to avoid noise pollution
Data Management
  Discuss with vendor about data management needs prior to purchasing
Pilot Testing and Dry Run
  Conduct a full pilot testing and dry run are essential
Advantages and Challenges
  Advantages
  - Able to recruit geographically diverse participants
  - Able to standardize data collection process across sites

  Challenges
  - Recruitment require significant effort and buy-in from professional organization
  - Run mobile and remote simulations at a national scale is costly
  - Susceptible to data loss when highly rely on technology (e.g., recording system, wi-fi etc.)

CONCLUSION

Using mobile and remote simulation as a research methodology to understand healthcare delivery has the ability to increase the reach of study participants and generate results with greater generalizability. As shown in our exemplar study using this approach, we were able to conduct experiments in 11 states and recruit 186 PCPs licensed by 48 states in the U.S., Guam, and US Virgin Islands in a controlled, simulated environment over a 26-month period. However, doing so at a national scale can be resource-intensive and its success heavily relies on technology capabilities, local support, and participant enrollment. Nonetheless, utilizing mobile and remote simulations as a research methodology holds the potential to provide impactful healthcare delivery data pivotal to important decisions such as policy reform and obtaining knowledge about differences in primary care practice between different disciplines. It could also provide valuable data about current clinical practice that could inform health professions education and/or help create targeted continuing education to address practice gaps. Researchers should carefully weigh the benefits and the costs when designing studies using mobile and remote simulation as a research methodology.

Contributor Information

Vicky J.-H. Yeh, Department of Surgery, Mayo Clinic Rochester; Simulation Center Education Specialist, Interdisciplinary Simulation and Education Center, Hennepin Healthcare.

Aysun Tekin, Division of Nephrology and Hypertension, Department of Internal Medicine, Mayo Clinic Rochester.

Ellen Green, College of Health Solutions, Arizona State University.

Elizabeth Reifsnider, College of Nursing and Health Innovation, Arizona State University.

Alicia Lewis, Economics, University of Michigan; Research Analyst Assistant, College of Health Solutions, Arizona State University.

Morgan Carver, Research PMO, Knowledge Enterprise, Arizona State University.

Yue Dong, Department of Anesthesiology and Perioperative Medicine, Mayo Clinic Rochester.

References

  • 1.Lockhart TJ, Paulman A. Mobile simulation training and teaching overview, Comprehensive Healthcare Simulation: Mobile Medical Simulation. Edited by Carstens PK, Paulman P, Paulman A., Stanton MJ, Monaghan BM, Dekker D. Cham, Springer, 2020, pp 141–4. [Google Scholar]
  • 2.Jafri FN, Yang CJ, Kumar A, Torres RE, Ahmed ST, Seneviratne N, Zarowin D, Bajaj K, Edwards RA: In situ simulation as a tool to longitudinally identify and track latent safety threats in a structured quality improvement initiative for SARS-CoV-2 airway management: A single-center study. Simul Healthc 2023; 18(1): 16–23 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Patterson MD, Geis GL, Falcone RA, LeMaster T. Wears RL: In situ simulation: detection of safety threats and teamwork training in a high risk emergency department. Simul Healthc 2013; 22(6): 468–77 [DOI] [PubMed] [Google Scholar]
  • 4.Green EP., Dong Y, Shah ND: Innovative research methods: using simulation to evaluate health care policy. Simul Healthc; 2023: 10–1097. DOI: 10.1097/SIH.0000000000000751 [DOI] [PMC free article] [PubMed]
  • 5.Lioce L. (Ed.), Lopreiato J. (Founding Ed.), Downing D, Chang TP, Robertson JM, Anderson M, Diaz DA, and Spain AE (Assoc. Eds.) and the Terminology and Concepts Working Group (2020), Healthcare Simulation Dictionary –Second Edition. Rockville, MD: Agency for Healthcare Research and Quality; September 2020. AHRQ Publication No. 20–0019. DOI: 10.23970/simulationv2. [DOI] [Google Scholar]
  • 6.Mladenovic R, Pereira LAP, Mladenovic K, et al. Effectiveness of augmented reality mobile simulator in teaching local anesthesia of inferior alveolar nerve block. J Dent Edu 2019; 83(4):423–428. [DOI] [PubMed] [Google Scholar]
  • 7.Laurent DA, Niazi AU, Cunningham MS, et al. A valid and reliable assessment tool for remote simulation-based ultrasound- guided regional anesthesia. Regional Anesthesia & Pain Medicine 2014;39(6), 496–501. [DOI] [PubMed] [Google Scholar]
  • 8.Shao M, Kashyap R, Niven A, et al. Feasibility of an international remote simulation training program in critical care delivery: a pilot study. Mayo Clinic Proceedings: Innovations, Quality & Outcomes 2018;2(3), 229–233. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.EMS. SIMULATIONiQ. https://www.simulationiq.com/. Accessed March 29, 2023.
  • 10.EMS. On-the-Go Field Simulation Training in Any Environment. https://www.simulationiq.com/mobile-field-simulation-training/. Accessed March 29, 2023.
  • 11.EMS. VideoCAPTURE. https://www.simulationiq.com/sites/default/files/VideoCAPTURE_Counseling.pdf. Accessed March 29, 2023.
  • 12.Zoom. One platform to create. https://zoom.us/. Accessed March 29, 2023.
  • 13.EMS. Enterprise Healthcare Simulation Software. https://www.simulationiq.com/enterprise-healthcare-simulation-software/. Accessed March 29, 2023.
  • 14.Microsoft. Microsoft Bookings. https://www.microsoft.com/en-us/microsoft-365/business/scheduling-and-booking-app. Accessed March 29, 2023.
  • 15.RStudio Team. RStudio. Accessed http://www.rstudio.com/ April 9, 2023.

RESOURCES