Skip to main content
African Journal of Emergency Medicine logoLink to African Journal of Emergency Medicine
. 2019 Jul 8;9(3):140–144. doi: 10.1016/j.afjem.2019.05.004

Rapid, remote education for point-of-care ultrasound among non-physician emergency care providers in a resource limited setting

Benjamin Terry a,, David L Polan a, Rashidah Nambaziira c, Julius Mugisha b, Mark Bisanzo a, Romolo Gaspari a
PMCID: PMC6742845  PMID: 31528532

Abstract

Introduction

Access to high-quality emergency care in low- and middle-income countries (LMIC) is lacking. Many countries utilise a strategy known as “task-shifting” where skills and responsibilities are distributed in novel ways among healthcare personnel. Point-of-care ultrasound (POCUS) has the potential to significantly improve emergency care in LMICs.

Methods

POCUS was incorporated into a training program for a ten-person cohort of non-physician Emergency Care Providers (ECPs) in rural Uganda. We performed a prospective observational evaluation on the impact of a remote, rapid review of POCUS studies on the primary objective of ECP ultrasound quality and secondary objective of ultrasound utilisation. The study was divided into four phases over 11 months: an initial in-person training month, two middle month blocks where ECPs performed ultrasounds independently without remote electronic feedback, and the final months when ECPs performed ultrasounds independently with remote electronic feedback. Quality was assessed on a previously published eight-point ordinal scale by a U.S.-based expert sonographer and rapid standardised feedback was given to ECPs by local staff. Sensitivity and specificity of ultrasound exam findings for the Focused Assessment with Sonography for Trauma (FAST) was calculated.

Results

Over the study duration, 1153 ultrasound studies were reviewed. Average imaging frequency per ECP dropped 61% after the initial in-person training month (p = 0.01) when ECPs performed ultrasound independently, but rebounded once electronic feedback was initiated (p = 0.001), with an improvement in quality from 3.82 (95% CI, 3.32–4.32) to 4.68 (95% CI, 4.35–5.01) on an eight-point scale. The sensitivity and specificity of FAST exam during the initial training period was 77.8 (95% CI, 59.2–83.0) and 98.5 (95% CI, 93.3–99.9), respectively. Sensitivity improved 88% compared to independent, non-feedback months whereas specificity was unchanged.

Conclusions

Remotely delivered quality assurance feedback is an effective educational tool to enhance provider skill and foster continued and sustainable use of ultrasound in LMICs.

Keywords: ultrasound;non-physician, quality assurance, Hydroxyl radicals, remote feedback

African relevance

  • Remote feedback is effective to enhance in-person ultrasound training.

  • Non-physicians can independently perform high-level emergency ultrasound.

  • Ultrasound utilisation and skill are both enhanced with remote feedback.

Introduction

Access to high-quality emergency care in low- and middle-income countries (LMICs) is limited, despite the most recent call to action in 2007 by the WHO [1]. In addition, these countries face an overwhelming proportion of the global burden of disease; child mortality rates, for instance, are often 10 to 20 times higher in LMICs than in high-income countries [2].

Many factors contribute to this lack of access to care, including a lack of skilled providers. Sub-Saharan Africa faces 25% of the global burden of disease with only 3% of the healthcare workforce [3]. To combat this shortage, many countries have utilised a strategy known as “task-shifting” in which skills and responsibilities are distributed in novel ways among existing provider cadres and new cadres are formed where needed [4].

The shortage of skilled providers in these resource-limited settings is often compounded by a paucity of technologic resources, including diagnostic imaging technology. Portable, hand-carried ultrasound is inexpensive, easily deployable and clinically effective in settings where more advanced diagnostic modalities are not available [5]. Training a cadre of non-physician clinicians in point-of-care ultrasound (POCUS) in a rigorous and sustainable manner thus has the potential to significantly impact the delivery of care in LMICs.

Early research has shown that non-physician clinicians can be trained to function independently in skills essential to emergency care [6., 7., 8.]. The use of POCUS by physicians in LMICs already has proven impact on patient management, such as electing surgical treatment or changing the medical plan of care [9]. There is limited research examining the ability of non-physician clinicians providing emergency care in LMICs to learn POCUS as an adjunct to standard care. Robertson et al. described the remote, real-time use of FaceTime to instruct and monitor POCUS by non-physicians in Haiti and Levine et al. demonstrated that FaceTime images in tele-review are non-inferior to those captured on the ultrasound machine [10,11]. To date, there is no published data describing the use of tele-review to sustain POCUS usage and skill by non-physicians in LMICs.

Traditionally, ultrasound education of providers ranges from brief one- to two-day intensive training sessions to one-year modular courses [12,13]. Other groups have found that without continued support, brief training sessions do not yield sustained skills retention [14]. However, prolonged direct-observation training one-to-one at the bedside can be prohibitively resource-intensive in LMICs, especially if oversight is provided by non-local experts traveling to LMICs specifically to provide education. Here we describe a novel educational tool to provide rapid, “tele-review”, quality assurance and feedback to a group of non-physician clinicians in rural Uganda and its impact on continuing education and skills retention for broad-based POCUS.

Since 2009, non-physician clinicians have been trained in emergency care at a district hospital in rural Uganda, with program graduates referred to as Emergency Care Practitioners (ECPs). The hospital setting and training program are described in detail elsewhere [2] POCUS was incorporated into the curriculum given limited access to radiography services [15]. We performed a prospective observational evaluation on the impact of a remote, rapid review of POCUS studies on ultrasound utilisation and skills in a ten-person cohort of ECPs.

Methods

All patient encounters were logged prospectively into an electronic research database. Data collected included chief complaint, demographic information, testing ordered or performed (including ECP POCUS), results and disposition. ECPs acquired ultrasound images with a Sonosite Micromaxx (Bothell, WA) using a 2–5 mHz curvilinear transducer, 6–13 mHz linear transducer, or a 1–5 mHz phased-array transducer. As part of the research study, information on ultrasound performed, sonographer and initial interpretation were recorded by ECPs and then uploaded by staff into a separate web-based database program designed by one of the authors (**) for remote quality assurance [16]. Image review was performed remotely by U.S.-based emergency physicians with fellowship training in POCUS. Detailed feedback was emailed to local research staff who printed and distributed the feedback to the performing ECPs. Our primary objective consisted of changes in educational ratings over time (interpretation and image acquisition). Our secondary objective consisted of ultrasound utilisation. Ultrasounds performed independently by visiting physicians were excluded. This work was approved by the Institutional Review Boards of [deidentified] and [deidentified].

The initial on-site educational training was modeled on the curriculum for U.S. emergency medicine residents, but was modified for local pathology and resources [17]. In March 2012, ten ECPs, including six first-year students, two second-year students and two graduates underwent training on the basic use and mechanics of portable ultrasound, including recording video clips of each view to digital media and documenting results. Over a one-month period, subsequent education involved seven one-hour sessions of didactic and several hours of hands-on instruction given by on-site U.S. emergency physicians, including live-model simulation. Topics covered included: Introduction to Ultrasound, FAST, Obstetrics, Echocardiogram, Biliary, Lung and Ultrasound-guided procedures. These sessions were taught in conjunction with standard bedside clinical instruction in the hospital Emergency Centre.

After the initial education, ECPs performed ultrasound imaging independently and submitted images and interpretations through the online quality assurance program. Ultrasound images were saved as two-second digital clips or still images onto compact flash cards. Images were uploaded to a cloud-based storage accessible via internet access (dropbox.com). ECP image interpretations were recorded as positive or negative for primary findings for each individual ultrasound protocol. Additional interpretations as well as any questions from the learners in Uganda were also recorded on a paper log attached to the ultrasound equipment that was transcribed into the web application for the reviewer. Immediately after the images were reviewed, feedback and answers to any questions were emailed to local research staff, printed and distributed back to the ECPs. Goal duration for the review and feedback process was less than 48-h.

Each ultrasound image was reviewed for overall image quality, interpretation and image acquisition metrics for each ultrasound type with reviewers blinded to the sonographer identity. Image review was based on an eight-point rating scale (Fig. 1). Image interpretation was recorded as either agree with expert review or disagree with expert review based on the sonographer's interpretation. Details on this rating scale and feedback system have been published previously [16]. Feedback relating to interpretation, overall image quality, and specific technical errors related to image acquisition were provided to the ECP obtaining the ultrasound images.

Fig. 1.

Fig. 1

Image review rating scale.

Image acquisition metrics were divided into three categories: image elements, machine settings, and probe mechanics. Image element errors included missing images or image components, for example, no subxyphoid view or failure to visualise the diaphragm on right upper quadrant view of FAST. Errors in machine setting included errors of configuration by the sonographer such as abnormal gain or depth. Errors in probe mechanics included errors in transducer orientation and minimisation of imaging artifacts. To standardise the feedback process and ensure that each learner was getting the feedback emails, research staff in Uganda supplied printed emails to ECPs with comments from the reviewers on these metrics, as well as direct feedback and teaching points based on users' specific questions regarding the scan.

The four time periods of the study included the initial training month, two middle month blocks where ECPs performed ultrasounds independently without feedback, and the final months when ECPs performed ultrasounds independently with feedback (Fig. 2). For the training month, instruction was provided by in-person trainers during hands-on sessions in the emergency centre (as detailed above) and all images were acquired under supervision of the trainer. Following this initial period, ultrasound imaging was performed at the discretion of the ECP while working clinically in the emergency centre. During these later study blocks, which were separated to perform repairs on the ultrasound machine, visiting U.S. emergency physicians were periodically available for bedside supervision and refresher training, but no other formal ultrasound training occurred. This time period represents the standard education model for ultrasound training in LMICs. For the purpose of our study, when visiting physicians were present, ECPs recorded images and interpretations independently prior to consulting with a visiting physician. In the middle months' time period, ultrasounds were recorded and reviewed for data collection but feedback was not provided to the ECPs. During the final study period, rapid remote review was performed with feedback provided to the ECPs.

Fig. 2.

Fig. 2

Study timeline.

Descriptive statistics for learners were calculated for each time period of the study. Normality of data was analysed using the Lilliefors test. Comparisons between groups were performed using Wilcoxon Signed Rank Test for continuous data. Accuracy rates were calculated using web-based statistical software (www.statpages.org) with 95% confidence intervals. Comparisons of accuracy rates were performed using 95% confidence intervals.

Results

Over the 11-month study duration, 972 ultrasounds were performed by the ten-person cohort of non-physician ECPs. 654 were recorded and reviewed by U.S.-based providers. The total duration of the review and feedback process ranged from less than 48 h to several days. The average number of monthly ultrasounds performed per block during the study is shown in Fig. 3. Average monthly utilisation per ECP dropped 61% after the initial in-person training month (p = 0.01) but rebounded 240% once feedback was provided (p = 0.001). Reviewed ultrasound volume during the feedback months was no different from the initial in-person training months.

Fig. 3.

Fig. 3

Ultrasound volume.

Image quality during the initial training month averaged 3.82 (95% CI, 3.32–4.32) out of 8 on our ordinal eight-point scale. Image quality improved during the final feedback period to 4.68 (95% CI, 4.35–5.01) when compared to the initial training month (p = 0.01) (Fig. 4). There was no statistical improvement in image quality during the non-feedback periods. Sub-group analysis based on ECP experience (graduate vs non-graduate) found no statistically significant difference.

Fig. 4.

Fig. 4

Overall image quality.

Post-hoc analysis of the accuracy of interpretation for FAST examinations demonstrated statistically significant improvement over the length of the study (FAST exams were the only exam type performed frequently enough to calculate meaningful sensitivity and specificity rates). FAST exams with interpretations recorded by ECPs (n = 535) were analysed for sensitivity and specificity compared to findings on expert review. Computed Tomography (CT) was not available for use as a gold standard modality. The overall sensitivity was 77.8 (95% CI, 59.2–83.0) during the initial training period and trended down thereafter. A statistically significant improvement in sensitivity by 88% occurred during the feedback time period compared to non-feedback months, corresponding to the initial sensitivity rate (p = 0.001) (Fig. 5). Specificity during the initial training period was 98.5 (95% CI, 93.3–99.9) and did not change significantly during any of the study time periods (98.5 vs 90.1 vs 94.6 for in-person, middle and feedback months, respectively).

Fig. 5.

Fig. 5

Ultrasound image interpretation.

Discussion

This study describes a new educational model for providing ultrasound education in LMICs that is sustainable and builds local capacity. The majority of ultrasound machines that are donated to resource poor areas of the globe are done so with minimal training and little attention to sustainability [18]. A study by Greenwold et al. published in March of 2014 explored an educational program for obstetrical ultrasound in Mozambique consisting of on-site training for two months followed by unsupervised imaging by learners [19]. A similar study from 2008 described a general ultrasound training program for physicians in Rwanda consisting of nine weeks of on-site training followed by unsupervised imaging [20]. The middle two blocks of this study represent this prevailing educational model of sending educators from other countries and then having them leave. Comparing this model (middle blocks) with our feedback system (final block) demonstrates significantly higher skills in both image interpretation as well as image acquisition. This type of feedback stimulates increased ultrasound utilisation comparable to when educators were continuously present (initial educational month).

Given the high burden of disease in LMICs, the paucity of healthcare providers and the limited availability of local specialists, training methods that can inexpensively boost local capacity and expertise are critical. Introducing tools that increase the accuracy of diagnoses is also important to make the most effective use of available resources. Educational efforts using continually present experts from outside the country run the risk of decreasing capacity as local providers learn to rely on these more experienced experts to provide care. This educational model is also not sustainable due to its high cost. The sustainability of educational programs is largely limited by two factors that are directly related, funding and availability of experienced health care providers to provide training. Any educational system that expands the pool of trainers while decreasing costs should dramatically increase the sustainability of an educational venture. In our study, remote feedback related to ultrasound in the context of a training program in emergency care both enhanced the provider utilisation of ultrasound and fostered continued skill acquisition.

Other researchers have described educational efforts through the use of various feedback mechanisms, but the findings have been limited by research design and small size. Ferriaoli et al. described the utilisation of email as a means for trained, local supervisors to send reports to the organisers of the training program [21]. Shah et al. reported a retrospective review from 97 ultrasound images from an 11-week post-training period utilising transportation of hard copies of record images [20]. And Henwood et al. described a pilot program in Rwanda that remotely tracked physician usage of POCUS following a training program [22]. Our study represents the largest and longest-duration published report of the impact of rapid, remote feedback on the ultrasound education of non-physicians in a resource-limited setting.

Much of the research on POCUS to date has focused on ultrasound training for physicians. As noted, in many LMICs, physicians are in short supply, making delivery of emergency care especially challenging. Therefore, in addition to the task-shifting work from physicians to mid-level providers described previously, we have with this study sought to assess a sustainable POCUS training model in a novel cadre of mid-level ECPs [23]. Given these providers have less comprehensive training than physicians, it is important to demonstrate that training programs enable providers to perform newly acquired skills safely and effectively. Our study demonstrates that the introduction of remote educational feedback significantly improves the frequency of ultrasound utilisation, image quality and interpretation accuracy with sensitivity and specificity by non-physicians in LMICs comparable to U.S.-trained emergency medicine physicians [15].

The study was limited by a small sample size, use of a single study site and lack of gold standard diagnostic techniques. Due to the constraints of working in a resource-limited country, data loss occasionally occurred with some ultrasound image or interpretation loss and a total duration of feedback process that was often longer than the less than 48-h goal. This led to loss of data during the analysis phase, as noted by the decrease in ECP scans performed (n = 972) to those successfully recorded and reviewed (n = 654). A more rapid review process would likely provide enhanced clinical relevance in addition to improved educational outcomes. It is also possible that ultrasound interpretation, image quality or utilisation was affected by visiting physicians during the study. ECPs were instructed to image independently and provide interpretations without input from visiting physicians, but it is possible that the visiting physicians influenced the outcomes. Even if influence occurred, it is unlikely that this would change the conclusions of our study. Visiting physicians were present sporadically during the study but there was no difference in the amount of time visiting physicians were on-site during the second, third or feedback study blocks.

Remotely delivered quality assurance feedback is an effective educational tool to enhance provider skill and foster continued use of ultrasound in LMIC. Our data suggest that non-physician clinicians trained in emergency care can accurately perform POCUS as part of routine clinical care without ongoing direct physician oversight. Further evaluation of this model in larger groups of trainees can be used to define optimal duration of feedback required to maximize provider accuracy in a broad range of POCUS applications.

Dissemination of results

Effectiveness of remote feedback was informally communicated to the Emergency Care Providers who were the participants in the study by Global Emergency Care staff. Published manuscripts involving the ECPs, either as authors or participants, are posted in the local Ugandan Emergency Department. ECPs regularly present research findings at regional conferences such as AFCEM and EMSSA.

Authors' contributions

Authors contributed as follows to the conception or design of the work; the acquisition, analysis, or interpretation of data for the work; and drafting the work or revising it critically for important intellectual content: BT and DP contributed 30% each; MB and RG contributed 15% each; RN contributed 7.5% and MJ contributed 2.5%. All authors approved the version to be published and agreed to be accountable for all aspects of the work.

Disclosures

Grant funding for investigator initiated research: BMT, DLP, MB report grant money to UMass Memorial Emergency Dept to conduct research conceived and written by BMT and MB from the UMass Medical School Office of Global Health.

Declaration of Competing Interest

Grant funding for investigator initiated research: BMT, DLP, MB report grant money to UMass Memorial Emergency Department to conduct research conceived and written by MBT and MB from the UMass Medical School Office of Global Health. The authors declare no conflicts of interest.

References

  • 1.World Health Assembly (WHA). Health systems: Emergency care systems - report by the secretariat [web-based WHA report 60.22]: World Health Organization (Western Pacific Region) web site http://www.wpro.who.int/mnh/A60_R22-en.pdf Accessed April 17, 2014.
  • 2.Hammerstedt H., Maling S., Kasyab R. Addressing WHO resolution 60.22: a pilot project to create access to acute care services in Uganda. Ann Emerg Med. Nov 2014;64(5):461–468. doi: 10.1016/j.annemergmed.2014.01.035. [DOI] [PubMed] [Google Scholar]
  • 3.World Health Organization. Working together for health. The world health report 2006. ISBN 978 92 4 156317 8.
  • 4.Mullan F., Frehywot S. Non-physician clinicians in 47 sub-Saharan African countries. Lancet. 2007;370(9605):2158–2163. doi: 10.1016/S0140-6736(07)60785-5. [DOI] [PubMed] [Google Scholar]
  • 5.Terry B., Blehar D., Gaspari R., Maydel A., Bezuldenhout F., Andronikou S. FAST as a predictor of clinical outcome in blunt abdominal trauma. SA J Radiol. Dec 2011;15(4):108–115. [Google Scholar]
  • 6.Bisanzo M., Nichols K., Hammerstedt H. Nurse-administered ketamine sedation in an emergency department in rural Uganda. Ann Emerg Med. 2011;59(4) doi: 10.1016/j.annemergmed.2011.11.004. [DOI] [PubMed] [Google Scholar]
  • 7.Chamberlain S., Stolz U., Dreifuss B., Nelson S.W., Hammerstedt H., Andinda J. Mortality related to acute illness and injury in rural Uganda: task shifting to improve outcomes. PLoS One. 2015;10(4) doi: 10.1371/journal.pone.0122559. Price MA, ed. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Rice B., Periyanayagam U., Chalberlain S., Dreifuss B., Hammerstedt H., Nelson S. Mortality in children under five receiving nonphysician clinician care in Uganda. Pediatrics. March 2016;137(3) doi: 10.1542/peds.2015-3201. [DOI] [PubMed] [Google Scholar]
  • 9.Shah S., Epino H., Bukhman G. Impact of ultrasound services in a limited resource setting: rural Rwanda 2008. BMC Int Health Hum Rights. 2009;9:4. doi: 10.1186/1472-698X-9-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Robertson T.E., Levine A.R., Verceles A.C., Buchner J.A., Lantry J.H., 3rd, Papali A. Haiti resource limited intensive care (Haiti-RELIC) study group. Remote tele-monitored ultrasound for non-physician learners using FaceTime: a feasability study in a low-income country. J Crit Care. 2017 Aug;40:145–148. doi: 10.1016/j.jcrc.2017.03.028. [DOI] [PubMed] [Google Scholar]
  • 11.Levine A.R., Buchner J.A., Verceles A.C., Zubrow M.T., Mallemat H.A., Papali A. Ultrasound images transmitted via FaceTime are non-inferior to images on the ultrasound machine. J Crit Care. 2016;33:51–55. doi: 10.1016/j.jcrc.2016.02.019. Jun. [DOI] [PubMed] [Google Scholar]
  • 12.Crouch A., Dawson M., Long D. Perceived confidence in the FAST exam before and after an educational intervention in a developing country. Int J Emerg Med. 2010;3:49–52. doi: 10.1007/s12245-009-0144-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Kawooya M.G., Goldberg B., DeGroot W. Evaluation of US training for the past 6 years at ECUREI, the World Federation for Ultrasound in Medicine and Biology (WFUMB) Centre of Excellence, Kampala, Uganda. Acad Radiol. March 2010;17(3):392–398. doi: 10.1016/j.acra.2009.10.009. [DOI] [PubMed] [Google Scholar]
  • 14.Henwood P., Mackenzie D.C., Rempell J., Murray A.F., Leo M.M., Dean A.J. A practical guide to self-sustaining point-of-care ultrasound education programs in resource-limited settings. Ann Emerg Med. 2014;64(3):277–285.e2. doi: 10.1016/j.annemergmed.2014.04.013. [DOI] [PubMed] [Google Scholar]
  • 15.Stolz L.A., Muruganandan K.M., Bisanzo M.C., Sebikali M.J., Dreifuss B.A., Hammerstedt H.S. Point-of-care ultrasound education for non-physician clinicians in a resource-limited emergency department. Trop Med Int Health. March 2015;20(8):1067–1072. doi: 10.1111/tmi.12511. [epub] [DOI] [PubMed] [Google Scholar]
  • 16.Gaspari, Blehar D., Barton B., Gaspari R.J. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015 May;22(5):574–582. doi: 10.1111/acem.12653. [DOI] [PubMed] [Google Scholar]
  • 17.Akhtar S., Theodoro D., Gaspari R., Tayal V., Sierzenski P., Lamantia J. Resident training in emergency ultrasound: consensus recommendations from the 2008 Council of Emergency Medicine Residency Directors Conference. Acad Emerg Med. Dec 2009;16(2):S32–S36. doi: 10.1111/j.1553-2712.2009.00589.x. [DOI] [PubMed] [Google Scholar]
  • 18.Harris R.D., Cho J.Y., Deneen D.R. Compact ultrasound donations to medical facilities in low-resource countries: a survey-based assessment of the current status and trends. J Ultrasound Med. 2012;31(8):1255–1259. doi: 10.7863/jum.2012.31.8.1255. [DOI] [PubMed] [Google Scholar]
  • 19.Greenwold N., Wallace S., Prost A., Jauniaux E. Implementing an obstetric training program in rural Africa. Int J Gynaecol Obstet. 2014 Mar;124(3):274–277. doi: 10.1016/j.ijgo.2013.09.018. [DOI] [PubMed] [Google Scholar]
  • 20.Shah S.P., Epino H., Bukhman G., Umulisa I., Dushimiyimana J.M., Reichman A. Impact of the introduction of ultrasound services in a limited resource setting: rural Rwanda 2008. BMC Int Health Hum Rights. 2009 Mar;9:4. doi: 10.1186/1472-698X-9-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Ferraioli G., Meloni M.F. Sonographic training program at a district hospital in a developing country: work in progress. Am J Radiol. 2007;189:W119–W122. doi: 10.2214/AJR.07.2202. [DOI] [PubMed] [Google Scholar]
  • 22.Henwood P.C., Rempell J.S., Liteplo A.S., Leo M.M., Murray A.F., Mackenzie D. Point-of-care ultrasound use over six-month training period in Rwandan District hospitals. Ann Emerg Med. 2013;62(4):S77–S78. [Google Scholar]
  • 23.Terry B., Bisanzo M., McNamara M., Dreifuss B., Chamberlain S., Nelson S.W. Task shifting: meeting the human resources needs for acute and emergency care in Africa. Afr J Emerg Med. 2012;2(4):182–187. Dec. [Google Scholar]

Articles from African Journal of Emergency Medicine are provided here courtesy of African Federation for Emergency Medicine

RESOURCES