Skip to main content
Public Health Action logoLink to Public Health Action
. 2014 Dec 21;4(4):265–270. doi: 10.5588/pha.14.0077

Implementation of an in-patient pediatric mortality reduction intervention, Gondar University Hospital, Ethiopia

D M Gordon 1,, A Shehibo 2, A Tazebew 2, M R Huddart 3, A Kadir 1, N Allen 4, H Draper 1, M Kokeb 2
PMCID: PMC4533513  PMID: 26400707

Abstract

Setting: Gondar University Hospital (GUH) is a resource-limited tertiary care hospital in northern Ethiopia.

Objective: To evaluate the aggregate effect of care standardization, institutional guidelines, and simulation-based training on pediatric mortality at a resource-limited hospital.

Design: Uncontrolled pre-post study. GUH in-patients aged from 30 days to 14 years were included in the program evaluation (baseline 11 September–18 November 2010; intervention 19 September–9 December 2011). Interns attached to the GUH pediatrics department from 6 September to 9 December 2011 were included in the training evaluation. Institution-specific management guidelines were prepared for choking, respiratory distress, dehydration, sepsis, congestive heart failure, coma, and seizure. Approval for the protocols was obtained from each pediatric faculty member. Interns received a 3.5 h simulation-based training in triage, procedural skills, and protocol usage. Primary outcome was overall deaths (%); secondary outcomes were deaths within 24 h of admission (%) and median pre/post-training emergency management test scores (%).

Results: No difference in mortality (OR 0.72, 95%CI 0.40–1.29, P = 0.265) or first 24 h mortality (crude OR 0.97, 95%CI 0.37–2.55) was observed. Trainee examination scores improved from 33% to 74% (P < 0.001).

Conclusion: Combining care standardization, management protocols, and simulation-based training did not reduce mortality among pediatric in-patients. Focused, simulation-based training improved short-term test scores among interns.

Keywords: emergency care, continuing medical educations, Africa, mortality, children


Of the 11.3 million deaths occurring in Africa each year, roughly 45% occur in children aged <14 years.1 Available reports suggest that a third to half of these deaths occur in health care facilities.2,3 To improve pediatric outcomes in resource-limited hospitals, the World Health Organization (WHO) has promoted the Emergency Triage Assessment and Treatment (ETAT) program. ETAT provides a standard curriculum in triage and patient stabilization, endorses facilities renovation and materials acquisition, and emphasizes operational strategies that reduce mortality in the first 24 h of admission among children aged ⩽5 years.4 Although the WHO recommends adapting ETAT to meet local needs, relatively few reports have described adapted ETAT programs.5–8

Gondar University Hospital (GUH) is the only tertiary-level hospital in the Amhara Region of northern Ethiopia. It receives referrals from an area that encompasses 17 million people, of which 43% are aged <14 years, and 88% live in rural areas.9 Of all non-neonatal pediatric admissions to GUH, 20% are diagnosed with severe acute malnutrition (SAM), 9% with malaria, 8% with tuberculosis (TB), and 8% with human immuno-deficiency virus (HIV) infection.10 Due to the diversity of climates in its catchment area, GUH sees considerable pathologic variety.

A review of GUH medical records in 201010 suggested that ETAT implementation alone would insufficiently address mortality among pediatric in-patients. Most deaths were observed among children aged >5 years, and most occurred beyond the first 24 h of admission. Considerable mortality was observed among patients with congestive heart failure (CHF), which is not featured in the ETAT curriculum. To optimize resource utilization and address concerns unique to our facility, we developed an operational intervention that integrated protocol development, standardization of care, and simulation-based training. Elements of ETAT were incorporated into our protocols to ensure compliance with international standards.

In this uncontrolled pre-post study, we hypothesized that our intervention would reduce deaths among non-neonatal pediatric in-patients. A second uncontrolled pre-post study was conducted to evaluate the effect of our simulation-based training program on emergency management skills.

METHODS

Setting

Pediatric patients presenting to GUH are evaluated by interns at an out-patient clinic, and the sickest children are admitted to a critical care ward. At the time of their 3-month pediatrics attachment, interns have completed 5 years of medical school. At least one pediatrician is always available for consultation. Oxygen, intravenous/intraosseous fluids, and basic emergency medications are available intermittently. Sub-specialty services are limited. To our knowledge, no substantial changes were made to the hospital infrastructure between study periods.

Before our intervention, full histories and physical examinations were performed at the first patient encounter. Although national guidelines were available for the management of common pediatric diseases, no institutional guidelines had been developed. Little consensus existed between supervising pediatricians regarding the management of pediatric emergencies. Medical learners received little practical training in the management of pediatric emergencies, instead learning emergency skills on the job.

Intervention

A cross-sectional record review was conducted for patients aged 30 days–14 years admitted between 19 November 2009 and 18 November 2010. Seven diagnoses were significantly associated with death: SAM, coma, meningitis, CHF, severe dehydration, aspiration pneumonia, and sepsis. Another six were encountered in ⩾10 deaths: community-acquired pneumonia, gastroenteritis, anemia, disseminated TB, HIV, and malaria. These 13 diagnoses became targets for our mortality reduction initiative. Our methodology and results are reported elsewhere.10

The department agreed that five protocols would address the management of all priority diagnoses: respiratory distress, hypovolemic shock, septic shock, CHF, and coma. Nuances related to the management of SAM were integrated into each protocol. To ensure coverage of ETAT-recommended targets, we added protocols for the management of airway emergencies (choking and stridor) and seizure, and for appropriate cardiopulmonary resuscitation (CPR) technique. National guidelines were also integrated into each protocol. Lastly, we added four references to the protocol packet: normal vital signs by age, the Glasgow Coma Scale, an emergency medication formulary, and a weight estimation guide (Table 1).11 Substantive ETAT adaptations are listed in Table 2.

TABLE 1.

Contents of institution-specific bedside management protocol packet for pediatric emergencies, GUH

graphic file with name i2220-8372-4-4-265-t01.jpg

TABLE 2.

ETAT adaptations designed to address institutional failures and optimize resource utilization, GUH

graphic file with name i2220-8372-4-4-265-t02.jpg

Nine 2-page protocols were produced. The first page described patient stabilization, and the second described post-stabilization management. All medication recommendations were reviewed with our pharmacy to confirm availability of drugs. Areas of controversy were subjected to literature review, and evidence was graded according to the Center for Evidence-Based Medicine guidelines.12 Faculty consensus was obtained on all revised protocols through three review meetings. The finalized product was laminated and bound.* Two copies were introduced to the pediatrics ward at the start of our intervention, to be carried to the patient's bedside during emergencies.

Each intern rotating through our department from 6 September to 9 December 2011 received simulation-based training in emergency pediatrics. Participants were divided into three groups, each of which was trained within the first 2 weeks of the rotation (10, 11, and 18 September 2011). A 30 min didactic presentation was given on pattern recognition and triage. Each trainee rotated through three 30 min mannequin-based procedure skills stations: airway/bag mask ventilation, CPR, and intraosseous needle placement. The trainees then rotated through three additional 30 min stations, each offering two simulated scenarios for rehearsing emergency skills. Emphasis was placed on adhering to department protocols, using the protocols at the bedside, and communicating clearly. Debriefing was provided after each simulation.

Our evaluation used non-neonatal pediatric deaths among GUH in-patients (per cent of admissions) as a primary outcome measure. As secondary outcome measures, it used deaths in the first 24 h of admission (F24H mortality) and median pre/post-training emergency management test scores.

Patient outcome evaluation

Patient data were retrospectively abstracted from admission registers, and missing data were recovered from patient charts and archived death certificates. Because death certification is legally mandated, certification strongly reflects actual mortality. Two senior pediatricians transcribed these records into an electronic database for the baseline (11 September–18 November 2010) and intervention (19 September–9 December 2011) periods. The inclusion and exclusion criteria, definition of variables, and data collection process are described in detail elsewhere.10

GUH introduced a pediatrics residency program in late 2010. During the intervention period, residents operated functionally as general practitioners (GPs), but spent more time each day supervising interns. We reviewed the work calendars for each GP and resident and calculated the average resident/GP h/day during each patient's admission. Because four GPs were available at all times during most of the baseline period, this variable was dichotomized with a cut-off of ⩾4.

Baseline and intervention patient groups were compared for demographic variables. Age was dichotomized as ⩽ or >5 years, as age <5 years is an internationally recognized target for mortality reduction. Categorical variables were assessed using the χ2 test. Continuous variables were assessed using the Wilcoxon rank-sum test. Admission outcomes were summarized as percentage of admissions. Univariable logistic regression was used to compare study groups for admission outcome and to analyze associations between demographic variables and death.

Training evaluation

All interns rotating through the GUH pediatrics department from 6 September to 9 December 2011 were eligible to participate in the training. Informed consent was obtained from each participant. Identical examinations were administered to each intern before and immediately after the training session. The examinations were reviewed for quality by four senior pediatricians. Emergency problem solving was tested with an 18-question multiple-choice examination (best answer of 5 options). Procedural skills were tested by direct observation. The trainees were asked to perform a variety of procedures on a pediatric mannequin, while course instructors applied a standardized, checklist-based evaluation instrument to evaluate their performance. Median trainee examination scores were calculated as percentages answered correctly, and pre- and post-training results were compared by Wilcoxon matched-pairs signed ranks test.

Statistics were generated using Stata statistical software, release 12 (StataCorp, College Station, TX, USA, 2011).

Ethics approval for this study was obtained from internal review boards at both GUH in Gondar, Ethiopia (RCS/05/83/2011), and Baylor College of Medicine in Houston, TX, USA (H-28911).

RESULTS

Of the 474 patients admitted in the baseline group, 432 were included in the intervention group. During the intervention period, more males were admitted (66.4% vs. 56.7%, P = 0.003) and less supervision was available (34.2% with ⩾4 GPs or residents available per admission day vs. 91.2%, P < 0.001) (Table 3). Male sex was associated with death (odds ratio [OR] 2.01, 95% confidence interval [CI] 1.03–3.91, P = 0.040; Table 4). No change was observed for overall mortality (odds ratio: [OR] OR 0.72, 95%CI 0.40–1.29, P = 0.265) or F24H mortality (OR 0.97, 95%CI 0.37–2.55, P = 0.959). A significant increase in disappearance was observed (OR 1.66, 95%CI 1.05–2.62, P = 0.031; Table 5).

TABLE 3.

Characteristics of non-neonatal in-patients at Gondar University Hospital during baseline and intervention periods (n = 906)

graphic file with name i2220-8372-4-4-265-t03.jpg

TABLE 4.

Estimated ORs relating patient characteristics to death among non-neonatal pediatric in-patients at Gondar University Hospital during the study period; univariable logistic regression (n = 906)

graphic file with name i2220-8372-4-4-265-t04.jpg

TABLE 5.

Outcomes among non-neonatal pediatric in-patients at Gondar University Hospital during baseline and intervention periods

graphic file with name i2220-8372-4-4-265-t05.jpg

All 21 interns rotating through the GUH pediatrics department during the intervention period participated in the training program. Total examination scores improved from median 32.6% to 73.5% (P < 0.001), with improvements observed for all tested skills. Median scores improved from 11.1% to 77.8% on airway skills, from 11.1% to 88.9% on CPR, from 40.0% to 100% on intraosseous needle placement, and from 21.7% to 87.0% in aggregate for all procedural skills (P < 0.001 for all parameters) (Figure).

FIGURE.

FIGURE

Examination results from a simulation-based training for pediatric interns at Gondar University Hospital, 10–18 September 2011. * * For all tested parameters, the difference between pre- and post-test results was significant with P < 0.001, as calculated using Wilcoxon matched-pairs signed-ranks test. Didactic total = triage score + knowledge score; procedure total = airway score + CPR score + IO score. BMV = bag mask ventilation; CPR = cardiopulmonary resuscitation, including chest compressions; IO = intraosseous needle placement.

DISCUSSION

This study describes an intervention designed to reduce non-neonatal pediatric mortality at an Ethiopian teaching hospital. We standardized emergency care through faculty consensus, created institutional guidelines, and provided simulation-based training to interns. The training program generated significant improvements in intern test performance, but no change in patient mortality was observed. Further investigation with a larger sample, longer observation periods, and more balanced study groups is required to elaborate the findings of this pilot test.

Several features distinguished our intervention. First, it was designed around the findings of a baseline audit. We developed one protocol not described in ETAT (CHF) and adapted the remaining protocols to optimize resource utilization. Second, all intervention components were built on consensus between faculty members, allowing consultants to deliver consistent recommendations to trainees. Finally, the training in our intervention provided an opportunity to build procedural skills and rehearse emergency situations safely.

Our guidelines promoted rapid fluid resuscitation among children with septic shock, in accordance with international standards.4,11 After our guidelines were prepared, the Fluid Expansion as Supportive Therapy (FEAST) study demonstrated increased mortality risk with this approach.13 Our study neither supports nor challenges the international standard. Our study was not powered to detect mortality change among patients with septic shock and, unlike the FEAST trial, it included signs of decompensation in its case definition. Careful revision of our septic shock protocol is nevertheless warranted.

To our knowledge, only three studies exploring ETAT adaptation have reported mortality outcomes. One intervention combining operational changes with ETAT triage training significantly reduced mortality among children aged <5 years.6 Evaluations of another modified ETAT program7 and the ‘ETAT plus admission’ program (ETAT+, a supplemented ETAT curriculum)8 describe mortality reductions but do not formally compare pre-post groups. In our study, no mortality reduction was observed. This may reflect limitations in our analysis or under-publication of similar studies with negative findings.

Our intervention's impact may have been dampened by secular differences between study groups. First, more males were admitted in the intervention group, and male sex was associated with death. To our knowledge, there is no systematic reason for the increased admission of males. Second, less supervision was provided, and supervision may conceivably reduce mortality. The paradoxical decrease in supervision is explained by the exceptionally high number of GPs and residents who took vacation or served at clinical sites outside GUH during the intervention period.

Third, the qualitative effects of our residency program may have influenced patient mortality. A recent systematic review, including mostly surgical studies from Western countries, demonstrated no added risk or benefit to care delivered by residents as a whole, but a trend toward preventable complications was noted among residents in their first year of training.14 It is conceivable that the residency program adversely affected patient outcomes, as all residents were in their first year during our intervention. As little has been published on the clinical impact of African residency programs, any comparison between Western and African residencies should be drawn with caution. Future research exploring the effect of our intervention will need to establish more balanced study groups.

Several reports have described the effect of management guidelines on pediatric outcomes, including survival15 and length of stay (LOS).16 No reduction in death was observed in our study, and we did not demonstrate any change in LOS. Few studies have described the efficacy of simulation-based training programs in African hospitals. Significant improvement was observed in our study for all tested parameters, with particular improvement in procedural skills. This finding is in agreement with a report from Taiwan demonstrating improved procedural technique after simulation-based training.17 In Western settings, simulation-based training has also been found to improve teamwork18–20 and to reduce medical errors,21 but our evaluation instrument was not designed to capture these effects.

Training in ETAT+ runs for 5 days,6 standard ETAT for 3.5 days,4 and ‘Essential ETAT’ (a focused ETAT program) for 1 day.22 At 3.5 h, our training is among the briefest reported, and test scores among trainees improved significantly. By allowing for focused, practical instruction, simulation-based training may increase learning efficiency. Because different evaluation methods were applied, however, direct comparisons between these training programs should be made cautiously.

More patients ‘disappeared’ during our intervention period. Although a causal link may conceivably exist between exposure and disappearance, this finding is more likely to be spurious due to our small sample size. Further investigation with prospectively collected outcome data is required to confirm this hypothesis. Our statistical approach (hypothesis testing) is crude compared to time-series analysis.23,24 We could not apply the latter because the interns targeted by our training were attached to our department for only 3 months. We attempted to address the influence of seasonality on pediatric death by drawing data from the same season in both study groups.

Due to resource limitations, we could not directly evaluate the effect of our intervention on provider behavior. Evidence from Kenya suggests that didactic training in guideline adherence generates improvement in clinical practice only when delivered jointly with job aids and longitudinal feedback.25 Our approach used simulation training and job aids, but failed to provide longitudinal feedback. Further research is needed to determine if our intervention produced comparable changes in provider practice.

Parts of our training evaluation may have been biased. First, we could not apply a previously validated examination because our training program contained content tailored to our hospital. We attempted to maximize the validity of our examination by subjecting it to peer review. Second, because we administered the same examination pre- and post-training, some of the observed test score improvement may have resulted from priming. Third, because qualified third-party proctors were unavailable at the time of our training, course instructors administered all examinations directly. Their involvement may have biased the procedural examination scores.

Our locally adapted mortality reduction intervention did not lower pediatric mortality. Simulation-based training is effective and efficient in improving short-term knowledge and procedural skills among Ethiopian medical trainees. Further investigation is required to understand the link between operational change, clinical practice, and patient outcome in resource-limited hospitals.

Acknowledgments

The authors thank the faculty, residents, and interns of Gondar University Hospital, Gondar, Ethiopia, for their essential role in shaping our intervention. We thank the Baylor International Pediatric AIDS Initiative, Houston, TX, USA, and the Jewish Joint Distribution Committee, New York, NY, USA, for their collaborative input into this work. We thank S Gordon for technical assistance in the development of variables. This work was supported by Gondar University Hospital.

Footnotes

Conflicts of interest: none declared.

* The protocol can be obtained on request from the corresponding author.

References

  • 1.Mathers C D, Boerma T, Ma Fat D. Global and regional causes of death. Brit Med Bull. 2009;92:7–32. doi: 10.1093/bmb/ldp028. [DOI] [PubMed] [Google Scholar]
  • 2.Kallander K, Hildenwall H, Waiswa P et al. Delayed care seeking for fatal pneumonia in children aged under five years in Uganda: a case-series study. Bull World Health Organ. 2008;86:332–338. doi: 10.2471/BLT.07.049353. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Armstrong Schellenberg J R, Nathan R, Abdulla S et al. Risk factors for child mortality in rural Tanzania. Trop Med Int Health. 2002;7:506–511. doi: 10.1046/j.1365-3156.2002.00888.x. [DOI] [PubMed] [Google Scholar]
  • 4.World Health Organization. Emergency triage assessment and treatment manual for participants. Geneva, Switzerland: WHO; 2005. [Google Scholar]
  • 5.Tamburlini G, Di Mario S, Magggi R S et al. Evaluation of guidelines for emergency triage assessment and treatment in developing countries. Arch Dis Child. 1999;81:478–482. doi: 10.1136/adc.81.6.478. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Robison J A, Ahmad Z P, Nosek C A et al. Decreased pediatric hospital mortality after an intervention to improve emergency care in Lilongwe, Malawi. Pediatrics. 2012;130:e676–e682. doi: 10.1542/peds.2012-0026. [DOI] [PubMed] [Google Scholar]
  • 7.Molyneux E, Ahmad S, Robertson A. Improved triage and emergency care for children reduces inpatient mortality in a resource-constrained setting. Bull World Health Organ. 2006;84:314–319. doi: 10.2471/blt.04.019505. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Irimu G W, Gathara D, Zurovac D et al. Performance of health workers in the management of seriously sick children at a Kenyan tertiary hospital: before and after a training intervention. PLOS ONE. 2012;7:e39964. doi: 10.1371/journal.pone.0039964. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Population Census Commission. Statistical reports of the census for Amhara region. Addis Ababa, Ethiopia: Central Statistical Agency of Ethiopia; 2007. [Google Scholar]
  • 10.Gordon D, Frenning S, Draper H et al. Prevalence and burden of diseases presenting to a general pediatrics ward hospital in Gondar, Ethiopia. J Trop Pediatr. 2013;59:350–357. doi: 10.1093/tropej/fmt031. [DOI] [PubMed] [Google Scholar]
  • 11.World Health Organization. Pocketbook of hospital care for children. Geneva, Switzerland: WHO; 2007. [Google Scholar]
  • 12.Centre for Evidence-based Medicine. Oxford centre for evidence-based medicine levels of evidence. Oxford, UK: CEBM; 2014. http://www.cebm.net/oxford-centre-evidence-based-medicine-levels-evidence-march-2009/. Accessed November 2014. [Google Scholar]
  • 13.Maitland K, Kiguli S, Opoka R O et al. Mortality after fluid bolus in African children with severe infection. N Engl J Med. 2011;364:2483–2495. doi: 10.1056/NEJMoa1101549. [DOI] [PubMed] [Google Scholar]
  • 14.Van der Leeuw R M, Lombarts K M, Arah O A et al. A systematic review of the effects of residency training on patient outcomes. BMC Med. 2012;10:65. doi: 10.1186/1741-7015-10-65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Nankervis C A, Martin E M, Crane M L et al. Evaluation of multidisciplinary guideline-driven approach to the care of extremely premature infants improved hospital outcomes. Acta Pediatr. 2010;99:188–193. doi: 10.1111/j.1651-2227.2009.01563.x. [DOI] [PubMed] [Google Scholar]
  • 16.Evans R, LeBailly S, Gordon K K et al. Restructuring asthma care in a hospital setting to improve outcomes. Chest. 1999;116(Suppl):210S–216S. doi: 10.1378/chest.116.suppl_2.210s. [DOI] [PubMed] [Google Scholar]
  • 17.Chen P T, Huang Y C, Cheng H W et al. New simulation-based airway management training program for junior physicians: advanced airway life support. Med Teach. 2009;31:338–344. doi: 10.1080/01421590802641471. [DOI] [PubMed] [Google Scholar]
  • 18.Wallin C J, Meurling L, Hedman L et al. Target-focused medical emergency team training using a human patient simulator: effects on behavior and attitude. Med Educ. 2007;42:175–180. doi: 10.1111/j.1365-2929.2006.02670.x. [DOI] [PubMed] [Google Scholar]
  • 19.Shapiro M J, Morey J C, Small S D et al. Simulation based teamwork training for emergency department staff: does it improve clinical team performance when added to an existing didactic teamwork curriculum? Qual Saf Health Care. 2004;13:417–421. doi: 10.1136/qshc.2003.005447. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Cheng A, Goldman R D, Aish M A et al. A simulation-based acute care curriculum for pediatric emergency medicine fellowship training programs. Pediatr Emerg Care. 2010;26:475–480. doi: 10.1097/PEC.0b013e3181e5841b. [DOI] [PubMed] [Google Scholar]
  • 21.Morey J C, Simon R, Jay G D et al. Error reduction and performance improvement in the emergency department through formal teamwork training: evaluation results of the Med Teams project. Health Serv Res. 2002;37:1553–1581. doi: 10.1111/1475-6773.01104. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Dyer B, Pollock L, Morar N et al. Essential ETAT: one-day pediatric resuscitation training in a resource-limited setting. Emerg Med J. 2013;30:880. [Google Scholar]
  • 23.Lagarde M. How to do (or not to do)…assessing the impact of a policy change using routine longitudinal data. Health Policy Plan. 2012;27:76–83. doi: 10.1093/heapol/czr004. [DOI] [PubMed] [Google Scholar]
  • 24.Benneyan J C, Lloyd R C, Lsek P E. Statistical process control as a tool for research and healthcare improvement. Qual Saf Health Care. 2003;12:458–464. doi: 10.1136/qhc.12.6.458. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Ayieko P, Ntoburi S, Wagai J et al. A multifaceted intervention to implement guidelines and improve admission pediatric care in Kenyan district hospitals: a cluster randomized trial. PLOS MED. 2011;8:e1001018. doi: 10.1371/journal.pmed.1001018. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Public Health Action are provided here courtesy of The International Union Against Tuberculosis and Lung Disease

RESOURCES