Skip to main content
American Journal of Clinical Pathology logoLink to American Journal of Clinical Pathology
. 2016 Jul 27;146(2):199–206. doi: 10.1093/ajcp/aqw083

Challenges of Maintaining Good Clinical Laboratory Practices in Low-Resource Settings

 A Health Program Evaluation Framework Case Study From East Africa

Helen L Zhang 1, Michael W Omondi 2, Augustine M Musyoka 3,4, Isaac A Afwamba 3, Remigi P Swai 3, Francis P Karia 3,4, Charles Muiruri 2, Elizabeth A Reddy 5, John A Crump 1,4,6, Matthew P Rubach 1,
PMCID: PMC6410885  PMID: 27473737

Abstract

Objectives: Using a clinical research laboratory as a case study, we sought to characterize barriers to maintaining Good Clinical Laboratory Practice (GCLP) services in a developing world setting.

Methods: Using a US Centers for Disease Control and Prevention framework for program evaluation in public health, we performed an evaluation of the Kilimanjaro Christian Medical Centre–Duke University Health Collaboration clinical research laboratory sections of the Kilimanjaro Clinical Research Institute in Moshi, Tanzania. Laboratory records from November 2012 through October 2014 were reviewed for this analysis.

Results: During the 2-year period of study, seven instrument malfunctions suspended testing required for open clinical trials. A median (range) of 9 (1-55) days elapsed between instrument malfunction and biomedical engineer service. Sixteen (76.1%) of 21 suppliers of reagents, controls, and consumables were based outside Tanzania. Test throughput among laboratory sections used a median (range) of 0.6% (0.2%-2.7%) of instrument capacity. Five (55.6%) of nine laboratory technologists left their posts over 2 years.

Conclusions: These findings demonstrate that GCLP laboratory service provision in this setting is hampered by delays in biomedical engineer support, delays and extra costs in commodity procurement, low testing throughput, and high personnel turnover.

Keywords: Africa, Laboratory medicine, Good Clinical Laboratory Practice, Quality assurance


As developing countries continue to confront an enormous burden of communicable and noncommunicable diseases,1,2 it is increasingly apparent that laboratory systems strengthening is a vital yet underaddressed prerequisite to health care delivery advances in low-resource settings.3 Over the past decade, attempts to scale up developing world laboratory services as part of vertical health care delivery models have demonstrated notable success. The President’s Emergency Program for AIDS Relief (PEPFAR), President’s Malaria Initiative (PMI), and Global Fund to Fight AIDS, Tuberculosis, and Malaria have shown that with adequate investment and willpower, even health systems with the most limited resources can scale up integrated health care services.4,5 These programs have led to substantial improvements in laboratory capacity, including increased availabilities of CD4+ T-lymphocyte counts for human immunodeficiency virus (HIV) disease monitoring and rapid diagnostic tests for malaria, in settings where they are most urgently needed.6‐8 However, these disease-specific investments have not been coupled with a necessary cross-sector strengthening of laboratory infrastructure in sub-Saharan Africa.9 As of 2009, only 28 (8.2%) of 340 internationally accredited laboratories in Africa were located in sub-Saharan Africa.10

This lack of cross-sector laboratory capacity remains a major limiting factor to health care delivery improvements in resource-limited settings. In sub-Saharan Africa, laboratory infrastructure suffers from a scarcity of skilled laboratory workforce, essential instruments, and equipment; underdevelopment of standard quality control programs; and a weak supply chain for laboratory reagents and consumables.9,11‐14 Despite these limitations, several laboratories in sub-Saharan Africa have demonstrated that Good Clinical Laboratory Practice (GCLP), a set of minimum requirements intended to promote reliability and integrity of laboratory data in clinical care and clinical research, can be achieved in the context of externally funded clinical research.15,16 However, maintenance of such standards requires considerable long-term investment not yet available for most laboratories in this setting. In this case study, we aim to characterize the challenges of maintaining GCLP standards in the Kilimanjaro Christian Medical Centre (KCMC)–Duke University Health Collaboration sections of the Kilimanjaro Clinical Research Institute (KCRI) Biotechnology Laboratory in Moshi, Tanzania. Using the US Centers for Disease Control and Prevention (CDC) framework for program evaluation in public health,17 we are able to highlight key areas requiring further investment if a stepwise improvement in laboratory capacity is to occur in sub-Saharan Africa and elsewhere in the developing world.

Materials and Methods

Site Description

KCMC is a 458-bed hospital located in Moshi, Tanzania. It serves as the consultant referral hospital for the northern zone of Tanzania and has been a clinical research partner with multiple international institutions. In 2004, Duke University partnered with KCMC to establish a laboratory enterprise operating within the KCRI, which coordinates medical research activities at KCMC. These KCMC–Duke University Health Collaboration affiliated sections of the KCRI Biotechnology Laboratory are hereafter abbreviated as the “Biotechnology Laboratory.” In addition to offering space for fundamental science research, the Biotechnology Laboratory has developed clinical laboratory services operating to GCLP standards. It was first approved in 2008 as a GCLP-compliant laboratory of the AIDS Clinical Trials Group and International Maternal, Pediatric, Adolescent AIDS Clinical Trials networks by the Division of AIDS (DAIDS) laboratory evaluation contractor, Patient Safety Monitoring in International Laboratories. Since approval, the Biotechnology Laboratory has successfully supported laboratory evaluations for research studies funded by the US National Institutes of Health (NIH) (including the DAIDS network studies) and non-NIH externally funded studies.18,19

The Biotechnology Laboratory’s services comprise five laboratory sections: chemistry, hematology, immunology, microbiology, and molecular microbiology. Instruments include two COBAS INTEGRA 400 plus chemistry analyzers (Roche, Basel, Switzerland), CELL-DYN 3700 and CELL-DYN 3500 hematology analyzers (Abbott, Abbott Park, IL), FACSCalibur and FACSCount flow cytometers (Becton Dickinson, Franklin Lakes, NJ), a BacT/ALERT 3D microbial detection system (bioMérieux, Marcy l’Etoile, France), a BACTEC MGIT 960 mycobacterial detection system (Becton Dickinson), an m2000 RealTime System (Abbott), and other smaller instruments. Safeguarding measures against power fluctuations include two facility-level diesel-fueled power generators, instrument-based uninterruptible power supply units, and voltage regulators. In the event that assays cannot be performed on site, the Biotechnology Laboratory has partnered with other GCLP laboratories in the region, including the National Institute of Medical Research–Mbeya Medical Research Centre in Mbeya, Tanzania, and the Bio Analytical Research Corporation South Africa in Johannesburg, South Africa.

Organization Logic Model Analysis

We used the CDC’s health program evaluation framework, also known as an organization logic model, which orients the analysis around the organization’s primary mission and the concrete goals for meeting that mission.17 The selected outcomes are indicators of the organization’s success or failure in achieving the goals required to fulfill the organization’s mission. Within this logical framework, we posited the mission of the Biotechnology Laboratory as the following: to provide quality laboratory-based diagnostic services requisite for externally sponsored research conducted by the KCMC–Duke University Health Collaboration. To fulfill this mission, the Biotechnology Laboratory aims to achieve the following: to operate at GCLP standards, to do so without lapses in accreditation or in the diagnostic services required for open study protocols, and to do so in a sustainable manner. We measured GCLP compliance by assessing the following domains: external quality assurance (EQA), equipment, physical facilities, and reagents and controls.20 Cost-effectiveness and personnel turnover comprised other key aspects of the sustainability assessment.

Data Extraction and Analysis

All printed laboratory records dated between November 1, 2012, and October 31, 2014, were reviewed. Email records and key stakeholder interviews were used to supplement incomplete documentation. Analyses in each domain were performed as follows.

EQA Performance

EQA describes the ongoing practice of objectively evaluating a laboratory’s analytical performance by using an external agency for proficiency testing material and peer institutions for performance comparison. The Biotechnology Laboratory’s EQA records from the following were reviewed: College of American Pathologists (Northfield, IL); DAIDS Virology Quality Assurance, Rush Presbyterian–St Luke’s Medical Center (Chicago, IL); Immunology Quality Assurance PBMC Cryopreservation Proficiency Testing Program (Durham, NC); Oneworld Accuracy AccuTest Proficiency Testing Services (Boston, MA); and United Kingdom National External Quality Assessment Schemes Leucocyte Immunophenotyping Programme (Sheffield, United Kingdom). Each participating analyte was assigned to one of the following categories: analytes with consistently satisfactory EQA performance ratings, those with isolated EQA failures still meeting acceptability criteria for NIH-funded clinical trials, and those with persistent EQA failures not meeting acceptability criteria for NIH-funded clinical trials. Definitions are summarized in Table 1.

Table 1.

External Quality Assurance Performance Definitions Used for Evaluation of KCMC–Duke Health Collaboration Sections of the KCRI Biotechnology Laboratory

EQA Program(s) “Consistently Satisfactory” “Isolated Failure” “Persistent Failures”
CAP, Accutest Score ≥80% on all submitted panels Score <80% on at least one panel, never during consecutive panels Score <80% on at least two consecutive panels
UKNEQAS Running score of “satisfactory” 1-2 running scores of “unsatisfactory” within 12 months ≥3 running scores of “unsatisfactory” within 12 months
IQA Combined status of “approved” Combined status of “provisionally approved” Combined status of “on probation” or “on hold”
VQA Panel score of “certified” Panel score of “provisionally certified” Panel score of “probation” or cumulative score >10

Accutest, Oneworld Accuracy AccuTest Proficiency Testing; CAP, College of American Pathologists; EQA, external quality assurance; IQA, Immunology Quality Assurance PBMC Cryopreservation Proficiency Testing Program; KCMC, Kilimanjaro Christian Medical Centre; KCRI, Kilimanjaro Clinical Research Institute; UKNEQAS, United Kingdom National External Quality Assessment Schemes; VQA, Division of AIDS Virology Quality Assurance.

Instruments and Physical Facilities

To assess instrument malfunction frequency, we reviewed service records for the instruments mentioned previously, with the exception of the BD BACTEC MGIT 960 due to pending validation. Suspension of on-site testing was determined based on the nature of the malfunction, open study protocol analytes, and availability of a backup instrument. Time to engineer service and time to resolution were calculated using engineer service reports, laboratory technologist-maintained logs, and email records; consensuses on discrepancies between sources were reached on a case-by-case basis by two authors (M.P.R. and H.L.Z.). Purchase prices and service contract prices were noted for each instrument. The Abbott m2000 RealTime System was excluded from this calculation because it was acquired as a donation. Intervals between semiannual instrument preventative maintenance were calculated for all analytical instruments mentioned previously, with the exception of the BD FACSCount due to its annual preventative maintenance schedule. Preventative maintenance services not performed due to instrument out-of-service status were also excluded.

Reagents, Controls, and Consumables

To determine shipment durations, we compared dates of order payment with dates of shipment receipt at the laboratory. Shipment retention durations in Tanzanian customs were extracted from customs clearance receipts. To evaluate the impact of reagent stock-outs on calibration, quality control, and parallel testing, we reviewed internal incident reports. A reagent stock-out was attributed to a delay in delivery if shipment duration exceeded 28 days or if a standing order was not shipped on schedule.

Cost and Instrument Capacity Utilization

Testing capacity was calculated using maximum manufacturer-established instrument throughputs, assuming an operating schedule of five 8-hour shifts per week. Actual testing throughput was tabulated using laboratory billing documents. Chemistry sample throughput was estimated by dividing test throughput by the mean tests per sample from three recent clinical trial protocols, yielding a mean of six tests per sample.

The Biotechnology Laboratory generates no profit from its services. To recover operating costs, the laboratory charges clinical studies on a per-test basis. Laboratory costs not recovered through this mechanism are subsidized through external funding sources. For cost-per-test analysis, we selected the following representative tests: CD4+ T-lymphocyte count, complete blood count, HIV-1 RNA quantitation, negative blood culture, and serum sodium measurement. Costs per test were calculated using EP Evaluator (Data Innovations, South Burlington, VT) based on annual testing volume and materials, labor, and instrument maintenance costs. These figures were then compared with costs billed by the Biotechnology Laboratory in 2013 (ie, costs recovered from service clientele) and with costs per test in a high-volume North American laboratory. Freight costs were retrieved from invoices, and customs clearance charges were tabulated from accounting records.

Personnel

Data on staffing were extracted from human resources records. Measures of interest comprised personnel turnover rate, reasons for personnel departure, and training resources. These were compiled from electronic records. Personnel turnover over the lifetime of the Biotechnology Laboratory was also determined for comparison.

Results

EQA Performance

Table 2 summarizes the laboratory’s performance in EQA programs throughout the period of study. In total, 95 analytes were submitted for at least two EQA testing cycles. Sixty-one (64.2%) passed all of the EQA testing cycles for which results were submitted. Ten (10.5%) were suspended from testing for EQA failures. Four (40.0%) of these analytes were on active protocols at the time of failure, requiring cessation of on-site testing and interim shipment of samples to a backup laboratory until corrective actions were taken. Of note, one of these suspensions occurred as a consequence of grading problems relating to the laboratory’s use of outdated hematology analyzer models, for which outputs for certain analytes were incompatible with those from newer models.

Table 2.

EQA Performance by Laboratory Section, KCMC–Duke Health Collaboration Sections of KCRI Biotechnology Laboratory, November 2012 to October 2014

Section EQA Program(s) Analytes
Total No. Consistently Satisfactory EQA Performance, No. (%)a Isolated EQA Failures, No. (%)b Persistent EQA Failures, No. (%)c
Chemistry Accutest; CAP 40 31 (77.5) 5 (12.5) 4 (10.0)
Hematology Accutest; CAP 22 7 (31.8) 13 (59.1) 2 (9.1)
Immunology UKNEQAS; IQA 11 8 (72.7) 2 (18.2) 1 (9.1)
Microbiology Accutest; CAP 20 13 (65.0) 4 (20.0) 3 (15.0)
Molecular VQA 2 2 (100.0) 0 (0.0) 0 (0.0)
Total 95 61 (64.2) 24 (25.3) 10 (10.5)

Accutest, Oneworld Accuracy AccuTest Proficiency Testing; CAP, College of American Pathologists; EQA, external quality assurance; IQA, Immunology Quality Assurance PBMC Cryopreservation Proficiency Testing Program; KCMC, Kilimanjaro Christian Medical Centre; KCRI, Kilimanjaro Clinical Research Institute; UKNEQAS, United Kingdom National External Quality Assessment Schemes; VQA, Division of AIDS Virology Quality Assurance.

aAnalytes with consistently satisfactory EQA performance ratings over a 2-year evaluation period.

bAnalytes with isolated EQA failures still meeting acceptability criteria for National Institutes of Health (NIH)–funded clinical trials.

cAnalytes with persistent EQA failures not meeting acceptability criteria for NIH-funded clinical trials.

Instruments and Physical Facilities

During the period of study, the laboratory’s eight validated instruments experienced a cumulative 23 malfunctions requiring engineer technical support Table 3. Seven (30.4%) malfunctions affected the laboratory’s ability to test protocol analytes, requiring samples to undergo either freezer archiving or shipment to a backup testing facility. The median (range) time elapsed until a service contracted engineer arrived on site for instrument repair was nine (1-55) days, and the median (range) time elapsed until malfunctions were resolved was nine (1-158) days.

Table 3.

Laboratory Testing Instrument Malfunctions and Their Impact on Study Protocol Testing, KCMC-Duke Health Collaboration Sections of KCRI Biotechnology Laboratory, November 2012 to October 2014

Section No. of Validated Instruments Total No. of Malfunction Frequencies Malfunction-Related Test Suspension Frequency, No. (%) Time to Engineer On-Site, Median (Range), d Time to Resolution, Median (Range), d
Chemistry 2 10 2 (20.0) 6 (1-29) 6 (1-136)
Hematology 2 8 2 (25.0) 16 (5-55) 25 (11-158)
Immunology 2 2 0 (0.0) 9 (8-9) 9 (8-9)
Microbiology 1 0 NA NA NA
Molecular 1 3 3 (100.0) 16 (6-30) 16 (6-30)
Total 8 23 7 (30.4) 9 (1-55) 9 (1-158)

KCMC, Kilimanjaro Christian Medical Centre; KCRI, Kilimanjaro Clinical Research Institute; NA, not applicable.

Annual service contract prices among each laboratory section were a median (range) of 12.1% (3.5%-16.2%) of instrument purchase prices. The median (range) interval between semiannual preventative maintenance services was 184 (151-388) days. One preventative maintenance visit was rescheduled and delayed. Another preventative maintenance visit was never performed, in violation of the instrument’s service contract.

Five (21.7%) of the 23 instrument malfunctions were attributed to electricity supply instability. These malfunctions consisted of hardware thermal damage to both COBAS INTEGRA 400 plus instruments, laser diode failure to the Abbott CELL-DYN 3700, and two consecutive malfunctions of the Abbott m2000 RealTime System control center. Other causes of instrument malfunctions comprised hardware malfunction unrelated to electricity supply instability (n = 12), software malfunction (n = 3), and quality control failure (n = 3).

Reagents, Controls, and Consumables

During the period of study, the laboratory sourced reagents, controls, and consumables from 21 vendors, of which five (23.8%) shipped materials within Tanzania, nine (42.9%) from other African countries, and the remaining seven (33.3%) from Europe or North America. Of 98 incoming international shipments for which Tanzanian customs records were retained, the median (range) duration that a shipment was held in customs was 9 (1-51) days.

The laboratory’s strategies to minimize interruptions to testing included prepreparation of documents facilitating customs clearance, placement of standing orders for commonly used reagents and controls, and borrowing of commodities from the hospital’s clinical diagnostic laboratory during stock-outs. Despite these precautions, one protocol assay was suspended due to a regionwide shortage of carbon dioxide supply, four instances of calibration or parallel testing were deferred due to reagent stock-outs during delayed shipments, and four EQA surveys were received at an inappropriate temperature due to compromised cold supply chain.

Cost and Instrument Capacity Utilization

Table 4 shows a comparison of manufacturer-established maximum instrument throughputs vs the laboratory’s actual throughputs in each laboratory section. Table 5 shows calculated costs per test for representative laboratory assays. Customs accounted for US$22,980.11 (7.1%) of total laboratory supply expenditures (US$321,801.82) during the period of study. Of the 54 available order invoices from nonlocal vendors, the median (range) percentage of cost allocated to freight was 6.6% (0.0%-60.0%).

Table 4.

Maximum and Actual Throughput of Laboratory Testing Instruments, KCMC–Duke Health Collaboration Sections of KCRI Biotechnology Laboratory, November 2012 to October 2014

Section Instruments Estimated No. of Maximum Throughput, Samples per Montha No. of Laboratory Throughput, Samples per Month, Mean (SD) Percentage of Capacity Used
Chemistry Roche COBAS INTEGRA 400 plus (2) 10,000 62 (47) 0.6
Hematology Abbott CELL-DYN 3500, Abbott CELL-DYN 3700 28,800 65 (33) 0.2
Immunology BD FACSCalibur, BD FACSCount 8,000 30 (7) 0.4
Microbiology bioMérieux BacT/ALERT 3D 480 2,920 40 (19) 1.4
Molecular Abbott m2000 system 1,860 51 (19) 2.7

KCMC, Kilimanjaro Christian Medical Centre; KCRI, Kilimanjaro Clinical Research Institute.

aAssumes one 8-hour shift per day and 5 work days per week, not including startup, shutdown, and calibration.

Table 5.

Comparison of Estimated Costs per Test in the KCMC–Duke Health Collaboration Sections of KCRI Biotechnology Laboratory, Prices Charged, and Costs per Test in a North American Laboratory

Assay Cost per Test in Biotechnology Laboratory, US$ Charge per Test Billed by Biotechnology Laboratory, 2013, US$ Cost per Test in North American Laboratories, US$a
CD4+ T-lymphocyte count 108.13 48.89 39.98
CBC 32.81 24.31 5.95
HIV-1 RNA quantitation 83.64 66.97 150.12
Negative blood culture 29.87 18.50 13.25
Serum sodium measurement 63.40 22.72 3.56

HIV-1, human immunodeficiency virus 1; KCMC, Kilimanjaro Christian Medical Centre; KCRI, Kilimanjaro Clinical Research Institute.

aUnpublished data (M.P.R. and E.A.R.)—cost per tests from two academic medical center laboratories. Amount shown is the mean of the two centers.

Personnel

At the start of the period of study, the laboratory employed five laboratory technologists. Four additional technologists were hired during the period of study to maintain staffing requirements of four to seven technologists. Of these nine laboratory technologists, five (55.6%) left their posts during the period of study. The median (range) duration of employment among those who left was 604 (210-1,067) days. Cited reasons for departure consisted of other employment for four (80.0%) and graduate studies for one (20.0%). Resources invested in these former technologists included sponsorships to attend training workshops in South Africa (n = 2), Uganda (n = 1), and the United States (n = 1), as well as on-site training for all technologists. To place personnel turnover during the study period into a larger context, since the laboratory’s establishment, 13 (76.5%) of 17 laboratory technologists have left their positions, with a median (range) employment of 883 (210-2,354) days.

Discussion

Our evaluation characterized several barriers challenging the Biotechnology Laboratory’s ability to achieve uninterrupted and sustainable provision of high-quality diagnostic laboratory services in northern Tanzania. First, underperformance of instrument maintenance services, weaknesses in physical infrastructure, and limited backup testing options contribute to the frequently prolonged and debilitating nature of instrument breakdowns. Second, sourcing of quality reagents and consumables for laboratory testing is often performed internationally due to lack of reliable local or national vendors and is burdened by a slow and costly customs clearance process. Third, considering the fixed costs of sustaining the laboratory, including instrument maintenance service contracts and personnel wages, the low testing demand of the laboratory precludes self-sustainability. Finally, the laboratory’s public-sector salary scales and funding insecurities inherent to a research-funded environment compete poorly with more lucrative and stable private-sector positions, resulting in substantial human resource investment losses. Many of these issues are consistent with reported challenges of other laboratory capacity-building efforts in low-resource settings.6,21

Compared with laboratories in high-income countries, with the exception of labor costs, the Biotechnology Laboratory’s current state of operations reflects a “pay more, get less” situation. Despite comparable instrument service contract costs charged to this laboratory vs those charged to North American GCLP laboratories, the turnaround time for emergency engineering services observed in this laboratory is several-fold slower than the turnaround times expected in North American clinical laboratories. This may be explained in part by low numbers of trained biomedical service engineers in East Africa.11 The international or intercontinental sourcing of reagents not only increases laboratory costs through freight and customs charges but also poses a risk to shipment quality due to unreliable cold supply chain. Both the limited supply of biomedical engineers and the tenuous and costly reagent supply chains are consequences of underdeveloped markets for biomedical engineering services, reagents, and consumables in East Africa. In this context of insufficient investment, limited human resources, and an unpredictable supply chain, increasing testing volumes to achieve economies of scale are difficult to attain for most laboratories despite the immense need for laboratory services. The laboratory sector is thus trapped in a low-volume, high-cost scenario that precludes access to a quality-assured laboratory for most patients, particularly to the large potential market of nonresearch users.

Quality-assured laboratory services that provide timely and accurate results are fundamental to quality patient care, improved clinical outcomes, and effective disease control efforts.22‐24 Yet the clinician-laboratory interaction remains fragmented in settings such as ours. A recent study in northern Tanzania demonstrated that, despite the 2010 release of updated World Health Organization guidelines recommending parasitologic confirmation of malaria prior to antimalarial treatment, the proportion of smear-negative patients receiving a clinical diagnosis of malaria and treatment with antimalarial drugs remained high.25 These results may be driven in part by low clinician confidence in negative test results due to historically substandard malaria testing services, as well as a lack of diagnostic testing options for nonmalarial diseases.26,27 Improvements in clinical practice are therefore contingent on comprehensive expansion and clinical integration of high-quality laboratory services at all levels of health care provision.

We emphasize that cross-sector development of laboratory services in low-resource settings is unlikely to occur spontaneously. Such a process requires a concerted effort among international and local stakeholders, as has already been successfully applied to disease-specific initiatives such as PEPFAR, PMI, the Global Fund, and the Clinton Health Access Initiative. The past decade has seen several calls for laboratory systems strengthening in low-income countries, including the Maputo Declaration on Strengthening of Laboratory Systems in 2008.3 This recent momentum offers promise for increasing investment among international agencies, donors, biomedical industries, and local stakeholders to collectively address the current barriers to the scale-up of laboratory services in low-resource settings. Areas of need, as identified in our analysis, include improving prohibitively high-cost gradients wherein capital investments in laboratory instruments are difficult to recoup due to low test volumes and high overheads for maintaining instruments and quality assurance; upgrading laboratory physical infrastructure, including electricity conditioning; incentivizing laboratory workforce retention; establishing robust in-country biomedical industry support, including expansion of the biomedical engineer workforce and local sourcing of laboratory commodities; and expanding national and regional quality assurance schemes.

Our evaluation comes with several limitations. Due to the retrospective nature of this evaluation, our analysis was confined to existing records that did not include complete documentation for all measures of interest. Furthermore, this evaluation details the experiences of a single GCLP laboratory supporting clinical research studies and may not be generalizable to all developing world laboratories. However, we expect many of the challenges characterized here to be common throughout sub-Saharan Africa due to their direct relation to poor laboratory infrastructure across the region.

Our health program evaluation demonstrates the practical challenges of maintaining high-quality laboratory capacity in the regions of the world where it is most desperately needed. To overcome these challenges, a coordinated and goal-directed partnership between national and regional stakeholders, global donor agencies, international governing bodies, and biomedical industries is needed. Only then can progress toward large-scale improvements in laboratory capacity in sub-Saharan Africa and other low-resource settings be achieved.

Acknowledgments

Acknowledgments: We thank Frank Michael, Cynthia Asiyo, Blandina T. Mmbaga, and Vera Wright for their assistance with data collection. We are grateful to the leadership, clinicians, Kilimanjaro Christian Medical Centre, and Kilimanjaro Clinical Research Institute for their contributions to this research. Financial support: Authors received support from National Institutes of Health awards ISAAC (J.A.C. and C.M.), AIDS International Training and Research Program D43 PA-03-018 (J.A.C. and C.M.), the Duke Clinical Trials Unit and Clinical Research Sites U01 AI069484 (M.W.O., A.M.M., I.A.A., and J.A.C.), the Center for HIV/AIDS Vaccine Immunology U01 AI067854 (M.W.O., A.M.M., I.AA., and J.A.C.), and the Duke Center for AIDS Research P30 AI064518 (J.A.C.). J.A.C. and C.M. are supported by the joint US National Institutes of Health–National Science Foundation Ecology and Evolution of Infectious Disease program (R01 TW009237), and J.A.C. is supported by UK Biotechnology and Biological Sciences Research Council (BBSRC) (BB/J010367/1) and by UK BBSRC Zoonoses in Emerging Livestock Systems awards BB/L017679, BB/L018926, and BB/L018845. H.L.Z. is supported by the Doris Duke Charitable Foundation through a grant supporting the Doris Duke International Clinical Research Fellows Program at Duke University. We acknowledge the Hubert-Yeargan Center for Global Health at Duke University for critical infrastructure support for the Kilimanjaro Christian Medical Centre–Duke University Health Collaboration. These funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1. GBD 2013 Mortality and Causes of Death Collaborators. Global, regional, and national age-sex specific all-cause and cause-specific mortality for 240 causes of death, 1990-2013: a systematic analysis for the Global Burden of Disease Study 2013. Lancet. 2015;385:117-171. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Boutayeb A, Boutayeb S. The burden of non communicable diseases in developing countries. Int J Equity Health. 2005;4:2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. WHO Regional Office for Africa. The Maputo declaration on strengthening of laboratory systems. 2008. http://www.who.int/diagnostics_laboratory/Maputo-Declaration_2008.pdf. Accessed February 27, 2015.
  • 4. Bendavid E, Holmes CB, Bhattacharya J, et al. HIV development assistance and adult mortality in Africa. JAMA. 2012;307:2060-2067. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Lima VD, Granich R, Phillips P, et al. Potential impact of the US President's emergency plan for AIDS relief on the tuberculosis/HIV coepidemic in selected sub-Saharan African countries. J Infect Dis. 2013;208:2075-2084. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Abimiku AG; Institute of Human Virology, University of Maryland School of Medicine Pepfar Program. Building laboratory infrastructure to support scale-up of HIV/AIDS treatment, care, and prevention: in-country experience. Am J Clin Pathol. 2009;131:875-886. [DOI] [PubMed] [Google Scholar]
  • 7. Zhao J, Lama M, Korenromp E, et al. Adoption of rapid diagnostic tests for the diagnosis of malaria, a preliminary analysis of the Global Fund program data, 2005 to 2010. PLoS One. 2012;7:e43549. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Yukich JO, Bennett A, Albertini A, et al. Reductions in artemisinin-based combination therapy consumption after the nationwide scale up of routine malaria rapid diagnostic testing in Zambia. Am J Trop Med Hyg. 2012;87:437-446. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Olmsted SS, Moore M, Meili RC, et al. Strengthening laboratory systems in resource-limited settings. Am J Clin Pathol. 2010;134:374-380. [DOI] [PubMed] [Google Scholar]
  • 10. Gershy-Damet GM, Rotz P, Cross D, et al. The World Health Organization African region laboratory accreditation process: improving the quality of laboratory systems in the African region. Am J Clin Pathol. 2010;134:393-400. [DOI] [PubMed] [Google Scholar]
  • 11. Fonjungo PN, Kebede Y, Messele T, et al. Laboratory equipment maintenance: a critical bottleneck for strengthening health systems in sub-Saharan Africa? J Public Health Policy. 2012;33:34-45. [DOI] [PubMed] [Google Scholar]
  • 12. Birx D, de Souza M, Nkengasong JN. Laboratory challenges in the scaling up of HIV, TB, and malaria programs: the interaction of health and laboratory systems, clinical research, and service delivery. Am J Clin Pathol. 2009;131:849-851. [DOI] [PubMed] [Google Scholar]
  • 13. Archibald LK, Reller LB. Clinical microbiology in developing countries. Emerg Infect Dis. 2001;7:302-305. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Petti CA, Polage CR, Quinn TC, et al. Laboratory medicine in Africa: a barrier to effective health care. Clin Infect Dis. 2006;42:377-382. [DOI] [PubMed] [Google Scholar]
  • 15. Amukele TK, Michael K, Hanes M, et al. External quality assurance performance of clinical research laboratories in Sub-Saharan Africa. Am J Clin Pathol. 2012;138:720-723. [DOI] [PubMed] [Google Scholar]
  • 16. Guindo MA, Shott JP, Saye R, et al. Promoting good clinical laboratory practices and laboratory accreditation to support clinical trials in sub-Saharan Africa. Am J Trop Med Hyg. 2012;86:573-579. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Centers for Disease Control and Prevention. Framework for program evaluation in public health. MMWR Recomm Rep. 1999;48:1-40. [PubMed] [Google Scholar]
  • 18. Bartlett JA, Ribaudo HJ, Wallis CL, et al. Lopinavir/ritonavir monotherapy after virologic failure of first-line antiretroviral therapy in resource-limited settings. AIDS. 2012;26:1345-1354. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Violari A, Lindsey JC, Hughes MD, et al. Nevirapine versus ritonavir-boosted lopinavir for HIV-infected children. N Engl J Med. 2012;366:2380-2389. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. DAIDS. Guidelines for Good Clinical Laboratory Practices (GCLP) standards. http://www.niaid.nih.gov/LabsAndResources/resources/DAIDSClinRsrch/Documents/gclp.pdf. Accessed March 2, 2016.
  • 21. Fitzgibbon JE, Wallis CL. Laboratory challenges conducting international clinical research in resource-limited settings. J Acquir Immune Defic Syndr. 2014;65(suppl 1):S36-S39. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. National Kidney Foundation. K/DOQI clinical practice guidelines for chronic kidney disease: evaluation, classification, and stratification. Am J Kidney Dis. 2002;39(2)(suppl 1):S1-S266. [PubMed] [Google Scholar]
  • 23. Oxlade O, Piatek A, Vincent C, et al. Modeling the impact of tuberculosis interventions on epidemiologic outcomes and health system costs. BMC Public Health. 2015;15:141. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Menzies NA, Cohen T, Lin HH, et al. Population health impact and cost-effectiveness of tuberculosis diagnosis with Xpert MTB/RIF: a dynamic simulation and economic evaluation. PLoS Med. 2012;9:e1001347. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Moon AM, Biggs HM, Rubach MP, et al. Evaluation of in-hospital management for febrile illness in northern Tanzania before and after 2010 World Health Organization guidelines for the treatment of malaria. PLoS One. 2014;9:e89814. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Chandler CI, Mwangi R, Mbakilwa H, et al. Malaria overdiagnosis: is patient pressure the problem? Health Policy Plan. 2008;23:170-178. [DOI] [PubMed] [Google Scholar]
  • 27. Chandler CI, Jones C, Boniface G, et al. Guidelines and mindlines: why do clinical staff over-diagnose malaria in Tanzania? A qualitative study. Malar J. 2008;7:53. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from American Journal of Clinical Pathology are provided here courtesy of Oxford University Press

RESOURCES