Abstract.
Rapid diagnostic tests (RDTs) are one of the primary tools used for parasitological confirmation of suspected cases of malaria. To ensure accurate results, health-care workers (HCWs) must conduct the RDT test correctly. Trained supervisors visited 3,603 facilities to assess RDT testing performance and conduct outreach training and supportive supervision activities in eight African countries between 2015 and 2017, using a 12-point checklist to determine if key steps were being performed. The proportion of HCWs performing each step correctly improved between 1.1 and 21.0 percentage points between the first and third visits. Health-care worker scores were averaged to calculate facility scores, which were found to be high: the average score across all facilities was 85% during the first visit and increased to 91% during the third visit. A regression analysis of these facility scores estimated that, holding key facility factors equal, facility performance improved by 5.3 percentage points from the first to the second visit (P < 0.001), but performance improved only by 0.6 percentage points (P = 0.10) between the second and third visits. Factors strongly associated with higher scores included the presence of a laboratory worker at the facility and the presence of at least one staff member with previous formal training in malaria RDTs. Findings confirm that a comprehensive quality assurance system of training and supportive supervision consistently, and often significantly, improves RDT performance.
INTRODUCTION
In 2010, the World Health Organization (WHO) revised its Guidelines for the Treatment of Malaria to recommend that all suspected cases of malaria be confirmed with either microscopy or a rapid diagnostic test (RDT) before treatment.1 On World Malaria Day in 2012, WHO launched the T3: Test. Treat. Track. initiative, which urged malaria-endemic countries, donors, and the global malaria community to scale-up diagnostic testing, treatment, and surveillance.2 Rapid diagnostic tests are one of the primary tools used by health-care workers (HCWs) to confirm a suspected diagnosis of malaria in patients. Rapid diagnostic tests have been used across sub-Saharan Africa along the entire spectrum of the health system, from community-level to reference-level facilities. In 2015, 240 million RDTs were procured in sub-Saharan Africa, including both falciparum-only and combination tests.3 Compared with microscopy, RDTs are relatively quick and easy to use, are cost-effective compared with presumptive treatment and microscopy in many settings, and require fewer supplies, which make them less subject to supply chain problems.4,5 If used correctly, RDTs have been found to be more sensitive than routine microscopy but less than expert microscopy.6–8 As with any diagnostic test, however, it is imperative to ensure correct results through the use of quality-assured RDTs and by ensuring proper use and interpretation of the test by HCWs.
Mechanisms exist to check the quality of the RDTs themselves, including one led by the WHO–Foundation for Innovative New Diagnostics (FIND) Malaria RDT Evaluation Program.9 Manufacturing practices and regional- and country-level lot testing provide further quality control of the RDT itself. Furthermore, positive control wells are undergoing field testing at national- and subnational-level reference facilities to confirm correct functionality.10,11
In addition to ensuring the quality of the device itself, it is important that HCWs perform the test properly because errors in key performance steps can decrease the specificity and sensitivity of RDTs. Materials to guide RDT use have been developed by WHO, FIND, other international programs, and countries themselves. World Health Organization recommends that monitoring performance should be a component of diagnostic quality assurance systems. Although several field trials have examined the ability of community health workers to use RDTs to diagnose malaria, few have examined the performance of facility-level HCWs in adhering to the steps necessary to conduct the test in the context of large-scale programs.12–18
The U.S. President’s Malaria Initiative–funded MalariaCare project (2012–2017) implemented a malaria case management quality assurance system, which included outreach training and supportive supervision (OTSS) visits. During OTSS, trained supervisors used a standardized checklist to observe HCWs conducting malaria RDTs, fever case management consultations, and malaria microscopy at facilities where this was performed, and collected facility and provider information relevant to the provision of case management. They provided individualized, real-time feedback on steps performed correctly and incorrectly, and developed action plans to address broader issues at the health facility level.19
We present here an analysis of programmatic data representing more than 9,000 observations from eight countries focusing on performance of both clinical and laboratory staff using RDTs at health facilities across administrative levels of health-care systems. The results provide an indication of the impact of supportive supervision on RDT performance, within the context of providing comprehensive case management in busy health facilities in resource-challenged countries. Results presented here can help decision-makers when designing and implementing an RDT quality assurance system.
MATERIALS AND METHODS
Program setting and population.
Eight countries in sub-Saharan Africa supported by MalariaCare to improve the quality of case management were targeted for OTSS implementation: Democratic Republic of the Congo, Ghana, Kenya, Malawi, Mali, Mozambique, Tanzania, and Zambia. Within each country, the ministries of health selected both public and private facilities (or stand-alone laboratories—Democratic Republic of the Congo only) within regions or provinces agreed on by the ministry and the USAID mission. Within facilities, any staff who performed RDTs were eligible for observation and feedback. Staff were classified as laboratory workers or non-laboratory workers and were asked whether they had received any formal training in RDTs before the observation.
Program description.
Between 2015 and 2017, trained ministry of health clinical and laboratory supervisors observed eligible staff performing RDTs during routine OTSS visits (Some facilities received visits before 2015, but with a different checklist). Before the visits, supervisors received a minimum of 3 days of training in supervision skills and use of the checklist. The observed staff were evaluated using a 12-point checklist covering manufacturer-recommended steps for conducting RDTs. The checklist was adapted from WHO and modified by MalariaCare’s technical and field teams (Table 1). For each item on the checklist, observers were prompted to check “yes” or “no” as to whether the health worker being observed correctly performed the step. Six of the items were considered “minimum standard” steps (i.e., they are most essential for proper performance of the RDT); they are highlighted in bold font in Table 1. The score for each observation was weighted so that the six minimum standard steps accounted for two-thirds of the score, whereas the remaining six steps accounted for one-third. Individual health facility scores comprised an average of one to three health worker observations per health facility. The checklist also captured certain aspects of the health facility outside of the observation, such as the presence of RDT standard operating procedures (SOPs) and bench aids. For each facility, the amount of time between visits depended on a number of factors, including ministry of health schedules and project budgets, but it typically was 3–6 months.
Table 1.
Procedure* | Score |
---|---|
Put on a new pair of gloves | ☐Yes ☐No |
Check expiry date on the package | ☐Yes ☐No |
Write client ID on cassette | ☐Yes ☐No |
Clean patient finger with antiseptic/alcohol and allow finger to dry | ☐Yes ☐No |
Collect the right amount of blood with the inverted cup | ☐Yes ☐No |
Put collected blood into the correct well | ☐Yes ☐No |
Apply right amount of buffer to correct well | ☐Yes ☐No |
Wait for the correct time before reading results | ☐Yes ☐No |
Read the test results correctly | ☐Yes ☐No |
Record the test results in a laboratory register | ☐Yes ☐No |
Dispose of used tests, transfer devices, and other blood-contaminated material | ☐Yes ☐No |
Dispose of used lancet in sharps container | ☐Yes ☐No |
OTSS = outreach training and supportive supervision; RDT = rapid diagnostic test.
* Checklist steps in bold (“minimum standard” steps) are considered more important and are collectively weighted twice as much as the other steps when calculating scores.
Analysis of implementation data.
Results gathered during observation of HCWs performing RDTs were captured on a paper-based checklist and subsequently entered into a Microsoft Access database or were entered directly into MalariaCare’s Electronic Data System (a system using District Health Information System version 2 [DHIS2; Oslo, Norway] software to store and analyze data).20 Data from both databases were imported into Stata 14 (StataCorp, 2015. Stata Statistical Software: Release 14.1. College Station, TX: StataCorp LP) for data cleaning and analysis.
We analyzed results at two levels: 1) at the observation level for HCW performance on individual checklist steps and 2) at the facility level for overall facility improvement on RDT competencies over time. We reported performance on each of the 12 steps in the checklist as the proportion of observed HCWs who performed the step correctly during the first, second, and third visits. Only observations with no missing steps and in facilities with at least one complete observation at each time point were included in the analysis. If a HCW was observed more than once during a particular visit, we included only the results for the HCW’s first observation.
To estimate the impact of OTSS on facility RDT performance by visit and over time, we calculated scores for the first, second, and third visits by averaging the HCW observation scores included in the individual step analysis for each visit. We report descriptive statistics for the average facility performance by country and visit number for the subset of facilities with scores for all three visits, to show trends among consistent facilities over time. In addition, we conducted a multilevel, mixed-effects linear regression to estimate the independent effects of other health facility characteristics that could potentially affect scores. In that analysis, any complete observation occurring at a facility with at least two visits was included. “Health facilities visited at least twice” was the unit of analysis, and the analysis was clustered at the health facility level. Potential health facility characteristics collected as part of the broader OTSS checklist and included as covariates in the regression were as follows: 1) whether the facility received a previous OTSS visit before the first visit used in the analysis; 2) whether the health facility was a hospital; 3) whether RDT SOPs and bench aids were available in the facility; 4) whether at least one of the observations was conducted by a laboratory worker; 5) whether one of the observations was conducted by someone who had received formal RDT training within the 2 years before the visit; and 6) country name.
RESULTS
A total of 3,648 health facilities received at least two OTSS visits, with 1,625 health facilities receiving two visits, 1,769 facilities receiving three visits, and 254 facilities receiving four or more visits. Of the 3,648 health facilities visited, 3,603 had at least one complete observation. Earlier versions of the checklist did not ask the supervisor why observations were not complete; of those checklists that did ask, 13% reported an RDT stockout and 7% reported that no RDTs were conducted at the time of the visit. The rest did not give a reason, either because they skipped the question or because they mistakenly thought they had completed the checklist.
Observation performance on individual steps.
Among the 2,023 health facilities with at least three visits, 1,348 (67%) had at least one complete observation at each of the first three visits. In total, 6,350 observations were analyzed for their performance on steps in the RDT checklist (Table 2).
Table 2.
Observations | Facilities | |
---|---|---|
Number eligible* | 9,633 | 2,023 |
Number included in analysis† | 6,350 | 1,348 |
Percentage of total | 66% | 67% |
* Observations occurred at facilities that received at least three outreach training and supportive supervision visits.
† Observations were complete, and facilities had at least one complete observation at each of the first three visits.
Table 3 compares the proportion of HCWs who performed each checklist step correctly, by visit number, with percentage point differences between the first versus second visit and second versus third. Among these observations, the proportion of HCWs who performed each of the steps correctly improved between the first and third visits. Most of the improvement occurred between the first and second visit (1.1–17.3 percentage points). Among the six most important steps, all were performed at 85% or better during the first visit, with the exception of waiting for the correct amount of time before reading the results (77.3% of observations did this step correctly). Other steps performed relatively inconsistently at baseline, included putting on a new pair of gloves (54.1%) and checking the RDT expiry date (56.9%). Performance on these steps improved to 75.1% and 75.8%, respectively, during the third visit.
Table 3.
RDT observation checklist step* | Visit number | Percentage point change in score | ||||
---|---|---|---|---|---|---|
First | Second | Third | First to second | Second to third | First to third | |
Number of observations | 2,066 | 2,079 | 2,205 | – | – | – |
Put on a new pair of gloves | 54.1% | 68.7% | 75.1% | +14.6% | +6.4% | +18.9% |
Check expiry date on the package | 56.9% | 74.2% | 75.8% | +17.3% | +1.6% | +1.4% |
Write client ID on cassette | 85.4% | 89.1% | 86.8% | +3.7% | −2.3% | +8.1% |
Clean patient finger with antiseptic/alcohol and allow to dry | 84.1% | 91.1% | 92.2% | +7.0% | +1.1% | +5.4% |
Collect the right amount of blood with the inverted cup† | 89.2% | 93.7% | 94.6% | +4.5% | +0.9% | +1.1% |
Put collected blood into the correct well | 98.4% | 99.5% | 99.5% | +1.1% | 0.0% | +5.7% |
Applies right amount of buffer to correct well | 85.3% | 90.6% | 91.0% | +5.3% | +0.4% | +11.7% |
Wait for the correct time before reading results | 77.3% | 86.9% | 89.0% | +9.6% | +2.1% | +2.2% |
Read the test results correctly | 96.7% | 98.7% | 98.9% | +2.0% | +0.2% | +7.5% |
Record the test results in a laboratory register | 87.5% | 93.8% | 95.0% | +6.3% | +1.2% | +6.4% |
Dispose of used tests, transfer devices, and other blood-contaminated material | 85.7% | 90.0% | 92.1% | +4.3% | +2.1% | +4.0% |
Dispose used lancet in sharps container | 88.9% | 91.4% | 92.9% | +2.5% | +1.5% | +21.0% |
* RDT = rapid diagnostic test.
† Checklist steps in bold (“minimum standard” steps) are considered more important and are collectively weighted twice as much as the other steps when calculating the overall RDT performance score.
Facility performance.
The 6,350 observations for the 1,350 health facilities that had a complete score for the first three visits were averaged to produce a facility score for each visit. Figure 1 reports the average facility score at each visit by country. Among these facilities, average RDT scores improved from 85% during the first visit to 91% during the last. Scores improved for all countries between the first and last round. In four of seven countries, average scores improved between the first and second, as well as the second and third visits.
Regression analysis.
Among the 9,350 OTSS visits conducted at facilities with at least two visits, 8,148 (87.1%) OTSS visits, covering 3,603 health facilities, had at least one complete observation and were included in the unadjusted regression model and 7,277 (77.8%) OTSS visits, covering 3,518 health facilities, had complete observation scores and data on all covariates and were included in the adjusted regression model (Table 4).
Table 4.
Visits | Facilities | |
---|---|---|
Number eligible* | 9,350 | 3,648 |
Number included in unadjusted regression (% of eligible)† | 8,148 (87%) | 3,603 (99%) |
Number included in adjusted regression (% of eligible)‡ | 1,296 (78%) | 3,518 (96%) |
* Visits that occurred at facilities that received at least two outreach training and supportive supervision visits.
† Visits had at least one complete observation.
‡ Visits had at least one complete observation and had data for all covariates included in the adjusted regression.
Table 5 presents the characteristics of the facilities included in the adjusted regression model and the mean RDT facility score for each covariate. Scores at the 25th and 75th percentile are also given, indicating the score at or below which the bottom 25% of health facilities scored, and the score at or greater than which the top 25% scored, respectively. Of the facility visits, 35.9% were a first visit; 40.2% were a second visit, and 23.9% were a third visit. Mean scores ranged from 85.6% among facilities during the first visit to 91.4% among facilities during the third visit. Most visits occurred at facilities that did not have any previous OTSS visit (59.9%); had RDT SOPs (56.1%) and/or bench aids (59.0%) available; and had at least one staff member who received formal training in RDTs before the visit and who was observed during the visit (68.5%). A minority of the visits were conducted at hospitals (10.7%) and at facilities where at least one staff observed was a laboratory worker (23.7%). One MalariaCare-supported country (Country 1) represents 42.1% of the visits.
Table 5.
Characteristic | Visits | RDT score | |||
---|---|---|---|---|---|
% | N | Mean | 25th percentile | 75th percentile | |
Visit number | |||||
1 | 35.9 | 2,612 | 85.6 | 77.8 | 94.4 |
2 | 40.2 | 2,925 | 91.4 | 88.9 | 100.0 |
3 | 23.9 | 1,739 | 91.4 | 88.9 | 100.0 |
Across all visits | |||||
Characteristic | Visits | RDT Score | |||
---|---|---|---|---|---|
% | N | % | N | % | |
Health facility received prior OTSS visits | |||||
No | 40.1 | 2,918 | 90.0 | 86.1 | 100.0 |
Yes | 59.9 | 4,359 | 88.8 | 83.3 | 100.0 |
Facility has RDT SOPs | |||||
No | 43.9 | 3,195 | 88.1 | 83.3 | 100.0 |
Yes | 56.1 | 4,082 | 90.2 | 83.3 | 100.0 |
Facility has RDT bench aids | |||||
No | 41.0 | 2,984 | 86.8 | 80.6 | 97.2 |
Yes | 59.0 | 4,293 | 91.0 | 86.1 | 100.0 |
At least one obs. has formal training in RDTs | |||||
No | 31.5 | 2,292 | 85.7 | 77.8 | 96.3 |
Yes | 68.5 | 4,985 | 90.9 | 87.0 | 100.0 |
At least one obs. is a laboratory worker | |||||
No | 76.3 | 5,552 | 88.3 | 83.3 | 100.0 |
Yes | 23.7 | 1,725 | 92.5 | 88.9 | 100.0 |
Facility is a hospital | |||||
No | 89.3 | 6,498 | 89.2 | 83.3 | 100.0 |
Yes | 10.7 | 779 | 90.2 | 83.3 | 100.0 |
Country | |||||
Country 1 | 42.1 | 3,064 | 89.3 | 83.3 | 100.0 |
Country 2 | 1.3 | 95 | 88.4 | 83.3 | 100.0 |
Country 3 | 23.1 | 1,681 | 91.0 | 88.9 | 100.0 |
Country 4 | 12.4 | 902 | 85.8 | 77.8 | 94.4 |
Country 5 | 5.1 | 371 | 87.7 | 80.6 | 97.2 |
Country 6 | 1.8 | 131 | 87.6 | 83.3 | 100.0 |
Country 7 | 10.1 | 735 | 91.3 | 88.9 | 100.0 |
Country 8 | 4.0 | 291 | 88.2 | 83.3 | 98.1 |
obs. = Observed staff; OTSS = outreach training and supportive supervision; RDT = rapid diagnostic test; SOP = standard operating procedure.
In the unadjusted regression, a facility’s RDT score was estimated as 5.7 percentage points higher during the second OTSS visit (P < 0.001) and 6.7 percentage points higher during the third OTSS visit (P < 0.001) when compared with the first. After adjusting for facility characteristics, a facility’s RDT score was an estimated 5.3 percentage points higher during the second OTSS visit (P < 0.001) and 5.9 percentage points higher during the third OTSS visit (P < 0.001) when compared with the first (Table 6). Thus, the estimated increase in score between the second and third visits was only 0.6 percentage points and not statistically significant at the 5% level (P = 0.10).
Table 6.
Characteristic | Coefficient | 95% CI |
---|---|---|
Number of visits (ref: 1 visit) | ||
2 | 5.3 | [4.7, 5.9] |
3 | 5.9 | [5.2, 6.6] |
Facility received prior OTSS visits | 1.1 | [−0.4, 2.5] |
Facility has RDT SOPs | 0.8 | [0.1, 1.5] |
Facility has RDT bench aids | 2.4 | [1.7, 3.0] |
At least one obs. has formal training in RDTs | 4.4 | [3.7, 5.0] |
At least one obs. is a laboratory worker | 2.9 | [2.1, 3.6] |
Facility is a hospital | 0.1 | [−1.0, 1.2] |
Constant | 79.3 | [77.6, 80.9] |
Observations | 7,277 | – |
obs. = observed staff; OTSS = outreach training and supportive supervision; ref. = reference category; RDT = rapid diagnostic test; SOP = standard operating procedure. Note: Regressions included a control variable for the country; results are not reported here.
Aside from OTSS visits, the factor associated with the largest improvement in RDT performance was having a staff member who received formal training in RDTs before being observed (4.4 percentage points, P < 0.001). With the exception of whether the facility was a hospital and whether the facility received a prior OTSS visit with a different checklist, other factors included in the model were also found to have a small, but statistically significant, association with higher RDT scores.
DISCUSSION
The use of RDTs to confirm a suspected diagnosis of malaria is crucial in efforts to control and eliminate malaria. The validity, cost-effectiveness, and relative ease of use of RDTs contribute to their ability to decrease the use of clinical diagnosis, and thus the overuse of artemisinin combination therapy, and to expand access to diagnostic testing to remote populations. Conducting manufacturer-recommended sequential steps properly when performing RDTs is essential to ensuring an accurate test result. MalariaCare has demonstrated that through the implementation of a case management quality assurance system that includes OTSS, high levels of performance in conducting RDTs can be achieved, improved, and maintained among facility-based HCWs at both hospital and non-hospital facilities. Health-care workers in all eight countries were able to perform RDTs at a high level in the context of case management consultations during supportive supervision visits and improve and maintain that performance over consecutive visits.
Initial performance of RDTs was relatively high in all countries. Although many of the facilities in five of the eight countries had received previous OTSS visits by MalariaCare (and some facilities in three countries had also received OTSS visits under MalariaCare’s predecessor, the Improving Malaria Diagnostics project) during which a less detailed checklist was used, those included in the analysis that received no prior OTSS visits by MalariaCare also demonstrated high overall performance. The high performance, even at baseline across all countries and across multiple levels of the health-care system, demonstrates that high-quality RDT use can occur at scale. This information could help program managers decide how best to balance microscopy and RDT use to extend access to testing. For example, program managers may reserve microscopy for management of severe malaria patients at inpatient level facilities while focusing on RDTs in all outpatient departments, particularly in high-volume facilities. This would allow staff to see a higher number of patients in a more time-efficient manner. In addition, these data can help determine how to best allocate time dedicated to various tasks associated with malaria case management during OTSS. If RDT performance is consistently high, more time can be spent mentoring in areas that need improvement, potentially including clinical management, patient counseling, microscopy, record keeping, or organizational issues within the facility.
Across the eight countries in the analysis, it was found that several of the steps that might most affect the results—such as collecting the correct amount of blood, dispensing it in the correct well, using the correct amount of buffer, and reading the test result correctly—were performed at a high level even at baseline. Considering caseloads frequently seen at facilities in Africa, it is perhaps not surprising that waiting the correct amount of time before declaring a test negative initially was performed at a lower level than many other key steps; however, it is important to note the improvements in performance of this step over OTSS visits. Properly disposing of lancets, wearing gloves, and checking the expiry date on the RDT package received lower scores. Although improvements were seen in subsequent visits, these steps need further improvement, as expired tests can lead to erroneous results, and both patient and provider safety are an important aspect of the delivery of health services. Aggregate data such as those from this study can help program managers identify steps that need improvement in RDT performance, which supervisors can then focus on during OTSS visits.
National Malaria Control Programs and subnational health management teams need to know if a facility is ready to deliver quality case management to a population, which, in addition to the availability of drugs, supplies, and equipment, is highly influenced by staff knowledge and performance of case management tasks, including correctly using RDTs. Overall facility scores are an average of individual HCW observations during OTSS, so as the proportion of observed steps performed correctly improved between the first and last OTSS visits, overall facility performance improved. Improvement was the greatest between the first and second OTSS visit, and was less between the second and third, demonstrating that even a single visit can improve performance, particularly in facilities with staff trained in the previous 2 years. As less improvement was seen between the second and third rounds of OTSS, program managers may consider either focusing on RDTs less frequently during subsequent OTSS visits or honing in on steps being performed poorly across facilities.
Analysis of covariates affecting performance can also help managers to better direct resources. Aside from having an OTSS visit, the factors most associated with greater improvement in performance over OTSS visits were having at least one HCW formally trained in RDTs in the previous 2 years and having at least one of the observations focused on a laboratory worker. Rapid diagnostic test training is less intensive than microscopy training in terms of time and material resources—typically it can be carried out in a day or less. It can be designed as cascade training to rapidly expand the numbers trained or can be incorporated into other case management courses. Laboratory personnel performed better than non-laboratory staff; the reasons for this were not further investigated. Outpatient departments, where clinical observations occurred, are often high volume, with consultations lasting several minutes or less to accommodate the number of patients presenting with fever or other causes. The resulting, often excessive, workload may be the reason why clinician performance was lower in comparison with laboratory staff. Moreover, some of the steps assessed, such as blood collection, are a routine part of laboratory functions, giving those staff an advantage. This question is worth further investigation. Standard operating procedures and bench aids also were shown to have a marginally positive influence on performance when other covariates were controlled for, although it is difficult to conclude if that small effect was truly due to their presence or to other factors. Although it is unlikely that SOPs or bench aids in the absence of training and supervision would in fact lead to high or improved performance, they are likely helpful in combination with other components of a quality assurance system.
As countries continue to conduct OTSS visits at these facilities, performance data may guide program managers on the number of visits needed to achieve and maintain a certain level of performance. External quality assurance mechanisms are costly and time-consuming, and improving efficiency by targeting low performance can help reduce the time spent during visits, frequency of visits, and cost. Reviewing results after each visit could enable ministries of health to concentrate future external quality assurance activities on low-performing facilities. Although not evaluated here, promoting internal quality assurance mechanisms within facilities has the potential to further decrease the need for more costly external quality assurance activities. If during OTSS the capacity of laboratory and clinical facility-based managers to periodically monitor RDT performance can be improved, the need for national and subnational staff to visit the facility to monitor RDT performance could be diminished. This would produce cost savings associated with such visits, such as transport and per diem. As the results presented here are of an implementation program and not a research study, comparisons were limited to the same facilities over successive visits and assessments over time were not conducted at facilities that did not receive OTSS.
CONCLUSION
These programmatic data across eight countries in sub-Saharan Africa confirm, through more than 9,000 observations of staff at multiple administrative levels of facilities, that HCWs can conduct RDTs at a high level and that training and supportive supervision, as components of a diagnostics quality assurance system, can contribute to the improvement and maintenance of RDT skills. National programs should consider such factors when determining how to structure diagnostics testing programs and quality assurance systems for RDTs.
Acknowledgments:
We acknowledge the contributions of government and project staff in all eight countries who were the principal actors implementing the project.
REFERENCES
- 1.World Health Organization , 2010. Guidelines for the Treatment of Malaria, 3rd edition Geneva, Switzerland: WHO; Available at: http://www.who.int/malaria/publications/atoz/9789241549127/en/. Accessed January 9, 2018. [Google Scholar]
- 2.World Health Organization , 2012. T3: Test. Treat. Track. Scaling up Diagnostic Testing, Treatment and Surveillance for Malaria. Geneva, Switzerland: WHO; Available at: http://www.who.int/malaria/publications/atoz/test_treat_track_brochure.pdf. Accessed January 9, 2018. [Google Scholar]
- 3.World Health Organization , 2017. World Malaria Report 2017. Geneva, Switzerland: WHO; Available at: http://www.who.int/malaria/publications/world-malaria-report-2017/en/. Accessed January 9, 2018. [Google Scholar]
- 4.Shillcutt S, Morel C, Goodman C, Coleman P, Bell D, Whitty CJM, Mills A, 2008. Cost-effectiveness of malaria diagnostic methods in sub-Saharan Africa in an era of combination therapy. Bull World Health Organ 86: 101–110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Ansah EK, Epokor M, Whitty CJM, Yeung S, Hansen KS, 2013. Cost-effectiveness analysis of introducing RDTs for malaria diagnosis as compared to microscopy and presumptive diagnosis in central and peripheral public health facilities in Ghana. Am J Trop Med Hyg 89: 724–736. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Batwala V, Magnussen P, Nuwaha F, 2010. Are rapid diagnostic tests more accurate in diagnosis of plasmodium falciparum malaria compared to microscopy at rural health centres? Malar J 9: 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.De Oliveira AM, et al. 2009. Performance of malaria rapid diagnostic tests as part of routine malaria case management in Kenya. Am J Trop Med Hyg 80: 470–474. [PubMed] [Google Scholar]
- 8.Azikiwe CCA, Ifezulike CC, Siminialayi IM, Amazu LU, Enye JC, Nwakwunite OE, 2012. A comparative laboratory diagnosis of malaria: microscopy versus rapid diagnostic test kits. Asian Pac J Trop Biomed 2: 307–310. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.World Health Organization , 2017. WHO-FIND Malaria RDT Evaluation Programme. Geneva, Switzerland: WHO; Available at: http://www.who.int/malaria/areas/diagnosis/rapid-diagnostic-tests/rdt-evaluation-programme/en/. Accessed November 20, 2017. [Google Scholar]
- 10.World Health Organization , 2017. Lot Testing: Pre and Post-Purchase. Geneva, Switzerland: WHO; Available at: http://www.who.int/malaria/areas/diagnosis/rapid-diagnostic-tests/evaluation-lot-testing/en/. Accessed November 20, 2017. [Google Scholar]
- 11.World Health Organization , 2016. Positive Control Wells. Geneva, Switzerland: WHO; Available at: http://www.who.int/malaria/areas/diagnosis/rapid-diagnostic-tests/positive-control-wells/en/. Accessed November 20, 2017. [Google Scholar]
- 12.Manyando C, Njunju EM, Chileshe J, Siziya S, Shiff C, 2014. Rapid diagnostic tests for malaria and health workers’ adherence to test results at health facilities in Zambia. Malar J 13: 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Chinkhumba J, Skarbinski J, Chilima B, Campbell C, Ewing V, San Joaquin M, Sande J, Ali D, Mathanga D, 2010. Comparative field performance and adherence to test results of four malaria rapid diagnostic tests among febrile patients more than five years of age in Blantyre, Malawi. Malar J 9: 1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Akagbosu C, 2013. Review of Adherence to Malaria Rapid Diagnostic Testing in Different Health Care Settings. Master’s Thesis. Boston, MA: Boston University. Available at: https://hdl.handle.net/2144/12040. Accessed December 13, 2018. [Google Scholar]
- 15.Kabaghe AN, Visser BJ, Spijker R, Phiri KS, Grobusch MP, van Vugt M, 2016. Health workers’ compliance to rapid diagnostic tests (RDTs) to guide malaria treatment: a systematic review and meta-analysis. Malar J 15: 1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Hawkes M, Katsuva J, Masumbuko CK, 2009. Use and limitations of malaria rapid diagnostic testing by community health workers in war-torn Democratic Republic of Congo. Malar J 8: 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Harvey SA, Jennings L, Chinyama M, Masaninga F, Mulholland K, Bell DR, 2008. Improving community health worker use of malaria rapid diagnostic tests in Zambia: package instructions, job aid and job aid-plus-training. Malar J 7: 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Ruizendaal E, Dierickx S, Peeters Grietens K, Schallig HDFH, Pagnoni F, Mens PF, 2014. Success or failure of critical steps in community case management of malaria with rapid diagnostic tests: a systematic review. Malar J 13: 1–17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Eliades MJ, Alombah F, Wun J, Burnett S, Martin T, Kutumbakana S, Dena R, Saye R, Lim P, Hamilton P, 2019. Operational considerations and costs of malaria case management supportive supervision. Am J Trop Med Hyg 100: 861–867. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Burnett S, Wun J, Evance I, Davis K, Smith G, Lussiana C, Tesha G, Quao A, Robertson M, Hamilton P, 2019. Introduction of an electronic tool for improved data quality and data use during malaria case management supportive supervision. Am J Trop Med Hyg 100: 889–898. [DOI] [PMC free article] [PubMed] [Google Scholar]