Skip to main content
Journal of Clinical Microbiology logoLink to Journal of Clinical Microbiology
. 2012 Dec;50(12):4083–4086. doi: 10.1128/JCM.01355-12

Use of a Prequalification Panel for Rapid Scale-Up of High-Throughput HIV Viral Load Testing

Lesley E Scott a,, Sergio Carmona a,b, Natasha Gous a, Pamela Horsfield a, Melanie MacKay c, Wendy Stevens a,b
PMCID: PMC3503004  PMID: 22993182

Abstract

Increased access to antiretroviral drugs expands needs for viral load (VL) testing. South Africa's National Health Laboratory Service responded to demands by implementing two testing platforms in 17 laboratories within 8 months. An industry partner's collaboration, training programs, and method verification with a VL prequalification panel ensured testing quality and rapid implementation.

TEXT

More than 1.4 million patients receive antiretroviral (ARV) treatment in South Africa through the public health services (1, 8). In response to such demand, 17 HIV viral load (VL) testing laboratories were identified for the national HIV program between May 2010 and January 2011. These were distributed throughout the 9 provinces at both rural and urban laboratory centers. Several sites were newly implemented (n = 6), and the remaining required minimal renovation to accommodate the new VL testing platform footprints. A total of 75 technical staff members were trained. Two viral load testing platforms were selected for implementation using a stringent tender-based procurement process. The first platform was the COBAS TaqMan HIV-1(CAP/CTM) version 2.0 docked system (Roche Molecular Systems, Branchburg, NJ) (n = 20), which combines the extraction of total nucleic acids on the COBAS AmpliPrep (CAP) with real-time PCR on the COBAS TaqMan analyzer (CTM). Each site with this platform also installed a preanalytical sample-handling P630 device (Roche), ensuring further automation. The second platform was the Abbott m2000sp (n = 23) for nucleic acid extraction and the m2000rt for amplification and detection to perform the Abbott RealTime HIV-1 assay (Abbott Molecular Inc., Des Plaines, Illinois). Both automated systems are based on real-time PCR and were connected to the National Health Laboratory Service's (NHLS) laboratory information system (LIS). Both assays have been validated (4), including in-country validation (5, 7), and therefore did not require further validation but, rather, verification after platform placement to ensure adequate site performance. The verification was required within 1 week of installation to ensure each instrument was “fit for purpose” for clinical sample testing and result reporting.

The material used to prepare the verification panels was a combination of known HIV-positive and -negative plasma packs (∼200 ml) obtained from the South African National Blood Services (SANBS). SANBS tests all plasma using the Procleix Ultrio blood donor screening test (Gen-Probe and Novartis Diagnostics, Emeryville, CA) to confirm positive or negative HIV status. Each pack was quantitated (using both assays at the NHLS Charlotte Maxeke Academic Hospital PCR reference laboratory in Johannesburg) and kept frozen (−70°C) until results were obtained by testing on both VL testing platforms in the reference laboratory in Johannesburg. The bulk plasma packs were then thawed in a 37°C water bath and diluted (using negative plasma) or pooled and aliquoted into 6 bulk lots calculated to produce a range of VL. Once manufactured, these bulk lots were mixed thoroughly at room temperature on an orbital shaker (Labotec, SA) and then retested on the Abbott RealTime HIV-1 assay to confirm the correct dilutions/pooling of VL. The 6 bulk lots, each a maximum volume of ∼120 ml, consisted of one negative and five quantifiable bulk lots in the following viral load ranges (500 copies/ml, 2.7 log copies/ml; 1,000 copies/ml, 3.0 log copies/ml; 5,000 copies/ml, 3.7 log copies/ml; 50,000 copies/ml, 4.7 log copies/ml; and 100,000 copies/ml, 5.0 log copies/ml). These were then assigned into a 42-member verification panel (Table 1) to be processed in the order stated. Each range was repeated five times and interspersed between 17 negative samples. The choice of 42 tubes was to ensure coverage of two racks in the CAP/CTM v2 assay.

TABLE 1.

Panel constitution

Tube no. Panel member VL
1 Negative
2 500 copies/ml
3 1,000 copies/ml
4 5,000 copies/ml
5 Negative
6 50,000 copies/ml
7 100,000 copies/ml
8 Negative
9 Negative
10 500 copies/ml
11 1,000 copies/ml
12 5,000 copies/ml
13 Negative
14 50,000 copies/ml
15 100,000 copies/ml
16 Negative
17 Negative
18 500 copies/ml
19 1,000 copies/ml
20 5,000 copies/ml
21 Negative
22 50,000 copies/ml
23 100,000 copies/ml
24 Negative
25 Negative
26 500 copies/ml
27 1,000 copies/ml
28 5,000 copies/ml
29 Negative
30 50,000 copies/ml
31 100,000 copies/ml
32 Negative
33 Negative
34 500 copies/ml
35 1,000 copies/ml
36 5,000 copies/ml
37 Negative
38 50,000 copies/ml
39 100,000 copies/ml
40 Negative
41 Negative
42 Negative

The verification panel was shipped using couriers with dry ice packaging to each site. Testing was performed directly from the dry-ice-transported panel, or panels were stored at −70°C until testing. Testing at each site was performed over 1 day with the same lot numbers of reagents and controls per instrument. Once the results were obtained, they were entered by the site personnel into a template MS Excel spreadsheet and emailed to the Department of Molecular Medicine and Hematology, Research Diagnostic Laboratory, NHLS, in Johannesburg. Statistical parameters measured were accuracy, precision, carryover, and limit of blank. The mean (average), standard deviation (SD), and coefficient of variation (CV) were calculated in each category for both the untransformed value (copies/ml) and the log-transformed value (log copies/ml). Levels of acceptable variability (within-run precision) were determined as previously reported (2) and according to the international Viral Quality Assurance (VQA) program (Rush Presbyterian-St. Luke's Medical Center, Chicago, IL). These values were ≤35% CV on the untransformed copies/ml values and ≤0.19 SD on the log transformed copies/ml values. The log difference (reference − new site) or bias was calculated using the log-transformed values. An acceptable bias was considered ≤0.3 log copies/ml across all categories. In addition, the percentage similarity was calculated (6) across all log-transformed quantified values irrespective of categories, and the percentage similarity SD and percentage similarity CV were calculated. All Abbott RealTime HIV-1 results were compared to one Abbott RealTime HIV-1 panel, and all CAP/CTM v2 results were compared to one CAP/CTM v2 panel tested on both platforms at the central reference laboratory in Johannesburg and considered the reference standard for statistical analysis. Outcomes were reported back to the sites via email in a standard report document. Carryover and limit of the blank were reported if any negative sample directly after a high-VL sample was reported as positive or if any negative sample irrespective of position was reported as positive. The reports distributed to the sites recorded the maximum SD and CV.

Forty-five instruments were initially enrolled in this verification program across the 17 laboratories: 2 instruments did not pass verification and were removed, and 43 passed verification and were able to be used for clinical testing. One panel was tested on each instrument, and where problems were identified, additional panels were prepared for testing. Four instrument verifications flagged above the acceptable statistical criteria (2 failed the limit of the blank, 1 failed the bias, and 1 target was not detected in a positive sample). Four instruments were moved after initial verification due to laboratory renovations and were verified again before clinical sample testing. Twelve instruments did not generate verification panel results due to run losses for reasons listed in Table 2. A total of 65 panel units were tested, with 59 units from the same bulk manufacturing lot (panel 1). A second bulk plasma batch (panel 2) needed to be manufactured to ship six additional units to the sites. This second bulk lot was manufactured from different plasma packs but followed the same manufacturer's protocol as the first bulk.

TABLE 2.

Problems identified through the program between May 2010 and January 2011 from 45 instruments placed in the field and enrolled in the prequalification program

Problem reported No. of instruments
Results flagged outside the acceptable statistical criteria 4
    Failed the limit of the blank 2
    Failed the bias 1
    Target not detected in a positive sample 1
Instruments moved after initial verification due to laboratory renovations and were verified again before clinical sample testing 4
Instruments did not generate verification panel results due to run losses 12
    Power outage 2
    Instrument error 1
    Transcription error 2
    Incorrect carriers 1
    Instrument alignment 1
    Incorrect sample storage 1
    Failed controls 1
    Faulty thermocycler 1
    Instrument replaced 2

Table 3 lists the summary statistics of the final verification values for 43 instruments, excluding two instruments that were replaced. Panel 1 and panel 2 results are also shown separately, as the reference comparators were different. In addition to the within-platform comparison, a section is also included for the across-platform comparison, in which the Roche (1 instrument) and Abbott (1 instrument) assays are compared to each other using the reference panel results for panel 1 and panel 2. This comparison shows the maximum percentage similarity CV obtained between the two assays and, therefore, used as the maximum limit for within-platform precision acceptability. Any percentage similarity CV value above this level (maximum, 2.9%) was flagged for further investigation.

TABLE 3.

Summary statistics for two manufactured panels showing the performance across and within platforms for their final verification values

Comparison and laboratory no.a Instrument no. Maximum CV Maximum SD Maximum bias SD bias % similarity CV
Reference panel comparison across platformsb
    Reference panel 1 22.5 0.1 0.32 0.1 2.9
    Reference panel 2 34.5 0.2 0.25 0.2 2.5
Panel 1 comparison within Roche instruments
    1 1 15.4 0.07 −0.21 0.11 1.1
2 23.2 0.1 −0.26 0.13 1.6
    2 3 26.1 0.11 0.09 0.08 1.1
4 30.2 0.14 −0.08 0.12 1.5
5 13.2 0.06 −0.14 0.07 0.9
6 30.1 0.15 −0.32 0.14 2.3
7 18 0.08 −0.12 0.13 1.5
8 17.8 0.08 −0.16 0.08 1.4
    3 9 36.3 0.16 −0.19 0.18 2.3
10 17.7 0.08 −0.16 0.09 1
    4 11 20.1 0.09 −0.14 0.1 1.7
12 25.9 0.1 −0.26 0.16 2
    5 13 25.9 0.1 −0.18 0.18 2
14 25.5 0.12 −0.11 0.12 1.4
    6 15 22.6 0.09 −0.17 0.12 1.4
16 22.2 0.1 0.12 0.1 1.6
    7 17 23.3 0.1 −0.15 0.1 1
    8 18 28.4 0.15 −0.13 0.13 0.9
19 19.4 0.09 −0.11 0.1 1.1
    Avg 23.2 0.1 −0.1 0.1 1.5
Panel 1 comparison within Abbott instruments
    2 20 18.1 0.08 −0.22 0.12 1.3
    9 21 13.3 0.06 0.15 0.08 1.2
22 24.1 0.1 −0.08 0.14 1.5
23 18.1 0.08 −0.1 0.07 1.1
24 36.4 0.17 0.09 0.15 1.6
    10 25 11.6 0.05 0.16 0.09 1.6
26 16.5 0.07 0.06 0.08 1
    11 27 23.8 0.12 0.12 0.12 1.6
28 19.1 0.08 0.1 0.09 1.2
    12 29 12.8 0.06 0.17 0.11 1.6
30 19 0.09 0.15 0.13 1.4
    13 31 26.6 0.1 −0.05 0.12 1.3
    14 32 20.5 0.09 0.06 0.11 1.2
    15 33 26.8 0.14 0.03 0.11 1.1
34 29.1 0.12 −0.06 0.17 1.5
    16 35 27.4 0.12 0.1 0.16 1.7
36 16.9 0.08 −0.11 0.09 1.1
    17 37 15.1 0.06 −0.1 0.08 0.9
38 33.2 0.13 0.11 0.15 1.7
    Avg 21.5 0.1 0.0 0.1 1.3
Panel 2 comparison within Roche instruments
    2 39 18.2 0.1 −0.25 0.19 2.4
    7 40 17 0.08 −0.19 0.13 1.8
41 15.7 0.1 −0.23 0.12 2
    Avg 17.0 0.1 −0.2 0.1 2.1
Panel 2 comparison within Abbott instruments
    17 42 23.4 0.11 0.12 0.14 1.7
43 16.1 0.07 0.11 0.14 1.6
    Avg 19.8 0.1 0.1 0.1 1.7
a

Numbers in the stub are laboratory numbers unless otherwise indicated (i.e., reference panel numbers).

b

Roche (n = 1) versus Abbott (n = 1).

The average maximum CV and bias for both panels on both platforms were similar, showing that both platforms are suitable for HIV VL testing on clinical samples from the region. Two instrument (numbers 9 and 24) maximum CVs on the untransformed values were >35% but were considered borderline acceptability, as their biases were within acceptable limits. Apart from the instrument errors identified through this program and listed in Table 2, a further 11 individual sample tubes (0.4%; 11/2,730) generated errors (1 internal control failed and 10 were invalid due to a clot being detected).

Statistical analysis for verification may be daunting, especially when implementing different platforms, different samples, and different scoring parameters; however, the design of this panel of 42 samples was well suited to both platforms testing run sizes, and the selection of panel members well represented the assays' dynamic range and clinically relevant treatment switch range (500 to 1,000 copies/ml). The five replicates in each range also appeared suitable to identify any issues of precision within these clinically important ranges, and the 17 negative samples appeared adequate to investigate carryover and limit of the blank. The percentage similarity CV was useful as an overall measure of variability to highlight instrument problems. If the within-assay percentage similarity CV is greater than the between-assay percentage similarity CV (>2.9%), then further investigation is needed within each category using the bias, SD, and CV.

This prequalification program design, its central location, and its rapid deployment (a not-scheduled scheme) with local resources proved suitable for both VL testing platforms. The process identified errors related to both the instrument and the laboratory operator and proved useful in training and managing new sites (installation, on-site training, and verification within 1 week). It identified the need to manufacture larger bulk batch sizes but also that standard laboratory equipment is suitable for such bulk manufacture. Plasma packs were selected as the choice of testing material for instrument verification because they were relatively easy to source (local blood bank material), truly represented clinical testing material (predominately subtype C [3]), and showed few sample errors due to clots detected. However, other testing material, such as viral cultures in synthetic matrix, spiked negative plasma, and plasmid preparations, may be investigated. The potential future use of dried blood spots (DBS) for HIV VL testing will also require instrument verification and thereby also require a specialized DBS verification panel, which is being investigated. An ongoing VL assessment program is now being developed to continue quality VL testing.

Footnotes

Published ahead of print 19 September 2012

REFERENCES

  • 1. Avert 2012. HIV AIDS in South Africa. Avert, Horsham, West Sussex, United Kingdom: http://www.avert.org/aidssouthafrica.htm [Google Scholar]
  • 2. Brambilla D, Granger S, Bremer J. 2000. Variation in HIV RNA assays at low RNA concentration, abstr 774. Abstr. 7th Conf. Retrovir. Oppor. Infect., San Francisco, CA, 30 January to 2 February 2000 [Google Scholar]
  • 3. Papathanasopoulos M, Hunt G, Tiemessen CT. 2003. Evolution and diversity of HIV-1 in Africa—a review. Virus Genes 26:151–163 [DOI] [PubMed] [Google Scholar]
  • 4. Schumacher W, et al. 2007. Fully automated quantification of human immunodeficiency virus (HIV) type 1 RNA in human plasma by the COBAS AmpliPrep/COBAS TaqMan system. J. Clin. Virol. 38:304–312 [DOI] [PubMed] [Google Scholar]
  • 5. Scott L, Carmona S, Stevens W. 2009. Performance of the new Roche Cobas AmpliPrep-Cobas TaqMan version 2.0 human immunodeficiency virus type 1 assay. J. Clin. Microbiol. 47:3400–3402 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Scott LE, Galpin JS, Glencross DK. 2003. Multiple method comparison: statistical model using percentage similarity. Cytometry B Clin. Cytom. 54:46–53 [DOI] [PubMed] [Google Scholar]
  • 7. Scott LE, et al. 2009. Evaluation of the Abbott m2000 RealTime human immunodeficiency virus type 1 (HIV-1) assay for HIV load monitoring in South Africa compared to the Roche Cobas AmpliPrep-Cobas Amplicor, Roche Cobas AmpliPrep-Cobas TaqMan HIV-1, and BioMerieux NucliSENS EasyQ HIV-1 assays. J. Clin. Microbiol. 47:2209–2217 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. South African National AIDS Council 2010. The National HIV counselling and testing campaign strategy. South African National AIDS Council, Pretoria, South Africa: http://www.westerncape.gov.za/other/2010/6/hct_campaign_strategy_2_3_10_final.pdf [Google Scholar]

Articles from Journal of Clinical Microbiology are provided here courtesy of American Society for Microbiology (ASM)

RESOURCES