Skip to main content
PLOS ONE logoLink to PLOS ONE
. 2024 Jan 24;19(1):e0287272. doi: 10.1371/journal.pone.0287272

Re-testing as a method of implementing external quality assessment program for COVID-19 real time PCR testing in Uganda

Erick Jacob Okek 1,2,3,*, Fredrick Joshua Masembe 2, Jocelyn Kiconco 1, John Kayiwa 1, Esther Amwine 1,3, Daniel Obote 2, Stephen Alele 2, Charles Nahabwe 5, Jackson Were 4, Bernard Bagaya 2, Stephen Balinandi 1,3, Julius Lutwama 1,2,3, Pontiano Kaleebu 1,2,3
Editor: Chika Kingsley Onwuamah6
PMCID: PMC10807774  PMID: 38265993

Abstract

Background

Significant milestones have been made in the development of COVID19 diagnostics Technologies. Government of the republic of Uganda and the line Ministry of Health mandated Uganda Virus Research Institute to ensure quality of COVID19 diagnostics. Re-testing was one of the methods initiated by the UVRI to implement External Quality assessment of COVID19 molecular diagnostics.

Method

participating laboratories were required by UVRI to submit their already tested and archived nasopharyngeal samples and corresponding meta data. These were then re-tested at UVRI using the WHO Berlin protocol, the UVRI results were compared to those of the primary testing laboratories in order to ascertain performance agreement for the qualitative & quantitative results obtained. Ms Excel window 12 and GraphPad prism ver 15 was used in the analysis. Bar graphs, pie charts and line graphs were used to compare performance agreement between the reference Laboratory and primary testing Laboratories.

Results

Eleven (11) Ministry of Health/Uganda Virus Research Institute COVID19 accredited laboratories participated in the re-testing of quality control samples. 5/11 (45%) of the primary testing laboratories had 100% performance agreement with that of the National Reference Laboratory for the final test result. Even where there was concordance in the final test outcome (negative or positive) between UVRI and primary testing laboratories, there were still differences in CT values. The differences in the Cycle Threshold (CT) values were insignificant except for Tenna & Pharma Laboratory and the UVRI(p = 0.0296). The difference in the CT values were not skewed to either the National reference Laboratory(UVRI) or the primary testing laboratory but varied from one laboratory to another. In the remaining 6/11 (55%) laboratories where there were discrepancies in the aggregate test results, only samples initially tested and reported as positive by the primary laboratories were tested and found to be false positives by the UVRI COVID19 National Reference Laboratory.

Conclusion

False positives were detected from public, private not for profit and private testing laboratories in almost equal proportion. There is need for standardization of molecular testing platforms in Uganda. There is also urgent need to improve on the Laboratory quality management systems of the molecular testing laboratories in order to minimize such discrepancies.

Introduction

Laboratory External Quality Assurance program (EQA) are intended to ensure quality, timely and accurate results are released from the testing laboratories with ultimate goal of improving quality of patient care [1]. COVID19 diagnostics technology has evolved relatively fast since disease outbreak in 2019 [2]. This technological advancement is also associated with a number of errors that requires stringent regulation and monitoring of the accuracy, reliability, and precision of new testing technologies [3]. One such approach is to institute a reliable, efficient, and consistent External Quality Assessment program, with proper and timely root cause analysis conducted, corrective actions taken for non-conforming laboratories [4].

In Uganda, the antigen based Rapid Diagnostics test kits (AgRDTs) and Polymerase Chain Reaction (mainly Real time PCR) testing platforms have been validated and approved for use [5]. Currently, there are 67 Ministry of Health approved laboratories across Uganda conducting COVID19 PCR testing. The Uganda Virus Research Institute COVID19 National Reference Laboratory and the National Quality assurance committee has approved and periodically conducts three types of EQA: Proficiency Testing (PT) panel, re-testing/re-checking of quality control samples and on-site supervision. In the PT panel program, UVRI scientists calculate, prepare, concentrate, pool, aliquot, packages and distribute the panels to MoH approved laboratories. Laboratories conduct test, relay results to UVRI where sorting, analysis, report writing, dissemination of findings to relevant stake holders is done.

In the re-testing/re-checking, technical experts are sent to laboratories to randomly identify positive and negative samples from the archives and biobanks of primary testing laboratories, package and return with them. At the UVRI, re-testing is done, test outcomes are compared with those of the primary laboratories; reports are written and shared with relevant stakeholders. In both EQA methods, root cause analysis for under-performing laboratories is done & corrective actions taken. On-site supervision which is the 3rd EQA method is usually implemented alongside proficiency testing panel and re-testing.

When a COVID19 PCR result became a travel requirement in Uganda, the demand for the test overwhelmingly increased to a point where public facilities could not manage the workload. Subsequently, the Government apportioned part of this task to the private sector but maintained the oversight roles. Majority of these Laboratories were set up, assessed, and approved for operation in a rush due to national and international demands; moreover, molecular diagnostics is still relatively a new testing method in Uganda and many African countries. To ensure test results reliability and continued improvement of Laboratory Quality management System, there was an urgent need to strengthen External quality assurance programs. This study evaluated the effectiveness of re-testing as a method of External quality assessment for COVID19 testing.

Materials and methods

Selection of the assessment and activation team

A team of experts were selected from the UVRI arbovirology Laboratory and dispatched to the various Laboratories across the country to retrieve samples for re-testing as part of quality control process. The National COVID19 Quality Assurance committee selected, vetted, and appointed a COVID19 National Laboratory Assessment and activation team that is composed of experts in molecular biology, policy & guidelines development, Laboratory Quality Management Systems (LQMS), and representative from the Allied Health Professional Council (diagnostics regulatory arm of the Ministry of Health). These experts were required to conduct quarterly visits (an interval of three months) of the approved testing sites. Amongst key activities are mentorship, supervision, vertical audit, and regulatory inspection.

Participating health facilities and laboratories

A formal communication from the Ugandan Ministry of Health and Uganda Virus Research Institute was sent to all the 67 Ministry of Health assessed and approved COVID19 molecular testing facilities in Uganda prior to the visit by experts. Thirty-three (33) of the Laboratories had not been archiving (storing) samples and were excluded from participation. Twenty-one (21) of the remaining Thirty-four (34) did not have sufficient accompanying meta data (clients’ demographics and clinical information) and were also excluded. Part of the aliquots from the positive samples retrieved were also used for genomic sequencing. Clients’/patients’ details are crucial to making statistical sense and interpretation of the test results. Of the remaining thirteen (13) sites, two submitted insufficient sample volumes and were rejected according to the UVRI Arbovirology rejection criteria [6]. A total of Eleven(11) Laboratories qualified for inclusion in the program and these were: Mulago National Referral Hospital (Public), Examina Medical Laboratory(private), MAIA medicals(private), Test & Fly Laboratory (private), Kabale Regional Referral Hospital (public), Same day Laboratory (private), Bwindi Community Hospital(private not for profit),Medsafe Hospital (private), Gulu University multifunctional Laboratory (Teaching & research), Safari Laboratory (Private) and Tenna& Pharma (private).

Sample and metadata retrieval from the primary testing laboratory

These experts were given cool boxes stocked with ice packs, thermometers for temperature monitoring, absorbent materials, and cotton wool. Public and University Laboratories such as Gulu University, Kabale Regional Referral Hospital and Mulago National Referral Hospital have freezers provided by Government and were able to store their samples at -20°C. The rest of the private facilities had small freezers but still managed to archive samples at -20°C. However, we noted that due to limited storage space and high-test positivity rates at the time, some samples could have been stored under inappropriate temperatures and other undesirable storage conditions, though we do not have proof of this. On reaching the facility, UVRI scientists interacted with the laboratory technicians (quality officers) who provided a list of all positive and negative nasopharyngeal samples in their biobanks and archives for the past three (3) months. About 1ml of the liquid aliquots were pipetted into cryovial tubes from each participating Laboratory and these were the quality control samples, part of the aliquot was separately kept for genomic sequencing. Labels on the samples were cross checked for match with the duplicate sample identification in either the electronic database or in the records books. For laboratories with not more than ten(10)PCR positive COVID19 samples, all of them were picked for re-testing. For those with more than 10 positive samples, a probability sampling was done. In this, the total number of samples were counted and divided by a number that gave a convenient interval of selection. The same was done for negative samples. For positives, a total of 10 samples were selected (though there were instances where more were selected), while for negatives, a total of 20 samples were selected. Only samples stored for not more than three(3) months were considered for re-testing.

Packaging, documentation & transportation

Selected samples were triple packaged according to a procedure described by Karthik K et al.; 2020 [7]. But briefly, selected nasopharyngeal samples were put in the primary nasopharyngeal container, then inside a zip-lock bag (secondary container) and finally in a cool box (tertiary container) with contents as previously discussed. A desiccant was put inside the zip-lock bag to absorb any moisture. Cotton wool was put between the ice packs and zip lock bags inside the cool box in order to avoid effects of moistures on the samples. For the meta data, the following information were captured; sample ID, testing facility, final test result, sample collection date, sample type, Cycle Threshold value (CT value) at different gene targets. Clinical details such as presenting signs and symptoms, disease severity among others were documented in the Laboratory Investigation form. Packaged samples with accompanying metadata were properly put inside the vehicles and transported to Uganda Virus Research Institute. Temporary storage at appropriate temperature were done prior to testing.

Re-testing of nasopharyngeal samples at the UVRI

TaqMan real time PCR was use

Principle of detection and amplification. TaqMan real time PCR contains a set of forward and reverse primers along with probes that can bind the DNA/RNA between the binding sites for the primers [8]. The probe contains a fluorescence reporter molecule on its 5’end and a non-fluorescent quencher molecule on its 3’ end. When the probe is intact, fluorescence is not present. During the amplification, the polymerase cleaves the fluorescence molecule from the probe producing fluorescence. The increase in fluorescence occurs only when the target sequence is complimentary to the probe and is amplified during the PCR reaction. Any non-specific amplification will not be detected due to the requirements for cleavage.

Equipment preparation. All work surfaces, pipettes, centrifuges, and other equipment were cleaned and decontaminated prior to use. Decontamination agents used included 5% bleach, 70% ethanol, and DNAzap™ and RNase AWA™ to minimize the risk of nucleic acid contamination.

Nucleic acid extraction. QIAamp Virus RNA Mini Kit was utilized in the RNA extraction: A method by Liu Y; 2020 [9] was used to extract the RNA; but briefly, we aliquoted 140μL of nasopharyngeal specimen referred from primary testing sites for quality control purposes and eluted with 60 μL of buffer AVE. SARS-CoV-2 Negative Control in this kit was also extracted with the same protocol as for specimens. The Internal Control in the kit was added into the extraction mixture with 1μl/test to monitor the whole process. Manufacturer’s recommended procedures (except as noted in recommendations above) were followed for sample extraction.

Assay setup. A method by Shen M et al. 2020 [10] was used to set up the reaction master mix, but briefly; negative control and positive Control were included in each run. In the reagent setup room clean hood, Super Mix and RT-PCR Enzyme were mixed on ice or cold block, this was meant to keep cold temperatures during preparations and use. The Super mix was thawed prior to use, subsequently, super mix and RT-PCR was mixed with enzyme mix by inversion 5 times or until the technician feels mixing was adequate. In the next step, centrifuge super mix and RT-PCR Enzyme were allowed to mix for 5 seconds, contents were collected at the bottom of the tube, and then the tube containing the mixture was placed in a cold rack. The number of reactions (N) to be set up per assay was then determined. In order to cater for possible pipetting error, excess reaction mix for the Negative Control, Positive Control, were made. After addition of the reagents, reaction mixtures were well agitated using vortex mixer. The mixture was centrifuged for 5 seconds and contents collected at the bottom of the tube, and then the tubes were placed in a cold rack. Reaction plates were set up in a 96-well cooler rack. 20 μL of master mix was after dispensed into each PCR tube. The entire reaction plate was covered, and the reaction plate was moved to the specimen nucleic acid handling area.

Template addition. Nucleic acid sample including positive and extracted negative control tubes were gently vortexed for approximately 5 seconds. Centrifugation was done for 5 seconds in order to collect contents at the bottom of the tube, and then the mixture in the tube was placed in a cold rack. After centrifugation, nucleic acid samples including positive and negative control were placed in tubes in the cold rack. Carefully, 5.0 μL of sample including positive and negative control were pipetted in each well. Other sample wells were covered during addition. Tips were changed after each addition. The column to which the sample has been added was securely capped to prevent cross contamination and to ensure sample tracking. Gloves were changed often and when necessary to avoid contamination.

Creation and running of the PCR on the Applied Biosystems 7500 real time PCR instrument

Applied biosystems 7500 real time PCR instrument was launched, a new window created, and new experiment was chosen. Experimental properties were selected, after which targets and samples were selected. Whilst UVRI has many testing platforms, we opted to use Applied Biosystems in order to ensure uniformity and consistency in result analysis and interpretation.

Testing platforms for COVID19 detection used by different laboratories across Uganda

A total of eight different molecular detection platforms were in use for COVID19 detection by participating in country laboratories. Two are closed systems (GeneXpert and U-STAR) while the others are open (Table 1). A Close system requires utilization of reagents or cartridges recommended by the manufacturer, while an open PCR gives options for utilization of any other reagents that can be compatible with the equipment. Being a national reference laboratory, UVRI has four testing platforms (Applied Biosystems, Quant Studio and Biorad) (Table 1). All these testing platforms were validated by the UVRI and Ministry of Health using a clearly defined Standard operating procedures. During re-testing, UVRI used only the Applied Biosystems platform in Berlin protocol, this was meant to ensure uniformity and reduce margin of error that could arise from inter platform differences.

Table 1. Real time PCR testing platforms being used for SARS CoV2 detection by various Ugandan laboratories.

Facility name RT-PCR testing platform Comment
Mulago NRH 16 module GeneXpert Closed system
Kabale RRH 16 module GeneXpert Closed System
Gulu University multifunctional Lab Bioer Lineage System Open System
Bwindi Community Hospital Magnetic Induction Cycler Open system
MAIA group of Laboratories Bioer Fluorescent Detection System Open system
Test & Fly Laboratory Rotorgene Open system
EXAMINA diagnostics Laboratory U-STAR Technologies Closed system
Tenna & Pharma Laboratory SLAN 96 P Real time PCR Open System
Same Day Laboratory Biorad CFX 96 Open System
Medsafe Hospital LTD Quant Studio 5 Open System
Safari Laboratory Biorad CFX 96 Open System
Uganda Virus Research Institute Applied Biosystems, Quant studio 7, Biorad CFX and GeneXpert A mixture of close and open systems

Genes and proteins on SARS CoV2 detected and reported by PCR testing platforms across different laboratories in Uganda

The majority of PCR testing platforms used by these laboratories detect genes while a few amplify viral proteins using different techniques but similar principles. The genes commonly detected are ORF1, E-gene, and N-gene. Bioer Fluorescence detection system used by MAIA Laboratories detected ORF1ab genes through the FAM channel while the same platform amplifies the N gene target sequence through the ROX channel (Fig 1). FAM is an important synthetic equivalent of a fluorescein dye used in oligonucleotide synthesis and molecular Biology. Bioer platform also detects the internal control (IC) in the VIC channel. VIC is a green color proprietary dye used to fluorescently label oligonucleotide at the 5’-end. The rest of the platforms either directly detected the genes or used other techniques beyond the scope of this study. N-gene was the most predominantly used for COVID19 detection by PCR testing platforms in Ugandan laboratories (32%), followed by ORF1 (32%), E-gene (18%). Only 14% of the participating laboratories ran and reported Internal Control before analyzing actual samples (Fig 1). It is a requirement by the laboratory quality management system that Internal Controls should be run and documented to have passed before analysis of actual patients’ or clients’ samples. Utilization of ORF1ab gene for COVID19 detection was at 4%. All testing platforms detected and amplified at least two gene targets before confirming a positive test. In most platforms, a positive test was confirmed upon detection and amplification of N-gene and ORF1, while some platforms detected and amplified E-gene and ORF-1.

Fig 1. Targets and genes reported by platforms of various testing laboratories in Uganda.

Fig 1

Key: ORF1 = open reading frame one. E-gene = Envelope gene. N-gene = Nucleocapsid gene. IC = Internal Control. ORF1ab = Open Reading Frame 1 ab.

Statistical methods

Cycle Threshold values of the quality control samples re-tested by the UVRI were exported into the CSV file facility by facility. The same identification number assigned to the sample by the primary testing laboratory was also assigned by UVRI during re-testing. At the UVRI, CT values were rounded off to two decimal places. Here a test is reported as Negative if CT value at two genes or targets exceeds 38.5.

To determine level of performance agreement, results from the primary laboratories were aligned with that of UVRI, sample by sample; matching was done by sample identification number. A result was reported as discordant when test result from the reference Laboratory does not agree with that of the primary testing laboratory and was reported as concordant when results from UVRI agrees with that of the primary testing Laboratory.

To perform a deeper analysis, the results were trimmed and exported to GraphPad prism (version 8). Line graphs of CT values of N-gene from UVRI were compared with that of the primary testing laboratory, sample by sample. Analysis of Variance and its P values in Graph Pad prism (version 8) were used to derive if there was any significant difference. Pie charts were drawn in Excel to determine utilization of different genes in RNA amplification and detection. To graphically display performance agreement, bar graphs were plotted in Excel spread sheet.

Results

Performance agreement between UVRI COVID19 national reference laboratory and the primary testing laboratories

A total of eleven(11) Ministry of Health approved COVID19 PCR testing Laboratories were considered for this study. These were the ones that submitted full metadata, sufficient sample volumes, had evidence of sample storage at the recommended temperature and presented well labelled quality control samples. Overall, 5/11 laboratories (45%) had a 100% performance agreement with UVRI, while the other 6/11 (55%) had varied number of discrepant results with the National Reference Laboratory (Fig 2). Out of the participating laboratories, two were public (Mulago National Referral Hospital and Kabale Regional Referral Hospital), seven were private for profit (Examina, MAIA, same day, Test & fly, Medsafe Hospital, Safari, Tenna & Pharma), one was private not for profit (Bwindi Community Hospital) and one was University Laboratory (Gulu University). Eight(8)of the Laboratories are located within the Kampala metropolitan area while two are located in Southwestern Uganda (Kabale Regional Referral Hospital and Bwindi Community Hospital) and one in Northern Uganda (Gulu University multifunctional Laboratory). Mulago National Referral Hospital and Gulu University had the highest number of discrepant results; three (3) samples from each of the Laboratories re-tested by the UVRI had negative results when they were initially tested positive. This was followed by Kabale Regional referral Hospital, Same day Laboratory, Safari Laboratory, and Bwindi Community Hospital with each having one discrepant result (false positive) upon re-testing by the National reference Laboratory (Fig 2). All samples initially tested as negative by the primary laboratories also tested negative by the UVRI and thus were excluded from both this plot and the overall analysis. Being a National Reference Laboratory, test outcome from the UVRI was treated as the correct and right result and conclusion of false result (positive or negative) was based on it.

Fig 2. Performance agreement between UVRI and initial testing laboratories.

Fig 2

Key: UVRI:Uganda Virus Research Institute. Examina: Examina Medical Laboratory. MAIA: MAIA medicals. T & F: Test and Fly Laboratory. KRRH: Kabale Regional Referral Hospital. S.Day: Same Day Laboratory. BCH: Bwindi Community Hospital. M.SAFE: Medsafe Hospital Limited. GUV: Gulu University multifunctional Laboratory.

CT values of N genes of samples with discrepant results between UVRI and primary testing laboratories

For quality control samples with discrepant results between National Reference Laboratory and primary testing laboratories, we further triangulated the aggregate result by conducting in depth analysis of the CT values. For Mulago NRH, the CT values of the three discrepant results were still within the acceptable range for a true positive COVID19 PCR test by the UVRI. It is disturbing that samples with CT values of 31.5, 34.9 and 31.4 re-tested negative at the UVRI national reference laboratory whose cut off CT is 38.5(Table 2). Similar findings were made for Gulu University multifunctional Laboratory where three samples with CT values of 18.79,18.84 and 28.27 that initially tested positive all turned negative upon retesting at the UVRI. For Kabale RRH (CT = 38.9), Bwindi Community Hospital(42.1), Same Day(40.1) and Safari Laboratory (35.5), CT values of N gene for samples with discordant results were out of range for acceptable values of a true positive COVID19 PCR test by the UVRI (Table 2).

Table 2. Comparison of CT values of samples with discrepant results between the UVRI COVID19 National Reference Laboratory and the primary testing laboratories.

MNRH UVRI KRRH UVRI BCH UVRI S.day UVRI GUV UVRI Safari UVRI
31.15 >38.5 38.9 >38.5 42.1 >38.5 40.1 >38.5 18.79 >38.5 35.5 >38.5
34.9 >38.5 18.84 >38.5
31.4 >38.5 28.27 >38.5

Key

Column 1 & 2: CT values of the same samples tested by Mulago National Referral Hospital and Uganda Virus Research Institute but with discrepant results.

Column 3&4: CT values of the same sample but with discrepant results between Kabale Regional Referral Hospital and Uganda Virus Research Institute

Column 5&6: CT values of the same sample but with discrepant results between Bwindi Community Hospital and Uganda Virus Research Institute

Column 7&8: CT values of the same sample but with discrepant results between Same Day Laboratory and Uganda Virus Research Institute

Column 9&10: CT values of the same samples but with discrepant results between Gulu University and Uganda Virus Research Institute

Column 11 & 12: CT values of the same sample but with discrepant results between Safari Laboratory and Uganda Virus Research Institute

Fluctuation in CT values of N-gene between the primary testing laboratory and the UVRI COVID19 National Reference Laboratory

Cycle Threshold Values (CT) of N-genes of positive samples initially tested by the primary Laboratory were compared to that of the National Reference Laboratory at the Uganda Virus Research Institute. These were samples that tested positive by both the primary laboratory and the National Reference Laboratory; however, they had differences in the CT values. Among all samples tested, there was none with the exact CT value for the N gene at both the external site and reference Laboratory much as the differences were not statistically significant across board. The CT values for N gene was slightly higher (p = 0.2395) for all the samples re-tested at the UVRI compared to the initial testing Laboratory (Test & Fly Laboratory) (Fig 3A). For samples that tested positive by both Bwindi Community Hospital and Uganda Virus Research Institute, the differences in the CT value of N gene were tangling in between low and high for the two laboratories, much as Bwindi Community Hospital had higher CT values for the last two samples (p = .999) (Fig 3B). Quite a difference in the CT values of positive samples re-tested was observed between MAIA laboratory and UVRI National Reference Laboratory with MAIA reporting higher CT values across almost all samples (P = 0.0849) (Fig 3C). Much as it is not statistically significant (p = 0.2698), the CT values of N genes for results reported by UVRI were generally higher than that reported by Mulago National referral Hospital, the initial testing Laboratory (Fig 3D). For Gulu University and Safari Laboratories, the CT values for the N gene were tangling in between high and low (Fig 3E & 3F). The most interesting was that of UVRI and Tenna& pharma Laboratory. CT values for all the ten samples were significantly higher upon retesting by the UVRI (p = 0.0296) much as both laboratories agreed on the final test outcome (Fig 3).

Fig 3. Line graphs comparing CT values of N genes of quality control samples from different facilities and the UVRI COVID19 National Reference Laboratory.

Fig 3

3A)CT value of Qc samples from Test & fly compared to UVRI, B)CT values of QC samples from Bwindi Community Hospital compared to UVRI, C) CT value of QC samples from MAIA compared to UVRI, D) CT values of QC samples from Mulago NRH compared to UVRI, E)CT values of QC samples from Safari Laboratory compared to that of UVRI, F)CT values of QC samples from Gulu University Laboratory compared to that of UVRI, G)CT values of QC samples from Tenna & Pharma Laboratory compared to that of UVRI.

Discussion

This study intended to evaluate the effectiveness of re-testing as a method of implementation External Quality assessment of Ugandan molecular testing laboratories. It found some level of discrepancy of COVID19 PCR results between the external sites and the National Reference Laboratory at the UVRI. The discrepancies were spread across public and private facilities in near equal proportion. For example, Mulago National Referral Hospital which posted the highest number of discrepant results and Gulu University Multifunctional Laboratories are Public and University entities respectively. Kabale Regional Referral hospital is Government aided while Bwindi Community Hospital is a private not for profit supervised by the Uganda Protestant Medical Bureau (UPMB) while same Day Laboratory is a private facility. The discrepancies also cut across different platforms; example, Mulago NRH uses versant KPCR and gene Expert platforms, Kabale RRH uses GeneXpert platform, Bwindi Community Hospital uses GeneXpert and Magnetic induction Cycler, while same Day Laboratory uses U-star technologies and Biorad CFX.

Findings majorly point to variability in the inter-platform detection ranges. There could be some element of transmission and clerical errors that could have risen from heavy workload and non-standardized reporting tools at that time. This study was conducted before the introduction of Laboratory Information Management systems and so the entire records and documentation processes were purely manual and error prone. According to the Ministry of Health Results Dispatch System (RDS), Mulago National Referral Hospital for example was testing over 500 samples per day at the peak of delta variant wave of COVID19. New staffs were just added, and yet molecular testing is highly sophisticated and requires advanced training, consistent practice, adequate and motivated work force. The majority of private facilities did not have sufficient capital to invest in molecular diagnostics given the unpredictability of COVID19 pandemics, especially from the Economic perspective. Unpublished report by the East Africa Community COVID19 Assessment common path committee found more than half of private COVID19 molecular diagnostics Laboratories in Uganda to be lacking in at least one of the 12 essential elements of a Laboratory Quality Management System. Because new Laboratories were being set up at a rapid speed, trained and competent personnel were being moved from one facility to another, leaving a huge gap of competent work force across most Laboratories. The interpretation of the CT values especially for open RT-PCR were very subjective. There was a high level of inter technician variability in the CT value interpretation from one Laboratory to another.

In order to understand the cause of the false positives, this study also compared actual CT values of N genes of quality control samples tested by the primary Laboratories and the UVRI COVID19 National Reference Laboratory and found positive samples from Mulago NRH and Gulu University which tested negative at the UVRI to have CT values within the acceptable range of a positive PCR result at the UVRI. These samples were concluded to have false positives according to guidance offered by the National Quality assurance committee for COVID19. This we believe could be errors arising from sample deterioration due to poor storage, inter-platform variability, packaging, and transportation. When poorly stored at wrong temperatures, RNA is very unstable and can deteriorate quickly. J.Greenman et al.; 2015 reported that HIV RNA deteriorate quickly when dried blood spot is kept at room temperatures for long and so the viral copies reduced while CT values increases upon re-testing [11], however, we are not certain if this finding can be applicable to SARS CoV2 given the differences in properties and classification of the two viruses. A study done by M Hardt et al. 2022 reported a significant reduction in detectable RNA in 75% of the swab solutions stored at 37°C for 96 hours [12]. Most of these samples were stored for months at the primary testing laboratories before retrieval for quality control testing at the UVRI. Commercial reagent contamination and contamination in the laboratory workflow are among factors cited to cause false positives as in the case of Mulago NRH [13]. For the other three facilities with false positives, the CT values were out of range for a true positive test result according to the National reference Laboratory testing platform. This could be attributed to staff incompetency, clerical error, or transmission error. These factors have been reported to occur at the pre-analytical, analytical, and post-analytical phases of testing [14].

Even where there was perfect agreement between the initial testing laboratories and the National Reference Laboratory in the final test outcome, there were difference in the CT values though not statistically significant, except for Tenna & Pharma Laboratory. This could be due to principles on which the different testing technologies were built on. All facilities whose results were compared to that of UVRI had different testing platforms with inter-technological differences. Daniel Rhoads et al. 2021 reported that CT values can vary within and between methods [15]. The college of American pathologists surveyed over 700 laboratories using proficiency testing materials produced from the same batch and found CT values by different instruments to vary by as much as 14 cycles. Within a single gene target for a single method, up to 12 cycles were reported to have been seen across all laboratories.

Conclusion

Discrepant COVID19 PCR results (false positives and false negatives) were caused by a wide range of factors; amongst which are clerical errors, inadequate storage facility, inter-differences across testing technologies, gaps in the chain of custody, huge workload, clerical, and transmission errors. Inter-laboratory comparison of results(re-testing) is an effective method of implementing External quality assurance program of molecular testing. Laboratories should invest more in developing quality management systems and enroll for accreditation. There should be continued investment in COVID19 and other molecular testing External Quality Assurance programs. Authorities in Uganda, and other Countries especially the line Ministry of Health can benchmark on these findings to expand the external quality assurance to other disease programs. Finally, it is not possible to reproduce the exact CT values for the same sample(s) run across different testing platforms. It is appropriate to give a range of acceptable values for a positive COVID19 PCR test.

Limitation of the study

Whilst these samples were being stored at the right temperatures during retrievals, we cannot guarantee the same was being consistently done for the three months while the samples were at these external sites. Majority of these laboratories especially private ones do not have power back up and yet electricity black out is a common occurrence in Uganda. Fluctuation in temperatures can lead to RNA degradation and protein denaturation; inconsistency power supply can cause temperature fluctuations. This was a program related work, and we were unable to control for those confounding.

Supporting information

S1 Data

(XLSX)

Acknowledgments

Let me extend my gratitude to the Ugandan Ministry of Health for putting up a spirited fight against different waves of COVID19 pandemic. Let me also applaud the Incident Management Team (IMT) and the Laboratory pillar of the Ministry of Health for the consorted efforts put against COVID19 and other emerging and re-emerging infections. Special gratitude goes to the Uganda Virus Research Institute (UVRI) and the National Quality assurance committee for the technical roles played. Much appreciation goes to Center for Diseases Control & prevention (CDC) under the candid leadership of Mr.Thomas Nsibambi for supplying reagents and other logistics that was used for running the QC samples. Finally, let me acknowledge COVID19 PCR testing laboratories for participating in the External Quality Assessment program.

Data Availability

Data will be availed once the manuscript is accepted for publication.

Funding Statement

Government of the republic of Uganda sends funds on a quarterly basis to support program activities. All authors of this manuscript are government employees paid wages at a periodic interval. For this project and genomic sequencing, CDC provided reagents in kind. There was no direct funding for this work and its publication.

References

  • 1.Todd CA, Sanchez AM, Garcia A, Denny TN, Sarzotti-Kelsoe M. Implementation of Good Clinical Laboratory Practice (GCLP) guidelines within the external quality assurance program oversight laboratory (EQAPOL). Journal of immunological methods. 2014;409:91–8. doi: 10.1016/j.jim.2013.09.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Organization WH. WHO manual for organizing a national external quality assessment programme for health laboratories and other testing sites. 2016. [Google Scholar]
  • 3.Majumder J, Minko T. Recent developments on therapeutic and diagnostic approaches for COVID-19. The AAPS journal. 2021;23:1–22. doi: 10.1208/s12248-020-00532-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Mardian Y, Kosasih H, Karyana M, Neal A, Lau C-Y. Review of current COVID-19 diagnostics and opportunities for further development. Frontiers in medicine. 2021;8:615099. doi: 10.3389/fmed.2021.615099 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Lutalo T, Nalumansi A, Olara D, Kayiwa J, Ogwang B, Odwilo E, et al. Evaluation of the performance of 25 SARS-CoV-2 serological rapid diagnostic tests using a reference panel of plasma specimens at the Uganda Virus Research Institute. International Journal of Infectious Diseases. 2021;112:281–7. doi: 10.1016/j.ijid.2021.09.020 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Bwogi J, Lutalo T, Tushabe P, Bukenya H, Eliku JP, Ssewanyana I, et al. Field evaluation of the performance of seven Antigen Rapid diagnostic tests for the diagnosis of SARs-CoV-2 virus infection in Uganda. Plos one. 2022;17(5):e0265334. doi: 10.1371/journal.pone.0265334 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Karthik K, Babu RPA, Dhama K, Chitra MA, Kalaiselvi G, Senthilkumar TMA, et al. Biosafety concerns during the collection, transportation, and processing of COVID-19 samples for diagnosis. Archives of Medical Research. 2020;51(7):623–30. doi: 10.1016/j.arcmed.2020.08.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.McGuigan FE, Ralston SH. Single nucleotide polymorphism detection: allelic discrimination using TaqMan. Psychiatric genetics. 2002;12(3):133–6. doi: 10.1097/00041444-200209000-00003 [DOI] [PubMed] [Google Scholar]
  • 9.Liu Y, Wang Y, Wang X, Xiao Y, Chen L, Guo L, et al. Development of two TaqMan real-time reverse transcription-PCR assays for the detection of severe acute respiratory syndrome coronavirus-2. Biosafety and health. 2020;2(04):232–7. doi: 10.1016/j.bsheal.2020.07.009 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Shen M, Zhou Y, Ye J, Al-Maskri AAA, Kang Y, Zeng S, et al. Recent advances and perspectives of nucleic acid detection for coronavirus. Journal of pharmaceutical analysis. 2020;10(2):97–101. doi: 10.1016/j.jpha.2020.02.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Greenman J, Roberts T, Cohn J, Messac L. Dried blood spot in the genotyping, quantification and storage of HCV RNA: a systematic literature review. Journal of viral hepatitis. 2015;22(4):353–61. doi: 10.1111/jvh.12345 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Hardt M, Föderl-Höbenreich E, Freydl S, Kouros A, Loibner M, Zatloukal K. Pre-analytical sample stabilization by different sampling devices for PCR-based COVID-19 diagnostics. New Biotechnology. 2022;70:19–27. doi: 10.1016/j.nbt.2022.04.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Ahmed W, Simpson SL, Bertsch PM, Bibby K, Bivins A, Blackall LL, et al. Minimizing errors in RT-PCR detection and quantification of SARS-CoV-2 RNA for wastewater surveillance. Science of the Total Environment. 2022;805:149877. doi: 10.1016/j.scitotenv.2021.149877 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Padhye NS. Reconstructed diagnostic sensitivity and specificity of the RT-PCR test for COVID-19. MedRxiv. 2020:2020.04. 24.20078949. [Google Scholar]
  • 15.Rhoads D, Peaper DR, She RC, Nolte FS, Wojewoda CM, Anderson NW, et al. College of American Pathologists (CAP) Microbiology Committee perspective: caution must be used in interpreting the cycle threshold (Ct) value. Clinical Infectious Diseases. 2021;72(10):e685–e6. doi: 10.1093/cid/ciaa1199 [DOI] [PubMed] [Google Scholar]

Decision Letter 0

Chika Kingsley Onwuamah

5 Jul 2023

PONE-D-23-16243RE-TESTING AS A METHOD OF IMPLEMENTING EXTERNAL QUALITY ASSESSMENT PROGRAMME FOR COVID-19 PCR TESTING IN UGANDAPLOS ONE

Dear Dr. Okek,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

All reviewers had concerns, some of which were major. Kindly review their comments and address as appropriate.

Please submit your revised manuscript by Aug 19 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Chika Kingsley Onwuamah, Ph.D.

Academic Editor

PLOS ONE

Journal requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Thank you for stating the following financial disclosure:

 “This work did not have direct funding. It's part of routine program activities of the Uganda Virus Research Institute. The institute was designated as COVID19 National reference Laboratory by the Ugandan Ministry of Health. UVRI was also designated as regional referral laboratory for COVID19, influenza and viral hemorrhagic fevers by CDC and WHO. A lot of samples have been referred from South Sudan, Democratic Republic of Congo, Burundi amongst others. As part of the dissemination plan, staffs are encouraged to publish in peer reviewed journals.”

At this time, please address the following queries:

a) Please clarify the sources of funding (financial or material support) for your study. List the grants or organizations that supported your study, including funding received from your institution.

b) State what role the funders took in the study. If the funders had no role in your study, please state: “The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.”

c) If any authors received a salary from any of your funders, please state which authors and which funders.

d) If you did not receive any funding for this study, please state: “The authors received no specific funding for this work.

3. We note that you have stated that you will provide repository information for your data at acceptance. Should your manuscript be accepted for publication, we will hold it until you provide the relevant accession numbers or DOIs necessary to access your data. If you wish to make changes to your Data Availability statement, please describe these changes in your cover letter and we will update your Data Availability statement to reflect the information you provide.

4. Please amend the manuscript submission data (via Edit Submission) to include authors Jocelyn Kiconco, John Kayiwa, Esther Amwine, Daniel Obote, Stephen Alele, Charles Nahabwe, Jackson Were, Bagaya Bernard, Balinandi Stephen, Thomas Nsibambi, Julius Lutwama and Pontiano Kaleebu.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: No

Reviewer #3: No

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: No

Reviewer #3: No

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: No

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: REVIEW: RE-TESTING AS A METHOD OF IMPLEMENTING EXTERNAL QUALITY ASSESSMENT PROGRAMME FOR COVID-19 PCR TESTING IN UGANDA

1. What type of PCR? Is it conventional or real time PCR? This should be stated at required points in the paper.

2. There is need to check the use of the word “quadrille” and use as appropriate.

3. The statement starting from “In September …” needs to be referenced.

4. The paragraph starting from "A total of eight different genes, dyes …” is ambiguous and disjointed. It is important to state clearly the functons of the primers and the dyes. The dyes are not the targets.

5. Abbreviations are not expected at the beginning of a sentence as seen in the statement starting from “CT …”

6. In what form were the samples retrieved? As extracted DNA or in the crude form? What was the storage condition at the different facilities prior to retrieving the samples from them? How many days after initial testing were the samples re-tested?

7. Quotation referred to J. Greenman et. al. needs to be revised for clarity.

Reviewer #2: As a Quality Assurance measure during the rollout of SARS-CoV-2 diagnostic testing, Okek et al. performed confirmatory qRT-PCR testing after initial testing was done at multiple hospitals, private test centers, and university sites within Uganda. The confirmatory testing was performed at the Uganda Virus Research Institute and included re-testing of 10 positive samples and 20 negative samples from each external site. Among the 11 external sites that participated, 5 had fully concordant positive/negative results with UVRI. At 2 external sites, there were three false-positives each, and at 4 external sites, there were one false-positive each. There were no false-negatives. This study nicely illustrates the importance of centralized confirmatory testing as multiple assays were rolled out quickly during the SARS-CoV-2 pandemic. As suggested below, the manuscript would be strengthened by adding more details and reframing some interpretations.

- How were sites chosen to be offered confirmatory testing? How many sites were offered confirmatory testing and declined? Could there have been bias towards better-performing sites in those that agreed to undergo confirmatory testing?

- In the Discussion, the authors allude to RNA degradation as a potential reason for tests going from positive (at the initial external site) to negative (at UVRI). I agree this is a very likely explanation for the “false positives” observed, and the authors should include more details in the Methods and Results sections about the temperature and duration of storage for samples at each site. For the “false positive” samples, it would be important to evaluate whether their storage conditions were different from the others.

- Figure 1 – please label the y-axis. If the y-axis indicates the number of positive tests, then to me it looks like more positive tests were found at UVRI than the external sites, which would indicate false-negatives at the external sites, not false-positives.

- I don’t understand the relevance of the genes, dyes, and targets reported by different laboratories. What question(s) were the authors asking by evaluating this information? Instead, it would be more informative to provide a table listing the assays, kits, and machines used by each external site.

- As the authors note in the Discussion, it’s very difficult to compare Ct values between assays. This point should be acknowledged sooner, and while OK to present data about the Ct values (Table 2 and Figure 3), I think the interpretation needs to be much softer. For example, it is not accurate to expect that sample with a Ct of 31.5 tested at an external site should have had the same (or even a similar) Ct when tested at UVRI. Similarly, it may not be accurate to state that “For the other three facilities with false positives, the CT values were out of the acceptable range for a positive test result according to the National reference Laboratory,” since the cutoff needs to be determined for the specific assay and machine being used. I also don’t think it’s appropriate or necessary to compare Ct values by statistical testing, and the authors do not state what test was used.

- In Table 2, what do the different rows indicate? It would be more informative to list each sample on a separate row and label the rows.

- In Figure 3, why are only seven sites shown?

- I disagree with the authors that they found a high level of discrepancy. For one thing, it was very good that no false-negatives were found. The few false-positives that were found could be explained by storage, RNA degradation, and different assays used. The differences in Ct observed between sites/assays is expected. Although the exact Cts are not expected to be the same, you would generally expect the difference between any two assays to be consistent between samples, and looking at Figure 2 this seems to be the case.

- While human error could be a component of discrepant results, some of the wording is overstated e.g. “With the stated inadequacies, discrepant results were inevitable.” Furthermore, it’s difficult to invoke incompetency as an explanation without testing operator competency. This section should be rewritten to be more constructive.

Reviewer #3: Methodology

• What is the frequency of the visit of experts to the labs? Is it one-off or periodic?

• How long and at what temperature are samples stored before they are selected for testing?

• Was the method used for testing by the labs retrieved also?

• What is the “right” temperature at which retrieved samples are stored at the reference lab?

• Packaging, Documentation & transportation. Selected samples were triple packaged according to (7). This is an incomplete statement

Results

• The methodology is described in the result section as it had to do with Fig 2 (VIC, FAM, ROX, and CV5). This should be moved to the methodology section.

• Differences in CT values are expected even when the same samples are run in duplicates in the same run. There is a need to confirm the assays' intra and inter-assay variabilities to interpret CT values within the context of the manufacturer’s performance characteristics. It is important to know if the same RNA extraction method was used between the reference and primary labs in order for an objective comparison.

Discussion

• There is a need for specification with this statement “Mulago NRH for example was testing over 500 samples”. Is it 500 per day/month/year?

• The methods used for COVID-19 testing at some labs were mentioned in the discussion but this was not listed as part of the metadata obtained from the facilities along with the samples.

• It is unclear how the authors describe QMS to be lacking in some labs when the study did not assess its implementation.

• A mention that the samples were stored for months at the primary testing labs before collection for retesting negates the essence of the quality assurance re-testing program. Since the aim of this program is not to assess the quality of storage but of results, the procedure used was not suitable for the objective.

Conclusion

• Most of the factors listed as affecting the discrepancy in results were not investigated in this study hence, cannot be categorically attributed.

General

• Confounders need to be addressed before making any general statements else they should be listed as limitations.

• Considering the requirement for confidentiality, the names of the labs should have not been mentioned.

• A review by an English editor will provide some more clarity.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Babatunde Akeem Saka

Reviewer #2: No

Reviewer #3: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachment

Submitted filename: PLOSONE REVIEW.docx

PLoS One. 2024 Jan 24;19(1):e0287272. doi: 10.1371/journal.pone.0287272.r002

Author response to Decision Letter 0


16 Aug 2023

Response to questions, comments and guidance offered by the Scientific reviewers on Manuscript titled “Re-testing as a method of implementing External Quality Assessment Program for COVID-19 real time PCR testing in uganda.”

Reviewer one

Qn one: What type of PCR?, is it conventional or Real Time PCR? This should be stated at required points in the paper.

Response: All testing platforms were real time PCR. Tittle in the revised manuscript has real time PCR added and also in the main text.

Qn Two: There is need to check the use of the word “quadrille” and use as appropriate.

Response: Thank you for this observation; the word quadrille has been removed in the revised manuscript and replaced by “overwhelmingly increased”

Qn three: The statement starting with “In September…” need to be referenced.

Response: The entire paragraph has been removed and replaced by a more befitting statement in the revised manuscript.

Qn four: The paragraph starting with “ A total of eight different gene dyes” is ambiguous and disjointed. It is important to clearly state the functions of the primers and probes . The dyes are not probes.

Response: I do appreciate your keen observation and clarification. That entire section has been re-written, and I believe the ambiguity is no more. The revised section clearly states the roles of the function of different probes and primers.

Qn five: Abbreviations are not expected at the beginning of a sentence as seen in the statement starting from “CT…”

Response: Thank you so much for this guidance. The revised paragraph now starts with Thermocycler Value as opposed to CT in the original version.

Qn Six: In what forms were the samples retrieved? As extracted DNA or in crude form? , What was the storage conditions at the different facilities prior to retrieving the samples from them? How many days after initial testing were the samples re-tested?

Response: These were aliquots of stored Nasopharyngeal swabs in crude form stored at Negative 20oC across all facilities. Samples for the past three months were collected.

Qn seven: Quotation referred to J.Greenman et al need to be revised for clarity.

Response: The citation has been clarified in the revised manuscript

Reviewer 2

Qn one: How were the sites chosen to be offered confirmatory testing? How many sites were offered confirmatory testing and declined? Could there have been biased towards better performing sites in those that agreed to undergo confirmatory testing?

Response: All the 67 sites were eligible, but they were excluded because they didn’t meet the inclusion criteria. The majority of them were not archiving samples while some did not have sufficient metadata, and few submitted insufficient sample volumes. Inclusion criteria has been well stated in the revised manuscript.

Qn two: In the discussion, the authors allude to RNA degradation as a potential reason for test going from positive (at the initial external site) to negative (at UVRI). I agree this is a very likely explanation for the “false positive” observed, and the authors should include more details in the methods and result sections about the temperatures and duration of storage of samples at each site . For the “false positive” samples, it would be important to evaluate whether their storage conditions were different from others.

Response: An entire section has been created on storage conditions, temperature monitoring and duration has been included in the revised manuscript. It addresses all your queries.

Qn three: Figure 1 – please label the y-axis. If the y-axis indicates the number of positive tests, then to me it looks like more positive tests were found at UVRI than the external sites, which would indicate false-negatives at the external sites, not false-positives.

Response: Y-axis has been in the revised manuscript. I think it still indicates a false positive by an external site because the positive issued to the client by an external site.

Qn four: I don’t understand the relevance of the genes, dyes, and targets reported by different laboratories. What question(s) were the authors asking by evaluating this information? Instead, it would be more informative to provide a table listing the assays, kits, and machines used by each external site.

Response: Thank you so much for this observation and guidance. A table listing the testing platform by each facility has been created in the revised manuscript. Information on genes, dyes and targets were put forward in order for the readers to appreciate the diversity in testing platforms in Uganda .

Qn five : As the authors note in the Discussion, it’s very difficult to compare Ct values between assays. This point should be acknowledged sooner, and while OK to present data about the Ct values (Table 2 and Figure 3), I think the interpretation needs to be much softer. For example, it is not accurate to expect that sample with a Ct of 31.5 tested at an external site should have had the same (or even a similar) Ct when tested at UVRI. Similarly, it may not be accurate to state that “For the other three facilities with false positives, the CT values were out of the acceptable range for a positive test result according to the National reference Laboratory,” since the cutoff needs to be determined for the specific assay and machine being used. I also don’t think it’s appropriate or necessary to compare Ct values by statistical testing, and the authors do not state what test was used.

Response: We take good note of your guidance and observation. As advised, the language has been softened in the revised manuscript. UVRI used the berlin protocol to test these samples; this protocol was WHO approved and so we treated it as gold standard. At the time, Uganda as a country wanted to set a cut off for acceptable positive test and this study was meant to be one of the key informers. With similar storage conditions as it was for the external sites, variation in temperature was some how controlled. We used ANOVA in graph pad prism to compare the CT values.

Qn six: In Table 2, what do the different rows indicate? It would be more informative to list each sample on a separate row and label the rows.

Response: We tried and found that listing each sample on a separate row and labelling will over crowd the work. We instead opted to put a key for each column just below the table.

Qn seven : In Figure 3, why are only seven sites shown?

Response: Drawing 11 graphs for 11 sites would be monotonous since findings were similar. We instead opted to consider regional balance, ownership (public vs private) in the plotting the graphs. The seven facilities were representative enough.

Qn Eight : I disagree with the authors that they found a high level of discrepancy. For one thing, it was very good that no false negatives were found. The few false positives that were found could be explained by storage, RNA degradation, and different assays used. The differences in Ct observed between sites/assays is expected. Although the exact CTs are not expected to be the same, you would generally expect the difference between any two assays to be consistent between samples and looking at Figure 2 this seems to be the case.

Response: Thank you for your in-depth analysis. In-ability to monitor storage conditions for a longer period has been stated as a major limitation to this study. However, we tried to control storage as a cofounding by excluding sites without freezers. In the discussion, we also categorically acknowledged RNA degradation as a possible factor and cited literatures that confirms it as well.

Qn Nine: While human error could be a component of discrepant results, some of the wording is overstated e.g. “With the stated inadequacies, discrepant results were inevitable.” Furthermore, it’s difficult to invoke incompetency as an explanation without testing operator competency. This section should be rewritten to be more constructive.

Response: Some of the overstated words have been changed in the revised manuscript. We also cited unpublished report from East Africa Community COVID19 common path assessment report that identified personal as a component deficient in most Ugandan molecular testing Laboratories. However, we do acknowledge that personal competency was not one of the outcomes of our study.

Reviewer 3

Qn one: What is the frequency of the visit of experts to the labs? Is it one-off or periodic?

Response: A section titled “Selection of assessment and activation team” has been included in the revised manuscript. It has all the details of the questions you have asked.

Qn two: How long and at what temperature are samples stored before they are selected for testing?

Response: A section on samples and meta data retrieval has been enriched to address most of the questions you have raised in the revised manuscript

Qn three: Was the method used for testing by the labs retrieved also?

Response: A table with testing platforms for different Laboratories has been created in the revised manuscript.

Qn Four: What is the “right” temperature at which retrieved samples are stored at the reference lab?

Response: UVRI temporarily stores samples at negative 80 degree Celsius for a short term (3 months) and negative 96oC for long storage.

Qn five: Packaging, Documentation & transportation. Selected samples were triple packaged according to (7). This is an incomplete statement.

Response: A full statement has been put in that section of the revised manuscript. It addressed the comment you have raised.

Qn Six: The methodology is described in the result section as it had to do with Fig 2 (VIC, FAM, ROX, and CV5). This should be moved to the methodology section.

Response: Thank you for this keen observation and guidance. The section has been moved from results section to method section in the revised manuscript.

Qn 7: Differences in CT values are expected even when the same samples are run in duplicates in the same run. There is a need to confirm the assays' intra and inter-assay variabilities to interpret CT values within the context of the manufacturer’s performance characteristics. It is important to know if the same RNA extraction method was used between the reference and primary labs in order for an objective comparison.

Response: Unfortunately, platforms and assays have different extraction methods. However, because UVRI uses berlin protocol (one of the earliest approved by WHO at the onset of COVID19), we expect all other testing platforms to agree with it much as inter assay differences is inevitable.

Qn 8: There is a need for specification with this statement “Mulago NRH for example was testing over 500 samples”. Is it 500 per day/month/year?

Response: This statement has been clarified in the revised manuscript

Qn 9: The methods used for COVID-19 testing at some labs were mentioned in the discussion, but this was not listed as part of the metadata obtained from the facilities along with the samples.

Response: A separate table with list of testing platforms across various laboratories have been included in the revised manuscript.

Qn 10: It is unclear how the authors describe QMS to be lacking in some labs when the study did not assess its implementation.

Response: In this study, we did not have sufficient proof to identify any inadequacy in the QMS. However, we based on the unpublished report by East Africa Community COVID19 common path assessment report that found some lacks to be deficient in the QMS. We have however revised the statement in the revised manuscript.

Qn 11: A mention that the samples were stored for months at the primary testing labs before collection for retesting negates the essence of the quality assurance re-testing program. Since the aim of this program is not to assess the quality of storage but of results, the procedure used was not suitable for the objective.

Response: In the revised manuscript, we stated that the samples were stored at -20oC at the external sites for at most three months, much as we did not have proof of consistency of the storage temperatures for all these months.

Qn 12: Most of the factors listed as affecting the discrepancy in results were not investigated in this study, hence, cannot be categorically attributed.

Response: We actually did not attribute those factors in totality, but we only assumed or hypothesized. Subsequent studies will be designed to investigate those factors.

Qn 13: Confounders need to be addressed before making any general statements else they should be listed as limitations.

Response: Thank you for the guidance; a section on limitation has been created. Some of the confounding such as storage has been listed amongst limitations.

Qn 14: Considering the requirement for confidentiality, the names of the labs should have not been mentioned.

Response: These Laboratories are public utility facilities. Besides, this work was more of program and regulatory as opposed to pure academic and research. Non the less, we have softened some of the tough languages used in the original manuscript which was some how implicative.

Qn 15: A review by an English editor will provide some more clarity.

Response: This comment lack clarity; I believe the quality of English in the revised manuscript is much better now.

Attachment

Submitted filename: Rebuttal_retesting of QC samples(2).docx

Decision Letter 1

Chika Kingsley Onwuamah

14 Sep 2023

PONE-D-23-16243R1RE-TESTING AS A METHOD OF IMPLEMENTING EXTERNAL QUALITY ASSESSMENT PROGRAMME FOR COVID-19 REAL TIME PCR TESTING IN UGANDAPLOS ONE

Dear Dr. Okek,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

The reviewers acknowledge the manuscript has greatly improved. However, all the issues raised have not be addressed. Kindly address all issues raised by the three reviewers in both rounds of reviews.

Please submit your revised manuscript by Oct 29 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Chika Kingsley Onwuamah, Ph.D.

Academic Editor

PLOS ONE

Additional Editor Comments:

Please, the issues raised by all three reviewers have not been fully addressed. Kindly address all satisfactorily to allow us proceed. See the reviewers comments included

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: (No Response)

Reviewer #2: (No Response)

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: No

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: No

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: No

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: No

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: 1. Previous corrections such as the use of abbreviations at the beginning of a sentence is still in the manuscript.

2. The word quadrilled was still used wrongly in the manuscript.

3. CT value means cyclethreshold and should be referred to as such. Thermocycler value cannot be abbreviated as CT.

4. The use and interpretation of dyes and genes is still confusing in this manuscript. For clarity, dyes are the coloured probes used to detect the genes intended for identification (amplification). Dyes are not to be grouped as items to be identified as this paper implied especially summarised in the pie chart.

Reviewer #2: While the manuscript has improved, there are important comments from myself and other reviewers that have not been adequately addressed. I have some examples below, but I recommend that the authors carefully go through the prior detailed review to make sure they understand and address each comment. The manuscript contains valuable data, but there are still important inaccuracies in the analysis and interpretations. Just a couple of examples:

-A fundamental problem with the paper is the expectation that the specific Ct values produced by the reference lab should match the specific Ct values produced by each other lab. This is not true, because machines and assays have different ways of producing and measuring fluorescence, which are not expected to generate the same Ct value for the same sample. Here is an example of a study that demonstrates variable Ct values across assays and platforms, even though all are valid to use:

https://journals.asm.org/doi/10.1128/jcm.00821-20

Because of this well known inter-assay variability, specific comparisons are not appropriate, though the relative value of each sample can be compared e.g. using correlation tests.

-I still do not understand Figure 1. Using the external site MNRH as an example (first set of bars), it looks like 7 samples tested positive at MNRH and 10 samples tested positive at UVRI. This would mean that there were either 3 false-positives at UVRI or 3 false-negatives at MNRH. This is the opposite of what is described in the text.

-As reviewers 1 and 2 noted, the description of dyes and probes in the various real-time PCR assays is confusing and misleading. Dyes are not probes; each assay will use a dye, and some will also use a probe. By placing them in the same pie chart in Figure 2, the authors make it seem as though each assay uses either a dye or a probe.

-Again, these are just a few examples of where I do not think the authors have adequately addressed prior comments.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2024 Jan 24;19(1):e0287272. doi: 10.1371/journal.pone.0287272.r004

Author response to Decision Letter 1


29 Oct 2023

Response to Reviewers

Reviewer #1: 1. Previous corrections such as the use of abbreviations at the beginning of a sentence is still in the manuscript.

Response: Thank you for the guidance; abbreviations locally used such as Ministry of Health (MoH) and Uganda Virus Research Institute (UVRI) have been written in full in the abstract. However, there are universally accepted abbreviations such as COVID19, PCR that the target audience for this manuscript are conversant with.

2. The word quadrilled was still used wrongly in the manuscript.

Response: Thank you for this observation. However, the word “quadrilled” does not exist in this revised manuscript to the best of my knowledge. Deliberate efforts have been put to avoid colloquial words in the revised manuscript.

3. CT value means cyclethreshold and should be referred to as such. Thermocycler value cannot be abbreviated as CT.

Response: Thank you so much for this correction. What was previously written as Thermocycler value has been changed to cycle Threshold in the entire revised manuscript as guided

4. The use and interpretation of dyes and genes is still confusing in this manuscript. For clarity, dyes are the colored probes used to detect the genes intended for identification (amplification). Dyes are not to be grouped as items to be identified as this paper implied especially summarized in the pie chart.

We take note of your comment. The entire section of genes and dyes has been re-written in the revised manuscript. I believe the ambiguity has been clarified in the revised manuscript.

Reviewer #2: While the manuscript has improved, there are important comments from myself and other reviewers that have not been adequately addressed. I have some examples below, but I recommend that the authors carefully go through the prior detailed review to make sure they understand and address each comment. The manuscript contains valuable data, but there are still important inaccuracies in the analysis and interpretations. Just a couple of examples:

-A fundamental problem with the paper is the expectation that the specific Ct values produced by the reference lab should match the specific Ct values produced by each other lab. This is not true, because machines and assays have different ways of producing and measuring fluorescence, which are not expected to generate the same Ct value for the same sample. Here is an example of a study that demonstrates variable Ct values across assays and platforms, even though all are valid to use:

https://journals.asm.org/doi/10.1128/jcm.00821-20

Because of this well-known inter-assay variability, specific comparisons are not appropriate, though the relative value of each sample can be compared e.g. using correlation tests.

Response: Thank you for this observation and comment. In the dissuasion of the revised manuscript, we acknowledge inter-assay differences as a possible cause of the discrepancies. Much as the exact values cannot be reproduced across different platform, there should be an acceptable range of values which are usually +/- 2SD. However, results from the UVRI are treated as the true results by the National Quality assurance committee after weighing in so many variables

-I still do not understand Figure 1. Using the external site MNRH as an example (first set of bars), it looks like 7 samples tested positive at MNRH and 10 samples tested positive at UVRI. This would mean that there were either 3 false-positives at UVRI or 3 false-negatives at MNRH. This is the opposite of what is described in the text.

Response: Thank you for this keen observation. The confusing graph has been removed and new one plotted as reflected in the revised manuscript. The values were assigned a wrong tittle while inputting.

-As reviewers 1 and 2 noted, the description of dyes and probes in the various real-time PCR assays is confusing and misleading. Dyes are not probes; each assay will use a dye, and some will also use a probe. By placing them in the same pie chart in Figure 2, the authors make it seem as though each assay uses either a dye or a probe.

Response: This section has been extensively revised, we believe your issue has been addressed in the revised manuscript.

-Again, these are just a few examples of where I do not think the authors have adequately addressed prior comments.

Response: We believe the revised manuscript addresses most of your concerns and questions. I would be much excited if the scientific reviewers can clear this paper for publication before it is taken by events.

Attachment

Submitted filename: Response to Reviewers(1).docx

Decision Letter 2

Chika Kingsley Onwuamah

8 Nov 2023

RE-TESTING AS A METHOD OF IMPLEMENTING EXTERNAL QUALITY ASSESSMENT PROGRAMME FOR COVID-19 REAL TIME PCR TESTING IN UGANDA

PONE-D-23-16243R2

Dear Dr. Okek,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Chika Kingsley Onwuamah, Ph.D.

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: I agree to publishing this not because it is novel but because it will provide further information about the quality of testing and preservation of samples collected in another country. Same has been reported in so many other countries globally.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

**********

Acceptance letter

Chika Kingsley Onwuamah

3 Jan 2024

PONE-D-23-16243R2

PLOS ONE

Dear Dr. Okek,

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team.

At this stage, our production department will prepare your paper for publication. This includes ensuring the following:

* All references, tables, and figures are properly cited

* All relevant supporting information is included in the manuscript submission,

* There are no issues that prevent the paper from being properly typeset

If revisions are needed, the production department will contact you directly to resolve them. If no revisions are needed, you will receive an email when the publication date has been set. At this time, we do not offer pre-publication proofs to authors during production of the accepted work. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few weeks to review your paper and let you know the next and final steps.

Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

If we can help with anything else, please email us at customercare@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Chika Kingsley Onwuamah

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Data

    (XLSX)

    Attachment

    Submitted filename: PLOSONE REVIEW.docx

    Attachment

    Submitted filename: Rebuttal_retesting of QC samples(2).docx

    Attachment

    Submitted filename: Response to Reviewers(1).docx

    Data Availability Statement

    Data will be availed once the manuscript is accepted for publication.


    Articles from PLOS ONE are provided here courtesy of PLOS

    RESOURCES