Skip to main content
Journal of Clinical Microbiology logoLink to Journal of Clinical Microbiology
. 2015 May 14;53(6):1790–1796. doi: 10.1128/JCM.02739-14

Point-Counterpoint: Can Newly Developed, Rapid Immunochromatographic Antigen Detection Tests Be Reliably Used for the Laboratory Diagnosis of Influenza Virus Infections?

James J Dunn a,, Christine C Ginocchio b,c,
Editor: P H Gilligan
PMCID: PMC4432049  PMID: 25274999

Abstract

Five years ago, the Point-Counterpoint series was launched. The initial article asked about the role of rapid immunochromatographic antigen testing in the diagnosis of influenza A virus 2009 H1N1 infection (D. F. Welch and C. C. Ginocchio, J Clin Microbiol 48:22–25, 2010, http://dx.doi.org/10.1128/JCM.02268-09). Since that article, not only have major changes been made in immunochromatographic antigen detection (IAD) testing for the influenza viruses, but there has also been rapid development of commercially available nucleic acid amplification tests (NAATs) for influenza virus detection. Further, a novel variant of influenza A, H7N9, has emerged in Asia, and H5N1 is also reemergent. In that initial article, the editor of this series, Peter Gilligan, identified two issues that required further consideration. One was how well IAD tests worked in clinical settings, especially in times of antigen drift and shift. The other was the role of future iterations of influenza NAATs and whether this testing would be available in a community hospital setting. James Dunn, who is Director of Medical Microbiology and Virology at Texas Children's Hospital, has extensive experience using IAD tests for diagnosing influenza. He will discuss the application and value of these tests in influenza diagnosis. Christine Ginocchio, who recently retired as the Senior Medical Director, Division of Infectious Disease Diagnostics, North Shore-LIJ Health System, and now is Vice President for Global Microbiology Affairs at bioMérieux, Durham, NC, wrote the initial counterpoint in this series, where she advocated the use of NAATs for influenza diagnosis. She will update us on the commercially available NAAT systems and explain what their role should be in the diagnosis of influenza infection.

J Clin Microbiol. 2015 May 14;53(6):1790–1796.

POINT

Influenza virus infections are responsible for significant morbidity and mortality in both pediatric and adult populations worldwide. Unfortunately, diagnosis of influenza infection based solely on clinical symptoms can be challenging in both pediatric and adult patients (13). During periods of low influenza activity and outside epidemic periods, patients may present with influenza-like illness (ILI) due to other circulating respiratory viruses. To establish influenza as the viral etiology of infection requires accurate diagnostic testing, and the more rapidly that this result is available, the more likely that clinical patient management will be impacted. It is been shown that a timely and accurate diagnosis of influenza infection will more likely result in initiation of antiviral therapy, reduce the number of additional diagnostic studies, preclude the use of unnecessary antibiotics, and allow for prompt institution of proper infection control practices (4).

Rapid influenza diagnostic tests (RIDTs) have been used extensively for many years in a variety of settings, including physicians' offices, urgent care centers, and small laboratories where more-complex viral diagnostic capabilities may not be available (5). Generally speaking, positive results by these rapid methods correlate well with actual influenza virus infection. Unfortunately, the historic performance of these tests has been hampered by poor sensitivity and low negative predictive values (NPVs) compared to those of culture and/or molecular detection methods (6, 7), findings often more pronounced for novel or pandemic influenza virus strains (810). In light of these findings, several organizations and professional societies have cautioned clinicians about the utility of RIDTs for certain patient populations and how results should be interpreted (4, 11, 12). Most notably, because a negative RIDT result cannot reliably exclude influenza virus infection, follow-up testing with a more sensitive and specific method, such as reverse transcription (RT)-PCR or viral culture, should be considered to confirm the result, a practice that can delay decisions about patient management. Additionally, the correct interpretation and appropriate use of RIDTs should be considered in the context of the prevalence of circulating influenza strains in the community, since this affects the positive and negative predictive values of the tests. If the prevalence is unknown, RIDT results become difficult to interpret and are of limited use in making patient management decisions. Given the caveats and limitations that are associated with use of traditional lateral-flow immunoassays, many laboratories have forgone their use or restricted testing to only certain patient populations during periods of higher influenza prevalence. Ideally, if an RIDT had diagnostic accuracy approaching or equivalent to those of the more sensitive methods, it could serve as a stand-alone method for the majority of patients presenting with ILI. Hospitalized patients or those with underlying risk factors that might predispose them to more-severe infection would still require frontline or supplemental testing by other methods, as recommended previously (4, 11, 12).

In an effort to overcome many of the issues associated with RIDTs, two recently FDA-cleared, lateral-flow immunoassays have been designed and developed to employ an instrument-based digital scan of the test strip to enhance the sensitivity and specificity of detection of influenza virus antigens. The Quidel Sofia influenza A and B fluorescent immunoassay (FIA) (Sofia) (Quidel Corp., San Diego, CA), FDA cleared in 2011, employs europium-based immunofluorescence technology to qualitatively detect and differentiate influenza A and B virus nucleoproteins using the Sofia analyzer, an automated reader that scans the test strip, measures the fluorescent signal, and processes the results using method-specific algorithms in about 1 min. The BD Veritor System for Flu A+B (Veritor) (BD Diagnostics, Sparks, MD), FDA cleared in 2012, is an immunochromatographic assay for the qualitative detection and differentiation of influenza A and B nucleoproteins using the BD Veritor System reader, a portable digital instrument that uses a reflectance-based measurement to evaluate the line signal intensities on the assay test strip and a proprietary algorithm to identify and compensate for sample-related, nonspecific signal generation, a process requiring approximately 10 s. Both test platforms eliminate the need for an operator to visualize and interpret test results, a task that is often subjective, with varied results.

In a number of studies published to date, both Sofia and Veritor displayed clinical sensitivities approaching those of culture and/or RT-PCR for the detection of influenza A and B viruses. Compared to R-Mix shell vial culture, Sofia displayed sensitivities of 94% and 90% for influenza A and B, respectively, using nasopharyngeal (NP) aspirate/wash specimens, NP swabs, and nasal swabs in a patient cohort that was predominantly <6 years of age (13). A second study evaluated the performance of Sofia compared to that of R-Mix culture using NP swabs collected from an older group of patients (mean age, 27.7 years) and found sensitivities of 82% and 78% for influenza A and B viruses, respectively (14). In both studies, the specificity of Sofia was >95% for both influenza A and B viruses. Some studies have shown sensitivities of ≥90% for both influenza A and B virus detection using Sofia (15, 16) and Veritor (15, 17) compared to RT-PCR, while others have found sensitivities in the range of 75% to 86% for Sofia (13, 18, 19) and 69% to 88% for Veritor (17, 20). Some of the study-to-study variability in performance might be attributed to several factors, including the age of the patient, type of specimen collected, time of collection relative to the onset of symptoms, and version of the test used. For example, the use of some lots of Sofia kits were shown to result in a significant number of falsely positive results, particularly for influenza B virus (15, 16), resulting in a manufacturer recall of specific lots in December 2012 (21). In all studies in which rapid antigen tests other than Sofia and Veritor were also included, both digitally read assays displayed significantly better sensitivities for detecting influenza viruses (14, 15, 17, 18, 22). In terms of workflow, once the sample is added to the test strip, Sofia requires a 15-min incubation and Veritor requires a 10-min incubation. The time to process and test a single specimen or a small batch, therefore, is much shorter overall using Veritor, a time frame similar to or slightly less than that using the BinaxNOW influenza A&B assay (15, 17).

RIDTs have generally not performed well in detecting novel and variant influenza A virus strains. Human infections with these viruses have been a concern from both a public health perspective and the perspective of managing individual patients. Sofia and Veritor have demonstrated improved antigen detection capabilities in this area. An evaluation of FDA-cleared rapid antigen tests to detect 7 different clinical isolates of influenza A variant H3N2 [(H3N2)v] virus showed that only four of seven assays, including both Sofia and Veritor, detected all strains (23). Likewise, for the novel avian origin influenza A (H7N9) virus, the limits of detection (LODs) were lowest for Veritor and Sofia among six rapid antigen tests (24), and in serial respiratory specimens collected from a patient with H7N9 infection, Veritor was positive in more samples than four other rapid antigen assays, including Sofia and the direct fluorescent-antibody assay (DFA) (25). In another study, the LODs for influenza A (H7N9) virus for Veritor and Sofia were >1-log-fold dilution lower than those of five other rapid antigen tests as well as two commercially available multiplex molecular assays (26). The performance of both Sofia and Veritor in the detection of 2009/pH1N1 strains has also shown improvement, with sensitivities ranging from 80% to 100% in clinical studies (15, 17, 19, 20).

Clearly, these newer digital immunoassays (DIAs) are an upgrade over previous, commercially available lateral-flow immunoassays. To date, they have demonstrated enhanced sensitivity of detection for seasonal, novel, and variant influenza strains, and as a result, the negative predictive values have typically been >90% compared to those of RT-PCR in clinical studies. Generally speaking, the positive predictive values have been >90%, except in studies using Sofia performed around the time that specific lots were recalled, a technical issue apparently rectified by the manufacturer. The DIAs are competitively priced at or below the list prices of older RIDT platforms (∼$15 to $22 per test), and in an era when molecular testing for influenza viruses can cost upwards of $100 per test, the ability to utilize a rapid and accurate immunochromatographic assay of this kind comes at a premium.

In light of the recent FDA proposal to increase the classification of RIDTs from class I to class II devices subject to special control and performance standards (27), it will be of interest to learn how many different RIDTs remain commercially available in the next few years. If adopted, this new rule would (i) identify the minimum acceptable performance criteria, (ii) identify the appropriate comparator for establishing the performance of new assays, and (iii) call for mandatory annual analytical reactivity testing of contemporary influenza strains, including testing of newly emerging strains that pose a danger of public health emergency. For devices to be cleared for marketing and to remain on the market, the performance criteria listed in Table 1 would be required, standards that few immunoassays currently fulfill. Even the DIAs described herein have widely varying performance characteristics depending on the patient population being tested and the type of specimen collected. It may be that FDA clearance of certain RIDTs becomes limited to only certain age groups and sample types. For the most part, the DIAs appear to consistently meet the criteria and may be most useful when nasal wash/aspirate or NP swabs from symptomatic pediatric patients are used for testing (13, 15, 17, 19), although the performance of Sofia in some cases was skewed around the time of the recall (21). Undoubtedly, the landscape of rapid influenza virus testing will be reshaped in the near future.

TABLE 1.

FDA-proposed minimal performance criteria for RIDTsa

Comparator RIDT characteristic % point estimate (lower 95% CI) for:
Influenza A virus Influenza B virus
Viral culture Sensitivity ≥90 (≥80) ≥80 (≥70)
Specificity ≥95 (≥90) ≥95 (≥90)
Molecular assay Sensitivity ≥80 (≥70) ≥80 (≥70)
Specificity ≥95 (≥90) ≥95 (≥90)
a

See reference 27. CI, confidence interval.

Since their introduction in the 1990s, RIDTs have been valued for their ease of use, quick time to a result, and high positive predictive value during periods of higher prevalence. However, the poor sensitivity of many of these tests has been concerning, since the misdiagnosis of influenza infection can have potentially serious consequences. As laboratory experts, we have an obligation to provide accurate tests results, and implementation of RIDTs that meet the new FDA criteria should provide clinicians with reliable diagnostic information, reduce the likelihood of false-negative results, and enable effective infection control and public health responses during influenza outbreaks.

James J. Dunn

REFERENCES

  • 1.Babcock HM, Merz LR, Dubberke ER, Fraser VJ. 2008. Case-control study of clinical features of influenza in hospitalized patients. Infect Control Hosp Epidemiol 29:921–926. doi: 10.1086/590663. [DOI] [PubMed] [Google Scholar]
  • 2.Heinonen S, Peltola V, Silvennoinen H, Vahlberg T, Heikkinen T. 2012. Signs and symptoms predicting influenza in children: a matched case-control analysis of prospectively collected clinical data. Eur J Clin Microbiol Infect Dis 31:1569–1574. doi: 10.1007/s10096-011-1479-4. [DOI] [PubMed] [Google Scholar]
  • 3.Poehling KA, Edwards KM, Weinberg GA, Szilagyi P, Staat MA, Iwane MK, Bridges CB, Grijalva CG, Zhu Y, Bernstein DI, Herrera G, Erdman D, Hall CB, Seither R, Griffin MR. 2006. The underrecognized burden of influenza in young children. N Engl J Med 355:31–40. doi: 10.1056/NEJMoa054869. [DOI] [PubMed] [Google Scholar]
  • 4.Harper SA, Bradley JS, Englund JA, File TM, Gravenstein S, Hayden FG, McGeer AJ, Neuzil KM, Pavia AT, Tapper ML, Uyeki M, Zimmerman RK. 2009. Seasonal influenza in adults and children—diagnosis, treatment, chemoprophylaxis, and institutional outbreak management: clinical practice guidelines of the Infectious Diseases Society of America. Clin Infect Dis 48:1003–1032. doi: 10.1086/598513. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Williams LO, Kupka NJ, Schmaltz SP, Barrett S, Uyeki TM, Jernigan DB. 2014. Rapid influenza diagnostic test use and antiviral prescriptions in outpatient settings pre- and post-2009 H1N1 pandemic. J Clin Virol 60:27–33. doi: 10.1016/j.jcv.2014.01.016. [DOI] [PubMed] [Google Scholar]
  • 6.Chartrand C, Leeflang M, Minion J, Brewer T, Pai M. 2012. Accuracy of rapid influenza diagnostic tests. Ann Intern Med 156:500–511. doi: 10.7326/0003-4819-156-7-201204030-00403. [DOI] [PubMed] [Google Scholar]
  • 7.Petrozzino JJ, Smith C, Atkinson MJ. 2010. Rapid diagnostic testing for seasonal influenza: an evidence-based review and comparison with unaided clinical diagnosis. J Emerg Med 39:476–490. doi: 10.1016/j.jemermed.2009.11.031. [DOI] [PubMed] [Google Scholar]
  • 8.Babin SM, Hsieh YH, Rothman RE, Gaydos CA. 2011. A meta-analysis of point-of-care laboratory tests in the diagnosis of novel 2009 swine-lineage pandemic influenza A (H1N1). Diagn Microbiol Infect Dis 69:410–418. doi: 10.1016/j.diagmicrobio.2010.10.009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Ginocchio CC, Zhang F, Manji R, Arora S, Bornfreund M, Falk L, Lotlikar M, Kowerska M, Becker G, Korologos D, de Geronimo M, Crawford JM. 2009. Evaluation of multiple test methods for the detection of the novel 2009 influenza A(H1N1) during the New York City outbreak. J Clin Virol 45:191–195. doi: 10.1016/j.jcv.2009.06.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Chu H, Lofgren ET, Halloran ME, Kuan PF, Hudgens M, Cole SR. 2012. Performance of rapid influenza H1N1 diagnostic tests: a meta-analysis. Influenza Other Respir Viruses 6:80–86. doi: 10.1111/j.1750-2659.2011.00284.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.World Health Organization. 2010. Use of influenza rapid diagnostic tests. World Health Organization, Geneva, Switzerland: http://whqlibdoc.who.int/publications/2010/9789241599283_eng.pdf. [Google Scholar]
  • 12.Centers for Disease Control and Prevention. Guidance for clinicians on the use of rapid influenza diagnostic tests. Centers for Disease Control and Prevention, Atlanta, GA: http://www.cdc.gov/flu/professionals/diagnosis/clinician_guidance_ridt.htm. [Google Scholar]
  • 13.Lewandrowski K, Tamerius J, Menegus M, Olivo PD, Lollar R, Lee-Lewandrowski E. 2013. Detection of influenza A and B viruses with the Sofia analyzer. Am J Clin Pathol 139:684–689. doi: 10.1309/AJCP7ZTLJCP3LLMA. [DOI] [PubMed] [Google Scholar]
  • 14.Lee CK, Cho CH, Woo MK, Nyeck AE, Lim CS, Kim WJ. 2012. Evaluation of Sofia fluorescent immunoassay for influenza A/B virus. J Clin Virol 55:239–243. doi: 10.1016/j.jcv.2012.07.008. [DOI] [PubMed] [Google Scholar]
  • 15.Dunn J, Obuekwe J, Baun T, Rogers J, Patel T, Snow L. 2014. Prompt detection of influenza A and B viruses using the BD Veritor™ System Flu A+B, Quidel® Sofia® Influenza A+B FIA, and Alere BinaxNOW® Influenza A&B compared to real-time reverse transcription-polymerase chain reaction (RT-PCR). Diagn Microbiol Infect Dis 79:10–13. doi: 10.1016/j.diagmicrobio.2014.01.018. [DOI] [PubMed] [Google Scholar]
  • 16.Olsen SJ, Kittikraisak W, Fernandez S, Suntarattiwong P, Chotpitayasunondh T. 2014. Challenges with new rapid influenza diagnostic tests. Pediatr Infect Dis J 33:117–118. doi: 10.1097/INF.0000000000000089. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Hassan F, Nguyen A, Formanek A, Bell J, Selvarangan R. 2014. Comparison of the BD Veritor System for Flu A+B with Alere BinaxNOW influenza A&B card for detection of influenza A and B viruses in respiratory specimens from pediatric patients. J Clin Microbiol 52:906–910. doi: 10.1128/JCM.02484-13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Leonardi GP, Wilson AM, Zuretti AR. 2013. Comparison of conventional lateral-flow assays and a new fluorescent immunoassay to detect influenza viruses. J Virol Methods 189:379–382. doi: 10.1016/j.jviromet.2013.02.008. [DOI] [PubMed] [Google Scholar]
  • 19.Rath B, Tief F, Obermeier P, Tuerk E, Karsch K, Muehlhans S, Adamou E, Duwe S, Schweiger B. 2012. Early detection of influenza A and B infection in infants and children using conventional and fluorescence-based rapid testing. J Clin Virol 55:329–333. doi: 10.1016/j.jcv.2012.08.002. [DOI] [PubMed] [Google Scholar]
  • 20.Nam MH, Jamg JW, Lee JH, Cho CH, Lim CS, Kim WJ. 2014. Clinical performance evaluation of the BD Veritor System Flu A+B assay. J Virol Methods 204:86–90. doi: 10.1016/j.jviromet.2014.04.009. [DOI] [PubMed] [Google Scholar]
  • 21.US Food and Drug Administration. 2013. Class 2 device recall Sofia influenza A B FIA, kit 20218. Food and Drug Administration, Silver Spring, MD: http://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfRes/res.cfm?ID=115438. [Google Scholar]
  • 22.Peters TR, Blakeney E, Vannoy L, Poehling KA. 2013. Evaluation of the limit of detection of the BD Veritor™ system flu A+B test and two rapid influenza detection tests for influenza virus. Diagn Microbiol Infect Dis 75:200–202. doi: 10.1016/j.diagmicrobio.2012.11.004. [DOI] [PubMed] [Google Scholar]
  • 23.Centers for Disease Control and Prevention. 2012. Evaluation of rapid influenza diagnostic tests for influenza A (H3N2)v virus and updated case count—United States, 2012. MMWR Morb Mortal Wkly Rep 61:619–621. [PubMed] [Google Scholar]
  • 24.Baas C, Barr IG, Fouchier Kelso RA A, Hurt AC. 2013. A comparison of rapid point-of-care tests for the detection of avian influenza A(H7N9) virus, 2013. Euro Surveill 18:pii=20487 http://www.eurosurveillance.org/ViewArticle.aspx?ArticleId=20487. [PubMed] [Google Scholar]
  • 25.Chan K, To K, Chan J, Li C, Chan K, Chen H, Ho P, Yuen K. 2014. Assessment of antigen and molecular tests with serial specimens from a patient with influenza A (H7N9) infection. J Clin Microbiol 52:2272–2274. doi: 10.1128/JCM.00446-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Chan K, To K, Chan J, Li C, Chen H, Yuen K. 2013. Analytical sensitivity of seven point-of-care influenza virus detection tests and two molecular tests for detection of avian origin H7N9 and swine origin H3N2 variant influenza A viruses. J Clin Microbiol 51:3160–3161. doi: 10.1128/JCM.01222-13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.US Department of Health and Human Services. 2014. Microbiology devices; reclassification of influenza virus antigen detection test systems intended for use directly with clinical specimens. 21 CFR Part 866. https://www.federalregister.gov/articles/2014/05/22/2014-11635/microbiology-devices-reclassification-of-influenza-virus-antigen-detection-test-systems-intended-for. [PubMed]
J Clin Microbiol. 2015 May 14;53(6):1790–1796.

COUNTERPOINT

The use of rapid influenza A/B direct tests (RIDTs) versus nucleic acid amplification tests (NAATs) was first debated in the inaugural January 2010 Point-Counterpoint (1), in response to reports of poor performance of RIDTs during the 2009 influenza A H1N1 pandemic (pH1N1) (2, 3). Since 2009, the number of U.S. Food and Drug Administration-cleared RIDTs has risen to 16 (4). Advantages of the newer RIDTs include improved yet variable reactivity to circulating influenza strains and detection technologies with the potential to bridge the performance gap between RIDTs and NAATs.

Are the newer RIDTs an improvement over the older RIDTs, and are they more comparable to NAATs? A 2012 evaluation of 11 RIDTs by the U.S. Centers for Disease Control and Prevention (CDC; Atlanta, GA) using dilutions of 23 circulating influenza A and B virus strains demonstrated that the number of positive tests varied by influenza type (A or B) and influenza A subtype (5). Overall, there were no performance differences with influenza B virus lineages. Conversely, there was high variability in the analytical results of the influenza A test among the strains tested, especially at lower viral concentrations. Additional studies have shown that RIDT clinical sensitivity is less optimal for detection of pH1N1 (55.8%) than for detection of H3N2 (71.0%) and the previous seasonal H1N1 strain (69.4%) due to their lower virus burden rather than a diminished capacity of the test to detect strains (6, 7). A combination of lower analytical and clinical sensitivities can have a significant impact on RIDT performance.

Two new immunofluorescence assays, the Sofia system for influenza A and B fluorescence immunoassay (Quidel, San Diego, CA) and the Veritor System for Flu A+B (BD Diagnostics, Sparks, MD), with automated reading, were developed to enhance performance. The assays were designed to detect the current strains influenza virus B/Victoria/Yamagata, pH1N1, and influenza virus A (H3N2) and a variety of influenza A subtypes, including variations of H2 to H9, H11 to H14, and H16, depending on the assay. Clinical trial performance data varied by specimen type and age of the patients tested (8, 9). Compared to viral culture, Sofia demonstrated (in groups with ≥10 samples) sensitivities for influenza A virus detection ranging from 78% (nasal swabs [NS]; patient ages, 22 to 59 years) to 99% (nasopharyngeal wash specimens/aspirates [NPW/A]; patient age, <6 years) and for influenza B virus detection ranging from 73% (NS; patient ages, 22 to 59 years) to 94% (NP swabs [NPS]; patient ages, 6 to 21 years) (8). Compared to values for an FDA-cleared RT-PCR assay in the prospective arm of the Veritor clinical trial study, the percentages of positive agreement for influenza A ranged from 81.3% (NPS) to 83% (NPW/A) and for influenza B ranged from 81.3% (NPS) to 85.6% (NPW/A) (9). Both assays demonstrated high specificities for influenza A and B. However, the overall performance of both assays was heavily weighted by patient age, a factor known to effect assay performance, since younger patients shed larger amounts of virus and for longer periods, whereas geriatric patients show the lowest viral titers. In total, 96% of samples tested with Sophia and 77% tested with Veritor were from patients <22 years old, and only 1% of the samples tested by Sophia and 0.1% tested by Veritor were from patients ≥60 years. Therefore, assay performance was not well established for an advanced age group that has the highest rates of influenza-associated morbidity and mortality. Gao et al. evaluated the effect of age on the clinical sensitivity of other RIDTs in comparison to RT-PCR and demonstrated that increasing age was negatively associated with RIDT sensitivity (<2 years, 85.7%; 2 to 39 years, 60.3%; ≥40 years, 33.3%) (10).

Independent studies compared the performances of Sophia and Veritor to those of other RIDTs, using either NAAT or NAAT plus viral culture as the reference methods (1116). Sophia testing yielded sensitivities for influenza A and B virus detection ranging from 78.9% to 95.8% and 62.5% to 98.1%, respectively (1114), whereas nonfluorescent RIDTs demonstrated sensitivities for influenza A and B virus detection ranging from 54.8% to 79.2% and 40.7% to 97.3%, respectively. Veritor sensitivities for influenza A and B virus detection ranged from 72.0% to 93.8% and 69.3% to 94.2%, respectively, whereas nonfluorescent RIDTs demonstrated sensitivities for influenza A and B virus detection ranging from 56.0% to 85.7% and 57.3% to 80.8%, respectively (1416). The sensitivities of Sophia and Veritor were better than those of some RIDTs, were inferior to those of NAATs, and varied depending on the influenza strain, specimen type, and patient population, with pediatric studies yielding the best results.

Another CDC study evaluated seven RIDTs, including Sophia and Veritor, in comparison to the CDC Flu rRT-PCR Dx panel (CDC) for detection of an influenza A (H3N2) virus variant (H3N2v) (17). Four of seven RIDTs (Directigen EZ Flu A+B [BD], Sofia, Veritor, and Xpect Flu A&B [Remel, Lenexa, KS]) detected all seven H3N2v strains, BinaxNOW influenza A&B (Alere, Waltham, MA) detected five of seven strains, Quick View Influenza A+B test (Quidel) detected three of seven strains, and SAS FluAlert A&B (SA Scientific, San Antonio, TX) detected only one strain. This study highlights the variable performances of RIDTs for emerging strains, indicating that several viruses and subtypes should be evaluated with each RIDT on a regular basis. Currently, RIDT manufacturers are not required to reevaluate the performance of their assays once they are FDA cleared. However, the emergence of novel or variant influenza strains requires a more stringent oversight of RIDTs to provide reasonable assurance of the safety and effectiveness of RIDTs. Consequently, the FDA has proposed that all RIDTs regulated under §866.3330 of the Code of Federal Regulations (18) be reclassified as class II with special controls. This would include mandatory annual analytical reactivity testing of contemporary influenza strains, including newly emerging strains that pose a danger to public health.

Despite the suboptimal performance of RIDTs, studies have shown that RIDTs improve seasonal influenza diagnostic sensitivity above unaided clinical diagnosis and affects clinical decision-making, thereby reducing diagnostic testing, antibiotic use, and emergency department utilization while increasing antiviral prescription rates (19).

If these benefits are realized for RIDTs, why would we not want to use more sensitive and specific NAATs in lieu of RIDTs to further enhance the above-named benefits? Currently, there are 26 FDA-cleared NAATs for the detection of influenza A and B (4). Please refer to Table 1 in the CDC reference (20) for specifics on each assay and reviews (21, 22), as a comprehensive overview is beyond the scope of this commentary. Eleven assays detect influenza A and B, without influenza A virus subtyping, eight assays detect influenza A and B, with influenza A virus subtyping, six detect just influenza A, with subtyping, and one detects influenza B virus, with subtyping. Ten assays detect additional respiratory viruses (1 to 15), and one assay detects three additional bacterial pathogens, allowing for a more comprehensive syndromic diagnostic screening. NPS, preferably flocked swabs, since they have been demonstrated to collect more cellular material, are approved for all tests, with various additional sample types, including NS, NPW/A, tracheal aspirates, and lower respiratory tract specimens (LRTS), approved depending on the test (20). Virus detection in LRTS is essential in severe cases of pneumonia since upper respiratory tract samples may test negative. NAAT clinical performance has been consistently better than those of RIDTs, including those developed after 2009, when used in a variety of clinical settings and patient populations (2, 3, 9, 10, 21, 22). NAAT sensitivities range from 90.5% to 97.6%, thereby improving the diagnosis, especially in older patients and if suboptimal samples are collected. The majority of studies demonstrate a specificity close to 100% (21, 22), providing confidence in a positive result, especially outside the influenza season.

Considering the limitations of RIDTs and the better performance of NAATs, I pose several questions relating to test features which are consistently used to promote the use of RIDTs. These include questions related to ease of use, rapid turnaround time, and low cost compared to those of NAATs. Finally, I question the argument that the detection of influenza A or B is sufficient since there no treatments for other respiratory viruses and their identification does not necessarily change patient management.

Are RIDTs easier to perform than all NAATs? NAATs vary in format (although all include steps to extract, amplify, and detect nucleic acid), ease of use (hands-on times, 2 min to several hours), and times to results (<30 min to 8 or more hours). The only FDA CLIA-waived NAAT, the Alere iNAT FluA/B (Alere, Waltham, MA), takes approximately 15 min to perform. The Liat Influenza A/B assay (IQuum, Marlborough, MA) has a <30-min test time, the Xpert Flu assay (Cepheid, Sunnyvale, CA) and the FilmArray RP assay (BioFire/bioMérieux, Salt Lake City, UT) provide results in approximately 1 h, and the Simplexa FluA/B+RSV Direct assay (Focus Diagnostics, 3M, Cyprus, CA) provides results in <2 h. All four NAATs, like RIDTs, require minimal hands-on time (one step and less than 2 min to perform), and they are listed as being of moderate complexity. The Verigene respiratory virus nucleic acid test and Verigene respiratory virus plus nucleic acid test (Nanosphere, Northbrook, IL) are also listed as being of moderate complexity, with results in 3.5 h and a minimal 2-step process (addition of sample and reading of results).

Is a 15-min RIDT with a 72% to 85% sensitivity better for facilitating rapid patient care than a 98%-sensitive NAAT with a 15- to 60-min time to results? A very minimal gain in the time to results cannot justify an incorrect or missed diagnosis. Diagnostic accuracy has been shown to reduce ancillary testing, assist in decisions to admit or not admit, and facilitate appropriate therapeutic decisions (antiviral, antibiotic, or none). Incorrect treatment is expensive and can result in greater risk of toxicity, adverse effects, and development of drug resistance (2123). Optimal test performance is essential for infection control to reduce the risk of nosocomial outbreaks and their associated morbidity and mortality and to reduce the significant costs to health care systems, including health care worker (HCW) absenteeism, medication costs, and staff time spent in outbreak control. Yassi et al. investigated an influenza A outbreak that resulted in infection of 17 HCWs (34% servicing the ward) and 16 chronic geriatric patients (47% of patients), of whom 3 died (24). Another study relating to an influenza A H3N2 outbreak among geriatric patients and HCWs identified six nosocomial infections and three independent clusters, with an HCW source identified in at least two (25). Nosocomial outbreaks of influenza/parainfluenza have been significantly associated (P = <0.001) with complete closure of medical units (26). Outbreaks are especially dangerous for immunocompromised patients, for whom there is a risk of prolonged shedding and development of oseltamivir resistance. An outbreak in a hematologic/oncologic unit resulted in 23/76 patients (32%) developing nosocomial influenza, three patients being identified with oseltamivir-resistant virus, and 3 patients expiring (27). An outbreak investigation in the Netherlands found nosocomial transmission of oseltamivir-resistant virus, leading to three cases of pneumonia and two mortalities (28).

Is the detection of just influenza A and/or B by RIDTs sufficient in a hospitalized patient, especially since antiviral treatment is available only for influenza? Despite the fact that treatment is limited to influenza, identification of other respiratory viruses as the cause of disease will reduce unnecessary use of antivirals and antibiotics, promoting good stewardship practices. A study by Rodgers et al. found that use of a rapid comprehensive respiratory panel that detected 21 viruses and 3 bacterial pathogens resulted is significant improvements by reducing the mean time to results, increasing the percentage of patients in the emergency department (ED) with a result, and shortening the duration of antibiotic use if results were provided within 4 h (23). A positive result decreased inpatient length of stay and time in isolation. Additionally, one must consider the potential for mixed viral infections and the impact of cohorting during a busy respiratory season. One study identified coinfections with other respiratory viruses in 7.2% of patients with influenza (29), and another study found that 3.39% of all specimens and 9.55% of all positive specimens contained more than one virus, with influenza virus present in 52.17% of the mixed infections (30). A secondary viral infection in an already-compromised patient can result in severe outcomes.

Most NAATs can subtype influenza A virus strains. This is important for surveillance and to identify potential new variants. Testing positive for influenza A matrix but negative for the current hemagglutinin types would result in an “unsubtypeable” strain, as was the case with pH1N1 (31). Subtyping may be relevant for treatment, as different subtypes can have different antiviral drug susceptibilities, as noted with prepandemic seasonal influenza A(H1N1), which was oseltamivir resistant whereas the cocirculating H3N2 strain was susceptible. Although in the 2013-2014 season, 98.2% of pH1N1, 100% of H3N2, and 100% of influenza B virus strains were oseltamivir susceptible, pH1N1 oseltamivir resistance occurred, and under these conditions, if resistant subtypes predominate, subtyping would be essential (32).

Is performing an inexpensive RIDT (average cost, $10 to $15) really less expensive than performing NAATs? The cost of a NAAT can range from approximately $40 to $150 or more. A RIDT would be less expensive in an outreach setting if no further testing was required. However, for inpatients or for persons at risk for severe disease, in accordance with CDC guidelines, negative sample results for a patient suspected of having influenza should be reflexed to more-sensitive methodologies, such as viral culture and NAAT, with significant additional testing costs and a delay in the time to results. Delays in results lead to additional ancillary testing and prolonged hospital stays. The study by Rodgers et al. determined that a positive viral test result decreased the inpatient length of stay and time in isolation (23). Cost savings were estimated at $231 in hospital costs, $17 in antibiotic usage per patient, and $178 per patient for testing if a panel of various PCRs had to be run in lieu of the single rapid comprehensive respiratory panel. The extra cost of a NAAT can be justified by the prevention of nosocomial outbreaks with significant impact on hospital finances, particularly if an entire ward must be closed, HCWs must be furloughed, and patients must receive antiviral prophylaxis.

Have RIDTs, compared to NAATs, improved to an extent that we should support their use alone or as a primary screening test in all clinical settings? Better-performing RIDTs could be used under circumstances of financial constraint, in circumstances where there are no other options, in appropriate outpatient and ED discharge settings for specific patient populations, such as nonimmunocompromised pediatric patients, and when only influenza is documented in the community. Even in these settings, clinicians must be aware that RIDT performances are not equivalent among all tests for all strains, and manufacturer claims do not necessarily reflect real performance that correlates with sample type, time of sample collection, and virus burden (33). Despite the known poor sensitivities of some RIDTs or a lack of knowledge about their performance in community settings, Williams et al. found that clinicians in outpatient settings often relied on RIDTs for deciding antiviral therapy rather than following CDC recommendations for patients at higher risk for complications (34). Many clinicians do not follow the CDC recommendations for reflex testing to confirm positive results when the prevalence of influenza is low or when the test is negative for a patient highly suspected of having influenza and/or with an increased risk for severe disease when disease prevalence is high. For hospitalized patients, only NAATs should be performed, preferably with a comprehensive respiratory panel. Additionally, monitoring of critically ill patients using a quantitative NAAT has been shown to help evaluate responses to antiviral therapy. Performing a one-step NAAT is as simple as performing a RIDT, can be performed on demand 24 h a day/7 days per week, in all-size laboratories, with minimal technical expertise required. A missed diagnosis can have a significant impact on patient clinical outcome and a financial impact on laboratory services and the utilization of health care resources.

Christine C. Ginocchio

ACKNOWLEDGMENT

I am an employee of bioMérieux.

REFERENCES

  • 1.Welch DF, Ginocchio CC. 2010. Role of rapid immunochromatographic antigen testing in diagnosis of influenza A virus 2009 H1N1 infection. J Clin Microbiol 48:22–25. doi: 10.1128/JCM.02268-09. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Centers for Disease Control and Prevention. 2009. Evaluation of rapid influenza diagnostic tests for detection of novel influenza A (H1N1) virus—United States. MMWR Morb Mortal Wkly Rep 58:826–829. [PubMed] [Google Scholar]
  • 3.Ginocchio CC, Zhang F, Manji R, Arora S, Bornfreund M, Falk L, Lotlikar M, Kowerska M, Becker G, Korologos D, de Geronimo M, Crawford JM. 2009. Evaluation of multiple test methods for the detection of the novel 2009 influenza A (H1N1) during the New York City outbreak. J Clin Virol 45:191–195. doi: 10.1016/j.jcv.2009.06.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.US Food and Drug Administration. 24 February 2015. Premarket notification 510(k). Food and Drug Administration, Silver Spring, MD: http://www.fda.gov/MedicalDevices/DeviceRegulationandGuidance/HowtoMarketYourDevice/PremarketSubmissions/PremarketNotification510k/default.htm. [Google Scholar]
  • 5.Beck E, Fan J, Hendrickson K, Kumar S, Shively R, Kramp W, Villanueva J, Jernigan D, Kilmov A, Chen LM, Donis R, Williams T, Pirkle J, Barr J. 2012. Evaluation of 11 commercially available rapid influenza diagnostic tests—United States, 2011–2012. MMWR Morb Mortal Wkly Rep 61:873–876. [PubMed] [Google Scholar]
  • 6.Chan KH, Chan KM, Ho YL, Lam YP, Tong HL, Poon LL, Cowling BJ, Peiris JS. 2012. Quantitative analysis of four rapid antigen assays for detection of pandemic H1N1 2009 compared with seasonal H1N1 and H3N2 influenza A viruses on nasopharyngeal aspirates from patients with influenza. J Virol Methods 186:184–188. doi: 10.1016/j.jviromet.2012.09.001. [DOI] [PubMed] [Google Scholar]
  • 7.Yang JR, Lo J, Ho YL, Wu HS, Liu MT. 2011. Pandemic H1N1 and seasonal H3N2 influenza infection in the human population show different distributions of viral loads, which substantially affect the performance of rapid influenza tests. Virus Res 155:163–167. doi: 10.1016/j.virusres.2010.09.015. [DOI] [PubMed] [Google Scholar]
  • 8.Quidel Corp. 2014. Sophia influenza A+B FIA, package insert 1219103EN01. Quidel Corp, San Diego, CA. [Google Scholar]
  • 9.Becton, Dickinson and Company. 2012. BD Veritor System for rapid detection of flu A+B 8087667(03). Becton, Dickinson and Company, Sparks, MD. [Google Scholar]
  • 10.Gao F, Loring C, Laviolette M, Bolton D, Daly ER, Bean C. 2012. Detection of 2009 pandemic influenza A(H1N1) virus infection in different age groups by using rapid influenza diagnostic tests. Influenza Other Respir Viruses 6:e30–e34. doi: 10.1111/j.1750-2659.2011.00313.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Lee CK, Cho CH, Woo MK, Nyeck AE, Lim CS, Kim WJ. 2012. Evaluation of Sofia fluorescent immunoassay analyzer for influenza A/B virus. J Clin Virol 55:239–243. doi: 10.1016/j.jcv.2012.07.008. [DOI] [PubMed] [Google Scholar]
  • 12.Rath B, Tief F, Obermeier P, Tuerk E, Karsch K, Muehlhans S, Adamou E, Duwe S, Schweiger B. 2012. Early detection of influenza A and B infection in infants and children using conventional and fluorescence-based rapid testing. J Clin Virol 55:329–333. doi: 10.1016/j.jcv.2012.08.002. [DOI] [PubMed] [Google Scholar]
  • 13.Leonardi GP, Wilson AM, Zuretti AR. 2013. Comparison of conventional lateral-flow assays and a new fluorescent immunoassay to detect influenza viruses. J Virol Methods 189:379–382. doi: 10.1016/j.jviromet.2013.02.008. [DOI] [PubMed] [Google Scholar]
  • 14.Dunn J, Obukewe J, Baun T, Rogers J, Patel T, Snow L. 2014. Prompt detection of influenza A and B viruses using the BD Veritor System Flu A+B, Quidel Sofia Influenza A+B FIA, and Alere BinaxNOW® influenza A&B compared to real-time reverse transcription-polymerase chain reaction (RT-PCR). Diagn Microbiol Infect Dis 79:10–13. doi: 10.1016/j.diagmicrobio.2014.01.018. [DOI] [PubMed] [Google Scholar]
  • 15.Hassan F, Nguyen A, Bell JJ, Selvarangan R. 2014. Comparison of the BD Veritor System for Flu A+B with the Alere BinaxNOW influenza A&B card for detection of influenza A and B viruses in respiratory specimens from pediatric patients. J Clin Microbiol 52:906–910. doi: 10.1128/JCM.02484-13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Nam MH, Jang JW, Lee JH, Cho CH, Lim CS, Kim WJ. 2014. Clinical performance evaluation of the BD Veritor System Flu A+B assay. J Virol Methods 204:86–90. doi: 10.1016/j.jviromet.2014.04.009. [DOI] [PubMed] [Google Scholar]
  • 17.Balish A, Garten R, Klimov A, Villanueva J. 2013. Analytical detection of influenza A(H3N2)v and other A variant viruses from the USA by rapid influenza diagnostic tests. Influenza Other Respir Viruses 7:491–496. doi: 10.1111/irv.12017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Code of Federal Regulations. 2001. Title 21. Food and drugs. Chapter I. Food and Drug Administration. Subchapter H. Medical devices. Part 866. Immunology and microbiology devices. Section 866.3330. Influenza virus serological reagents. 21 CFR 866.3330 http://www.accessdata.fda.gov/SCRIPTs/cdrh/cfdocs/cfcfr/cfrsearch.cfm?fr=866.3330.
  • 19.Petrozzino JJ, Smith C, Atkinson MJ. 2010. Rapid diagnostic testing for seasonal influenza: an evidence-based review and comparison with unaided clinical diagnosis. J Emerg Med 39:476–490. doi: 10.1016/j.jemermed.2009.11.031. [DOI] [PubMed] [Google Scholar]
  • 20.Centers for Disease Control and Prevention. 2014. Guidance for clinicians on the use of RT-PCR and other molecular assays for diagnosis of influenza virus infection. Centers for Disease Control and Protection, Atlanta, GA: http://www.cdc.gov/flu/professionals/diagnosis/molecular-assays.htm. [Google Scholar]
  • 21.Kumar S, Henrickson KJ. 2012. Update on influenza diagnostics: lessons from the novel H1N1 influenza A pandemic. Clin Microbiol Rev 25:344–361. doi: 10.1128/CMR.05016-11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Cheng VCC, To KKW, Tse H, Hung IFN, Yuen K-Y. 2012. Two years after pandemic influenza A/2009/H1 N1: what have we learned? Clin Microbiol Rev 25:223–263. doi: 10.1128/CMR.05012-11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Rodgers BB, Shankar P, Jerris RC, Kotzbauer D, Anderson EJ, Watson JR, O'Brian LA, Uwindatwa F, McNamara K, Bost JE. 25 August 2014. Impact of a rapid respiratory panel test on patient outcomes. Arch Pathol Lab Med doi: 10.5858/arpa.2014-0257-OA. [DOI] [PubMed] [Google Scholar]
  • 24.Yassi A, McGill M, Holton D, Nicolle L. 1993. Morbidity, cost and role of health care worker transmission in an influenza outbreak in a tertiary care hospital. Can J Infect Dis 4:52–56. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Eibach D, Casalegno JS, Bouscambert M, Benet T, Regis C, Comte B, Kim BA, Vanhems P, Lina B. 2014. Routes of transmission during a nosocomial influenza A(H3N2) outbreak among geriatric patients and healthcare workers. J Hosp Infect 86:188–193. doi: 10.1016/j.jhin.2013.11.009. [DOI] [PubMed] [Google Scholar]
  • 26.Hansen S, Stamm-Balderjan S, Zuschneid I, Behnke M, Ruden H, Vonberg RP, Gastmeier P. 2007. Closure of medical departments during nosocomial outbreaks: data from a systematic analysis of the literature. J Hosp Infect 65:348–353. doi: 10.1016/j.jhin.2006.12.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Pollara CP, Piccinelli G, Rossi G, Perandin F, Corbellini S, De Tomasi D, Bonfanti C. 2013. Nosocomial outbreak of the pandemic influenza A (H1N1) 2009 in critical hematologic patients during seasonal influenza 2010–2011: detection of oseltamivir resistant variant viruses. BMC Infect Dis 13:127–133. doi: 10.1186/1471-2334-13-127. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Gooskens J, Jonges M, Claas ECJ, Meijer A, van den Broek PJ, Kroes AC. 2009. Morbidity and mortality associated with nosocomial transmission of oseltamivir-resistant influenza A (H1N1). JAMA 301:1042–1046. doi: 10.1001/jama.2009.297. [DOI] [PubMed] [Google Scholar]
  • 29.Feng L, Li Z, Zhao S, Nair H, Lai S, Xu W, Li M, Wu J, Ren L, Liu W, Yuan Z, Chen YU, Wang X, Zhao Z, Zhang H, Li F, Ye X, Li S, Feikin D, Yu H, Yang W. 2014. Viral etiologies of hospitalized acute lower respiratory infection patients in China, 209-2013. PLoS One 9(6):e99419. doi: 10.1371/journal.pone.0099419. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Zhang D, He Z, Xu L, Zhu X, Wu J, Wen W, Zheng Y, Deng Y, Chen J, Hu Y, Li M. 2014. Epidemiology characteristics of respiratory viruses found in children and adults with respiratory tract infections in southern China. Int J Infect Dis 25:159–164. doi: 10.1016/j.ijid.2014.02.019. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Ginocchio CC, St George K. 2009. Likelihood that an unsubtypeable influenza A result in the Luminex xTAG respiratory virus panel is indicative of novel A/H1N1 (swine-like) influenza. J Clin Microbiol 47:2347–2348. doi: 10.1128/JCM.01027-09. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Centers for Disease Control and Prevention. 8 January 2015. Influenza antiviral drug resistance. Centers for Disease Control and Prevention, Atlanta, GA: http://www.cdc.gov/flu/about/qa/antiviralresistance.htm. [Google Scholar]
  • 33.Cheng PKC, Wong KKY, Mak GC, Wong AH, Ng AYY, Chow SYK, Lam RKH, Lau CS, Ng KC, Lim W. 2010. Performance of laboratory diagnostics for the detection of influenza A (H1N1)v virus as correlated with the time after symptom onset and viral load. J Clin Virol 47:182–185. doi: 10.1016/j.jcv.2009.11.022. [DOI] [PubMed] [Google Scholar]
  • 34.Williams LO, Kupka NJ, Schmaltz SP, Barrett S, Uyeki TM, Jerigan DB. 2014. Rapid influenza diagnostic use and antiviral prescriptions in outpatient settings pre- and post-2009 H1N1 pandemic. J Clin Virol 60:27–33. doi: 10.1016/j.jcv.2014.01.016. [DOI] [PubMed] [Google Scholar]
J Clin Microbiol. 2015 May 14;53(6):1790–1796.

SUMMARY

Points of agreement

Many of the points of agreement from the 2010 influenza Point-Counterpoint remain unchanged.

  • Influenza NAATs have sensitivities superior to those of RIDTs for influenza viruses. Negative RIDT results require confirmation by NAATs or culture.

  • RIDTs are best done in the pediatric population early in the disease course during periods of high influenza disease activity.

  • Rapid diagnostic tests by either RIDTs or NAATs reduce ancillary test utilization, inappropriate use of antibacterial agents, and appropriate use of influenza-specific antivirals.

New points of agreement

  • The new digital immunoassays for the detection of influenza A and B viruses have performances superior to those of other influenza immunochromatographic assays, with similar turnaround times. In particular, they can detect newly emergent influenza strains, such as influenza A 2009/pH1N1, novel H3N2 variants, and H7N9.

  • Diagnosis of respiratory infections in hospitalized and/or immunocompromised patients is best accomplished using NAAT panels, which can detect multiple viral and bacterial pathogens.

  • Newer NAATs have reduced turnaround times approaching those of the influenza DIAs and greater accuracy, which may obviate the use of DIAs.

Issues to be resolved

  • Studies of the influenza DIAs are needed in the geriatric age groups (>60 years old), the population of patients who suffer the highest morbidity and mortality from influenza.

  • Strict performance criteria for influenza RIDTs are likely to be instituted by the Food and Drug Administration. In particular, failure to meet annual performance standards based on currently circulating viral genotypes may result in some RIDTs being removed from the marketplace. Performance standards may also limit the patient populations or the specimen types that are approved for testing.

Peter H. Gilligan, Editor, Journal of Clinical Microbiology


Articles from Journal of Clinical Microbiology are provided here courtesy of American Society for Microbiology (ASM)

RESOURCES