Abstract
Context
In 2009, the Centers for Disease Control and Prevention completed migration of all 59 surveillance project areas (PAs) from the case-based HIV/AIDS Reporting System to the document-based Enhanced HIV/AIDS Reporting System.
Objectives
We conducted a PA-level assessment of Enhanced HIV/AIDS Reporting System process and outcome standards for HIV infection cases.
Design
Process standards were reported by PAs and outcome standards were calculated using standardized Centers for Disease Control and Prevention SAS code.
Setting
A total of 59 PAs including 50 US states, the District of Columbia, 6 separately funded cities (Chicago, Houston, Los Angeles County, New York City, Philadelphia, and San Francisco), and 2 territories (Puerto Rico and the Virgin Islands).
Participants
Cases diagnosed or reported to the PA surveillance system between January 1, 2011, and December 31, 2011, using data collected through December 2012.
Main Outcome Measures
Process standards for death ascertainment and intra- and interstate case de-duplication; outcome standards for completeness and timeliness of case reporting, data quality, intrastate duplication rate, risk factor ascertainment, and completeness of initial CD4 and viral load reporting.
Results
Fifty-five of 59 PAs (93%) reported linking cases to state vital records death certificates during 2012, 76% to the Social Security Death Master File, and 59% to the National Death Index. Seventy percent completed monthly intrastate, and 63% completed semiannual interstate de-duplication. Eighty-three percent met the 85% or more case ascertainment standard, and 92% met the 66% or more timeliness standard; 75% met the 97% or more data quality standard; all PAs met the 5% or less intrastate duplication rate; 41% met the 85% or more risk factor ascertainment standard; 90% met the 50% or more standard for initial CD4; and 93% met the same standard for viral load reporting. Overall, 7% of PAs met all 11 process and outcome standards.
Conclusions
Findings support the need for continued improvement in HIV surveillance activities and monitoring of system outcomes.
Keywords: acquired immunodeficiency syndrome, AIDS, evaluation, HIV, surveillance
In 2012, the US Centers for Disease Control and Prevention (CDC) released its vision for 21st-century public health surveillance and assessed the challenges associated with collecting and interpreting high-quality public health information to support prevention practices.1 Among the topics was the impact on data collection strategies of emerging technologies, barriers to data sharing, and the analytic challenges associated with linking multiple unique datasets. The CDC’s population-based National Human Immunodeficiency Virus (HIV) Surveillance System (NHSS) is affected by these challenges at different levels across individually funded project areas (PAs) and within multiple components of the system that include case level, incidence, perinatal, and molecular elements. The use of NHSS data to guide the allocation of HIV-prevention funding, its increasing use as a mechanism for prevention program development and assessment, and its use as a driver for public health action clearly indicate the need for system evaluation. This article summarizes an evaluation of the key process and outcome measures for HIV infection case-level surveillance at the national, state, local, and territorial levels and serves as a model for ongoing state and national evaluation.
The CDC began surveillance of AIDS in June 1981 and by late 1985 established the AIDS Reporting System (ARS) data collection tool at the local and state level. In August 1993, the CDC replaced ARS with the HIV/AIDS Reporting system (HARS), so PAs with integrated name-based reporting could maintain and report cases of HIV infection to the CDC irrespective of stage of disease. Personal identifiers such as social security number and name are not sent to the CDC for purposes of confidentiality. Both ARS and HARS were case-based systems (1 record per person), with limited numbers of variables available (eg, only 20 CD4 and 9 viral load (VL) tests could be entered) and no mechanism for capturing the source from which data originated. These limitations led to the development of the Enhanced HIV/AIDS Reporting System (eHARS), a document-based system that allows for an unlimited number of surveillance reports per person to populate demographic, risk factor, clinical care, laboratory result, and mortality variables. All 59 PAs completed migration to eHARS by July 2009, making the 2011 diagnosis year an optimal time for evaluation of data used to inform and equitably distribute federal HIV infection prevention and care funding across the United States.
Methods
Guidance from the CDC on evaluation of surveillance systems was issued in 2001 and recommended the assessment of 9 basic attributes: simplicity; flexibility; data quality; acceptability; sensitivity; predictive value positive; representativeness; timeliness; and stability.2 Prior publication of Guidance for National HIV Case Surveillance by the CDC in collaboration with the Council of State and Territorial Epidemiologists included minimum performance standards for 3 of those attributes: sensitivity; timeliness; and data quality.3 Operationalization of these attributes for the specific purpose of evaluating HIV case surveillance resulted in 3 process standards and 8 outcome standards.4 Minimum process standards require (1) annual linkage of HIV infection case reports with state vital statistics death certificates to support death ascertainment; (2) monthly intrastate de-duplication of case records; and (3) semiannual interstate de-duplication of case records requiring collaboration with other PAs.5–7 Outcome standards include quantitative measures of completeness and timeliness of case reporting, data quality, intrastate case duplication rate, risk factor ascertainment, and reporting of CD4 (count or percent) and VL test results measured within 3 months following initial diagnosis of HIV infection (hereafter referred to as the initial CD4 and VL), and the publication of an HIV infection surveillance report at the PA level (Table 1). Standardization of these measures across PAs enables CDC to provide performance feedback and improve the quality of NHSS data.
TABLE 1.
Outcome Standardsa for HIV Case Surveillance in the United States
| Outcome Standard | Definition |
|---|---|
| Completeness and timeliness of case reporting | ≥85% of the expected number of cases for a diagnosis year are reported and ≥66% are reported within 6 mo after diagnosis, assessed at 12 mo after the diagnosis year |
| Data quality | ≥97% of cases for a diagnosis year pass all standard data edits checks, assessed at 12 mo after the diagnosis year |
| Intrastate duplicates | ≤5% duplicate cases, assessed at 12 mo after the report year |
| Risk factor ascertainment | ≥85% of cases for a report year have sufficient risk factor information to be classified into a known HIV transmission category, assessed at 12 mo after the report year |
| CD4 reporting | ≥50% of cases 13 y or older for a diagnosis year have an initial CD4 test result, assessed at 12 mo after the diagnosis year |
| Viral load reporting | ≥50% of cases 13 y or older for a diagnosis year have an initial viral load test result, assessed at 12 mo after the diagnosis year |
| Data dissemination | Annual HIV infection surveillance report published by the project area |
Abbreviation: HIV, human immunodeficiency virus.
Standards listed are requirements that, when met, indicate a fully functioning surveillance system with high-quality data.
We evaluated the extent to which the PAs achieved the 3 process and 8 outcome standards for calendar year 2011 (ie, all cases of HIV infection either newly diagnosed or reported, based on the specific measure, between January 1, 2011, and December 31, 2011). Data reported to the PAs through the end of December 2012 were used for the evaluation to allow at least 12 months for reporting delays, case investigations, and data processing. Evaluation of each PA-included only cases who resided in that PA at the time of HIV infection diagnosis. The CDC provided standardized SAS (version 9.3) programs for PA application to ensure comparability of results while process measures were based on PA self-report. Project areas are required to submit their evaluation results to the CDC as a component of their annual progress reports (APR).
This study is exempt from institutional review board approval, as it reports only summary-level public health surveillance system evaluation data.
Process standards
The APRs received in March 2013 for the 2011 diagnosis/report year were reviewed for confirmation that 3 process standards were met. Death ascertainment requires evidence of no less than annual linkage of HIV infection case reports with state vital statistics death certificates,5 the Social Security Death Master File,8 and the National Death Index.9 In addition, to ensure the accuracy of case counts at the state and national levels, identification and resolution of intrastate duplicate cases are required monthly and interstate duplicate cases semiannually. Potential intrastate duplicate pairs are identified by matching last name soundex,10 date of birth, sex at birth, and country and state of residence at HIV or AIDS diagnosis within each PA and manually researched to determine whether they are duplicate records for a single person. Potential interstate duplicate pairs (ie, reported by different PAs) are identified by the CDC by matching on last name soundex, date of birth, and sex at birth. Newly reported diagnoses of HIV infection during the previous 6 months are compared with the national database, and potential duplicates are provided to each PA in a Routine Interstate Duplicate Review report. As with suspected intrastate pairs, all interstate pairs are researched and updated in the surveillance system to reflect the same person or different persons.
Completeness and timeliness of case reporting
Thirty-nine PAs used a 3-source log-linear capture-recapture model11 to estimate the completeness of reporting of persons newly diagnosed with HIV infection in 2011 and reported to the PA surveillance system by the end of 2012. The 3 sources, health care provider, laboratory, and other (eg, other public health databases such as sexually transmitted disease or hepatitis surveillance systems), represent the most common sources from which a diagnosis of HIV infection may be reported to the surveillance system. If “1” represents being reported by a source and “0” otherwise, each new diagnosis in 2011 was classified into 1 of the 7 cells 100, 010, 001, 110, 101, 011, and 111. On the basis of the observed frequency count in each of the 7 cells, the log-linear models estimated the number of new diagnoses in 2011 that were not reported by any of the 3 sources by the end of 2012, f̑000. The estimated completeness of reporting of persons newly diagnosed in 2011 by the end of 2012 was the sum of the 7 frequency counts divided by the sum of the 7 frequency counts plus f̑000. A more detailed description of using the 3-source capture-recapture method to evaluate the completeness of reporting HIV infection diagnoses can be found elsewhere.11 For these PAs, the estimated timeliness of reporting was calculated by dividing the number of HIV diagnoses in 2011 reported within 6 months following diagnosis among those who had been reported by the end of 2012 by the sum of the 7 frequency counts plus f̑000.
For the remaining 20 PAs, an alternative approach for measuring the completeness of reporting was used because, by the end of 2012, these areas had reported fewer than 30 diagnoses of HIV infection in 2011 or at least 1 of the 3 sources had reported less than 20% of the total number of HIV diagnoses in 2011.
The analytic method used examines the year of HIV diagnosis among persons newly reported to the PA from 2008 to 2012 and estimates the probability of being reported within 12 months after the diagnosis year using conditional probabilities that are estimated on the basis of historical data. The method assumes that all diagnoses were reported within 4 years after the diagnosis year. For these PAs, the estimated timeliness was derived as the product between the estimated conditional probability of being reported within 6 months following diagnosis, given that the diagnosis was reported within 1 year after the diagnosis year and the estimated probability of being reported within 1 year after the diagnosis year.
With significant level set at 0.10, a PA was considered as meeting the 85% or more completeness standard if the low bound of the 90% confidence interval (CI) for completeness was at least 85% and was not considered to have met the standard if the upper bound of the 90% CI fell below 85%. The PAs meeting neither of these lower- or upper-bound criteria were classified as undetermined. Similarly, a PA was considered as meeting the 66% or more timeliness standard if the lower bound of the 90% CI for time liness was at least 66% and not meeting the standard if the upper bound of the 90% CI fell below 66%. The PAs meeting neither of these lower- or upper-bound criteria were classified as undetermined.
Data quality
There are approximately 200 built-in data entry edit checks within eHARS to identify data problems and protect data integrity. To report a person to the NHSS, the CDC requires that (1) the person must meet the surveillance case definition of HIV infection,12 (2) there is no missing value in the required data elements, and (3) the case record has passed all real-time automated data entry checks in eHARS. For this report, data quality was measured by the percentage of cases that met these criteria by the end of 2012 among cases newly diagnosed in 2011. The PA was considered as having met the data quality outcome standard if the percentage was at least 97%.
Intrastate case duplication rate
Unless indicated by the PA as “Different from” (ie, different persons) in the surveillance system, the CDC considers cases as duplicates if (1) they match on last name soundex, date of birth, sex at birth, and country and state of residence at HIV or AIDS diagnosis or (2) they do not match on the abovementioned variables but the PA had indicated that the cases were the “Same as” (ie, the same person). Duplicate status data (ie, “Different from” or “Same as”) reported through the end of 2012 were used to estimate the intrastate case duplication rate among HIV cases reported to the PA through the end of 2011. The estimated intrastate case duplication rate was calculated by dividing the number of pairs of cases that satisfied criterion 1 or 2 by the cumulative number of HIV cases reported through December 31, 2011. The PA was considered to have met the duplication outcome standard if the rate was no more than 5%.
Risk factor ascertainment
Risk factor ascertainment for an HIV case is considered complete if, by 12 months after the report year, there is sufficient risk factor information to classify the case into a known transmission category.5 In general, a female HIV case with no history of injection drug use or any other risk factors for HIV infection, but who had sexual contact with a male whose HIV serostatus and risk factor for HIV infection are unknown, is classified as a case with no identified risk factor in the transmission category hierarchy.5 However, for the purpose of this evaluation, such a case was classified as Female Presumed Heterosexual Contact and regarded as having a known transmission category. Completeness of risk factor ascertainment was measured by the percentage of HIV cases reported in 2011 that had a known transmission category by the end of 2012, and the PA was considered to have met the completeness outcome standard for risk factor ascertainment if the percentage was at least 85%.
Initial CD4 and VL reporting
Evaluations of the completeness of reporting of initial CD4 (count or percent) and VL test results were restricted to adults and adolescents (diagnosed at the age of ≥13 years). In the evaluation, only VL tests that produced a result in absolute number of HIV RNA copies per milliliter of blood or with an interpretation of below limit, within limit, or above limit were considered. Moreover, CD4 and VL test results without a specimen collection date were excluded from consideration for this analysis. Completeness was measured by the percentage of adults and adolescents newly diagnosed with HIV infection in 2011 who had an initial CD4 or VL test result by the end of 2012. The PA was considered to have met the completeness outcome standard for initial CD4 and initial VL if the percentages were at least 50%.
Data dissemination
As a component of their APR, each PA was required to provide an electronic link to or, a copy of, a surveillance report published in 2012. The mechanism for release of HIV infection information varies and is determined by PAs.
Results
Process standards
Fifty-five of the 59 PAs (93%) reported linking case data to their state vital statistics death certificates during 2012 (Colorado, Idaho, Maine, and Vermont did not meet this standard) (Table 2). Of the 55 PAs that completed a linkage, 2 linked case data to deaths that occurred through 2009, 5 linked deaths through 2010, and the remaining 48 linked deaths that occurred through 2011, with 5 beginning to link deaths that occurred in 2012. In addition, 45 PAs (76%) linked case data to the Social Security Death Master File and 35 PAs (59%) to the National Death Index, although the data years linked for these 2 sources is not reported.
TABLE 2.
Number and Percentage of Project Areas by Year of Their Last Vital Statistics Linkage to eHARS
| Project Area | Last Year Vital Statistics Matched to eHARS |
No. of Project Areas |
% of Project Areas |
|---|---|---|---|
| Florida, Indiana, Kentucky, Missouri, American Virgin Islands | 2012a | 5 | 8.5 |
| Alabama, Alaska, Arizona, Arkansas, Chicago, Connecticut, Delaware, District of Columbia, Georgia, Hawaii, Houston, Illinois, Iowa, Kansas, Louisiana, Maryland, Massachusetts, Minnesota, Mississippi, Montana, Nebraska, Nevada, New Hampshire, New Jersey, New Mexico, New York, New York City, North Carolina, North Dakota, Ohio, Oklahoma, Philadelphia, Rhode Island, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia, Washington, West Virginia, Wisconsin, Wyoming | 2011 | 43 | 72.9 |
| California, Michigan, Pennsylvania, Puerto Rico, San Francisco | 2010 | 5 | 8.5 |
| Los Angeles, Oregon | 2009 | 2 | 3.4 |
| Colorado, Idaho, Maine, Vermont | No linkage | 4 | 6.8 |
Abbreviation: eHARS, Enhanced HIV/AIDS Reporting System.
State began loading early 2012 vital statistics, but data were not complete.
Monthly intrastate de-duplication was completed by 41 PAs (70%) and 16 other PAs (27%) completed intrastate de-duplication at least once per year but not monthly as required (data not shown). Semiannual interstate de-duplication was not conducted by 3 of the separately funded cities (Chicago, Houston, and Philadelphia), as it was completed at the state level. Of the remaining 56 PAs, 35 (63%) resolved all interstate duplicate cases identified by the CDC and 10 PAs (18%) resolved a portion, but not all duplicate cases reported in 2011.
Outcome standards
Of the 39 PAs using the 3-source log-linear capture-recapture model for estimating completeness of case reporting 35 (90%) met the 85% or more standard, 2 (5%) did not meet the standard, and 2 (5%) were undetermined on the basis of broad CIs. Thirty-seven PAs of the 39 (95%) met the 66% or more timeliness standard and the remaining 2 were undetermined (see Supplemental Digital Content Table 4, available at: http://links.lww.com/JPHMP/A68).
Of the 20 PAs that used the reporting delay model, 14 (70%) met the completeness standard, 2 (10%) fell below the standard, and 4 (20%) were undetermined. Seventeen (85%) met the timeliness standard, 1 (5%) fell below, and 2 (10%) were undetermined.
Forty-four of the 59 PAs (75%) met the 97% data quality standard. Fourteen PAs (24%) achieved 100% compliance, and only 2 PAs fell below 90% (New York excluding New York City 80% and Maryland 78%).
Fifty-six PAs reported rates for intrastate duplication, all of which met the standard of 5% or less (range, 0%–1.0%) duplicates by December 31, 2012. Three separately funded cities/counties (Chicago, Houston, and Philadelphia) did not report PA-specific percentages, as their duplicate review is conducted and reported at the statewide level (see Supplemental Digital Content Table 4, available at: http://links.lww.com/JPHMP/A68).
Only 24 of the 59 PAs (41%) met the risk factor ascertainment standard requiring 85% or more of the cases reported in 2011 to have a known transmission category by December 31, 2012 (range, 37% in Maryland to 100% in multiple states). Seventy-five percent of PAs achieved 75% or more and Maryland and Georgia fell below 60% (37% and 57%, respectively) (Table 3).
TABLE 3.
Local/State HIV Surveillance System Performance on Reporting Risk Factor, CD4 Level, and Viral Load Level: 2011
| System Attributes | |||||||
|---|---|---|---|---|---|---|---|
| Completeness of Risk Factor Ascertainment | Completeness of Initial CD4 and Viral Load Reporting Following HIV Disease Diagnosis |
||||||
| Surveillance Program | HIV Disease Cases Reporteda in 2011 | HIV Disease Cases Diagnosed in 2011 | No. of Adults/ Adolescents Diagnosed With HIV Infection in 2011 |
% With Initial CD4 (Count or %) |
% With Initial Viral Load |
||
| No. of Cases Reported |
% With a Known Transmission Category |
No. of Cases Diagnosed |
% With Complete Risk Factor Information |
||||
| Standard | … | ≥85% | … | … | … | ≥50% | ≥50% |
| Alabama | 778 | 71.0 | 703 | 71.4 | 697 | 59.8 | 62.0 |
| Alaska | 25 | 96.0 | 24 | 95.8 | 24 | 87.5 | 70.8 |
| Arizona | 568 | 91.6 | 576 | 90.5 | 575 | 48.9 | 76.2 |
| Arkansas | 232 | 73.7 | 233 | 73.0 | 232 | 62.9 | 44.0 |
| California (overall)b | NR | NR | NR | NR | NR | NR | NR |
| Los Angeles County | 2218 | 84.1 | 1960 | 83.1 | 1950 | 75.6 | 77.6 |
| San Francisco | 534 | 95.7 | 421 | 95.3 | 421 | 82.2 | 83.6 |
| California (excluding Los Angeles County and San Francisco) | 3219 | 91.9 | 2575 | 90.6 | 2564 | 70.2 | 65.2 |
| Colorado | 399 | 87.7 | 375 | 88.3 | 368 | 86.7 | 87.8 |
| Connecticut | 383 | 80.9 | 347 | 80.7 | 346 | 63.9 | 84.1 |
| Delaware | 127 | 89.8 | 110 | 94.6 | 110 | 75.5 | 78.2 |
| District of Columbia | 844 | 73.0 | 695 | 74.5 | 693 | 76.2 | 74.5 |
| Florida | 4841 | 93.0 | 4814 | 92.5 | 4800 | 52.6 | 52.9 |
| Georgia | 2303 | 57.2 | 1963 | 55.5 | 1953 | 58.5 | 63.4 |
| Hawaii | 135 | 83.0 | 71 | 84.5 | 71 | 80.3 | 84.5 |
| Idaho | 40 | 80.0 | 40 | 80.0 | 40 | 40.0 | 77.5 |
| Illinois (overall)b | NR | NR | NR | NR | NR | NR | NR |
| Chicago | 1196 | 89.1 | 998 | 89.8 | 997 | 70.0 | 69.8 |
| Illinois (excluding Chicago) | 864 | 79.6 | 706 | 78.8 | 702 | 54.7 | 60.7 |
| Indiana | 519 | 84.4 | 474 | 84.6 | 474 | 76.8 | 75.7 |
| Iowa | 121 | 90.9 | 120 | 90.8 | 119 | 85.7 | 87.4 |
| Kansas | 138 | 77.5 | 138 | 84.8 | 142 | 83.8 | 83.1 |
| Kentucky | 607 | 73.8 | 320 | 71.9 | 315 | 70.2 | 73.3 |
| Louisiana | 1256 | 77.2 | 1245 | 77.8 | 1241 | 74.0 | 69.9 |
| Maine | 47 | 68.1 | 49 | 69.4 | 49 | 44.9 | 44.9 |
| Maryland | 1928 | 37.2 | 1589 | 35.7 | 1583 | 60.8 | 63.5 |
| Massachusetts | 1740 | 82.3 | 658 | 74.0 | 655 | 73.1 | 69.2 |
| Michigan | 792 | 72.9 | 802 | 73.1 | 793 | 69.5 | 78.4 |
| Minnesota | 323 | 82.4 | 301 | 83.1 | 296 | 78.4 | 78.0 |
| Mississippi | 526 | 76.6 | 549 | 75.6 | 549 | 59.0 | 51.2 |
| Missouri | 520 | 86.0 | 524 | 86.3 | 523 | 59.1 | 66.9 |
| Montana | 29 | 86.2 | 21 | 85.7 | 21 | 90.5 | 85.7 |
| Nebraska | 79 | 73.4 | 78 | 73.1 | 77 | 81.8 | 87.0 |
| Nevada | 371 | 97.3 | 378 | 96.6 | 378 | 86.2 | 89.7 |
| New Hampshire | 42 | 81.0 | 42 | 78.6 | 39 | 76.9 | 87.2 |
| New Jersey | 1440 | 61.7 | 1162 | 60.3 | 1153 | 61.6 | 70.9 |
| New Mexico | 140 | 87.1 | 141 | 88.7 | 140 | 81.4 | 74.3 |
| New York (overall)b | NR | NR | NR | NR | NR | NR | NR |
| New York City | 3861 | 77.8 | 3138 | 80.0 | 3115 | 80.0 | 79.2 |
| New York (excluding New York City) | 3556 | 75.8 | 1355 | 78.2 | 1349 | 83.8 | 83.6 |
| North Carolina | 1696 | 74.5 | 1504 | 75.7 | 1497 | 54.0 | 68.1 |
| North Dakota | 11 | 100 | 11 | 100 | 11 | 100 | 100 |
| Ohio | 1197 | 71.6 | 1133 | 71.2 | 1130 | 48.3 | 68.5 |
| Oklahoma | 341 | 83.3 | 317 | 83.3 | 314 | 45.9 | 31.8 |
| Oregon | 278 | 89.2 | 239 | 90.0 | 239 | 90.0 | 86.2 |
| Pennsylvania (overall)b | NR | NR | NR | NR | NR | NR | NR |
| Philadelphia | 701 | 98.1 | 681 | 98.7 | 678 | 74.9 | 78.3 |
| Pennsylvania (excluding Philadelphia) | 797 | 89.7 | 733 | 89.2 | 729 | 60.5 | 78.2 |
| Rhode Island | 162 | 85.8 | 90 | 93.3 | 89 | 71.9 | 67.4 |
| South Carolina | 787 | 83.0 | 764 | 83.6 | 761 | 89.6 | 88.6 |
| South Dakota | 20 | 100 | 21 | 100 | 21 | 76.2 | 85.7 |
| Tennessee | 875 | 74.6 | 857 | 74.8 | 852 | 38.1 | 52.3 |
| Texas (overall)b | NR | NR | NR | NR | NR | NR | NR |
| Houston | 1380 | 78.8 | 1224 | 78.9 | 1236 | 74.4 | 72.2 |
| Texas (excluding Houston) | 3470 | 81.7 | 3133 | 81.8 | 3122 | 69.8 | 71.6 |
| Utah | 99 | 90.9 | 94 | 90.4 | 94 | 81.9 | 81.9 |
| Vermont | 3 | 100 | 11 | 81.8 | 11 | 63.6 | 90.9 |
| Virginia | 987 | 84.7 | 914 | 84.4 | 911 | 60.5 | 58.3 |
| Washington | 588 | 87.3 | 502 | 86.5 | 496 | 83.3 | 80.6 |
| West Virginia | 107 | 98.1 | 97 | 94.9 | 96 | 52.1 | 75.0 |
| Wisconsin | 260 | 80.8 | 248 | 81.9 | 247 | 79.4 | 84.2 |
| Wyoming | 17 | 64.7 | 15 | 80.0 | 15 | 86.7 | 93.3 |
| Puerto Rico | 825 | 91.0 | 705 | 89.9 | 704 | 50.0 | 33.9 |
| Virgin Islands | 35 | 60.0 | 24 | 58.3 | 24 | 66.7 | 58.3 |
Abbreviations: HIV, human immunodeficiency virus; NR, not reported.
Includes cases diagnosed in any year but reported in 2011.
Includes cases with unknown city/county of residence at diagnosis.
Fifty-three of the 59 PAs (90%) met the requirement that 50% or more of newly diagnosed cases in 2011, aged 13 years or older, have an initial CD4 count or percentage by December 2012 (range, 38% in Tennessee to 100% in North Dakota). Similarly, 55 of the 59 PAs (93%) met the same standard for reporting of initial VL (range, 32% in Oklahoma to 100% in North Dakota). Finally, 56 of the 59 PAs (95%) produced an annual surveillance report during 2012. Montana, Nebraska, and Vermont did not meet this standard in the required time frame.
In summary, of the 11 standard process and outcome measures evaluated, 4 of the 59 PAs (7%) met all 11 standards, 18 (31%) met 10 standards, 17 (29%) met 9 standards, and the remaining 20 (34%)met 8 standards or fewer. The PAs that met all 11 standards were Florida, Iowa, Missouri, and the city of Philadelphia.
Lessons Learned
We evaluated the performance of the NHSS for the 2011 report year. To our knowledge, the current study is the first comprehensive evaluation of performance standards for HIV surveillance systems.3 Data summarized in this report are provided by PAs using CDC-developed standardized outcome evaluation software and analytic programs.
At 12 months after the end of the diagnosis year (ie, December 31, 2012), 83% of the PAs met the case ascertainment criteria. Three of the 10 PAs that did not meet the case finding standard, or for whom the result was indeterminate, also failed to meet 5 of the other outcome standards, suggesting PAs’ systemic challenges with data collection. Another 5 PAs failed 1 or no other measures, suggesting that improvements in case finding are needed.
Data quality was measured by ensuring that each HIV case has valid values for all required variables and passed approximately 200 real-time data entry checks. Seventy-five percent of PAs met or exceeded the 97% standard. Both PA assessment and national assessment of individual variable collectability and usefulness should be an ongoing activity.
Removal of duplicate records from the surveillance system is critical to establishing accurate national, state, and local estimates of HIV infection and AIDS for prevention planning and resource allocation, including for the use of diagnosis data by the Health Resources Services Administration (HRSA) to equitably allocate federal Ryan White Care Act prevention funding.13,14 A prior assessment of duplicate records in the national surveillance system found that 4.5% of cases were duplicates and recommended that the CDC develop reports to facilitate both intra- and interstate de-duplication efforts15 on a routine basis. The CDC implemented the recommendation, and the current study notes a marked improvement in identification and resolution of duplicate records. All PAs met the intrastate duplicate record goal of 5% or less, with only 1 PA exceeding 0.9%. This suggests that the intrastate duplication standard could be reasonably decreased to less than 1%. Since PAs do not have access to other PA data, a report of potential interstate duplicate records is provided by the CDC semiannually. Processing the duplicates requires telephone contact with other PAs to confirm or reject the records as duplicates; a noted barrier to accomplishing de-duplication is establishing that contact. Facilitating a more efficient communication process could improve the resolution of interstate duplicates.
Risk factor ascertainment was the most difficult measure to meet, with only 41% of PAs achieving the 85% standard (Table 3). A previous review of the APRs submitted by PAs not achieving at least 70% suggested a number of potential strategies for increasing risk factor ascertainment. Among those strategies were (1) improving the percentage of reports that are received electronically, freeing up staff to pursue risk factor data; (2) linking to clinics, community service programs, counseling centers, and other surveillance programs to share data; (3) and assigning local rather than state health department staff to develop provider relationships in their areas and provide training. Some of these strategies are currently in place in PAs that met the risk factor ascertainment standard. Sharing strategies between PAs and continued development of electronic data sharing capabilities may help facilitate data collection in less successful PAs.
The presence of an initial CD4 and VL result exceeded the 50% standard in 86% of the PAs. Of the 8 PAs not meeting either the initial CD4 or VL standard, 2 failed both measures (Maine and Oklahoma). Maine had laws or regulations that mandate the reporting of all values of CD4 and VL results since before January 1, 2010, suggesting that a systemic barrier for capturing laboratory test results, whereas Oklahoma laws required only the reporting of CD4 results of fewer than 500 count or percentage and all VL results regardless of result. Six of the 8 PAs failed only 1 of the 2 laboratory requirements, perhaps suggesting differences in physician testing patterns or laboratory reporting laws. Idaho reported the largest differential, with 40% reporting initial CD4 counts or percentages and 78% reporting initial VL results. State regulations in Idaho in 2011 required that only CD4 counts fewer than 200 or CD4 percent less than 14%, and only detectable VL results were reportable.
Limitations
This report is subject to at least 3 limitations. First, not all PAs submitted their evaluation reports at the required 12-month interval. A total of 20 months of additional reporting time was encountered. Forty-six of the 59 PAs (78%) reported their evaluation results as of 12 months after 2011 as required by CDC evaluation standards. Nine PAs (Connecticut, Hawaii, Mississippi, Nebraska, New Hampshire, Oregon, Philadelphia, South Carolina, and San Francisco) were delayed until 13 months, 2 (District of Columbia and South Dakota) until 14 months, 1 (Arkansas) until 15 months, and 1 (North Dakota) until 16 months after 2011. Because of the additional time for data collection, PAs that allowed 13 or more months of data collection may have inflated results compared with those that measured at the required 12-month interval.
Second, the capture-recapture method for evaluating case reporting completeness and timeliness is subject to under- and overestimation due, in part, to dependence between data sources, data accuracy, source completeness, and false-positive or -negative matching of multiple records.11,16 Inaccurate case estimates may result in misallocation of federal funds for prevention and care programs and invalid assessment of the effectiveness of public health or prevention programs.
Third, the completeness of initial CD4 and VL test results are dependent on many factors, including individual state laws that both mandate and prohibit reporting specific types of test results. As of 2012, more than 20 PAs did not require reporting of all CD4 and VL test results. In addition, physician testing practices and laboratory cooperation may vary and the availability of electronic laboratory reporting and delays in data entry by PAs may impact completion of CD4 and VL reporting outcomes.
Conclusion
These analyses demonstrate the importance of standardized evaluation of the national HIV surveillance system in identifying strengths and opportunities among multiple PAs to target process and system improvements. A large number of PAs exceeded the current performance standards, suggesting that a reevaluation of data standards is warranted. In addition, a high percentage of PAs struggled with 1 or 2 aspects of data collection, suggesting that mentoring among PAs may be beneficial. Other PAs struggled with nearly every measure, indicting the need for highly structured technical assistance. Annual assessment of all 11 criteria in each PA will serve as a guide for ensuring the highest-quality data for HIV infection monitoring and allocation of prevention funding that could be used for development, implementation, and evaluation of relevant interventions and programs.
Supplementary Material
Footnotes
The findings and conclusions in this report are those of the authors and do not necessarily represent the views of the Centers for Disease Control and Prevention.
The authors declare no conflicts of interest.
Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal’s Web site (http://www.JPHMP.com).
REFERENCES
- 1.Centers for Disease Control and Prevention. CDC’s vision for public health surveillance in the 21st century. MMWR Surveill Summ. 2012;61(suppl):1–40. [PubMed] [Google Scholar]
- 2.Centers for Disease Control and Prevention. Updated guidelines for evaluating public health surveillance systems: recommendations from the guidelines working group. MMWR Recomm Rep. 2001;50(RR-13):1–35. [PubMed] [Google Scholar]
- 3.Centers for Disease Control and Prevention. Guidelines for national human immunodeficiency virus case surveillance, including monitoring for human immunodeficiency virus infection and acquired immunodeficiency syndrome. MMWR Recomm Rep. 1999;48(RR-13):1–36. [PubMed] [Google Scholar]
- 4.Hall HI, Mokotoff ED. Setting standards and an evaluation framework for human immunodeficiency virus/acquired immunodeficiency syndrome surveillance. J Public Health Manag Pract. 2007;13(5):519–523. doi: 10.1097/01.PHH.0000285206.79082.cd. [DOI] [PubMed] [Google Scholar]
- 5.Centers for Disease Control and Prevention and Council of State and Territorial Epidemiologists. Technical Guidance for HIV/AIDS Surveillance Programs, Volume I: Policies and Procedures. Atlanta, GA: Centers for Disease Control and Prevention; 2005. [Google Scholar]
- 6.Council of State and Territorial Epidemiologists. AIDS case reporting: reciprocal notification. [Accessed October 24, 2013];CSTE Position Statement 1986-17 extended to 2001-id-04. http://www.cste.org/?page=PositionStatements%20. [Google Scholar]
- 7.Council of State and Territorial Epidemiologists. HIV case reporting: reciprocal notification. [Accessed October 24, 2013];CSTE Position Statement 2001-id-04. http://www.cste.org/ps/2001/2001-id-04.htm. [Google Scholar]
- 8.National Technical Information Service. Social Security Administration’s Death Master File. Alexandria, VA: US Department of Commerce; [Accessed October 23, 2012]. http://www.ntis.gov/products/ssadmf.aspx. [Google Scholar]
- 9.Centers for Disease Control and Prevention. National Death Index. Atlanta, GA: US Department of Health and Human Services; 2012. [Accessed October 23, 2012]. http://www.cdc.gov/nchs/ndi.htm. [Google Scholar]
- 10.The Soundex Indexing System. Washington, DC: National Archives and Records Administration; [Accessed October 25, 2012]. National Archives. http://www.archives.gov/publications/general-info-leaflets/55-census.html. [Google Scholar]
- 11.Hall HI, Song R, Gerstle JE, III, Lee LM. Assessing the completeness of reporting of human immunodeficiency virus diagnoses in 2002–2003: capture-recapture methods. Am J Epidemiol. 2006;164(4):391–397. doi: 10.1093/aje/kwj216. [DOI] [PubMed] [Google Scholar]
- 12.Centers for Disease Control and Prevention. Revised surveillance case definitions for HIV infection among adults, adolescents, and children aged <18 months and for HIV infection and AIDS among children aged 18 months to <13 years—United States, 2008. MMWR Recomm Rep. 2008;57(RR-10):1–12. [PubMed] [Google Scholar]
- 13.The Ryan White HIV/AIDS Treatment Modernization Act of 2006. Rockville, MD: US Department of Health and Human Services; [Accessed October 25, 2012]. Health Resources and Services Administration. http://hab.hrsa.gov/abouthab/modernact2006.html. [Google Scholar]
- 14.Institute of Medicine. Measuring What Matters. Allocation, Planning, and Quality Assessment for the Ryan White Care Act. Washington, DC: National Academies Press; 2004. [PubMed] [Google Scholar]
- 15.Glynn MK, Ling Q, Phelps R, Li J, Lee LM. Accurate monitoring of the HIV epidemic in the United States—case duplication in the National HIV/AIDS Surveillance System. J Acquir Immune Defic Syndr. 2008;47:391–393. doi: 10.1097/QAI.0b013e318160d52a. [DOI] [PubMed] [Google Scholar]
- 16.Hook EB, Regal RR. Capture-recapture methods in epidemiology: methods and limitations. Epidemiol Rev. 1995;17(2):243–264. doi: 10.1093/oxfordjournals.epirev.a036192. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
