Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2019 Dec 19.
Published in final edited form as: Cancer Causes Control. 2014 Feb 28;25(5):571–581. doi: 10.1007/s10552-014-0362-x

Population-based surveillance for cervical cancer precursors in three central cancer registries, United States 2009

Elaine W Flagg 1, S Deblina Datta 2, Mona Saraiya 3, Elizabeth R Unger 4, Edward Peters 5, Lauren Cole 6, Vivien W Chen 7, Thomas Tucker 8, Mary Jane Byrne 9, Glenn Copeland 10, Won Silva 11, Meg Watson 12, Hillard Weinstock 13
PMCID: PMC6921482  NIHMSID: NIHMS1062802  PMID: 24578200

Abstract

Purpose

The USA has a well-established network of central cancer registries (CCRs) that collect data using standardized definitions and protocols to provide population-based estimates of cancer incidence. The addition of cervical cancer precursors in select CCR operations would facilitate future studies measuring the population-level impact of human papillomavirus (HPV) vaccine. To assess the feasibility of collecting data on cervical cancer precursors, we conducted a multi-site surveillance study in three state-wide CCRs, to obtain annual case counts and compare rates of precursor lesions to those for invasive cervical cancer.

Methods

We developed standardized methods for case identification, data collection and transmission, training and quality assurance, while allowing for registry-specific strategies to accomplish surveillance objectives. We then conducted population-based surveillance for precancerous cervical lesions in three states using the protocols.

Results

We identified 5,718 cases of cervical cancer precursors during 2009. Age-adjusted incidence of cervical cancer precursors was 77 (Kentucky), 60 (Michigan), and 54 (Louisiana) per 100,000 women. Highest rates were observed in those aged 20–29 years: 274 (Kentucky), 202 (Michigan), and 196 (Louisiana) per 100,000. The variable with the most missing data was race/ethnicity, which was missing for 13 % of cases in Kentucky, 18 % in Michigan, and 1 % in Louisiana. Overall rates of cervical cancer precursors were over sixfold higher than invasive cervical cancer rates [rate ratios: 8.6 (Kentucky), 8.3 (Michigan), and 6.2 (Louisiana)].

Conclusions

Incorporating surveillance of cervical cancer precursors using existing CCR infrastructure is feasible and results in collection of population-based incidence data. Standardized collection of these data in high-quality registry systems will be useful in future activities monitoring the impact of HPV vaccination across states. As a result of this study, ongoing surveillance of these lesions has now been conducted in four CCRs since 2010.

Keywords: Cervical intraepithelial neoplasia, Epidemiology, Public health, Population characteristics, Sexually transmitted diseases

Introduction

Since 2006, two human papillomavirus (HPV) vaccines have been commercially available for protection against HPV types 16 and 18, which cause approximately 70 % of cervical cancers worldwide [1, 2]. Several factors complicate efforts to monitor the population impact of HPV vaccine, including multiple clinical outcomes and differing, often extended, times to outcome development [37]. Cervical cancer, the most important anogenital outcome of HPV infection, may take several decades to develop [8], but cervical cancer precursors often occur 1–3 years after cervical HPV infection [912].

Rates of invasive cervical cancer are estimated through population-based central cancer registries (CCRs). Federally funded CCRs exist in all 50 states and are administered through the Centers for Disease Control and Prevention’s (CDC) National Program of Cancer Registries (NPCR) [13] or the National Cancer Institute’s (NCI) Surveillance, Epidemiology and End Results (SEER) Program [14]. Prior to 1996, cases of in situ cervical carcinoma were routinely reported to CCRs, but collection was discontinued due to concerns about comparability of cases over time because of changes in diagnostic terminology, inconsistencies in case definitions used across registries, and increased diagnosis and treatment of cervical lesions in outpatient settings [15]. Only one state, Michigan, continued to collect data on in situ cervical carcinoma as part of its routine cancer surveillance [16].

The primary objectives of this project were to evaluate the feasibility of population-based surveillance of cervical cancer precursors using the existing CCR infrastructure and to estimate the 2009 annual incidence of these lesions in participating states. Secondary objectives were to develop standardized methods for data collection which would be easily transferrable to a larger group of CCRs.

Methods

This project was conducted by CDC, the Kentucky Cancer Registry (KCR), Louisiana Tumor Registry (LTR), and Michigan Cancer Surveillance Program (MCSP). The KCR and LTR are state-wide registries that participate in both the NPCR and the SEER Program. The MCSP participates in NPCR, conducting cancer surveillance for all areas of Michigan except metropolitan Detroit, for which cancer surveillance is conducted by a SEER registry; for this project, the MCSP collaborated with the Detroit registry to collect state-wide Michigan data.

Case definition

Case eligibility was based on pathology report information; cases identified only by cytology report were excluded. The case inclusion criteria are provided in Table 1. If more than one precancerous cervical lesion was identified for a given patient during a 12 month time period, only the earliest occurring lesion was reported. Lesions were classified histologically as either squamous or adenocarcinoma in situ (AIS).

Table 1.

Case inclusion criteria, surveillance of cervical cancer precursors, three central cancer registries, United States 2009

Site (ICD-O-3) C53.0 (endocervix), C53.1 (exocervix), C53.8 (overlapping lesions of cervix uteri) and C53.9 (cervix uteri)
Behavior 2 (in situ or noninvasive)
Histology 8010/2 carcinoma in situ, NOS
8050/2 papillary carcinoma in situ
8052/2 papillary squamous cell carcinoma, noninvasive
8070/2 squamous cell carcinoma in situ, NOS
8071/2 squamous cell carcinoma, keratinizing, NOS, in situ
8072/2 squamous cell carcinoma, large cell, non-keratinizing, in situ
8076/2 squamous cell carcinoma in situ with questionable stromal invasion
8077/2 squamous intraepithelial neoplasia grade III
8140/2 adenocarcinoma in situ
Pathologic classification CIN III, CIS, AIS, severe dysplasia
Case enrollment period January 1, 2009–December 31, 2009
Incident case time period If more than 1 lesion is identified for a patient in a 12-month time period, the earliest lesion is reported
Catchment area Residents of Louisiana, Kentucky or Michigan at time of diagnosis (includes cases diagnosed out-of-state)

Case-finding

CCRs have exhaustive case-finding protocols for identification of incident cancer cases that were utilized to ascertain precancerous cervical lesions; sources included free standing, reference, and hospital pathology laboratories. Case-finding was performed by manual review of pathology reports or an automated search of electronic records using key words or phrases, International Classification of Diseases for Oncology, 3rd edition (ICD-O-3) [17] codes, International Classification of Diseases, 9th revision, Clinical Modification (ICD-9-CM) [18] codes or Systematized Nomenclature of Medicine (SNOMED) codes [19, 20]. The appropriate ICD-9-CM code was 233.1 [cervical intraepithelial neoplasia grade 3 (CIN3)/carcinoma in situ (CIS)/severe dysplasia]. For pathology laboratories using the older SNOMED Reference Terminology [20], the Legacy Code (M-81402, M-80702, or M-80772) was used; those using the newer SNOMED Clinical Terms used both the Concept ID (51642000, 59529006, or 20365006) and the Legacy Code. Artificial Intelligence in Medicine (AIM) E-Path software [21] was modified to include appropriate search terms and screen pathology report text. Hospital registrars and pathology laboratory personnel collaborated with CCR staff to implement new E-Path installations and continued previous data collection and transmission protocols for a specified time period to ensure data collection was complete, transmission was continual, and any installation errors resulting in missed cases were quickly identified and corrected.

Cases were submitted to CCRs by pathology laboratories and hospital tumor registries according to established protocols within each state. These included:

  • Electronic reporting by hospital tumor registries with supporting documentation for the primary anatomic site and histology codes.

  • Electronic reporting by pathology laboratories (e.g., complete pathology report or electronic list with final diagnosis in text).

  • Case finding at pathology laboratories by CCR staff.

  • Pathology report submission by hospital medical record department or registry staff.

Cases added to the registry database were checked for duplicate records; all duplicates were removed.

Case follow-back to the original reporting source

Case follow-back [22] (contacting a physician, abstractor, or reporting facility to obtain missing information or resolve inconsistencies) was conducted, as necessary, for the following data items: lesion behavior; histology; diagnosis date; city, state, and county of residence on diagnosis date; birth date; race.

Data abstraction

Once case-finding activities were completed, data elements were abstracted. To maintain consistency with registry operations, these elements, a subset of the standard North American Association of Central Cancer Registries (NAACCR) variables, were coded using NAACCR standards [23]. Only the data elements necessary to identify and characterize cases of cervical cancer precursors were collected, in order to minimize burden on tumor registry and CCR staff. These data elements, with the rationale for including each, are listed in Table 2.

Table 2.

Data elements collected and edit checks performed, three central cancer registries, United States 2009

Variable (NAACCR item #) Rationale Edit description Note
Patient ID (20) Once the data have been de-identified this allows the registries to refer back to their identifiable dataset if necessary Field is numeric, greater than zero, right-justified and zero-padded on the left. Field cannot be empty Same as corresponding NAACCR edit
CIN sequence number Field is numeric, greater than zero, right-justified and zero-padded on the left. Field cannot be empty. 01 for first reportable diagnosis, 02 for second, etc. New
Registry ID (40) Registry ID facilitates analyses by each CCR
Patient first namea (2240) Patient name allows the CCRs to find situations in which one diagnosis is reported by multiple facilities or a woman had multiple tests during the follow-up First name may not be blank. Must be alpha, left-justified, and blank-filled. Mixed case is allowed. Embedded spaces are not allowed. Special characters are not allowed Same as corresponding NAACCR edit
Patient last namea (2230) Last name may not be blank. Must be alpha, left-justified, and blank-filled. Mixed case is allowed. Embedded spaces are not allowed. Embedded hyphen is allowed. No other special characters are allowed Same as corresponding NAACCR edit
Street address at diagnosisa (2330) Item may not be blank. Must be alphanumeric, left-justified, and blank-filled. Mixed case is allowed. Embedded spaces are allowed. Special characters are limited to periods, slashes, hyphens, and pound signs Same as corresponding NAACCR edit
City at diagnosisa (70) Item may not be blank. Must be alpha, left-justified, and blank-filled. Mixed case is allowed, but uppercase is preferred. Embedded spaces are allowed, but no more than one consecutive embedded space is allowed. Special characters are not allowed Same as corresponding NAACCR edit
State at diagnosis (80) This variable is used to exclude out-of-state residents who were treated by an in-state facility Field must contain valid US postal code for state or Canadian province Same as corresponding NAACCR edit
County at diagnosis (90) If this information is available, county is determined directly, rather than derived from street address, city, and state data County at diagnosis must be three-digit number. Same as corresponding NAACCR edit
Original reporting facility number Unique facility ID for the originally abstracted data
Assigned by the CCR and not traceable to the facility
Must be numeric, right-justified, zero-filled Same as corresponding NPCR edit
Follow-up reporting facility number Unique facility ID for the follow-back data. Assigned by the CCR and not traceable to the facility Must be numeric, right-justified, zero-filled or blank if no follow-up facility available New
Type of reporting source (500) This refers to the original reporting facility and is used to compare data completeness by the source of data Must be a valid type of reporting source code (1–8) Same as corresponding SEER edit
Sex (220) This variable is used by the CCRs as a quality control check Cannot be blank and must equal 2 (female) Revised
Date of birtha (240) Date of birth is used by the CCRs along with date of diagnosis to calculate age at diagnosis Must contain a valid date or 99999999 Same as corresponding NAACCR edit
Date of diagnosis (390) Date of biopsy. Slightly different from NAACCR standard Must contain a valid date or 99999999 Revised
Age at diagnosis (230) Calculated from date of birth and date of diagnosis Must be a valid value for age at diagnosis (000, …,120, 999) Same as corresponding SEER edit
Race 1 (160) The race variable allows analyses stratified by race Must be a valid Race 1 code (01–14, 20–22, 25–28, 30–32, 96–99) Same as corresponding SEER edit
Spanish origin 1 (190) Hispanic ethnicity as determined by information present in the medical record Must be a valid Spanish/Hispanic Origin code (0–9) Same as corresponding SEER edit
Spanish origin 2 (191) Hispanic ethnicity as determined by computer algorithm using surname Must be a valid NAACCR Hispanic Identification Algorithm Derived Hispanic Origin code (0–8) or blank (if algorithm has not been run) Same as corresponding NAACCR edit
Site (400) For this project, the site is the cervix Must be one of C53.0, C53.1, C53.8, or C53.9 Revised
Behavior 2 = in situ or noninvasive Must be 2 Revised
Histology (522) Codes 8010, 8050, 8052, 8070, 8071, 8072, 8076, 8077, and 8140 are reportable. All others must undergo review Must be one of 8010, 8050, 8052, 8070, 8071, 8072, 8076, 8077, or 8140. Other histology codes are acceptable only after careful review of pathology report Revised
Histology terminology code 1 = AIS, 2 = CIN III, 3 = CIS, 4 = severe dysplasia (prioritized, only 1 code, CIN III > CIS > severe dysplasia) Must be one of 1 (AIS), 2 (CIN III), 3 (CIS) or 4 (severe dysplasia). Prioritized if more than one noted, CIN III > CIS > severe dysplasia New data element
General comments Free field to enter additional information about the case
a

Fields not included in the analytic dataset sent to the Centers for Disease Control and Prevention

Quality control and feasibility assessments

CCRs were responsible for ensuring all eligible cervical cancer precursors were identified, using existing NPCR and SEER quality control protocols. Case-finding audits (systematic review of a sample of cases reported by all sources during a specified time period, coupled with verification of cases using a different case-finding methodology) were conducted by staff of each registry. Results of these case-finding audits were included in each registry’s annual project report.

Additional quality control was conducted by performing checks for duplicate cases, and variable-specific checks were used to ensure data values were not missing or invalid. These variable-specific checks are briefly described in Table 2. CCRs were encouraged to modify standard registry programs such as GenEdits Plus [24] for this purpose. Feasibility of implementing this activity at CCRs was assessed by examining numbers and qualifications of personnel needed, start-up and on-going costs, and other factors associated with project management, data collection, and quality control activities, including implementing changes to existing text-screening software, reviewing pathology reports, and conducting follow-up activities.

Population denominators

State-specific female populations for each age and race/ethnicity category were derived using vintage 2009 bridged race, Hispanic origin, and gender-specific postcensal estimates of the resident population of the USA for July 1, 2009, by year, county, and single year of age (0, 1, 2, …, 85 years and over) obtained from the National Center for Health Statistics, prepared in collaboration with the US Census Bureau [25].

Analysis variables

Cases were classified by histology (squamous or glandular lesions), age (5-year categories, except 0–14 and 65+ years), and race/ethnicity [Hispanic (of all races), non-Hispanic white, non-Hispanic black, other (Asian, Pacific Islander, American Indian, or Alaska Native), unknown]. Hispanic ethnicity was assigned using the NAACCR Hispanic Identification Algorithm version 2 [26]. To estimate incidence rates, we allocated cases with missing race across the four race/ethnicity categories based on the state-specific proportions of cases with known race/ethnicity in each category. Age-adjusted rates were calculated using standard SEER procedures and used year 2000 US standard population in 5-year age groups [27]. Invasive cervical cancer case counts and rates for year 2009 were derived from NPCR data by CDC staff. Invasive squamous lesions were those with ICD-O-3 histology codes 8050–8084 and 8120–8131; invasive adenocarcinomas had histology codes 8015, 8140–8149, 8160–8162, 8190–8221, 8260–8337, 8350–8551, 8560, 8570–8576, and 8940–8941.

Results

Case counts and incidence estimates

We identified a total of 5,718 cases: 1,639 (29 %) from Kentucky, 2,834 (49 %) from Michigan, and 1,245 (22 %) from Louisiana (Table 3). Ninety-seven percent of the lesions were of squamous cell origin. Overall, the reported histology of cases was 75 % CIN3, 10 % CIS, 12 % severe dysplasia, and 3 % AIS.

Table 3.

Incidence of cervical cancer precursors per 100,000 female population by age category and race/ethnicity, three central cancer registries, United States 2009

Kentucky Louisiana Michigan
Cases Incidence ratea Cases Incidence ratea Cases Incidence ratea
Age (years)
 0–14 0 0.0 0 0.0 0 0.0
 15–19 84 58.8 53 32.8 115 31.3
 20–24 417 284.6 305 177.1 647 185.9
 25–29 396 262.7 370 215.5 688 220.7
 30–34 286 207.0 182 125.9 485 166.2
 35–39 169 116.7 126 89.1 319 97.1
 40–44 117 78.5 85 57.3 224 64.2
 45–49 71 43.0 48 28.4 139 35.5
 50–54 35 21.6 29 17.5 79 20.1
 55–59 26 18.0 16 10.9 53 15.4
 60–64 18 14.6 12 10.0 40 14.1
 65+ 20 6.0 19 5.8 45 5.8
 Age-adjustedb 77.3 53.7 59.6
Race/ethnicityc
 Hispanic 58 109.6 33 42.4 132 64.0
  Age-adjustedb 108.6 40.6 63.4
 Non-Hispanic White 1,480 76.1 779 54.7 2,149 54.1
  Age-adjustedb 80.5 57.8 60.8
 Non-Hispanic Black 97 53.5 414 54.0 514 66.6
  Age-adjustedb 55.0 51.8 67.9
 Non-Hispanic Otherd 5 10.6 18 28.5 39 19.5
  Age-adjustedb 10.6 26.6 17.9
a

Rates are per 100,000 female population using denominators from: National Center for Health Statistics. Postcensal estimates of the resident population of the United States for July 1, 2000–July 1, 2009, by year, county, single year of age (0, 1, 2, …, 85 years and over), bridged race, Hispanic origin, and sex (Vintage 2009). Prepared under a collaborative arrangement with the US Census Bureau. Available from: http://www.cdc.gov/nchs/nvss/bridged_race.htm. Accessed 04/23/2010

b

Rates are age-adjusted to the year 2000 US Standard Population in 5-year age groups (19 age groups—Census P25–1130) standard. Available from: http://seer.cancer.gov/stdpopulations/. Accessed 05/13/2013

c

Cases with unknown/missing race/ethnicity were reallocated to non-missing categories based on the distribution of cases with known race/ethnicity within each state

d

Includes cases of non-Hispanic American Indian, Alaska Native, Asian, and Pacific Island race/ethnicity

The highest incidence was observed among women aged 20–29 with slight variation by state: 274 per 100,000 in Kentucky, 202 in Michigan, and 196 in Louisiana. Age-adjusted incidence of cervical cancer precursors was highest in Kentucky at 77 per 100,000, followed by 60 per 100,000 in Michigan and 54 per 100,000 in Louisiana. For each state, age-adjusted incidence for cervical cancer precursors was 6–8 times higher than the corresponding 2009 invasive cervical cancer rate [28] (Table 4). Rate ratios of precancerous to invasive lesions were 8.6 for Kentucky, 8.3 for Michigan, and 6.2 for Louisiana. Rate ratios for squamous histologies were similar in Kentucky and Michigan (12.5 and 12.4, respectively), but were lower for Louisiana (7.4); rates of precancerous and invasive lesions were similar for AIS histologies.

Table 4.

Incidence of cervical cancer precursors and invasive cervical cancer per 100,000 female population by lesion type, three central cancer registries, United States 2009

Kentucky Louisiana Michigan
Precancerous cervical lesions Invasive cervical cancera Rate ratio Precancerous cervical lesions Invasive cervical cancera Rate ratio Precancerous cervical lesions Invasive cervical cancera Rate ratio
Cases Incidence rateb Cases Incidence rateb Cases Incidence rateb Cases Incidence rateb Cases Incidence rateb Cases Incidence rateb
Lesion type
 Squamous 1,586c 74.7 135e 6.0 12.5 1,223c 52.8 160e 7.1 7.4 2,716c 57.1 235e 4.6 12.4
 Adenocarcinoma in situ 53d 2.6 57f 2.6 1.0 22d 0.9 30f 1.3 0.7 118d 2.6 140f 2.1 1.2
 Total 1,639 77.3 202 9.0 8.6 1,245 53.7 198 8.7 6.2 2,834 59.6 364 7.2 8.3
a

Data are from the National Program of Cancer Registries and meet United States Cancer Statistics publication criteria for 2009

b

Rates are per 100,000 female population and are age-adjusted to the year 2000 US standard population in 5-year age groups (19 age groups—Census P25–1130) standard. Available from: http://seer.cancer.gov/stdpopulations/. Accessed 05/13/2013

c

Defined by ICD-O-3 histology: 8010, 8050, 8052, 8070–8072, 8076–8077

d

Defined by ICD-O-3 histology: 8140

e

Defined by ICD-O-3 histology: 8050–8084, 8120–8131

f

Defined by ICD-O-3 histology: 8015, 8140–8149, 8160–8162, 8190–8221, 8260–8337, 8350–8551, 8560, 8570–8576, 8940–8941

The extent of missing values for race/ethnicity varied by state, ranging from 1 % in Louisiana to 13 % in Kentucky and 18 % in Michigan. Age-adjusted incidence in Kentucky was highest among Hispanics (109 per 100,000) (Table 3). Age-adjusted incidence in Louisiana was higher among non-Hispanic whites and blacks (58 and 52 per 100,000, respectively) than Hispanics (41). In Michigan, age-adjusted incidence was similar for Hispanics (63 per 100,000), non-Hispanic whites (61) and non-Hispanic blacks (68). In all three states, age-adjusted incidence was lowest among those of other non-Hispanic race.

Quality control and feasibility assessments

Audits included examination of pathology reports to determine whether the neoplasia classification system used resulted in missed cases of pre-invasive cervical lesions. With few exceptions, when cytology terminology [29, 30] was used, it was in combination with histology terminology, allowing for determination of case eligibility according to the established protocol. In Kentucky, 135 (8 %) originally missed cases were subsequently identified based on audit activities; in Louisiana, 92 (7 %) missed cases were identified via audit. In both states, the majority of the missed cases were identified through pathology reports that had not been transmitted electronically. In Michigan, no missed cases were identified based on an audit of reports from 10 randomly selected facilities; however, 5 of the 274 cases (2 %) from these facilities were misreported as having a cervical cancer precursor lesion when they actually had invasive cervical cancer.

In Kentucky, 100 % of cases were reported by pathology laboratories (hospital-based or free standing). In Louisiana, 64 % of cases were reported by hospital registrars, 33 % by pathology laboratories, and 3 % by physicians. In Michigan, 44 % of cases were reported by hospital registrars, 7 % by laboratories, and 48 % by physicians. Electronic reporting was used for the majority of cases in Kentucky (79 %) and Louisiana (67 %); fewer cases in Michigan were reported electronically (24 %). Electronic reporting reduced reporting burden on facility staff; however, adoption of electronic reporting by facilities during the study necessitated facility-specific interactions to identify and resolve implementation problems. Audits of electronic reporting helped identify and address software programming errors during a 6-month project pilot period (July–December 2008). During this period, 135 (8 %) cases in Kentucky were identified as missed by audit; most (81 %) because of logic errors in the case-finding software. In Louisiana, most (82 %) eligible cases from a large electronic reporting facility were not initially identified by the case-finding software because terms were omitted from the software case selection criteria; this was corrected in early 2009. Although electronic reports were delivered in a timely fashion, these data were frequently incomplete, necessitating case follow-back to the original reporting source and increasing operational burden.

In Michigan, where a majority of cases were identified using manual pathology report review by hospital tumor registry or CCR staff, specialized training was implemented to familiarize abstractors with the case definition and reporting procedures. A presentation was given at the Michigan Cancer Registrar’s Association Annual Educational Conference in October 2008, and two web-based presentations were conducted in early 2009. A series of communications describing the project and implementation protocol also were developed and distributed via the MCSP and Detroit registry staff newsletters and e-mails to all licensed Michigan pathology facilities.

To re-establish population-based surveillance for cervical cancer precursors in Kentucky and Louisiana, an initial development period of 6–12 months was necessary to obtain needed approvals (e.g., Institutional Review Board approvals and contracts), conduct staff training and facility outreach activities, update existing reporting and data management systems, and establish data collection and processing procedures. Facilities which diagnose cervical cancer precursors but which do not routinely diagnose other reportable lesions, such as non-hospital laboratories, also needed to be identified and outreach activities conducted to facilitate initiation of routine reporting. Each registry needed 1–1.5 full-time personnel to conduct ongoing project management, data collection, and quality control activities, including implementing changes to existing text-screening software, reviewing pathology reports and conducting follow-up activities. Estimated start-up costs ranged from $80,000 to $100,000 for each registry.

Discussion

This is the first report of population-based cervical cancer precursor incidence from multiple states since surveillance of in situ cervical carcinoma was discontinued in NPCR/SEER in 1996 [15]. Since then, changes in diagnostic terminology and increased diagnosis, treatment of cervical lesions in outpatient settings, HPV vaccination and the implementation of electronic pathology reporting warrant an evaluation of the feasibility of re-instituting surveillance for cervical cancer precursors in CCRs. Only one state, Michigan, continued to collect in situ cervical carcinoma cases as part of its routine cancer surveillance [16]. We observed an age-adjusted cervical cancer precursor incidence of 55 per 100,000 for Michigan, which was similar to the incidence of 59.2 reported by Michigan for CIS and CIN3 in 2003 [16].

Cervical cancer precursors are primarily detected through exfoliated cytology (Papanicolaou or Pap test), but are diagnosed by histologic evaluation of tissue samples. US nomenclature for cervical cytology and histology has changed over time and continues to evolve. The Bethesda system [29, 30], currently used for cervical cytology, categorizes most squamous lesions as either low- or high-grade squamous intraepithelial lesions (LSIL and HSIL, respectively) and glandular lesions as AIS. Histology terminology for squamous lesions originally used dysplasia gradations (mild, moderate, severe) with a separate category for CIS, and subsequently shifted to CIN grades 1, 2, 3; CIN3 includes CIS [31]. Glandular lesions are typically diagnosed histologically as AIS. CIN3 lesions and CIS are consistently recognized by pathologists as true cancer precursors because they rarely regress and have high potential of progression to invasion [32]; ≥31 % of women with CIN3 lesions who received minimal or no treatment developed invasive cervical cancer within 30 years of their CIN3 diagnosis [33]. In contrast, CIN2 diagnoses have poor reproducibility [34, 35] and are significantly more likely to regress than CIN3 [36]. Although it was thought that CIN grades represented a progressive continuum of morphologic changes, the concept that most CIN1 lesions are self-limited and represent transient HPV infection, while CIN3 lesions represent disease with invasive potential, has gained increasing credence [12, 37]. According to this view, CIN2 is an equivocal diagnosis reflecting uncertainty regarding whether the lesion should appropriately be classified as CIN1 or CIN3.

This has led to the recently proposed Lower Anogenital Squamous Terminology (LAST) [31, 38] for HPV-associated lesions, subsequent to implementation of this study. LAST is a two-tiered pathology terminology for anogenital squamous lesions, specifically LSIL and HSIL; HSIL may be further classified using appropriate CIN terminology, depending on the use of biologic markers such as p16 [31]. p16 immunohistochemistry is currently recommended to differentiate low-grade lesions from those cervical cancer precursors which are the focus of this project. Surveillance of in situ cervical carcinoma was previously discontinued in most CCRs in part because of changes in diagnostic terminology and concerns regarding use of the two-tiered Bethesda reporting system for histopathology diagnoses [15]; implementation of LAST terminology will need to be closely monitored over time to determine potential impact on case counts of cervical cancer precursors. Maintaining the ability to efficiently preclude non-precursor lesions under LAST implementation will depend on availability of sufficient additional information in the pathology report (e.g., dysplasia or CIN terminology) or consistent use and reporting of appropriate biologic markers.

For this study, the Bethesda terminology, when used, was almost always in combination with histology terminology. However, this project was limited only to cases with lesions conclusively diagnosed as CIN3, CIS, AIS, or severe dysplasia. Although diagnoses such as “CIN2 and CIN3” were eligible for inclusion, other combinations of these terms such as “CIN2/3” and “CIN2–3” were considered ambiguous, most likely indicating a lesion that was not clearly CIN2 or CIN3. We were not able to rule-out the possibility that combined diagnoses other than “CIN2 and CIN3” might have indicated lesions with invasive potential, resulting in potentially missed cases.

We found a slightly higher ratio of cervical precursor incidence relative to invasive cervical cancer for Kentucky, compared to Louisiana and Michigan. Proportionally, more electronic reporting in Kentucky might have contributed, in part, to this observation; cases were reported sooner via electronic reporting, but we did not assess the impact of such reporting on case finding. Other factors that might contribute to differences in ratios of pre-invasive to invasive cervical lesions across states include differences in population-specific demographic factors, regional or state-specific diagnostic coding by pathologists, and demographic or geographic variations in cervical cancer screening. In general, higher screening rates, followed by appropriate diagnostic procedures, can result in more precursor lesion detection and lower incidence of invasive cancer. Assessing or reviewing cervical cancer screening practices will be important [39] when trends in cervical cancer precursors are used to evaluate the impact of HPV vaccine. Recent guidelines encourage less cervical cancer screening among women under age 21 [4042], so a decrease in rates of cervical cancer precursors may reflect reductions in screening and detection, rather than actual decreases due to HPV vaccination. Although population-based registries of cervical cancer screening do not exist in the USA, self-reported data collected by the Behavioral Risk Factor Surveillance System (BRFSS) [43] can be used to monitor population trends in screening. BRFSS data show slight declines in percentages of adult women receiving a Pap test from 2004 to 2010 for Kentucky (85.0 % in 2004 to80.9 % in 2010) and Michigan (86.5–82.4 %). Screening in Louisiana varied during this period: 85.2 % (2004), 76.7 % (2008), and 83.1 % (2010) [44].

Surveillance methodologies developed for this project are readily transferable to other CCRs. Adequate financial resources are needed to conduct surveillance of cervical cancer precursors; however, once this data collection activity becomes routine, annual costs may decrease somewhat due to fewer resources spent on training and outreach to facilities. Estimated start-up expenses may increase a registry’s costs by approximately 10 %, based on median annual registry operational costs of $900,000 [45]. Additional study is needed to estimate the excess cost of implementing surveillance for precancerous cervical lesions in CCRs over time. In 2010, the Los Angeles Cancer Surveillance Program, the CCR for Los Angeles County, California, began surveillance of cervical cancer precursors using the project protocol. Implementation of this protocol in other CCRs in the United States with relatively high burdens of invasive cervical cancers, particularly those with racially and/or ethnically diverse populations, would result in the development of a valuable sentinel surveillance system for monitoring the impact of HPV vaccine on rates of cervical cancer precursors.

A potential limitation for future activities aimed at increasing the number of CCRs conducting surveillance for cervical cancer precursors may be the legislative environment in a given catchment area. State laws and legislative rules should be reviewed prior to surveillance initiation, and appropriate actions can be taken to modify the legislative language if it exempts collection of cervical cancer precursors from routine cancer reporting requirements. In Kentucky, facilities were mandated to report any lesions specified by the KCR, so reporting regulations did not need to be changed; other jurisdictions may have similarly worded reporting requirements, which would be much easier to amend than laws. To re-establish surveillance of cervical cancer precursors in Louisiana, the LTR worked with the state legislature to change the reporting regulations to include collection of pre-cancerous lesions specified by the LTR; these changes became effective in 2010.

Another limitation of this project was incomplete race/ethnicity data. These data are likely to be incomplete in electronically transmitted pathology reports, which typically do not include demographic information that might be available in medical records. Cases with missing race/ethnicity data were reallocated to non-missing categories based on state-specific proportions of cases with known race/ethnicity. Non-Hispanics of other race, which were less than 5 % of the population in each state, may have been somewhat under-represented by this reallocation. In addition, Hispanic population data, which may differ across states, may underestimate the true population because undocumented workers may be undercounted, artificially inflating incidence estimates for Hispanics. Use of multiple imputation [46] may be an appropriate alternate method to estimate state-specific race/ethnicity distributions for future reporting. Imputation of race/ethnicity data is not typically needed for invasive cancer cases because most CCRs have less than 5 % of missing race values for these cases (a requirement for receiving NAACCR certification). For our study of cervical cancer precursors, only Louisiana used the same procedures to identify race as were used for invasive cancers. These included use of a commercially available online record locator service and contacting facilities to obtain patient race information. This may explain why Louisiana has a very low percentage of cases with missing race.

Population-based surveillance of CIN2, CIN3, and AIS is also being conducted in five geographic city- or county-specific areas throughout the USA by CDC’s HPV-IMPACT project [47]. Although the total female population included in HPV-IMPACT is much smaller (approximately 15 % of the adult female population covered by this project), information on screening history and HPV vaccination is also being collected. Another project, the New Mexico HPV Pap Registry [48], collects state-wide data on Pap test cytology, cervical pathology, and HPV test results on individuals living in New Mexico, which, in conjunction with cervical screening histories, will be used to monitor the effectiveness of HPV vaccine in this population. In-depth investigations similar to these can be readily conducted using the CCR infrastructure developed for this project; special studies addressing topical issues in cancer prevention and control are well-supported by SEER and NPCR [49, 50]. A feasibility study has been conducted that linked the Michigan cervical cancer precursor cases to the state’s immunization information system; this methodology will be useful in examining the effect of HPV vaccine on trends in these lesions [51]. Population-based studies examining factors associated with distributions of HPV genotypes between cases of precancerous cervical lesions also can be conducted using this project’s infrastructure [52, 53].

In summary, this project established the feasibility of routine surveillance of cervical cancer precursors using the existing CCR infrastructure for collecting high-quality population-based data. Standardized collection of these data can be used in future projects and analyses to monitor the impact of HPV vaccination across states.

Acknowledgments

Funding for this work was provided by the Centers for Disease Control and Prevention (Contract No. 200-2008-27956).

Footnotes

Conflict of interest The authors declare that they have no conflict of interest.

Contributor Information

Elaine W. Flagg, Division STD Prevention, National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention, Centers for Disease Control and Prevention, 1600 Clifton Road, NE, MS E-02, Atlanta, GA 30333, USA

S. Deblina Datta, Division STD Prevention, National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention, Centers for Disease Control and Prevention, 1600 Clifton Road, NE, MS E-02, Atlanta, GA 30333, USA.

Mona Saraiya, Division of Cancer Prevention and Control, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USA.

Elizabeth R. Unger, Division of High-Consequence Pathogens and Pathology, National Center for Emerging and Zoonotic Infectious Diseases, Centers for Disease Control and Prevention, Atlanta, GA, USA

Edward Peters, Louisiana Tumor Registry, Epidemiology Program, School of Public Health, Louisiana State University Health Sciences Center, New Orleans, LA, USA.

Lauren Cole, Louisiana Tumor Registry, Epidemiology Program, School of Public Health, Louisiana State University Health Sciences Center, New Orleans, LA, USA.

Vivien W. Chen, Louisiana Tumor Registry, Epidemiology Program, School of Public Health, Louisiana State University Health Sciences Center, New Orleans, LA, USA

Thomas Tucker, Kentucky Cancer Registry, Markey Cancer Control Program, Markey Cancer Center, University of Kentucky, Lexington, KY, USA.

Mary Jane Byrne, Kentucky Cancer Registry, Markey Cancer Control Program, Markey Cancer Center, University of Kentucky, Lexington, KY, USA.

Glenn Copeland, Michigan Cancer Surveillance Program, Michigan Department of Community Health, Lansing, MI, USA.

Won Silva, Michigan Cancer Surveillance Program, Michigan Department of Community Health, Lansing, MI, USA.

Meg Watson, Division of Cancer Prevention and Control, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USA.

Hillard Weinstock, Division STD Prevention, National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention, Centers for Disease Control and Prevention, 1600 Clifton Road, NE, MS E-02, Atlanta, GA 30333, USA.

References

  • 1.Markowitz LE, Dunne EF, Saraiya M, Lawson HW, Chesson H, Unger ER (2007) Quadrivalent human papillomavirus vaccine. Recommendations of the Advisory Committee on Immunization Practices (ACIP). Morb Mortal Wkly Rep 56(RR-02):1–24 [PubMed] [Google Scholar]
  • 2.Centers for Disease Control and Prevention (2010) FDA licensure of bivalent human papillomavirus vaccine (HPV2, Cervarix) for use in females and updated HPV vaccination recommendations from the Advisory Committee on Immunization Practices (ACIP). Morb Mortal Wkly Rep 59:626–629 [PubMed] [Google Scholar]
  • 3.Brotherton JML, Kaldor JM, Garland SM (2010) Monitoring the control of human papillomavirus (HPV) infection and related diseases in Australia: towards a national HPV surveillance strategy. Sex Health 7:310–319 [DOI] [PubMed] [Google Scholar]
  • 4.Fairley CK, Donovan B (2010) What can surveillance of genital warts tell us? Sex Health 7:325–327 [DOI] [PubMed] [Google Scholar]
  • 5.Markowitz LE, Hariri S, Unger ER, Saraiya M, Datta SD, Dunne EF (2010) Post-licensure monitoring of HPV vaccine in the United States. Vaccine 28:4731–4737 [DOI] [PubMed] [Google Scholar]
  • 6.Schuchat A, Bell BP (2008) Monitoring the impact of vaccines postlicensure: new challenges, new opportunities. Expert Rev Vaccines 7:437–456 [DOI] [PubMed] [Google Scholar]
  • 7.Wong CA, Saraiya M, Hariri S et al. (2011) Approaches to monitoring biological outcomes for HPV vaccination: challenges of early adopter countries. Vaccine 29:878–885 [DOI] [PubMed] [Google Scholar]
  • 8.Egelkrout EM, Galloway DA (2008) The biology of genital human papillomaviruses In: Holmes KK, Sparling PF, Stamm WE et al. (eds) Sexually transmitted diseases, 4th edn McGraw Hill Medical, New York City, pp 463–487 [Google Scholar]
  • 9.Winer RL, Kiviat NB, Hughes JP et al. (2005) Development and duration of human papillomavirus lesions, after initial infection. J Infect Dis 191:731–738 [DOI] [PubMed] [Google Scholar]
  • 10.Moscicki AB, Hills N, Shiboski S et al. (2001) Risks for incident human papillomavirus infection and low-grade squamous intraepithelial lesion development in young females. JAMA 395:2995–3002 [DOI] [PubMed] [Google Scholar]
  • 11.Woodman CBJ, Collins S, Winter H et al. (2001) Natural history of cervical human papillomavirus infection in young women: a longitudinal cohort study. Lancet 357:1831–1836 [DOI] [PubMed] [Google Scholar]
  • 12.Baseman JG, Koutsky LA (2005) The epidemiology of human papillomavirus infections. J Clin Virol 32S:S16–S24 [DOI] [PubMed] [Google Scholar]
  • 13.National Program of Cancer Registries (NPCR), Centers for Disease Control and Prevention. http://www.cdc.gov/cancer/npcr/about.htm. Accessed 02/04/2013
  • 14.National Cancer Institute. Surveillance, Epidemiology and End Results (SEER) Program. http://www.seer.cancer.gov/. Accessed 02/04/2013
  • 15.[North] American Association of Central Cancer Registries (1993) Working group on pre-invasive cervical neoplasia and population-based cancer registries: final subcommittee report. [N]AACCR conference held 5–6 Apr 1993, Rockville, MD. Subsequently adopted by the [N]AACCR Executive Board May 1993 and amended November 1993
  • 16.Copeland G, Datta SD, Spivak G, Garvin AG, Cote ML (2008) Total burden and incidence of in situ and invasive cervical carcinoma in Michigan, 1985–2003. Cancer 113(10 suppl):2946–2954 [DOI] [PubMed] [Google Scholar]
  • 17.National Cancer Institute. Surveillance, Epidemiology and End Results Program. International classification of diseases for oncology, 3rd edn, Coding materials. http://seer.cancer.gov/icd-o-3/. Accessed 02/11/2013
  • 18.National Center for Health Statistics, Centers for Disease Control and Prevention. International classification of disease, 9th revision, Clinical modification. http://www.cdc.gov/nchs/icd/icd9cm.htm. Accessed 05/10/2012
  • 19.International Health Terminology Standards Development Organisation. Systematized nomenclature of medicine clinical terms. http://www.ihtsdo.org/snomed-ct/. Accessed 02/11/2013, 2:43 PM EST
  • 20.National Institutes of Health. US National Library of Medicine. Systematized nomenclature of medicine reference terminology. http://www.nlm.nih.gov/research/umls/Snomed/snomed_faq.html. Accessed 02/04/2013, 2:20 PM EST
  • 21.Artificial Intelligence in Medicine (AIM) E-Path software. http://www.aim.ca/cancer-surveillance/. Accessed 02/25/2014
  • 22.National Cancer Institute. Surveillance, Epidemiology and End Results Program data management system. SEER*DMS user manual and tutorials. Chapter 22: Managing follow-back. http://seer.cancer.gov/seerdms/manual/chap22.followback.pdf. Accessed 02/14/2013
  • 23.North American Association of Central Cancer Registries. Standards for cancer registries, vol II, 13th edn http://www.naaccr.org/LinkClick.aspx?fileticket=xSH32ZzIzGI%3d&tabid=268&mid=746. Accessed 02/01/2013 [Google Scholar]
  • 24.Centers for Disease Control and Prevention. National Program of Cancer Registries. GenEdits Plus version 1.2.5. http://www.cdc.gov/cancer/npcr/tools/edits/versionanddownload.htm. Accessed 02/01/2013
  • 25.Centers for Disease Control and Prevention. National Center for Health Statistics. Bridged-race intercensal estimates of the resident population of the United States for July 1, 2000–July 1, 2009, by year, county, single-year of age (0, 1, 2, …, 85 years and over), bridged race, Hispanic origin, and sex. Prepared under a collaborative arrangement with the US Census Bureau. http://www.cdc.gov/nchs/nvss/bridged_race.htm as of December 20, 2010, following release by the US Census Bureau of the revised unbridged intercensal estimates by 5-year age group on July 23, 2010
  • 26.North American Association of Central Cancer Registries. NAACCR Guideline for enhancing Hispanic-Latino identification: revised NAACCR Hispanic/Latino identification algorithm [NHIA v2.2.1]. Revised 12 September 2011. http://www.naaccr.org/LinkClick.aspx?fileticket=6E20OT41TcA%3d&tabid=118&mid=458. Accessed 02/11/2013
  • 27.National Cancer Institute. Surveillance, Epidemiology and End Results (SEER) Program. Standard populations (millions) for age-adjustment. http://www.seer.cancer.gov/stdpopulations/. Accessed 01/15/2014
  • 28.Centers for Disease Control and Prevention. National Program of Cancer Registries. United States Cancer statistics data: all data combined 1999–2009. http://www.cdc.gov/cancer/npcr/uscs/download_data.htm. Accessed 03/18/2013
  • 29.Koss LG (1990) The new Bethesda system for reporting results of smears of the uterine cervix. J Natl Cancer Inst 82:988–991 [DOI] [PubMed] [Google Scholar]
  • 30.Solomon D, Davey D, Kurman R et al. (2002) The 2001 Bethesda system. Terminology for reporting results of cervical cytology. JAMA 287:2114–2119 [DOI] [PubMed] [Google Scholar]
  • 31.Darragh TM, Colgan TJ, Cox J et al. (2012) The lower anogenital squamous terminology standardization project for HPV-associated lesions: background and consensus recommendations from the College of American Pathologists and the American Society for Colposcopy and Cervical Pathology. J Low Genit Tract Dis 16:205–242 [DOI] [PubMed] [Google Scholar]
  • 32.Harmon ML, Cooper K (2009) Cervical neoplasia In: Nucci MR, Oliva E (eds) Gynecologic pathology. Churchill Livingstone Elsevier, Philadelphia [Google Scholar]
  • 33.McCredie MRE, Sharples KJ, Paul C, Baranyai J, Medley G, Jones RW, Skegg DCG (2008) Natural history of cervical neoplasia and risk of invasive cancer in women with cervical intraepithelial neoplasia 3: a retrospective cohort study. Lancet Oncol 9:425–434 [DOI] [PubMed] [Google Scholar]
  • 34.Stoler MH, Schiffman M (2001) Interobserver reproducibility of cervical cytologic and histologic interpretations: realistic estimates from the ASCUS-LSIL triage study. JAMA 185:1500–1510 [DOI] [PubMed] [Google Scholar]
  • 35.Dalla Palma P, Giorgi Rossi P, Collina G et al. (2009) The reproducibility of CIN diagnoses among different pathologists. Data from histology reviews from a multicenter randomized study. Am J Clin Pathol 132:125–132 [DOI] [PubMed] [Google Scholar]
  • 36.Solomon D, Castle PE (2005) Findings from ALTS: impact on cervical cytology screening, triage, and patient management. Pathol Case Rev 10:128–137 [Google Scholar]
  • 37.Snijders PJF, Steenbergen RDM, Heideman DAM, Meijer CJLM (2006) HPV-mediated cervical carcinogenesis: concepts and clinical implications. J Pathol 208:152–164 [DOI] [PubMed] [Google Scholar]
  • 38.CAP/ASCCP lower anogenital squamous terminology for HPV-associated lesions. Summary of consensus recommendations. http://www.cap.org/aps/docs/membership/transformation/new/asccp-sum_last_recom.pdf. Accessed 03/15/2013
  • 39.Saraiya M, Goodman MT, Datta SD, Chen VW, Wingo PA (2008) Cancer registries and monitoring the impact of prophylactic human papillomavirus vaccines: the potential role. Cancer 113(10 suppl):3047–3057 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Saslow D, Solomon D, Lawson HW et al. (2012) American Cancer Society, American Society for Colposcopy and Cervical Pathology, and American Society for Clinical Pathology screening guidelines for the prevention and early detection of cervical cancer. CA Cancer J Clin 62:147–172 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.American Congress of Obstetricians and Gynecologists. New cervical cancer screening recommendations from the US Preventive Services Task Force and the American Cancer Society/American Society for Colposcopy and Cervical Pathology/American Society for Clinical Pathology. Released 14 March 2012. http://www.acog.org/About_ACOG/Announcements/New_Cervcal_Cancer_Screening_Recommendations. Accessed 03/18/2013
  • 42.US Preventive Services Task Force. Screening for cervical cancer. Current recommendation. Released March 2012. http://www.uspreventiveservicestaskforce.org/uspstf/uspscerv.htm. Accessed 03/18/2013
  • 43.Centers for Disease Control and Prevention. Office of Surveillance, Epidemiology, and Laboratory Services. Behavioral risk factor surveillance system, 2008 and 2010. http://www.cdc.gov/brfss/technical_infodata/surveydata.htm. Accessed 02/01/2013
  • 44.Centers for Disease Control and Prevention. Office of Surveillance, Epidemiology, and Laboratory Services. Behavioral risk factor surveillance system. Prevalence and trends data women’s health—2010. Women aged 18+ who have had a pap test within the past three years. http://apps.nccd.cdc.gov/BRFSS/list.asp?cat=WH&yr=2010&qkey=4426&state=All. Accessed 03/20/2013
  • 45.Tangka F, Subramanian S, Beebe MC, Trebina D, Michaud F (2010) Economic assessment of central cancer registry operations, part III: results from 5 programs. Journal of Registry Management 37:152–155 [PubMed] [Google Scholar]
  • 46.Yuan YC (2000) Multiple imputation for missing data: concepts and new development. Paper P267–25. Presented at 25th SAS Users Group International. 9–12 Apr 2000. http://www2.sas.com/proceedigns/sugi25/25/st/25p267.pdf. Accessed 03/18/2013
  • 47.Hariri S, Unger ER, Powell SE et al. (2012) The HPV vaccine impact monitoring project (HPV-IMPACT): assessing early evidence of vaccination impact on HPV-associated cervical cancer precursor lesions. Cancer Causes Control 23:281–288 [DOI] [PubMed] [Google Scholar]
  • 48.Wheeler CM, Hunt WC, Cuzick J, Langsfeld E, Pearse A, Montoya GD et al. (2013) A population-based study of human papillomavirus genotype prevalence in the United States: baseline measures prior to mass human papillomavirus vaccination. Int J Cancer 132:198–207 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.About the SEER Program. Research activities. http://seer.cancer.gov/about/. Accessed 02/15/2013
  • 50.National Cancer Institute. Surveillance, Epidemiology and End Results Program data management system. SEER*DMS user manual and tutorials. Chapter 28: Special studies. http://seer.cancer.gov/seerdms/manual/chap28.special.studies.pdf. Accessed 02/15/2013
  • 51.Potter R, Copeland G, Datta SD, Saraiya M, Flagg EW (2012) Monitoring the impact of human papillomavirus (HPV) vaccines on precancerous cervical lesions by linking immunization information system and cancer registry data in Michigan. Abstract 30354. Presented at the 1st national immunization conference online 26–28 March 2012 https://cdc.confex.com/cdc/nic2012/webprogram/Paper30354.html. Accessed 03/18/2013 [Google Scholar]
  • 52.Gargano JW, Wilkinson EJ, Unger ER, Steinau M, Watson M, Huang Y et al. (2012) Prevalence of human papillomavirus types in invasive vulvar cancers and vulvar intraepithelial neoplasia 3 in the United State before vaccine introduction. J Low Genit Tract Dis 16(4):471–479 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Steinau M, Unger ER, Hernandez BY, Goodman MT, Copeland G, Hopenhayn C et al. (2013) Human papillomavirus prevalence in invasive anal cancers in the United States before vaccine introduction. J Low Genit Tract Dis 17(4). doi: 10.1097/LGT.0b013e31827ed372 [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES