Abstract
The Centers for Disease Control and Prevention (CDC) has emphasized a need for data modernization initiatives (DMIs) to improve the quality and timeliness of cancer surveillance data. To guide such DMI efforts, we need data on the resources required to generate high-quality data. Twenty-one central cancer registries (CCRs) collected data from July 2020 to June 2021. We explored the potential relationship between resources for (1) electronic reporting and automation and (2) the quality of CCR data. We then compared activity-based costs of registries that always (n = 8), sometimes (n = 6), or seldom/never (n = 7) met data quality standards for completeness, timeliness, duplicate rate, and missing values. We found the registries that consistently met data quality standards used more resources on setting up processes to acquire data, training staff, and processing data. Compared to high-quality registries, registries that seldom/never met quality standards spent the most on case finding/data abstraction of non-hospital records. This study provides key findings for resource use, which can guide advancements when implementing electronic reporting and automation to improve CCR operations.
Keywords: cancer registry, electronic reporting, cost
INTRODUCTION
The National Program of Cancer Registries (NPCR)—comprising central cancer registries (CCRs) in 46 states, the District of Columbia, Puerto Rico, the U.S. Virgin Islands, and the U.S. Affiliated Pacific Islands—is supported by the Centers for Disease Control and Prevention (CDC) to collect cancer incidence data. CDC created NPCR data quality standards to evaluate registries on their ability to report cancer data. The standards rely on various metrics that monitor data quality, completeness, and timeliness at 12 and 24 months following the date of diagnosis (CDC NPCR Standards, 2023). CDC has emphasized the importance of data modernization initiatives (DMIs), including electronic reporting and automation innovation (CDC Data Modernization Initiative, 2023). For example, CDC is developing an initiative to improve the efficiency, timeliness, and accuracy of collecting and processing cancer data from laboratories and medical facilities by using a cancer surveillance cloud-based computing platform (CS-CBCP). This initiative aims to address four main focus areas: “(1) provide a uniform platform that addresses the currently fragmented processes created by disparate software applications; (2) automate initial record creation by establishing a record at the first point of detection, which is the laboratory for approximately 90%−95% of invasive cancer diagnoses; (3) automate record completion by requesting information from the responsible health care provider or pulling information from the electronic health record (EHR); and (4) standardize the interface for CCR interaction and reporting to NPCR” (Jones et al., 2021). Cancer registries are in various phases of adopting electronic reporting and automation. Prior research using NPCR 2017 Program Evaluation Instrument (PEI) data provided preliminary confirmation that registries reporting higher-quality data also had a higher level of electronic reporting. Furthermore, these registries had a higher proportion of staffing positions filled, a higher proportion of certified tumor registrars, and more quality assurance (QA) and information technology (IT) staff (Edwards et al., 2022). However, no study to date has explored the resources devoted to generating high-quality data, which are essential for planning future DMI.
We assessed the relationship between (1) the resources devoted to electronic reporting and automation and (2) the quality of CCR data, defined as consistently meeting data quality standards for completeness, timeliness, duplicate rate, and missing values. We collected detailed, activity-based costs from registries (with a broad range of characteristics) to explore resources devoted to acquiring, processing, and reporting data—with a focus on electronic reporting and automation. By clarifying resource use patterns among registries that consistently meet high data quality standards, our findings may help guide personnel to support registries as they expand electronic reporting.
METHODS
Overview and Registry Selection
To conduct a comprehensive quantitative evaluation of CCR resource use—related to activities involved in data acquisition, processing, and reporting—we identified a varied set of NPCR central cancer registries. We purposively sampled registries to account for the geographic and operational diversity inherent among the registry programs. Relying on the criteria used by NPCR to assess data completeness, timeliness, duplicate rate, and missing values, we first stratified the registries into three groups: always, sometimes, and seldom/never meeting quality standards (see Appendix A).
Always: Met the 12-month standard for all years 2014–2017.
Sometimes: Met the 12-month standard at least once during 2014–2017 and met the 24-month standards for all years 2014–2017.
Seldom/Never: Did not meet the 24-month standards for all years 2014–2017.
We used the (12-month) Advanced National Data Quality Standard to stratify into groups—because most NPCR registries meet the (24-month) National Data Quality Standard, and because increasing the timeliness of registry reporting is a CDC priority. Moreover, registries that always meet the data quality standards are those that are most likely to have adopted operational processes to incorporate electronic reporting and automation. After stratifying the registries into the three categories by quality standards, we identified 6 to 8 registries with varied characteristics in each of the 3 groups, for a total of 21 registries. Characteristics for registry selection included case volume, size of the area served, geographic location, rurality, and funding sources. These characteristics and registry classifications were identified using the NPCR compliance reports and the PEI.
Data Collection
The program period was July 1, 2020, to June 30, 2021. Activity-based cost data were collected from each participating registry to assess resource use differences among cancer registries that always, sometimes, and seldom/never met the national data standards, with specific attention to investments in electronic reporting and automation. We collected cost data for labor and non-labor resources from 21 registries, using a previously validated MS Excel-based tool (Beebe et al., 2018; Subramanian et al., 2016). The tool collected retrospective cost data in seven modules: Funding Sources; Registry Personnel; Retrospective Staffing; Consultants; Computers, Travel, Training, and Other Materials; Software; and Overhead. The Funding Sources module collected information on total funds received that were expended during the program period. The Registry Personnel module collected information on the personnel expenditures—including job titles, full-time equivalents (FTEs), and salaries—to calculate the total personnel costs. The Retrospective Staffing module collected information on the estimated percentage of time spent on registry activities during the past year by each of the registry staff identified in the Registry Personnel module. The Consultants module collected information on the job titles, annual payments, and activities performed by consultants (defined as persons with whom the registry had a formal agreement to perform registry activities), other personnel not employed by the registry, and any contractual expenditure not reported elsewhere. The Computers, Travel, Training, and Other Materials module collected information on the costs associated with hardware, IT support, travel, training, and other materials. The Software module collected information on the costs associated with software used by registries during the program period. The Overhead module collected information from the registries on indirect costs, the types of costs (fixed or variable), and cost amounts paid during the program period.
Registries were instructed to allocate all module costs to a list of registry activities. When more than one activity was applicable, registries estimated what percentage of the cost should be allocated to the various activities. The standardized list of registry activities (Table 1) was then tailored to each registry to capture more nuanced activities within the broader standardized activities. We used the standardized list for data analysis, and we grouped tailored registry activities into the appropriate standardized activities.
TABLE 1.
STANDARDIZED LIST OF REGISTRY ACTIVITIES
| REGISTRY ACTIVITIES | |
|---|---|
| Data Acquisition | Setting up Data Acquisition (DA) |
| DA: Electronic transfer from data source | |
| DA: Active data collection | |
| DA: Retrieve electronically via hospital EMR, exchange, or repository | |
| DA: Other methodsa | |
| Case finding/data abstraction of non-hospital records | |
| Monitor registry data and conduct follow-back activities for missing information | |
| Other DA–related activitiesb | |
| Data Processing | Perform data edits and case consolidation |
| Data validation/quality assurance | |
| Perform data linkages | |
| Death clearance | |
| Screen for reportability | |
| Other data processing–related activitiesc | |
| Data Reporting | Develop analytic files, analyze data, generate reports, and report data |
| Support data requests for research or special studies | |
| Other data reporting–related activitiesd | |
| General | IT support |
| Management/administrative/document policies, procedures | |
| Partnerships and collaborations | |
| Training of registry staff | |
| Research studies, advanced analysis, publications, websites | |
| COVID-19e | |
| Other generalf | |
Examples of other data acquisition methods include acquiring disease index, fax, paper reports, and critical data change forms.
Examples of other data acquisition–related activities include testing new data elements and working with delinquent reporters.
Examples of other data processing–related activities include fixing file transfer processes.
Examples of other data reporting–related activities include creating data exchange or data linkage files.
Examples of COVID-19 activities include staff supporting specific state-level COVID-19 activities or being fully deployed for several months.
Examples of other general activities include daily emails and phone calls.
Note: In general, other data acquisition, processing, and reporting activities also included meetings or IT support relevant to the respective broad categories.
EMR, electronic medical records; IT, information technology.
We collected additional data on IT staff from the 2019 PEI, a web-based CDC survey completed by appropriate registry staff from each CCR incorporating data through December 2019. The PEI data include the reported number of FTEs in computer/IT/geographic information system GIS specialist positions (filled or vacant) in either contractor or non-contractor roles. FTEs are reported to the hundredth decimal point and may include time spent by non-registry staff.
Data Analysis
Using the registry’s retrospective costing tool, we were able to identify the total resources expended by each registry. We then allocated costs to the standardized list of activities to determine the cost of each activity. On average, we were able to allocate, to specific activities, 99% of a registry’s total resources expended. Once activity-based costs were determined, we used them to calculate the total resources allocated to the different overall categories of data acquisition, data processing, data reporting, and general activities. Finally, we calculated the average resources and the average percentage of resources spent, by activity category, for registries that always, sometimes, or seldom/never met the quality standards.
Using the total IT staff FTEs (combined contractor and non-contractor totals) from the PEI data, we calculated the average number of IT staff FTEs by registries always, sometimes, or seldom/never meeting the quality standards.
RESULTS
The general characteristics of registries selected for this study are presented in Table 2 and are stratified by those that always, sometimes, and seldom/never met the quality standards. Of the 21 registries, 8 registries always met, 6 sometimes met, and 7 seldom/never met the quality standards. Registries were classified as high volume (29%), medium volume (43%), and low volume (29%). Registries were distributed across the United States, with 29% in the Midwest, 19% in the Northeast, 24% in the South, and 29% in the West. Registries served large (38%), medium-sized (38%), and small areas (24%) based on the size of the geographic location. Regarding rurality, 43% of registries served low-density rural areas, 24% served medium-density rural areas, and 33% served high-density rural areas. High, medium, and low volume; area covered by the registry; and density of rural areas are defined in Table 2. Large areas had a high proportion of registries that seldom/never met the quality standards (71%), compared to the proportion of registries that always met the quality standards (24%). High-density rural areas had a high proportion of registries that always met the quality standards (50%), compared to registries that seldom/never did (14%). Most registries (86%) received state funding in addition to NPCR funds, but a small percentage (14%), in addition to state funding, also received other federal funding from the National Cancer Institute Surveillance, Epidemiology, and End Results (NCI SEER). As sources of funding were limited, none of the 21 registries received funding not noted in Table 2. Registry characteristics showed varied consistency in meeting NPCR 12- and 24-month reporting standards.
TABLE 2.
OVERVIEW OF NPCR REGISTRIES BY ACHIEVEMENT OF NPCR DATA QUALITY AND REPORTING STANDARDS DURING 2014–2017
| Achievement of NPCR Standards | All | Always | Sometimes | Seldom/ Never |
|---|---|---|---|---|
| Attributes of Registries | n (%) | n (%) | n (%) | n (%) |
| Total | 21 (100) | 8 (38) | 6 (29) | 7 (33) |
| Volume of Cases | ||||
| High (> 26,558 cases) | 6 (29) | 3 (38) | 1 (17) | 2 (29) |
| Medium (10,455–26,558 cases) | 9 (43) | 2 (25) | 3 (50) | 4 (57) |
| Low (< 10,455 cases) | 6 (29) | 3 (38) | 2 (33) | 1 (14) |
| Geographic Area | ||||
| Midwest | 6 (29) | 1 (12) | 2 (33) | 3 (43) |
| Northeast | 4 (19) | 3 (38) | 1 (17) | 0 |
| South | 5 (24) | 2 (25) | 2 (33) | 1 (14) |
| West | 6 (29) | 2 (25) | 1 (17) | 3 (43) |
| Size of Area Covered by Registry | ||||
| Large (> 70,684 square miles [sq mi]) | 8 (38) | 2 (25) | 1 (17) | 5 (71) |
| Medium-sized (44,453–70,684 sq mi) | 8 (38) | 3 (38) | 3 (50) | 2 (29) |
| Small (< 44,453 sq mi) | 5 (24) | 3 (38) | 2 (33) | 0 (0) |
| Density of Rural Areas | ||||
| Low (> 147 residents per sq mi) | 9 (43) | 3 (38) | 2 (33) | 4 (57) |
| Medium (58–147 residents per sq mi) | 5 (24) | 1 (12) | 2 (33) | 2 (29) |
| High (< 58 residents per sq mi) | 7 (33) | 4 (50) | 2 (33) | 1 (14) |
| Sources of Additional Funding | ||||
| State and Other Federal (NCI SEER) | 3 (14) | 2 (25) | 1 (17) | 0 (0) |
| State | 18 (86) | 6 (75) | 5 (83) | 7 (100) |
Source: RTI and CDC analysis of 2019 Program Evaluation Instrument (PEI) data.
Note: Percentages may not add to 100 because of rounding.
Note: Pearson’s chi-square tests were conducted with an alpha level of 5%. Statistical testing showed no significant differences in the results.
NCI SEER, National Cancer Institute Surveillance, Epidemiology, and End Results; NPCR, National Program of Cancer Registries.
Figure 1 presents the total resources expended by registries that always, sometimes, and seldom/never met quality standards. On average, registries that always met the quality standards spent $1,735,846 during the program year; registries that sometimes met the quality standards spent $981,573, and those that seldom/never met the quality standards spent $1,221,486. Although the mean and median vary between meeting quality standards (as shown by the minimum and maximum), the range in the funding expended is similar across groups.
FIGURE 1. TOTAL RESOURCES EXPENDED IN FY2020 BY ACHIEVEMENT OF NPCR DATA QUALITY AND REPORTING STANDARDS DURING 2014–2017.

Source: RTI and CDC analysis of the NPCR retrospective costing tool collected for the program period July 1, 2020–June 30, 2021.
Note: One outlier was removed from this figure.
Note: F-tests for one-way analysis of variance were conducted with an alpha level of 5%. Statistical testing showed no significant differences in the results.
NPCR, National Program of Cancer Registries.
We conducted a more-detailed review of the total resources expended by data acquisition, data processing, data reporting, and general activities. Table 3 shows the average percentage of funds spent on each category, by meeting quality standards. Registries that always met the quality standards had the lowest proportion of their resources allocated for data acquisition, but the highest proportion allocated for data reporting and general activities. However, there was no statistically significant difference in this percent allocation of resources by activity category.
TABLE 3.
AVERAGE PERCENTage OF RESOURCES ALLOCATED TO ACTIVITY CATEGORY BY ACHIEVEMENT OF NPCR DATA QUALITY AND REPORTING STANDARDS DURING 2014–2017
| Activity Category | Always | Sometimes | Seldom/Never |
|---|---|---|---|
| Data Acquisition | 17% | 22% | 21% |
| Data Processing | 37% | 40% | 38% |
| Data Reporting | 11% | 10% | 8% |
| General | 35% | 28% | 33% |
Source: RTI and CDC analysis of the NPCR COPE retrospective costing tool collected for the program period July 1, 2020–June 30, 2021.
Note: Two-sample proportion Z-tests were conducted with an alpha level of 5%. Statistical testing showed no significant differences in the results.
NPCR, National Program of Cancer Registries.
We assessed the specific activity-based costs within the data acquisition, data processing, data reporting, and general categories. Figure 2 shows the average amount expended on data acquisition by whether registries met quality standards. Registries that always met the data quality standards spent the most money on setting up data acquisition, active data collection, registry data monitoring, and conducting follow-back activities. Registries that seldom/never met the quality standards spent the most on case finding or data abstraction of non-hospital records.
FIGURE 2. AVERAGE RESOURCES EXPENDED ON DATA ACQUISITION ACTIVITIES BY ACHIEVEMENT OF NPCR DATA QUALITY AND REPORTING STANDARDS DURING 2014–2017.

Source: RTI and CDC analysis of the NPCR COPE retrospective costing tool collected for the program period July 1, 2020–June 30, 2021.
Note: F-tests for one-way analysis of variance were conducted with an alpha level of 5%. Statistical testing showed no significant differences in the results.
DA, data acquisition; EMR, electronic medical records; NPCR, National Program of Cancer Registries.
Figure 3 presents the average resources expended on data processing activities by registries that met quality standards. Registries that always met the data quality standards spent the most money on performing data edits and case consolidation, data validation/QA, and other processing activities.
FIGURE 3. AVERAGE RESOURCES EXPENDED ON DATA PROCESSING ACTIVITIES BY ACHIEVEMENT OF NPCR DATA QUALITY AND REPORTING STANDARDS DURING 2014–2017.

Source: RTI and CDC analysis of the NPCR COPE retrospective costing tool collected for the program period July 1, 2020–June 30, 2021.
*Note: F-tests for one-way analysis of variance were conducted with an alpha level of 5%. Statistical testing showed a significant difference between registries that sometimes and seldom/never met the quality standards, p < .05.
NPCR, National Program of Cancer Registries.
Figure 4 shows the average resources expended on data reporting activities by registries that met quality standards. Registries that always met the data quality standards spent the most resources on all data reporting activities.
FIGURE 4. AVERAGE RESOURCES EXPENDED ON DATA REPORTING ACTIVITIES BY ACHIEVEMENT OF NPCR DATA QUALITY AND REPORTING STANDARDS DURING 2014–2017.

Source: RTI and CDC analysis of the NPCR COPE retrospective costing tool collected for the program period July 1, 2020-June 30, 2021.
Note: F-tests for one-way analysis of variance were conducted with an alpha level of 5%. Statistical testing showed no significant differences in the results.
NPCR, National Program of Cancer Registries.
Figure 5 shows the average resources expended on data reporting activities by registries that met quality standards. Registries that always met the quality standards spent the most money on IT support, management/administrative activities, and training of registry staff.
FIGURE 5. AVERAGE RESOURCES EXPENDED ON GENERAL ACTIVITIES BY ACHIEVEMENT OF NPCR DATA QUALITY AND REPORTING STANDARDS DURING 2014–2017.

Source: RTI and CDC analysis of the NPCR COPE retrospective costing tool collected for the program period July 1, 2020–June 30, 2021.
Note: F-tests for one-way analysis of variance were conducted with an alpha level of 5%. Statistical testing showed no significant differences in the results.
IT, information technology; NPCR, National Program of Cancer Registries.
Table 4 shows the average IT staff support for FTEs by registries that met quality standards. The IT staff support in the table includes both registry devoted IT staff and contracted IT staff. Registries that always met the quality standards had, on average, more IT staff supporting FTEs than registries that sometimes or seldom/never met the quality standards.
TABLE 4.
AVERAGE IT STAFF SUPPORT FTES BY ACHIEVEMENT OF NPCR DATA QUALITY AND REPORTING STANDARDS DURING 2014–2017
| Achievement of NPCR Standards | All | Always | Sometimes | Seldom/ Never |
|---|---|---|---|---|
| Attributes of Registries | n (%) | n (%) | n (%) | n (%) |
| Total | 21 (100) | 8 (38) | 6 (29) | 7 (33) |
| Average IT Staff Support (Total FTEs) | 1.27 | 2.04 | 1.06 | 0.57 |
Source: RTI and CDC analysis of 2019 Program Evaluation Instrument (PEI) data.
Note: F-tests for one-way analysis of variance were conducted with an alpha level of 5%. Statistical testing showed no significant differences in the results.
FTEs, full-time equivalents; IT, information technology; NPCR, National Program of Cancer Registries.
This study explores the resources expended by CCRs that always, sometimes, and seldom/never met the NPCR data quality standards. On average, registries that always met the quality standards expended more total resources on registry activities, but there was wide variation across all three registry categories. Registries with high and low levels of funding were present across the three study groups. This finding is consistent with previous studies that reported variation in registry operations because of factors such as legislation and regulation, interstate data exchanges, quality and capacity of reporting sources, and other external factors that can affect expended resources (Tangka et al., 2016).
At a more granular level, registries that always met the quality standards spend the most on setting up data acquisition. This activity was inclusive of identifying, onboarding, and training staff from reporting sources, along with creating and testing systems to receive data. During interviews with cancer registries’ staff, many noted that high resource use is required to set up electronic reporting, especially when software changes are required (Edwards et al., 2022). Additionally, employing IT staff to set up electronic reporting is costly; registries that always met the quality standards spent the most on IT support and, on average, had more IT staff than registries that sometimes or seldom/never met the standards. This finding highlights the importance of having IT staff to support operational procedures when implementing electronic reporting and automation.
Although registries that always met the quality standards spent the most on setting up data acquisition—active data collection, monitoring of registry data, and conducting follow-back activities—the average percentage of total resources dedicated to the data acquisition category was the lowest among these registries (17%). This could be caused by electronic reporting and automation not requiring as many resources once they are set up, which is the aim of DMIs. One of the possible and desired outcomes of the CS-CBCP project is to minimize the need for manual processes by acquiring complete, high-quality cancer data faster (Jones et al., 2021). Therefore, this points to the importance of receiving high-quality electronic data from reporting sources and implementing automated procedures to quickly process the data.
Registries that always met the quality standards spent more on data edits and case consolidation as well as data validation and QA. In a previous study with cancer registries examining facilitators and barriers to electronic reporting, registries noted the need for more QA staff because the number of records received increased substantially with electronic reporting. Registry staff also mentioned that several manual tasks were needed to process the electronic data, particularly regarding case consolidation (Tangka et al., 2021). These additional manual reviews could be reduced through further automation to optimize the benefits from electronic reporting.
Registries that always met the quality standards also spent the most on training registry staff. Registries identified staff training as a facilitator of electronic reporting (Tangka et al., 2021). Furthermore, registries that always met the quality standards on average spent the most on all data reporting categories and on average devoted the largest percentage of funds to data reporting. Future studies might explore the correlation between registries that always met the quality standards and resources devoted to data reporting.
The analysis presented here, based on a small subset of CCRs that were selected to reflect a variety of registry characteristics, does not consider potential interactions among characteristics. The small sample size is more sensitive to outliers, which can shift calculated averages. Furthermore, we attempted to collect detailed, activity-based cost and provided definitions for all the categories included. We also offered technical assistance throughout the data collection process, and despite our efforts, there could be differences in interpretation or reporting of activities across the registries. For example, in some instances, IT staff activities could be embedded within ongoing registry activities, and these may not have been reported as standalone IT support. Furthermore, we were not able to collect details about all activities specifically related to electronic reporting and automation. Retrospective costing data were reported for a prior annual period and can be subject to recall bias, but our team’s comparative analysis, using prospectively collected activity-based labor hours, indicated that our retrospective data was quite accurate (Beebe et al., 2023). We designed and planned this study before the COVID-19 public health emergency, which was in effect during the data collection period. Five of the participating registries had to shift staffing resources, amounting to no more than 5% of the total cost, to support the COVID-19 response. This could have potentially disrupted normal registry operations; therefore the study results may not be illustrative of general operations.
CONCLUSIONS
The exploratory analysis presented in this study identified several features among registries that consistently met data quality standards, which could support the expansion of electronic reporting and automation. First, these registries expended more resources on setting up processes to acquire data, reflecting potentially higher investment in facilitating electronic reporting. This investment could be why they spent the least on data acquisition activities. Second, these registries also spent more resources on staff training. These findings reveal that adoption of electronic reporting and automation requires investing in supporting partners who submit data and registry staff who support data processing. An additional finding is that electronic reporting could lead to high resource needs related to data processing—highlighting the need for more automation in CCR data processing to fully reap the efficiency benefits of receiving more timely electronic data submissions.
ACKNOWLEDGEMENTS
We thank the cancer registry staff who participated in this study.
APPENDIX
Appendix A
| NPCR Data Quality Standards Criteria | 12-Month Standards | 24-Month Standards |
|---|---|---|
| Completeness (based on observed-to-expected cases) | 90% | 95% |
| Unresolved Duplicate Rate | 2 per 1,000 or fewer | 1 per 1,000 or fewer |
| Maximum Percentage Missing Critical Data Elements | ||
| Age | 3% | 2% |
| Sex | 3% | 2% |
| Race | 5% | 3% |
| County | 3% | 2% |
| Percentage Passing CDC-Prescribed Set of Standard Edits | 97% | 99% |
Footnotes
DISCLAIMER
The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.
Contributor Information
Florence K. L. Tangka, Centers for Disease Control and Prevention
Jenny Beizer, RTI International.
Maggie Cole-Beebe, RTI International.
Amarilys Bernacet, RTI International.
Stephen Brown, RTI International.
Paran Pordell, Centers for Disease Control and Prevention.
Reda Wilson, Centers for Disease Control and Prevention.
Sandra F. Jones, Centers for Disease Control and Prevention
Sujha Subramanian, Implenomics.
REFERENCES
- Beebe MC, Subramanian S, Tangka FK, Weir HK, Babcock F, & Trebino D (2018). An analysis of cancer registry cost data: Methodology and results. Journal of Registry Management, 45(2), 58–64. https://www.ncbi.nlm.nih.gov/pubmed/3153312 [PMC free article] [PubMed] [Google Scholar]
- Beebe MC, Tangka FK, Beizer J, Bernacet A, Brown S, Pordell P, Wilson R, Jones SF, Subramanian S (2023). Exploring central cancer registry activity-based costs as electronic reporting evolves. [Manuscript in preparation]. [Google Scholar]
- Centers for Disease Control and Prevention. (2023). Data modernization initiative. Retrieved July 26, 2023, from https://www.cdc.gov/surveillance/data-modernization/index.html
- Centers for Disease Control and Prevention. (2023). NPCR Standards. Retrieved July 26, 2023, from https://www.cdc.gov/cancer/npcr/standards.htm
- Edwards P, Bernacet A, Tangka FKL, Pordell P, Beizer J, Wilson R, Blumenthal W, Jones SF, Cole-Beebe M, & Subramanian S (2022). Operational characteristics of central cancer registries that support the generation of high-quality surveillance data. Journal of Registry Management, 49(1), 10–16. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10036081/ [PMC free article] [PubMed] [Google Scholar]
- Jones DE, Alimi TO, Pordell P, Tangka FK, Blumenthal W, Jones SF, Rogers JD, Benard VB, & Richardson LC (2021). Pursuing data modernization in cancer surveillance by developing a cloud-based computing platform: Real-time cancer case collection. JCO Clinical Cancer Informatics, 5, 24–29. 10.1200/CCI.20.00082 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Subramanian S, Tangka F, Edwards P, Hoover S, & Cole-Beebe M (2016). Developing and testing a cost data collection instrument for noncommunicable disease registry planning. Cancer Epidemiology, 45(Suppl 1), S4–S12. 10.1016/j.canep.2016.10.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tangka FKL, Edwards P, Pordell P, Wilson R, Blumenthal W, Jones SF, Jones M, Beizer J, Bernacet A, Cole-Beebe M, & Subramanian S (2021). Factors affecting the adoption of electronic data reporting and outcomes among selected central cancer registries of the National Program of Cancer Registries. JCO Clinical Cancer Informatics, 5, 921–932. 10.1200/CCI.21.00083 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tangka FK, Subramanian S, Beebe MC, Weir HK, Trebino D, Babcock F, & Ewing J (2016). Cost of operating central cancer registries and factors that affect cost: Findings from an economic evaluation of Centers for Disease Control and Prevention National Program of Cancer Registries. Journal of Public Health Management and Practice: JPHMP, 22(5), 452–460. 10.1097/PHH.0000000000000349 [DOI] [PMC free article] [PubMed] [Google Scholar]
