Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Jul 14.
Published in final edited form as: J Perinatol. 2020 Aug 13;41(7):1611–1620. doi: 10.1038/s41372-020-00783-z

Inadequacies of Hospital-Level Critical Congenital Heart Disease Screening Data Reports: Implications for Research and Quality Efforts

Heather Siefkes 1, Laura R Kair 1, Annamarie Saarinen 2, Satyan Lakshminrusimha 1
PMCID: PMC7881046  NIHMSID: NIHMS1617665  PMID: 32792631

Abstract

Objective:

Assess the quality of critical congenital heart disease (CCHD) screening data reports in California, where CCHD screening is not mandatory but reporting is.

Study Design:

Retrospective review of California hospital-level CCHD screening data to evaluate data reliability and adherence to state screening and reporting recommendations. Data were evaluated for internal consistency and compared to two databases.

Results:

Over one-third of hospitals did not submit data. Only 70.7% of the Vital Records live births were reported in CCHD screening data. Only 46% of reporting hospitals submitted data with matching numbers of completed screens and results, and 22% matched their respective live births in a second database.

Conclusion:

CCHD data reporting in California is incomplete, which may miss 359 CCHD cases/year from non-reporting. Data inconsistencies may miss additional cases. Mandatory screening, reporting and improvements in data reliability are urgently needed to inform screening modifications and enhance timely detection and disease surveillance.

Introduction

Newborn screening (NBS) is a robust, complex system aimed at early identification and intervention to improve newborn and child health.1 The United States (US) Secretary of Health and Human Services Advisory Committee on Heritable Disorders in Newborns and Children (SACHDNC) recommends all newborns be screened for disorders listed under the Recommended Uniform Screening Panel (RUSP). The RUSP currently includes 35 core conditions and 26 secondary conditions.2 All of the conditions, with the exception of hearing loss and critical congenital heart disease (CCHD), can be screened for via bloodspot testing. Hearing loss and CCHD screening require point-of-care testing. Among the newer conditions on the panel, the more well-established bloodspot aspect of NBS has benefited from greater ability to centralize data, which can be beneficial for research and quality improvement initiatives involving multiple states.3 One notable difference between hearing loss and CCHD is the potential lethality if a CCHD is not detected in a timely manner.4,5 Thus, unlike hearing loss, where tracking and centralization of data helps ensure patients receive timely follow up and interventions over a period of months to years,6 the benefits of centralized CCHD screening data would 1) aid with systems-level progress, facilitating research and quality improvement to improve the screening, 2) provide clinical insights around hypoxemia and asymptomatic medical conditions in newborns and 3) support policy changes related to NBS.

After the US SACHDNC added CCHD to the RUSP in 2011, states began adopting CCHD screening policies.7 As of 2018, all US states and the District of Columbia have passed legislation or have formal public health statutes in place requiring CCHD screening.8 California is the only state with a policy to “offer” screening; others mandate screening be performed.8 States with mandatory screening policies have decreased early infant deaths from cardiac and other conditions by 33% and 21% respectively.9 States with non-mandatory policies did not see these decreases.9 In the years prior to universal CCHD screening, a California study estimated 30 newborns died in the state annually due to missed or delayed CCHD diagnoses.4,9 Additionally, these estimates do not include infants that survive but have morbidity due to a delayed diagnosis of CCHD. This justifies a need to critically evaluate the status of CCHD screening within the state of California. California has approximately a half million live births annually, which accounts for 13% of the nation’s live births, the largest proportion for any one state.10 Thus, not mandating CCHD screening, poor adherence to recommended guidelines, standardized reporting, or performance of CCHD screening in the state could impact US infant deaths.

Since July 2013, any general acute care hospital with licensed perinatal services or any intermediate care nursery in California is required to “offer” oxygen saturation screening for CCHD prior to discharge.11,12 Additionally, the policy requires annual reporting of aggregate (hospital-level) CCHD screening data from birth hospitals and patient-level data for newborns with CCHD from regional cardiac centers to the Department of Health Care Services (DHCS). The legislation did not “mandate” screening and did not provide financial support or health informatics infrastructure to help hospitals meet these requirements. Additionally, the legislation omitted direct consequences for non-reporting hospitals. Therefore, our team suspected many hospitals were not submitting CCHD screening reports or, worse yet, possibly were not screening at all. Thus, we undertook this retrospective review of CCHD screening data reported to California DHCS to describe reporting compliance and assess data consistency.

Methods

This study was approved by the University of California, Davis Institutional Review Board and the State of California Health and Human Services Agency Committee for the Protection of Human Subjects under project number 17-04-2931.

CCHD screening reports submitted by individual hospitals and birth centers to California DHCS for 2015 and 2016 were reviewed. One of the authors (HS) was granted access to all original reports submitted to DHCS. The reports included aggregate data for number of live births, number of inborns admitted to a Neonatal Intensive Care Unit (NICU) above Level II within the hospital, number of inborns screened, number failed/passed screens, number with failed screens resulting in diagnoses of CCHD, and number of those not screened for specified reasons (i.e. transferred or died prior to screening, echocardiogram obtained prior to screening or parents declined screening).12 Hospitals were provided a sample paper form for data submission, but were able to submit the same variables in other paper formats as well. For example, some hospitals submitted data in PDF, Microsoft Excel and/or Word files. Hospitals submitted this data annually and only submitted once unless requested to submit a revised report. DHCS then entered data into a common database. NICUs with a level of care higher than Level II were not mandated to screen nor to report screening data but were encouraged to develop policies to screen newborns whose clinical course or care were unlikely to detect CCHD. DHCS provided hospitals with the American Academy of Pediatrics 2011 recommended CCHD screening algorithm.13 The submitted data forms did not inquire about timing of screening (i.e. before/after 24 hours of age). Additionally, the data form did not provide definitions for non-viable newborn births and thus whether or not those newborns are considered births, deaths, and/or counted as parents declining screening is unknown.

Analysis:

This was a descriptive study. Summary statistics were computed as frequencies. To estimate reporting compliance, the denominator of hospitals required to report was estimated from OSHPD based on list of hospitals with perinatal and NICU beds. To estimate number of newborns with CCHD potentially missed due to non-reporting, the aggregate live births were compared to Vital Records of registered births. The number of revisions submitted was noted to document potential inaccuracies associated with original reports. Data were assessed for consistency both within the reports as well as compared to a secondary database. The measure of consistency within the report was whether or not the number of reported screens performed matched the number of reported screen results. The reported number of live births for each hospital was also compared to the live births reported to a secondary database, California’s Office of Statewide Health Planning and Development (OSHPD), to assess for consistency between databases. Fisher exact or chi squared tests, as appropriate, were used to compare categorical data. The data were analyzed using Stata Statistical Software, release 15.1 (Stata Corp, College Station, TX).

Results

Reported CCHD Screening Data

In 2015 and 2016, 82% (N=179 of 217) and 66% (N=147 of 223) of California hospitals subject to the CCHD screening policy reported CCHD screening data respectively (average – 74% over both years, Figure 1). One-hundred twenty-two hospitals submitted data in both years. In 2015, the submitted data represented 375,283 live births, or 76% of the 491,882 registered live births in Vital Records that year. In 2016, the submitted data represented 318,424 live births, or 65% of the 488,925 registered live births in Vital Records that year (average – 346,854/490,404 = 71%, Table 1). Assuming a CCHD incidence of 25/10,000 live births in California,14 359 newborns with CCHD were missed or unaccounted for per year in 2015 and 2016 due to non-reporting (Figure 1).

Figure 1. Illustration of Critical Congenital Heart Disease Screen Reporting in California.

Figure 1.

All values are averages based on the two years of data. “CCHD screening results” show the frequencies of reported live births for each category. “Missed screen” is defined as the difference between the reported number of newborns meeting screening criteria and the reported number of screens completed. The frequency of “missed screen” may be larger as some hospitals reported more screens completed than results and vice versa, which also results is some completely “unknown” status for some newborns. Consistent screening data is defined as hospitals submitted matching numbers of screens completed and screen results. Expected cases of CCHD is calculated based on incidence of 25/10,000 live births14 for both total California births (Vital Statistics) and live births reported to Department of Health Care Services.

CCHD = Critical Congenital Heart Disease, DHCS = Department of Health Care Services, the department to which all hospitals are to report screening data. False +ve = false positive

Table 1.

Reported Critical Congenital Heart Disease Screening in California 2015–2016

Year 2015 Year 2016 Annual Averagea
Live births reported to Vital Records 491,882 488,925 490,404
Department of Healthcare Services Data
Live births reported to Department of Health Care Services, N 375,283 318,424 346,854
Outborn NICU admissions, N 9,248 9,567 9,408
Total newborns 384,531 327,991 356,261
Not screened due to exclusion criteria, N (% total newborns) 10,528 (2.7%) 11,775 (3.6%) 11,152 (3.1%)
 Echocardiogram obtained before screening completed, N (% total newborns) 4,683 (1.2%) 4,561 (1.4%) 4,622 (1.3%)
 Parents declined screening, N (% total newborns) 117 (0.03%) 110 (0.03%) 114 (0.03%)
 Transferred before screening completed, N (% total newborns) 4,922 (1.3%) 6,468 (2%) 5,695 (1.6%)
 Died before screening completed, N (% total newborns) 806 (0.2%) 636 (0.2%) 761 (0.2%)
Total meeting screening criteria = Total newborns – not screened due to exclusion criteria, N (% total newborns) 374,003 (97.3%) 316,216 (96.4%) 345,110 (96.9%)
Total newborns screened (N, % of newborns meeting screening criteria) 362,150 (96.8%) 314,539 (99.5%) 338,345 (98%)
 Pass screen (N, % of screened)b 358,676 (99%) 310,509 (98.7%) 334,593 (98.9%)
 Failed screen (N, % of screened)b 804 (0.2%) 435 (0.1%) 620 (0.2%)
  Failed screens resulting in diagnoses of CCHD (N, % of failed screens) 171 (21.2%) 106 (24.4%) 139 (22.4%)
Data presented at N/10,000 live birthsc
Reported total newborns screened 9,650 9,878 9,764
Pass screen 9,557 9,751 9,654
Failed screen 21 14 18
Failed screens resulting in diagnoses of CCHD 4.6 3.3 4
Reported total newborns not screened 281 370 326
Echocardiogram obtained before screening completed 125 143 134
Parents declined screening 3 3 3
Transferred before screening completed 131 203 167
Died before screening completed 21 20 21
a

Annual average calculated based on 2015 and 2016 data

b

Pass and fail screen results do not equal completed screens and percent of pass and fail screens do not equal 100% as some hospitals reported more or fewer screen results than screens completed.

b

Live births reported to Department of Health Care Services used for these calculations

CCHD = critical congenital heart disease

When totaling the reported live births and the outborn NICU admissions, 94.2% and 95.9% of these newborns were reportedly screened in 2015 and 2016 respectively (Table 1). When removing the newborns that met screening exclusion criteria, 96.8% and 99.5% of newborns still meeting screening criteria were reportedly screened in 2015 and 2016 respectively (Table 1). The pass rate was similar over the two years with 99% and 98.7% of screened newborns passing the screen in 2015 and 2016, respectively. In 2015 and 2016, 0.2% and 0.1% of screened newborns failed the screen, which was 21/10,000 and 14/10,000 live births reported to DHCS, respectively. Note the frequency of passed and failed screens do not equal 100% due to inconsistencies in the submitted data (i.e. the reported number of screens completed do not equal the number of screen results reported). Among those that failed the CCHD screen, 21.2% and 24.4% resulted in a CCHD diagnosis. The number of failed screens resulting in a CCHD diagnosis was 4.6/10,000 live births reported to DHCS in 2015 and 3.3/10,000 in 2016. Based on a CCHD incidence of 25/10,000 live births in California,14 it is expected that 1,734 newborns with CCHD would be among the births 693,707 births reported to DHCS during the two years (or approximately 867/year - Figure 1). However, only 277 newborns (or 139/year) with CCHD were reported to DHCS by screening hospitals (Table 1, Figure 1).

Hospitals reported 281/10,000 and 370/10,000 live births were not screened in 2015 and 2016, respectively. The most common reasons for not completing the screen were due to transferring the newborn to another facility or completion of an echocardiogram prior to completing the screen. Parents declined the screen for 3.1/10,000 and 3.5/10,000 live births in 2015 and 2016 respectively (total 227 newborns combined in the two years). Additional summary data of the reported CCHD screening data can be seen in Table 1.

Consistency/Inconsistency of Reported CCHD Screening Data

After hospitals submitted their annual report, DHCS officials directly contacted hospitals to request revisions if there were concerns for inaccuracies (i.e. numbers not “adding up”) as reports were submitted. The frequency or timing of the requested revisions was not standardized. In 2015, 22 (12% of 179) submitting hospitals submitted at least one revised report, which increased to 67 (46% of 147) submitting hospitals in 2016 (Table 2). The mean number of revisions per hospital in both years was 1. The range of revisions was 1–2 in 2015 and 1–3 in 2016. In 2016, two hospitals required submission of 3 revisions or a total of 4 reports.

Table 2.

Consistency/Inconsistency of Reported Critical Congenital Heart Disease Screening in California 2015–2016

All Reporting Hospitals Year 2015 N=179 Year 2016 N=147 Annual Averagea
Hospitals submitting at least one revision, N (%) 22 (12%) 67 (46%) 29%
Hospitals reported live births that matched in two separate databasesb, N (%) 39 (22%) 32 (22%) 22%
Hospitals reported live births differed by 5% or more in two separate databasesb, N (%) 21 (12%) 19 (13%) 13%
Hospitals reported live births differed by 10% or more in two separate databasesb, N (%) 9 (5%) 12 (8%) 7%
Hospitals reported number of screens matched number of reported results, N (%) 82 (46%) 66 (45%) 46%
Hospitals reported number of completed screens differed from reported screen results by 5% or more, N (%) 25 (14%) 13 (9%) 12%
Hospitals reported number of completed screens differed from reported screen results by 10% or more, N (%) 20 (11%) 8 (5%) 8%
Hospitals that submitted revised reports Year 2015 N=22 Year 2016 N=67 Annual Averagea
Hospitals reported live births that matched in two separate databasesb, N (%)
 Initial report 4 (18%) 8 (12%) 15%
 Final report 4 (18%) 9 (13%) 16%
Hospitals reported live births differed by 5% or more in two separate databasesb, N (%)
 Initial report 2 (9%) 8 (12%) 11%
 Final report 1 (5%) 5 (7%) 6%
Hospitals reported number of screens matched number of reported results, N (%)
 Initial report 6 (27%) 8 (12%) 20%
 Final report 13 (59%) 19 (28%) 44%
Hospitals reported screens differed from reported screen results by 5% or more, N (%)
 Initial report 5 (23%) 15 (22%) 23%
 Final report 1 (5%) 2 (3%) 4%
a

Annual average calculated based on 2015 and 2016 data

b

Two databases compared were Department of Health Care Services (DHCS) and Office of Statewide Health Planning and Development (OSHPD)

OSHPD = Office of Statewide Health Planning and Development

Data were compared to a secondary database (OSHPD) to assess for consistency of reported live births by hospitals to the two different databases. The reported live births matched in the two databases for 39 (22% of 179) and 32 (22% of 147) of reporting hospitals in 2015 and 2016 respectively (Table 2). The difference in live births reported to the two databases differed by 5% or more for 21 (12% of 179) and 19 (13% of 147) reporting hospitals in 2015 and 2016 respectively. A minority of hospitals reported the same number of live births in two separate databases, even when submitting revised CCHD reports (Table 2).

Another consistency measure addressed whether or not the number of screen results provided matched the number of reported screens. Less than half of reporting hospitals in both years provided a number of screen results that matched the number of reported screens (82/179 or 46% in 2015 and 66/147 or 45% in 2016). In 2015, the aggregate differences, at the hospital level, in the number of screen results and screens completed ranged from 3,218 to 3,107. In 2016, this range was 115 to 1,353. Some hospitals reported more results than screens completed, and some did not provide results for all completed screens. Among hospitals submitting revisions, the number of hospitals with matching reported screens completed and screen results improved from the initial reports to the final reports (27% to 59% in 2015 and 12% to 28% in 2016) (Table 2). Among the 122 hospitals that submitted CCHD screening data both years, there was not a significant difference between the two years in markers of consistency such as live births matching in two databases and screens completed matching screen results provided (Table 3).

Table 3.

Consistency/Inconsistency of Report Among Hospitals Submitting Reports Both Years

Year 2015 N = 122 Year 2016 N =122 p value
Hospitals reported live births that matched in two separate databasesa, N (%) 25 (20%) 24 (20%) 0.9
Hospitals reported live births differed by 5% or more in two separate databasesa, N (%) 17 (14%) 12 (10%) 0.3
Hospitals reported live births differed by 10% or more in two separate databasesa, N (%) 8 (7%) 7 (6%) 0.8
Hospitals reported number of screens matched number of reported results, N (%) 53 (43%) 55 (45%) 0.8
Hospitals reported number of completed screens differed from reported screen results by 5% or more, N (%) 14 (11%) 8 (7%) 0.2
Hospitals reported number of completed screens differed from reported screen results by 10% or more, N (%) 12 (10%) 6 (5%) 0.1
a

Two databases compared were Department of Health Care Services (DHCS) and Office of Statewide Health Planning and Development (OSHPD)

To evaluate for characteristics between hospitals with consistent and inconsistent reports, we evaluated live births and NICU admissions among hospitals with matching number of screens completed and screen results to those that did not match. Hospitals that submitted matching number of screens completed and screen results had lower mean live births (1,515 vs 2,638, p < 0.001) and lower NICU admissions (180 vs 288, p = 0.001) compared to hospitals that did not have matching screen results. Total admissions to the NICU, inborn, and outborn admissions were significantly higher among hospitals with screen results that did not match their reported number of screens completed.

Discussion

The purpose of this review was to assess the quality of CCHD screening data in California. As a result of our findings, we call for a policy change towards CCHD screening in California. Our review reveals uncertainty about the status of CCHD screening and utility of the collected CCHD screening data in the state of California. Up to a third of California birth hospitals do not comply with a requirement to submit CCHD screening data, raising questions about whether screening is performed and when it is, screening program accuracy. When hospitals do submit reports, data inconsistencies are common, which makes CCHD surveillance challenging and will limit research and quality improvement efforts to improve the screen and gain a clearer understanding of both clinical and public health impact. Policy change is needed to save and improve the lives of newborns with CCHD and should include: 1) Mandatory CCHD screening 2) Requirements for high-quality centralized patient-level data and 3) Funding to support newborn CCHD screening, data reporting and follow-up (Figure 2).

Figure 2. Illustration of Our Call to Action for Mandatory Critical Congenital Heart Disease Screening with High-Quality Patient-Level Data and Funding.

Figure 2.

Dashed lines represent that the intervention is expected to have an inhibitory effect on the component at the thickened bar end of the dashed line. The solid lines represent an expected positive effect of the intervention on the component at the end of the arrow. These effects may be direct or indirect.

CCHD = critical congenital heart disease, CoA = coarctation of the aorta

Mandatory vs non-mandatory screening policies

California is currently the only state with a policy of “offering” CCHD screening, whereas others mandate screening be done.8 Mandated screening policies have been associated with decreased early infant mortality, while states with nonmandatory policies have not shown a similar reduction.9 In fact, states with nonmandatory policies had over two times higher early infant death rate from CCHD.9 While parents may decline screening in states with mandatory screening policies, the less stringent non-mandatory policy may translate to more parents declining the screen. At first glance of our data, the non-mandatory aspect may seem insignificant, as only 277 parents declined CCHD screening during the two years. However, that computes to an average rate of 3/10,000 live births not screened due to parental refusal, which could mean missing a CCHD diagnosis as often as every 3 years in the state.14 Note, this estimate assumes the unscreened population has the same risk of CCHD as the entire birth population for which we do not have data to support or deny. Additionally, this policy may have greater impact on patients already with noted health disparities with CCHD diagnoses or those with lower health literacy.15,16 Therefore, deaths and missed CCHD diagnoses will occur due to parental refusal of screening allowed in a nonmandatory policy.

Deaths and missed CCHD diagnoses may be occurring in California due to hospitals not performing the screen. Based on registered live birth Vital Records data, screening data reported to DHCS omitted, on average, 29% of live births for the state each year. Thus, assuming CCHD incidence 25/10,000 in California,14 this could reflect approximately 359 newborns with CCHD being missed or unaccounted for each year due to non-reporting. We cannot say that the screen was omitted in all the omitted live births. However, the omission of data in combination with a non-mandatory policy raises concerns about whether all babies are being screened for CCHD.

Need for centralized patient-level data for research and quality improvement

In addition to unreported screening data, the reported data in our review often contained inconsistencies. Hospital-level aggregate data, as opposed to patient-level data, will have limited applications and is further limited when the data contains inconsistencies. CCHD screening is unique compared to other conditions on the newborn RUSP. A positive CCHD screen requires immediate action, often within hours rather than days, weeks or months. Thus, the benefit of centralized patient-level data collection, standardized and frequent data review, and intervention upon the process can ensure appropriate follow up and timely intervention for newborns with CCHD, and add surveillance, research and quality improvement efforts to improve the screen.

Two non-physiologic areas in which the screen can improve are algorithm misinterpretations and healthcare disparities. States with centralized patient-level data have been able to identify issues with poor training, equipment concerns, and misinterpretations of the screening algorithm, including misinterpretations despite the addition of automation cues.17,18 Misinterpretations of the CCHD screen carry a risk of misclassifying a newborn with CCHD as normal, and can only be identified and fixed with review of high-quality data. Additionally, racial disparities have been noted in CCHD diagnoses, with minority children being diagnosed at later ages.15,16 High quality surveillance data could help inform interventions to mitigate these disparities. Additionally, standardization of care decreases healthcare inequities.19 Examples of such are quality improvement efforts that standardized data collection, rapidly review the data and implement modifications, and then repeating the cycle to follow up on the result of those modifications.

Centralized and accessible patient-level data can also facilitate research related to acyanotic CCHD defects commonly missed by oxygen saturation screening and provide insights into other conditions detected due to asymptomatic hypoxemia. For example, the type of CCHD most commonly missed by current screening methods are acyanotic defects such as coarctation of the aorta.20 Studies suggest detection of some CCHD types may improve with the addition of non-invasive measurements of perfusion.2125 Additional research in this area will require large cohorts, and thus, will benefit from centralized and accessible data. Having access to statewide linked data will enhance understanding of long-term outcomes among screened vs. not screened infants with CCHD. In addition, it will provide tools to device enhanced CCHD screening techniques to detect perfusion defects such as coarctation of aorta that are commonly missed under the current algorithm (Figure 2). As the screening algorithm changes to improve CCHD detection, monitoring of other conditions detected by the current screen will be necessary as well. For example, many “positive” screens are from other serious conditions such as non-critical congenital heart disease, sepsis, persistent pulmonary hypertension and other “secondary conditions.”26,27As screening policies and algorithms adjust, the impact on these secondary, and very important, conditions will have to be considered and tracked with high quality data.

Lack of patient-level CCHD screening data is not unique to California – our review is just the first to highlight the little utility aggregate data can serve. As of 2018, only 80% of US jurisdictions were receiving any CCHD screening data.8 Nineteen (37% of 51) jurisdictions were receiving all screening data as individual-level data.8 Individual-level screening data alone is not sufficient and data to identify false negatives are necessary as well. California’s policy required cardiac centers to submit patient specific data on newborns with CCHD. However, we were unable to evaluate data reported by regional cardiac centers because it was even more underreported and inconsistent, and thus our request for that data was not fulfilled. Use of birth defect registries to estimate/identify false negative screens is performed by over a third of states and has been effective, but is time-consuming (up to 6 months to retrospectively collect data and then analysis of data up to 2.5 years later).8,28,29 In addition, birth defect registries may be incomplete and the majority of data entry is performed manually.8,28,29 Integration of screening data and birth defect data benefits from collaboration. For example, in Minnesota staff for CCHD screening and birth defect surveillance programs work together regularly, allowing rapid reporting of infants detected by CCHD screening to the birth defect surveillance program and identifying cases undetected by screening in the system.8

Potential ways to achieve high-quality patient-level CCHD screening data

The mode of data collection and types of data collected has been challenging for CCHD screening. The approach has not been standardized across the states despite notable need for standardization.8,30 Inconsistency and low compliance in reporting in California may be related to a single mode of data collection, a state-specific paper form. A review of six states/regions CCHD screening programs, noted all six programs used three or more mechanisms for data collection: newborn bloodspot card, state-specific paper form, state-specific electronic CCHD module, Health level-7 (HL7) messaging with automatic file transfer, aggregate data collection, electronic birth certificate, birth defect registries, hospital electronic medical record, and Vital Records.31 Carefully conceived automatic or electronic reporting would plausibly improve reporting compliance and consistency in California. However, programs utilizing these automatic electronic processes still encountered challenges such as difficulty with data use agreements, data quality challenges from file transfers, algorithm interpretation errors, missed screens (NICU, home births, birthing centers), and data reported in different formats from hospitals.31 Our data suggest that larger institutions with higher delivery and NICU admission rates have more data errors. Thus, reviewing reasons for data errors among larger hospitals may provide some insight on possible interventions.

Attempts to standardize data collection in addition to technical assistance programs exist for CCHD screening. In fact, California Maternal Quality Care Collaborative (CMQCC) launched a CCHD reporting tool in Fall 2018 that matches the state reporting form within its Maternal Data Center. CMQCC is a multi-stakeholder organization focused on improving morbidity, mortality and racial disparities in California maternity through research, quality improvement toolkits and outreach. Unfortunately, the CCHD reporting tool has been utilized minimally. Fifteen hospitals used the tool for 2018 CCHD data and, as of April 2020, 33 hospitals used the tool to report 2019 CCHD data.32 While this tool may help improve the compliance and consistency of submitted data in California, the lack of standardization across the states and lack of individual-level data remains. Efforts to standardize data collection, create a data repository and offer technical support across states exist as well. The Newborn Screening Technical Assistance and Evaluation Program (NewSTEPs) is an example of a national attempt to standardize and centralize data and provide CCHD screening lessons learned and technical assistance to states.33 In order to improve CCHD screening, states could benefit from working together and learning from others to replicate the exceptional programs.

This study has mainly addressed quality of CCHD screening data reported to the state. While quality data forms the basis of evaluation and intervention, appropriate and coordinated testing and therapeutic strategies are equally important. Quality improvement through statewide California Perinatal Quality Care Collaborative (CPQCC) or CMQCC that addresses all steps of the CCHD screening process – timely conduct of the test, interpretation of results, data submission, appropriate referral to definitive treatment is important.

Need for funding

Centralized high-quality patient-level data will require funding. A likely barrier to reporting compliance and consistency is lack of funding. Financial support was not provided to hospitals when California passed its CCHD screening policy. California passed its policy in 2013, a time that overlaps with federal funding of six states/regions from the Health Resources and Services Administration (HRSA) to develop, disseminate and validate CCHD screening protocols and infrastructures. These states/regions were funded $300,000 annually 2012–2015 to meet those aims.31 While funding was provided to these programs to develop the CCHD screening programs, some ceased data collection or significantly reduced data collection when the funding ended.31 Therefore, upfront funding and planning to develop a data collection process that is self-sustaining, at least in part, would be beneficial in the long run. For example, Minnesota was granted a legislative fee increase for its NBS program, which specifically was used to implement a statewide health informatics system to automate CCHD screening reporting from hospitals to the state department of health.

Study limitations

There are several limitations to our study. We were only able to review data from two years as that was the only data available at time of our data request and were advised it would take several months or over a year to obtain additional data. However, of note, there were no changes to the reporting requirement or mode of submission until Fall 2018 when a new reporting mechanism was introduced as we discussed above. We were unable to evaluate data reported by regional cardiac centers because it was even more underreported and inconsistent, and, thus, our request for that data was not fulfilled. We are, therefore, unable to estimate the false negative rate for CCHD screening in California. Additionally, the reported birth hospital data were submitted in aggregate as opposed to at the patient-level, representing another verification challenge. Lastly, there may be reporting bias from self-reports and the submitted data to DHCS may not accurately reflect CCHD screening at non-reporting hospitals. All of these limitations support our call to action to improve this data by changing the CCHD screening policy to a mandated policy with high-quality centralized patient-level data that will require funding.

Conclusion

A third of California hospitals are not reporting CCHD screening results despite a mandate to do so. Approximately half of those who do report have data inconsistencies. California’s policy mandates “offering” CCHD screening but not that it be performed; other states mandate the screen be performed and have shown positive impact of this recommendation. Given the life-or-death nature of early CCHD detection, it is prudent that steps be taken in California to optimize screening, evaluation, accurate data collection, reporting, confirmation of diagnosis, referral and therapy to aid healthcare and public health authorities in advancing the CCHD screen to help infants with CCHD live full lives.

Acknowledgements

Thank you to Pooria Assadi and Khae Saechao for their invaluable assistance with data management. Data were obtained from DHCS under data use agreement number 17-07-03. The findings and conclusions in this article are those of the authors and do not represent the views or opinions of the State of California.

Funding Information

This project was funded by University of California, Davis Department of Pediatrics. Additionally. Dr. Siefkes’s effort was supported by the National Center for Advancing Translational Sciences (NCATS), National Institutes of Health (NIH) (through grant UL1 TR001860 and linked award KL2 TR001859) and by Eunice Kennedy Shriver National Institute of Child Health & Human Development (NICHD) (1R21 1HD099239-01). Dr Kair’s effort was supported by the NCATS, NIH (through grant UL1 TR001860), and the National Institutes of Health Building Interdisciplinary Research Careers in Women’s Health Program (K12HD051958). Dr. Lakshminrusimha was supported by NICHD, NIH (5R01 HD072929-09). The contents of this publication are solely the responsibility of the authors and do not represent the official views of the NIH. Funded by the National Institutes of Health (NIH).

Funding Source: Dr. Siefkes’s effort was supported by the National Center for Advancing Translational Sciences (NCATS), National Institutes of Health (NIH) (through grant UL1 TR001860 and linked award KL2 TR001859) and by Eunice Kennedy Shriver National Institute of Child Health & Human Development (NICHD) (1R21 1HD099239-01). Dr Kair’s effort was supported by the NCATS, NIH (through grant UL1 TR001860), and the National Institutes of Health Building Interdisciplinary Research Careers in Women’s Health Program (K12HD051958). Dr. Lakshminrusimha was supported by NICHD, NIH (5R01 HD072929-09). The contents of this publication are solely the responsibility of the authors and do not represent the official views of the NIH. Funded by the National Institutes of Health (NIH). Additionally, the project was funded by the University of California, Davis Department of Pediatrics.

Financial Disclosure: Annamarie Saarinen is a consultant for Masimo doing economical modeling and health outcomes research. This consulting work began after this study was designed and Masimo did not have input on the design of this study or the interpretation of the results. The authors have no other financial relationships relevant to this article to disclose.

Abbreviations:

NBS

newborn screening

US

United States

SACHDNC

Secretary of Health and Human Services Advisory Committee on Heritable Disorders in Newborns and Children

RUSP

Recommended Uniform Screening Panel

CCHD

critical congenital heart disease

DHCS

Department of Health Care Services

OSHPD

Office of Statewide Health Planning and Development

NICU

Neonatal Intensive Care Unit

HRSA

Health Resources and Services Administration

Footnotes

Conflicts of Interests

Ms. Saarinen is a consultant for Masimo doing economic modeling and health outcome research. This consulting work began after this study was designed and Masimo did not have input on this study design or interpretation of the results. The authors do not have other conflicts of interest to disclose.

References

  • 1.Watson MS, Mann MY, Lloyd-Puryear MA, Rinaldo P & Howell RR Newborn Screening: Toward a Uniform Screening Panel and System - Executive Summary. Pediatrics 117, S296–S307 (2006). [DOI] [PubMed] [Google Scholar]
  • 2.Advisory Committee on Heritable Disorders in Newborns and Children Recommended Uniform Screening Panel. Available at: https://www.hrsa.gov/advisory-committees/heritable-disorders/rusp/index.html. (Accessed: 15th May 2020)
  • 3.Sontag MK et al. Newborn Screening Timeliness Quality Improvement Initiative: Impact of National Recommendations and Data Repository. PLoS One 15, 1–17 (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Chang R, Gurvitz M & Rodriguez S Missed Diagnosis of Critical Congenital Heart Disease. Arch. Pediatr. Adolesc. Med 162, 969 (2008). [DOI] [PubMed] [Google Scholar]
  • 5.Wren C, Reinhardt Z & Khawaja K Twenty-year Trends in Diagnosis of Life-threatening Neonatal Cardiovascular Malformations. Arch. Dis. Child. Fetal Neonatal Ed 93, F33–F35 (2008). [DOI] [PubMed] [Google Scholar]
  • 6.White KR, Forsman I, Eichwald J & Munoz K The Evolution of Early Hearing Detection and Intervention Programs in the United States. Semin. Perinatol 34, 170–179 (2010). [DOI] [PubMed] [Google Scholar]
  • 7.Sebelius K Secretary of Health & Human Services letter to the Secretary’s Advisory Committee on Heritable Disorders in Newborns and Children (SACHDNC).
  • 8.Glidewell J et al. Actions in Support of Newborn Screening for Critical Congenital Heart Disease - United States, 2011–2018. Morb. Mortal. Wkly. Rep 68, 107–111 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Abouk R, Grosse S, Ailes E & Oster M Association of US State Implementation of Newborn Screening Policies for Critical Congenital Heart Disease With Early Infant Cardiac Deaths. JAMA 318, 2111–2118 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Hamilton BE, Martin JA, Osterman MHS, Curtin SC & Matthews TJ National Vital Statistics Reports Births : Final Data for 2014. Statistics (Ber). 64, 1–104 (2015). [PubMed] [Google Scholar]
  • 11.Guidelines for Critical Congential Heart Disease Screening Services; State of California-Health and Human Services Agency Department of Health Care Services; March 11, 2014.
  • 12.A.B. 1731 Newborn Screening Program: Critical Congenital Heart Disease (Gen Assemb, 2012). [Google Scholar]
  • 13.Kemper AR et al. Strategies for Implementing Screening for Critical Congenital Heart Disease. Pediatrics 128, e1259–67 (2011). [DOI] [PubMed] [Google Scholar]
  • 14.Purkey NJ et al. Birth Location of Infants with Critical Congenital Heart Disease in California. Pediatr. Cardiol 310–318 (2018). doi: 10.1007/s00246-018-2019-0 [DOI] [PubMed] [Google Scholar]
  • 15.Peterson JK, Catton KG & Setty SP Healthcare Disparities in Outcomes of a Metropolitan Congenital Heart Surgery Center: The Effect of Clinical and Socioeconomic Factors. J. Racial Ethn. Heal. Disparities 5, 410–421 (2018). [DOI] [PubMed] [Google Scholar]
  • 16.Oster ME et al. A Population-Based Study of the Association of Prenatal Diagnosis with Survival Rate for Infants with Congenital Heart Defects. Am J Cardiol 113, 1036–1040 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Engle M, Phimister A, Gaviglio A, Spector L & Lohr J Abstract 17294: The Effect of Statewide Electronic Reporting on Pulse Oximetry Screening in Newborns. Circulation 138, A17294 (2018). [Google Scholar]
  • 18.Kochilas LK et al. Implementation of Critical Congenital Heart Disease Screening in Minnesota. Pediatrics 132, e587–94 (2013). [DOI] [PubMed] [Google Scholar]
  • 19.Wyatt R, Laderman M, Botwinick L, Mate K & Whittington J Achieving Health Equity: A Guide for Health Care Organizations. IHI White Paper (Institute for Healthcare Improvement, 2016). [Google Scholar]
  • 20.Ailes EC, Gilboa SM, Honein MA & Oster ME Estimated Number of Infants Detected and Missed by Critical Congenital Heart Defect Screening. Pediatrics 135, 1000–1008 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Siefkes H et al. Oxygen Saturation and Perfusion Index-Based Enhanced Critical Congenital Heart Disease Screening. Am. J. Perinatol 37, 158–165 (2020). [DOI] [PubMed] [Google Scholar]
  • 22.de-Wahl Granelli A & Ostman-Smith I Noninvasive Peripheral Perfusion Index as a Possible Tool for Screening for Critical Left Heart Obstruction. Acta Paediatr. 96, 1455–1459 (2007). [DOI] [PubMed] [Google Scholar]
  • 23.Schena F et al. Perfusion Index and Pulse Oximetry Screening for Congenital Heart Defects. J. Pediatr 183, 74–79 (2017). [DOI] [PubMed] [Google Scholar]
  • 24.Palmeri L et al. Photoplethysmographic Waveform Characteristics of Newborns with Coarctation of the Aorta. J. Perinatol 37, 77–80 (2017). [DOI] [PubMed] [Google Scholar]
  • 25.Sorenson M, Sadiq I, Clifford G, Maher K & Oster M Using Pulse Oximetry Waveforms to Detect Coactation of the Aorta. BioMed Eng OnLine 19, (2020). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Singh A, Rasiah SV & Ewer AK The Impact of Routine Predischarge Pulse Oximetry Screening in a Regional Neonatal Unit. Arch. Dis. Child. Fetal Neonatal Ed 99, 297–302 (2014). [DOI] [PubMed] [Google Scholar]
  • 27.Oster ME et al. Lessons Learned From Newborn Screening for Critical Congenital Heart Defects. Pediatrics 137, e20154573–e20154573 (2016). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Case AP, Miller SD & Mcclain MR Using State Birth Defects Registries to Evaluate Regional Critical Congenital Heart Disease Newborn Screening. Birth Defects Res. 109, 1414–1422 (2017). [DOI] [PubMed] [Google Scholar]
  • 29.Anderka M et al. Development and Implementation of the First National Data Quality Standards for Population-Based Birth Defects Surveillance Programs in the United States. BMC Public Health 15, 1–7 (2015). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Olney RS & Botto LD Newborn Screening for Critical Congenital Heart Disease: Essential Public Health Roles for Birth Defects Monitoring Programs. Birth Defects Res. A. Clin. Mol. Teratol 94, 965–9 (2012). [DOI] [PubMed] [Google Scholar]
  • 31.Mcclain MR et al. Critical Congenital Heart Disease Newborn Screening Implementation : Lessons Learned. Matern Child Heal. J 21, 1240–1249 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Castles A Direct Communication with Anne Castles, Associate Director, Maternal Data Center, California Maternal Quality Care Collaborative. (2020). [Google Scholar]
  • 33.Ojodu J et al. NewSTEPs: The Establishment of a National Newborn Screening Technical Assistance Resource Center. Int. J. Neonatal Screen 4, 1–10 (2017). [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES