Introduction
Past and current economic evaluations of school-based sealant programs (SSPs) have leveraged the existing clinical trial data on sealant effectiveness to support SSP implementation (Gooch et al. 2009; Griffin et al. 2016; Griffin et al. 2017). These economic evaluations have been important to demonstrate the potential value of SSPs and to attract scarce funding dollars for implementation (Griffin et al. 2014; Griffin et al. 2018). However, given the broad recommendations to implement SSPs (Gooch et al. 2009; Griffin et al. 2016; Griffin et al. 2017), there is now a need to evaluate how variation in SSP design and implementation affects oral health outcomes and costs. Such an analysis may help address the stall in SSP implementation efforts as of 2015 (Pew Charitable Trusts 2015). This requires direct data on SSP effectiveness rather than sealant efficacy.
The consent process in SSPs may be a major component of SSP effectiveness that has yet to be examined. Passive and active consent notifies children’s parents or guardians of an ongoing study and requires a parent’s or guardian’s signature to decline (passive consent) or accept (active consent) program participation. Active consent is typically needed for any program providing care; hence, SSPs typically require active consent procedures.
Students with active consent to participate in research may systematically differ from those without consent. Additionally, active consent procedures for research may exclude individuals who would otherwise participate in clinical care. The study sample then is not representative of the entire school population or even all those who would participate in clinical care within the school population. The resulting bias in the study sample is called self-selection bias, and it threatens internal and external validity. Assessing the impact of SSPs on school populations when the school population is not well represented in the sample is difficult.
An additional complication is the variation across institutional review boards (IRBs) in consent requirements. A recent study examined variation in consent strategies required by 24 local IRBs (Higgerson et al. 2014). Although submitted IRB applications were the same across the 24 IRBs, researchers found IRBs inconsistent in interpreting federal regulations regarding passive consent and parental acceptance of passive consent strategies. Additionally, standards for active consent procedures may vary across IRBs, which may depend on the local setting (Silverman et al. 2001). However, the resulting variation in consent procedures may lead to differences in the magnitude of self-selection bias across SSPs collecting data for research.
Hence, it becomes important that economic evaluations report clearly on potential self-selection bias in the underlying data and provide information about the consent procedures used to procure the data. Otherwise, decision making could be ill-informed. At best, economic evaluations with unreported sources of bias are uninformative and not useful for decision making. At worst, biased economic evaluations may lead to suboptimal allocation of scarce resources. In the following discussion, we highlight challenges associated with informed consent in economic evaluations of SSPs and whether they have been addressed in recent economic evaluations, and we provide recommendations for future work.
Assess Program Impact Based on All Students, Not Just Those with Informed Consent
A key tenet of economic evaluations is to conduct intention-to-treat analysis. Because schools typically do not allow for differential targeting of students within their student populations, school-based programs are necessarily conducted schoolwide, and treatment is offered to every student. Hence, the unit for treatment assignment is the school. As a result, based on the intention-to-treat approach for economic evaluations suggested by International Society for Pharmacoeconomics and Outcome Research (ISPOR) (Husereau et al. 2013), the incremental impact on cost and health outcomes for all students within schools implementing school-based programs should be considered.
Not using intention-to-treat analysis can significantly bias estimates of the program impact. Suppose that only 20% of children participate in any SSP. Although children participating in the program may experience an 80% reduction in caries relative to what would have occurred without an SSP, the overall reduction in caries within the school is far below 80%. Furthermore, the 80% reduction in caries due to the SSP may not be generalizable to nonparticipants or to the overall school population. Economic evaluations not accounting for nonparticipation or assuming full participation may therefore be significantly biased in their assessment of SSPs.
Active Informed Consent in SSPs Prevents Data Collection on Nonparticipating Students
Active consent is required to both administer treatment and collect data on students. Thus, researchers have data only on those students with informed consent, not on those who did not respond.
Active Informed Consent Introduces Ambiguity into Clinical and Economic Evaluations
The self-selection bias from signing versus not signing informed consent affects estimates of the oral health and cost impact of SSPs. The direction and magnitude of this bias are ambiguous.
Variation in Consent Procedures Introduces Additional Ambiguity
When information and data on informed consent protocols are absent, this leads to ambiguity regarding quantifying and assessing “high” or “low” informed consent rates. In other words, are informed consent rates a reflection of the consent process design, community, school, classroom, student, or parent? When clinical and economic evaluations use data from multiple SSPs, this ambiguity creates additional difficulty in interpreting and comparing outcomes across SSPs.
Using data from SSPs with active informed consent may implicitly exclude research-averse children who might otherwise seek care. This exclusion is especially problematic for economic evaluations. First, data generated from SSPs conducting research will likely underestimate the school-level oral health impact had research not been conducted. Second, the direction of the bias for cost estimates is ambiguous. The reason is that including more children within an SSP improves economies of scale, but children who were excluded by informed consent procedures for research may be costlier to treat (higher oral health need). Without information to assess the informed consent quality within and across SSPs, it is difficult to assess whether economic evaluations relying on data from SSPs are prone to bias due to the informed consent procedures. Such bias directly affects the external validity of economic evaluations of SSPs.
Reporting on Informed Consent Procedures and Participation Rates
Because economic evaluations are intended to inform decision making, researchers conducting them should report and account for the reliability of the underlying studies and data. This will provide needed information to decision makers about how to interpret economic evaluations. However, researchers conducting economic evaluations may be hard-pressed to do so. A systematic review of the overall school-based prevention and intervention literature, including ~500 studies, found widely ranging rates of participation, between 11% and 100%, with a mean rate of 65.5% among studies reporting consent procedures and participation rates (Blom-Hoffman et al. 2009). Moreover, only 11.5% of studies reported both consent procedures and participation rates. Hence, economic evaluations are unlikely to report participation rates among all referenced studies and account for self-selection bias.
Assessment of SSP Evaluations
We assessed whether economic evaluations in a recent systematic literature review reported and accounted for self-selection bias stemming from informed consent procedures if direct SSP data were used. We found that recent economic evaluations leveraging any data collected directly from SSPs did not account for self-selection bias. Among the 14 articles included in the recent systematic literature review (Griffin et al. 2017), 5 contained direct data on SSPs. Of these 5 articles, only 2 reported complete information on participant loss at all stages of consent (i.e., active consent, active assent, and follow-up) and over every year with data collection. Moreover, no papers 1) collected sufficient data on nonparticipants to conduct intention-to-treat analyses and 2) directly accounted analytically for selection bias in their estimates.
Conclusion and Recommendations
Economic evaluations of SSP effectiveness, as currently published and designed, should be used and interpreted with caution. The threat of self-selection bias may severely threaten both internal and external validity.
The ISPOR recommendation of intention-to-treat analysis implies that an incremental impact on health and cost outcomes for all students within schools implementing school-based programs should be considered (Husereau et al. 2013). Statistical methods to accomplish this are available (Hahn et al. 2005).
Additionally, the use of passive consent procedures to collect oral health data, especially through oral health screenings, is not unprecedented. Several states (Wisconsin, Colorado, Oklahoma) have used passive consent to deliver health surveys or screenings (Calanan et al. 2012; Wisconsin Department of Health Services 2013; Oklahoma State Department of Health Dental Health Service 2016). When queried about whether active consent is required prior to a public school employee conducting dental screenings of students, the US Department of Education’s Family Policy Compliance Office responded that as long as dental instruments are not used, dental screenings would not be considered invasive and would not require active consent (Iowa Department of Education 2012).
Hence, to obtain data on nonparticipants, individuals seeking to collect data within or across SSPs can consider using a 2-pronged approach. One prong would address treatment and data collection among participants via active informed consent. Another prong would separately use passive consent procedures to collect oral health data without intervening clinically on a representative sample of children within schools. Oral health data in both prongs can be collected through survey self-reports (i.e., oral health–related quality of life) or through noninvasive oral health screening. Comparison data among schools without SSPs can also be collected with passive consent procedures for screening and survey self-reports.
To address concerns about self-selection bias in future work, economic evaluations of SSPs should
Evaluate and report on both consent procedures and participation rates
Identify and evaluate potential factors influencing participation within SSPs
Clinically assess nonparticipants via passive consent to determine background health
Use the foregoing information to generate an intention-to-treat analysis.
Author Contributions
S.S. Huang, contributed to conception, design, data analysis, and interpretation, drafted the manuscript; R. Niederman, contributed to data analysis and interpretation, critically revised the manuscript. Both authors gave final approval and agree to be accountable for all aspects of the work.
Acknowledgments
The feedback of participants at the 2018 Methodological Issues in Oral Health Research conference and NYU Epidemiology and Health Promotion research seminar is gratefully acknowledged.
Footnotes
Research reported in this work was partially funded by the National Institute on Minority Health and Health Disparities of the National Institutes of Health under Awards Numbers R01MD011526 and U24MD006964 and a Patient-Centered Outcomes Research Institute (PCORI) award (PCS-1609-36824).
The views presented in this publication are solely the responsibility of the authors and do not necessarily represent the official views of the National Institutes of Health or the Patient-Centered Outcomes Research Institute (PCORI), its Board of Governors or Methodology Committee.
The authors declare no potential conflicts of interest with respect to the authorship and/or publication of this article.
References
- Blom-Hoffman J, Leff SS, Franko DL, Weinstein E, Beakley K, Power TJ. 2009. Consent procedures and participation rates in school-based intervention and prevention research: using a multi-component, partnership-based approach to recruit participants. School Ment Health. 1(1):3–15. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Calanan R, John A, Mauritson K. 2012. The basic screening survey—children’s oral health screening—Colorado, 2011–2012. Reno (NV): Association of State & Territorial Dental Directors; [accessed 2018 Aug 8]. https://www.astdd.org/docs/co-third-grade-bss-screening-2011-2012.pdf. [Google Scholar]
- Gooch BF, Griffin SO, Gray SK, Kohn WG, Rozier RG, Siegal M, Fontana M, Brunson D, Carter N, Curtis DK, et al. 2009. Preventing dental caries through school-based sealant programs: updated recommendations and reviews of evidence. J Am Dent Assoc. 140(11):1356–1365. [DOI] [PubMed] [Google Scholar]
- Griffin S, Naavaal S, Scherrer C, Griffin PM, Harris K, Chattopadhyay S. 2016. School-based dental sealant programs prevent cavities and are cost-effective. Health Aff (Millwood). 35(12):2233–2240. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Griffin SO, Jones K, Crespin M. 2014. Calculating averted caries attributable to school-based sealant programs with a minimal data set. J Public Health Dent. 74(3):202–209. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Griffin SO, Jones K, Naavaal S, O’connell JM, Demopoulos C, Arlotta D. 2018. Estimating the cost of school sealant programs with minimal data. J Public Health Dent. 78(1):17–24. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Griffin SO, Naavaal S, Scherrer C, Patel M, Chattopadhyay S. 2017. Evaluation of school-based dental sealant programs: an updated community guide systematic economic review. Am J Prev Med. 52(3):407–415. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hahn S, Puffer S, Torgerson DJ, Watson J. 2005. Methodological bias in cluster randomised trials. BMC Med Res Methodol. 5:10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Higgerson RA, Olsho LE, Christie LM, Rehder K, Doksum T, Gedeit R, Giuliano JS, Brennan B, Wendlandt R, Randolph AG; PALISI PICFlu Study Investigators. 2014. Variability in IRBs regarding parental acceptance of passive consent. Pediatrics. 134(2):e496–e503. [DOI] [PubMed] [Google Scholar]
- Husereau D, Drummond M, Petrou S, Carswell C, Moher D, Greenberg D, Augustovski F, Briggs AH, Mauskopf J, Loder E; ISPOR Health Economic Evaluation Publication Guidelines–CHEERS Good Reporting Practices Task Force. 2013. Consolidated health economic evaluation reporting standards (CHEERS)—explanation and elaboration: a report of the ISPOR Health Economic Evaluation Publication Guidelines Good Reporting Practices Task Force. Value Health. 16(2):231–250. [DOI] [PubMed] [Google Scholar]
- Iowa Department of Education. 2012. School screenings. Des Moines (IA): Iowa Department of Education; [accessed 2018 Aug 8]. https://www.educateiowa.gov/sites/files/ed/documents/1112_sn_schoolScreenings_v3.pdf. [Google Scholar]
- Oklahoma State Department of Health Dental Health Service. 2016. Oklahoma oral health needs assessment 2016—third grade children. Oklahoma City (OK): State of Oklahoma; [accessed 2018 Aug 8]. https://www.ok.gov/health2/documents/DentalSurveyFinalReport2016.pdf. [Google Scholar]
- Pew Charitable Trusts. 2015. States stalled on dental sealant programs. Philadelphia (PA): Pew Charitable Trusts; [accessed 2018 Aug 8]. https://www.pewtrusts.org/en/research-and-analysis/reports/2015/04/states-stalled-on-dental-sealant-programs. [Google Scholar]
- Silverman H, Hull SC, Sugarman J. 2001. Variability among institutional review boards’ decisions within the context of a multicenter trial. Crit Care Med. 29(2):235–241. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wisconsin Department of Health Services. 2013. 2013—healthy smiles/healthy growth—Wisconsin’s third grade children. Madison (WI): Wisconsin Department of Health Services; [accessed 2018 Aug 8]. https://www.dhs.wisconsin.gov/publications/p0/p00589.pdf. [Google Scholar]
