Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Dec 1.
Published in final edited form as: Child Youth Serv Rev. 2019 Sep 11;107:104498. doi: 10.1016/j.childyouth.2019.104498

Important issues in estimating costs of early childhood educational interventions: An example from the REDI Program

Damon E Jones 1, Karen L Bierman 1, D Max Crowley 1, Janet A Welsh 1, Julia Gest 1
PMCID: PMC6924610  NIHMSID: NIHMS1061675  PMID: 31866702

Abstract

Early childhood education (ECE) interventions hold great promise for not only improving lives but also for potentially producing an economic return on investment linked to key outcomes from program effectiveness. Assessment of economic impact relies on accurate estimates of program costs that should be derived consistently to enable program comparability across the field. This is challenged by a lack of understanding of the best approach to determine program costs that represent how they will occur in the real world and how they may vary across differing circumstances. Thorough and accurate cost analyses are vital for providing important information toward future implementations and for enabling analysis of potential return on investment. In this paper, we present five key issues most relevant to cost analysis for ECE programs that interventionists should acknowledge when estimating their programs’ costs. Attention to these issues more broadly can lead to comprehensive and thorough cost estimates and potentially increase consistency in cost analyses. These issues are illustrated within the cost analysis of REDI (Research-based, Developmentally Informed), an enrichment program that seeks to extend the benefits of preschool through enhanced classroom and home visiting services. Implications for practice and policy are discussed.

Keywords: cost analysis, early intervention, program evaluation, home visiting

1. Introduction

Income-based gaps in school readiness are evident when children enter kindergarten and remain stable or increase over time, affecting academic attainment, employment, and future earnings (Macmillan, McMorris, & Kruttschnitt, 2004; Reardon, 2011). Motivated by findings suggesting that high-quality early childhood education (ECE) experiences reduce these gaps, federal and state governments have invested heavily in preschool and prekindergarten programs (Friedman-Krauss et al., 2018). Such investments are potentially very important to the lives of children but possibly economically sensible as well. High-quality preschool programs can improve developmental trajectories in some children that lead to increased likelihood for eventual educational and workplace success, as well as lower likelihood for emotional and behavioral problems (Council of Economic Advisers, 2014; Heckman, 2011; Reynolds, Temple, White, Ou, & Robertson, 2011). The programs can pay for themselves in the long run, or even provide a positive return on investment to society. However, there is considerable variation in the short- and long-term benefits produced by different ECE interventions, with the quality of curriculum, teaching quality, and parent involvement all affecting impact (Bailey, Duncan, Odgers, & Yu, 2017; Farran & Lipsey, 2016). In turn, these variations raise questions about the benefits of different ECE programs for students and in turn, their possible economic returns to society. Informed decision-making will require accurate estimates of the costs as well as the benefits of different intervention approaches.

Estimation of the costs of ECE interventions is complicated by a number of features that are not fully covered by existing cost assessment guides (Haddix, Teutsch, & Corso, 2003; Levin, McEwan, Belfield, Bowden, & Shand, 2018). Specifically, many of these new interventions aim to enrich existing ECE programs by, to varying degrees, building upon a current infrastructure, using existing staff, including multiple components, and being delivered over several years as they become embedded within existing programming (see also, (Bowden, Escueta, Muroga, Rodriguez, & Levin, 2018)). Estimates of program costs can vary significantly, depending upon the timing of the cost assessment and the judgments made about which costs to include. It is crucial to consider these kinds of fundamental situational aspects of any ECE intervention when planning evaluation of costs and in turn, potential economic benefits. In this paper, we discuss the key issues affecting the cost analyses of ECE programs, recommending practices that would increase the comparability and transparency of program cost estimates. To illustrate this process, we present a cost analysis for the Head Start Research-based, Developmentally Informed (REDI) intervention.

1.1. The Context of Public Investment in ECE

ECE intervention emerged as an optimal strategy to reduce the income-based achievement gap in the 1960s, when Head Start was initiated (Administration for Children and Families, 2010). Federal and state investments in ECE programs have grown substantially since then, based heavily on cost-benefit analyses of two model programs with long-term follow-up data (the Perry Preschool Project and Abecedarian Project). For example, compared to children who stayed home, children who attended the Perry Preschool or Abecedarian programs completed more years of education, had higher levels of employment and income as adults, and fewer lifetime arrests (Campbell, Ramey, Pungello, Sparling, & Miller-Johnson, 2002; Schweinhart, 2005). Overall, the Committee on Economic Development (2006) estimated that societal cost savings for these “stand alone” preschool programs were $16.14 (for Perry Preschool) and $3.78 (for Abecedarian) for every $1 invested (Committee for Economic Development, 2006).

The landscape of ECE in the U.S. has changed dramatically since those initial model studies were completed. For example, among American 4-year-olds, rates of attendance at center-based early education programs grew from 10% in 1960 to nearly 70% in 2015 (Rathbun & Zhang, 2016). In addition, the number of states funding public prekindergarten programs for disadvantaged children climbed steadily, rising to 28 states by 1991, 38 states by 2005, and 43 states by 2017. In 2017, states invested over $7.6 billion in prekindergarten programs -- at an average cost of $5,008 per child – more than double the investments made just a decade ago (Friedman-Krauss et al., 2018). In this current context, relevant program evaluations are most likely to compare enrichments to (and variations in) preschool programming, rather than evaluate model preschool programs relative to home care, as in the early studies.

1.2. The Importance of Estimating the Costs of ECE Enrichments

In the context of this dynamic and rapidly-developing field of publicly-funded ECE, program implementers, educators, and policy-makers are increasingly interested in understanding the economic implications of different program designs. The goals are both to document the potential return on investment (Conti & Heckman, 2014; Council of Economic Advisers, 2014; Heckman, 2012; Zaveri, Burwick, & Maher, 2014) and to maximize the efficiency of ECE programs (National Academies of Sciences Engineering and Medicine, 2016). Well-directed resources can increase the likelihood for return on investment, but more importantly greater efficiency in program delivery could mean more children are given the opportunity for high quality early education that is shown to be vital for healthy development. The increased attention is reflected in recent policy trends. The value of ECE is now being formalized in funding structures within the performance-based contracting endeavors of the past several years (e.g., Pay for Success). These initiatives instigate funding for programs from private investors with expectation of a measured monetary impact based on participant benefits realized within several years of service (Crowley, 2014; Gaylor et al., 2016). Such initiatives are promising and recognize the economic relevance of effective preschool education.

At the same time, little attention has been paid to the complicated challenges of estimating program costs in the context of current ECE programs or enrichments. To do so, the actual costs of delivering program services must be tracked systematically and fully, with attention toward expected variation across different settings and timeframes for implementation. In addition, decisions must be made and reported so that others know what was (and was not) included in the cost assessment. Relatively few ECE interventions provide cost estimates. Of those that do, cost information is often incomplete and reported as summary estimates, creating publicly-available reports of cost estimates that can be difficult to interpret. Important efforts have been made toward providing such information. For instance, the Department of Health and Human Services established the Home Visiting Evidence of Effectiveness website to provide key information on home visiting programs that serve families with children from birth to age 5, including program costs broken down into a straightforward structure (e.g., average cost per participant, labor costs, materials and forms; http://homvee.acf.hhs.gov/implementations.aspx). Such efforts must rely on the individual program leaders to provide cost estimates, however, which can lead to incomplete information based on varying methodologies. While organized information about program costs is needed, their utility may be limited if procedures for calculating and reporting costs are not consistent across programs.

There are resources available to provide guidance for cost analysis, although these may be unfamiliar to ECE researchers given they originated in other disciplines. Expert panels have produced guidelines, including a step-by-step process for determining costs of social programs produced through the Children’s Bureau of ACF (Calculating the Costs of Child Welfare Services Workgroup, 2013) as well as other recommendations for necessary standards and procedures for sound economic assessment (Crowley et al., 2018; Levin et al., 2018; National Academies of Sciences Engineering and Medicine, 2016). On-line tool-kits also are available to help evaluators determine program costs through a guided process (e.g., CostOut, produced by the Center for Benefit-Cost Studies of Education). These general overviews are meant to apply to a broad area of programming that includes healthcare and education, and they provide a foundation to guide cost assessments of ECE programs. Following such resources should enable ECE researchers to characterize key information about programs costs beyond single point estimates and to identify the stakeholders of the cost analysis to identify the appropriate perspective for the analysis (Crowley et al., 2018). The presented estimates should consider costs beyond primary materials and personnel costs. In the following example and discussion, we note five key issues that should be addressed when determining cost estimates for ECE enrichment programs, but are sometimes complicated to assess: 1) phase of the intervention (e.g., how long it has been established), 2) setting (e.g., schools and/or homes), 3) nature of program personnel (including allocating percent time to personnel who participate in program implementation but have other duties as well, 4) intervention components (including the potential for varying levels of intervention, such as with or without a home visiting component), and 5) program recipients. We note that consideration of these five issues can help ensure an accurate and comprehensive cost analysis for a specific program and, if more broadly adopted, could increase consistency of cost estimates in the ECE field. These issues reflect and extend upon the need for more general attention to consistency of cost analysis methodology, which are covered in across-discipline resources elsewhere (Crowley et al., 2018).

In order to illustrate how a cost analysis might apply general standards and procedures for sound economic assessment and also make decisions in the five identified areas that often complicate ECE cost assessments, we next describe the REDI program which provides an example of a cost analysis within the setting of an ECE enrichment. Framing of the cost analysis sets the stage for listing and then valuing all necessary resources to carry out REDI. Here we adopt a process often labeled an ‘ingredients-based’ approach (Levin et al., 2018), where all program inputs are first identified that are necessary in order for the intervention to be effectively delivered, including necessary activities or events. These ingredients may require things that do not explicitly appear in program budget files, but are still instrumental for successful delivery (e.g., such as volunteer time). Valuation of program inputs then incorporates direct costs that might be found in budget ledgers as well as determined unit costs that represent the determined true value of the resource.

1.3. Preschool Enrichments in the Battle against Fade Out: The REDI program

ECE intervention programs increasingly involve modifying existing programs rather than evaluating model stand-alone programs, primarily by enriching existing programs with stronger instructional curriculum components (Bailey et al., 2015) or expanding outreach to parents (Welsh, Bierman, & Mathis, 2014). Both strategies hold promise for strengthening children’s skills and reducing fade out and were incorporated in the REDI project. In order to boost child social-emotional skill development, REDI enriched Head Start classrooms with an evidence-based social-emotional learning program, the Preschool PATHS Curriculum (Domitrovich, Cortes, & Greenberg, 2007). In order to build child language and literacy skills, REDI included an interactive reading program, Sound Games to build phonological awareness, and print centers to strengthen alphabet knowledge (see Bierman et al., 2008 for more details.) In addition, REDI provided teachers with workshop training and mentored coaching to promote high-quality implementation and positive classroom management strategies (REDI-C program; Bierman et al., 2008). In the theory of change (Schindler, McCoy, Fisher, & Shonkoff, 2019) guiding the design of REDI, the pace of child skill acquisition in social-emotional and language-literacy domains was determined both by instructional materials and by the quality of teacher-child interactions in the classroom; hence providing enriched curriculum components in each area and fostering high-quality teaching and classroom management strategies were viewed as complementary intervention strategies. In a randomized, controlled trial, children in Head Start classrooms using the REDI enrichments relative to “business-as-usual” Head Start scored significantly better on measures of social-emotional skills, learning engagement, vocabulary, and emergent literacy skills at the end of the intervention year (Bierman et al., 2008). Three years later, after children transitioned into elementary school and were in second grade, the academic effects had faded but other benefits remained in areas of social-emotional skills and learning engagement (Bierman, Heinrichs, Welsh, Nix, & Gest, 2017), which further sustained through fifth grade (Welsh, Bierman, Heinrichs, Nix, & Gest, in press).

Subsequently, Head Start home visiting was extended and enhanced to reinforce the classroom program and provide families with support as children transitioned from Head Start into kindergarten (REDI-P program). REDI-P was designed to complement and extend REDI-C by bringing learning materials into the home and coaching parents in positive teaching and caregiving behaviors, reflecting a similar theory of change in which we anticipated that child skill acquisition would occur as a function of enriched home learning experiences, including high-quality learning materials and sensitive-responsive, language-rich parent-child interactions. In a randomized controlled trial, children who received REDI-C and REDI-P, relative to children who received REDI-C only, showed significantly better social-emotional skills and academic skills in kindergarten (Bierman, Welsh, Heinrichs, Nix, & Mathis, 2015). At follow-up assessments in second grade, effects on academic skills were sustained and positive effects on child self-perceptions emerged (Bierman et al., 2017). By third grade, significant impacts were found on academic performance, social understanding, reduced problems at home, and reduced need for educational and mental health services at school (Bierman, Welsh, Heinrichs, & Nix, 2018). The success of the REDI program is reflective of other programs that have documented sustained effects in later school years by enriching preschool classroom practice with evidence-based instructional materials and teaching strategies (Clements, Sarama, Wolfe, & Spitler, 2013) and by using more intensive preschool parent engagement programs (Brotman et al., 2011; Reynolds, Temple, Ou, Arteaga, & White, 2011).

Given this documented effectiveness, it is worth understanding the costs to carry out the REDI program to prepare for later analyses examining the potential return on investment and to guide future implementations. Understanding the program costs can also illuminate potential efficiencies that could reduce costs in future disseminations. In addition, REDI characterizes several challenges that frequently affect the cost analyses of ECE programs, as it went through initial and sustained phases of implementation, was embedded into existing Head Start programs, included personnel with dual roles, and included distinct components (classroom and home visiting programming) with different (though intertwined) costs and benefits. Thus this endeavor can provide a good representation of what should be considered and potential worthwhile approaches for addressing challenges in cost analysis of ECE enhancement interventions. We note that we set out to examine these costs to represent what implementers can expect to incur in a real setting, in order to make program costs most realistic. For REDI, this involved making decisions about which resources should (and should not) be valued in this specific implementation or in consideration of alternative settings – decisions which we sought to be transparent about. We next describe our method for determining the costs for the program, and then present results with relevant implications given the ECE setting.

2. Materials and Methods

2.1. Participants

In the 2003–04 and 2004–05 school years, the REDI-C intervention was introduced into 25 Head Start centers in three Pennsylvania counties as part of an initial randomized trial. At the end of that trial, control group teachers and local program supervisors were provided with the REDI-C curriculum and trained in its use. Data used for this cost analysis were collected five years later, during the 2008–09 and 2009–10 academic years, during the randomized trial evaluating the REDI-P program. At this time, REDI-C was being implemented in all 64 of the participating classrooms, each led by two teachers (lead, assistant) and containing approximately 17 children. Within these classrooms, parents of prekindergarten children were sent letters describing the REDI-P study. Those who agreed to participate were randomized (within classroom) to receive the classroom plus the additional parent intervention (intervention group, REDI-P and REDI-C) or to receive the classroom intervention only (control group, REDI-C only). Cost assessments for REDI-P are based on 105 families who received intervention home visiting. Hence, cost assessments for REDI-P were conducted when the program was being introduced and implemented for the first time; cost assessments for REDI-C were conducted when the program was already established in classrooms.

2.2. REDI Intervention

REDI-C was implemented by Head Start teachers who were provided with detailed manuals and materials for each of the curriculum components. These included: 1) the Preschool PATHS Curriculum (Domitrovich et al., 2007), a 33-lesson program focused on promoting social-emotional skills (cooperation, emotional understanding, and self-control), 2) an interactive reading program, involving two books each week on topics aligned with the PATHS theme of the week, 3) a set of brief Sound Games designed to teach phonological awareness, organized developmentally, and 4) a set of activities and materials to use in print centers to promote letter knowledge. To assure the maintenance of high-quality implementation, teachers and local supervisors were provided with a one-day “booster” training workshop at the start of each year, and a REDI-C program consultant made monthly classroom visits to answer questions and support quality program implementation.

REDI-P was designed to strengthen the impact of REDI-C by increasing parent support for learning at home as children navigated the transition into kindergarten. Families received 16 home visits (10 during the prekindergarten year; 6 after the transition to kindergarten). Visits followed a well-specified manualized curriculum that included home learning versions of the classroom intervention components: interactive reading, learning games, and pretend play activities that taught PATHS skills and letter knowledge, and supported parent-child conversation (Bierman et al., 2015). To support implementation fidelity, home visitors participated in joint training workshops and weekly supervision calls. In addition, the REDI-P supervisor made a bi-monthly visit to each site, attending 20% of the home visits to provide individual feedback and guidance to each home visitor, and to assure standard intervention implementation across the various home visitors.

2.3. Procedures for Cost Analyses

The initial step in the cost assessment was to identify all program inputs or ingredients (Levin et al., 2018) that were needed for effective intervention delivery. Some ingredients may not explicitly appear in program budget files and have costs attached, but are still instrumental for successful delivery (e.g., such as volunteer time, donated space, etc.). Data collection included a review of all budget accounts for the project, along with interviews with key program staff located at Penn State University where the program originated (Bierman et al., 2008). This included faculty involved in program development as well as the intervention program supervisor.

We valued program inputs using expenses tracked in detailed university budgets. Costs needed to be separated first based on whether they went toward program implementation versus toward program evaluation, as this version of REDI involved a research component. In many cases, budget files explicitly identified line item costs as intervention or research-related. Ideally, the proportion of materials and equipment used for the intervention (versus some other use) can be accurately determined through use of tracking logs. Where amounts were not clearly identified, allocation was based on the percentage of total project personnel time going toward intervention implementation versus toward research, as based on recommended strategies for valuing shared resources (Calculating the Costs of Child Welfare Services Workgroup, 2013; Foster, Dodge, & Jones, 2003). This allocation weighting was also used to separate costs going toward non-personnel items that were jointly used for evaluation and implementation (e.g., computers). Salary fringe amounts were allocated based on the proportion of salary dollars that went toward program delivery within each year. Budget detail information was also sufficient to identify costs going toward REDI-C versus REDI-P, as well as helping us distinguish development costs for REDI-P. Once resources/ingredients were identified and costs determined, necessary cost adjustments were applied in order to align costs into a single base year (i.e., adjusting for inflation) (Corso & Filene, 2009). For this cost analysis, all dollar amounts were adjusted to 2008 (the first year of the cost assessments) using the Consumer Price Index from the Bureau of Labor Statistics.

Finally, we took steps to represent uncertainty in the final estimates. As noted above, there are decisions for allocating costs used here that could differ in other circumstances of implementation. This uncertainty can be characterized through sensitivity analysis that are part of the overall cost assessment, in order to represent the extent to which cost estimates may vary across settings (Foster, Porter, Ayers, Kaplan, & Sandler, 2007; Jones, Greenberg, & Crowley, 2015).

It is important to consider the format and phase of the intervention in order to define resource needs. Here we characterize necessary training costs for REDI-C, sufficient for implementation across two years for two cohorts of children and based on cost information collected five years after the introduction of the REDI-C program, and in the first years of the REDI-P program. Two cohorts were involved given the nature of the funding for this project. For longer-term projects, costs could be based on more years of data to examine variation in costs over time (Calculating the Costs of Child Welfare Services Workgroup, 2013). We consider the costs based on these two cohorts to be representative of the typical state of intervention delivery. Program development costs were necessary to carry out REDI-P, but these costs would be unnecessary in subsequent implementations. Hence, we sought to isolate the personnel and non-personnel inputs for program development so they could be excluded from final costs of the program. Through the ingredients-based method, we documented all program inputs to cover personnel, non-personnel and travel necessary for this implementation of the program. A key element for this cost analysis was delineating the incremental costs necessary for REDI-P. Finally, in order to determine a ‘unit cost’ for REDI, we clarified the primary beneficiary of the program based on intervention logic models.

3. Results

Table 1 provides REDI-C and REDI-P program inputs, distinguished by personnel and non-personnel categories. Program ingredients were identified through interviews with the program developer and supervisors. We present cost amounts considering costs for the two cohorts that were separated by one year (academic years starting in 2008 and 2009). We tallied all necessary program ingredients and determined costs through valuation based on program budgets. Tables 2a and 2b provide the overall costs for REDI-C and REDI-P for the three years of the project broken down by sub-category. Specifically, Table 2a presents the costs for REDI-C for two cohorts (two years) while Table 2b presents the incremental costs for REDI-P. The additional year for the latter covers the booster home visits that occur for the second cohort. We also present a separate table (Table 3) that includes definitions of cost analysis terms used throughout this paper. Definitions of terminology for economic evaluation more broadly can be found elsewhere (National Academies of Sciences Engineering and Medicine, 2016).

Table 1.

Program inputs for REDI-C and REDI-P programs

Personnel Non-personnel
REDI-C Intervention Implementation:
Head Start teachers
Local coaches (Head Start staff)
Supervisor/trainers
Administrative assistant
Training:
Annual training meeting (3 days initially; booster trainings in subsequent years)
Supervision/coaching Meetings:
Periodic coach-teacher meetings
Periodic coach-supervisor meetings
Curriculum/materials:
Preschool PATHS Curriculum
Language and Literacy Curriculum manual
Alphabet Sounds Photo Library
Books for Interactive Reading
Facilities:
Office space
Equipment (computers, telephone, copier, postal)
Travel:
Coach travel (local) to classrooms
Training:
Travel costs for teachers (mileage, hotel, meals)
Training materials
REDI-P Intervention Implementation:
Home visitors
Supervisors/trainers
Administrative assistant
Training:
Annual training meeting for home visitors (5 days initially; booster trainings in subsequent years)
Supervision/coaching Meetings:
Weekly individual and group phone calls
Supervisor (accompanies 20% of home visits)
Curriculum/materials:
REDI-P manual
Activity boxes
Handouts, memory books
Facilities:
Office space
Equipment (computers, telephone, copier, postal)
Travel:
Regular home visitor travel costs
Periodic supervisor travel costs
Training:
Training materials

Table 2a.

Total Direct Costs of REDI-C by Project Year (2008 dollars; rounded)

Year 1 (2008) Year 2 (2009)
Dollar Amount % of Budget Dollar Amount % of Budget
Personnel: 32.3% 46.2%
Supervision/Administration $ 27,171 20.9% $ 18,391 24.6%
Contract personnel $ 14,869 11.4% $ 16,095 21.6%
Non-personnel: 32.9% 32.8%
Materials $ 14,397 11.1% $ 9,771 13.1%
Travel costs $ 685 0.5% $ 397 0.5%
Facilities $ 27,824 21.4% $ 14,325 19.2%
Training: 34.8% 21.0%
Supervision/Administration $ 4,619 3.5% $ 410 0.5%
Teacher payments $ 22,050 16.9% $ 8,625 11.6%
Training materials $ 2,145 1.6% $ -
Meeting costs $ 10,702 8.2% $ 4,509 6.0%
Travel for training $ 5,799 4.5% $ 2,101 2.8%
Total Costs $ 130,263 $ 74,624

NOTES: Year 1 involved 26 classrooms where REDI-C occurred (in Huntingdon and York counties); Year 2 involved 38 classrooms (in Blair, Huntingdon and York counties). Contract personnel were three coaches from Head Start organizations in the counties where REDI was implemented. Supervisory and administrative personnel worked from university facilities, centrally located to the three REDI counties.

Table 2b.

Total Incremental Costs of REDI-P by Project Year (2008 dollars; rounded)

Year 1 Year 2 Year 3
Dollar Amount % Budget Dollar Amount % Budget Dollar Amount % Budget
Personnel: 81.7% 80.1% 86.6%
Supervision/Administration $ 27,215 31.9% $ 23,026 17.2% $ 2,822 6.7%
Home visitors $ 42,479 49.8% $ 84,308 62.9% $ 33,812 79.9%
Non-personnel: 16.2% 17.9% 13.4%
Materials $ 842 1.0% $ 2,698 2.0% $ 430 1.0%
Travel for home visits $ 6,800 8.0% $ 17,180 12.8% $ 5,227 12.4%
Facilities $ 6,212 7.3% $ 4,095 3.1% $ -
Training: 2.5% 2.1%
Personnel $ 1,739 2.0% $ 1,849 1.4%
Meetings $ 389 0.5% $ 930 0.7%
Total Costs: $ 85,300 $ 134,017 $ 42,304

NOTES: Year 1 included 10 preschool visits for cohort 1 (N = 43 families). Year 2 included 6 kindergarten visits for cohort 1 (N = 43) and 10 preschool visits for cohort 2 (N = 62). Year 3 included 6 kindergarten visits for cohort 2 (N = 62). Materials include intervention curriculum and supplies; Facilities include costs for space and necessary equipment (including computers, copier, telephone, postal needs). Meetings include costs to carry out meetings on-site (including necessary materials and food).

Table 3.

Glossary of Relevant Terms for Cost Analysis in Early Childhood Educational Settings

Allocation weighting Technique used to estimate costs for resources that go toward multiple purposes (only partially to the program of interest). Can be used where program inputs cannot be directly valued from budgets, but rather determined from estimated proportions of shared resources that go toward the intervention. For instance, amounts of shared materials can be determined from the percent of personnel time that goes toward intervention activities.
Incremental costs
Program costs that are determined for additional components of the intervention above and beyond those going toward a primary component.
Ingredients-based method Name for cost-analysis methodology that involves detailing all necessary program inputs and activities (Levin et al., 2018)
Marginal costs The expected increase in program costs that would occur as the number of participants or services increase (e.g., due to need for more materials or more project personnel time)
Opportunity costs The value of a resource in its next best use. For instance, if an intervention implementation relies on volunteer time and it is determined that the volunteer’s time could feasibly go toward another purpose (e.g., earning money some other way).
Point estimate (average cost) Program costs in terms of an average based on the determined unit of analysis. For instance, the estimated cost to deliver the program per participating child.
Program development costs Certain costs that may be required originally to design and refine the project that can be excluded from the total program costs; that is, if they would not be necessary in subsequent implementations.
Sensitivity analysis Process in cost analysis whereby all anticipated alternative implementation arrangements are determined. Alternative cost ingredients and amounts are used to determine an approximate range of costs that could occur across different implementation scenarios.

3.1. Program Definition and Phase

A definition of the program as it exists for this implementation helped clarify the structure for what resources and costs would be necessary (Foster et al., 2007). In this case, REDI was evaluated as a preschool intervention for children delivered through a partnership between regional Head Start administrators, teachers, and university-based early childhood developmental experts. The classroom intervention (REDI-C) had been introduced initially five years earlier and was evaluated as an on-going implementation. The home visiting intervention (REDI-P) was evaluated here during its initial implementation. This difference in phase of implementation had an impact on costs, particularly materials, training, and supervision costs which are generally highest in the first year of implementation. (We factor in potential variation with different implementation characteristics at the end of the Results section.)

For example, preschools were already using REDI-C before this trial, which reduced training and material costs relative to the costs of initial introduction of the intervention into the system. This has implications for how curriculum materials are obtained for the classroom and home visits. For REDI-C, this involves costs for Preschool PATHS curriculum, interactive reading books, language and literacy curriculum, and a sound games picture library (Bierman et al., 2008). Some of these intervention materials were purchased from vendors; the language and literacy manual was developed for the program and is available at no cost from the program developers, but must be printed and copied for teacher use. For this implementation of REDI-C, most classrooms already had core curriculum materials purchased earlier (e.g., the PATHS curriculum, books, and picture library), reducing the costs of materials relative to the initial year of implementation. Additional materials were purchased to provide the REDI-P curriculum, including books for interactive stories between parents and the child, and preschool PATHS activities. As shown in Table 2a, costs for materials for REDI-C was roughly 11% of total costs in the first year and 13% in the second year. Additional materials for REDI-P were only 1–2% of its total costs.

As noted the cost assessment for REDI-P was based on its first years of implementation, while the intervention was still undergoing some development and refinement. Hence, costs associated with intervention development were separated out from the implementation costs because they would not re-occur in the future. These amounts were straightforward to allocate given that they involved only personnel hours and travel costs. The two project leaders who put in time to help develop the home visiting procedures noted that roughly 25% of their time in the first year of REDI-P went toward program development, and these personnel costs were identified but excluded from costs presented here.

3.2. Setting

The setting of the intervention has implications for the space and supplies/materials necessary for intervention delivery. This version of REDI was delivered in three counties representing both rural and urban settings in central Pennsylvania (Blair, Huntingdon and York counties). Many resources necessary for the program in these communities are likely to generalize to other settings, although certain differences may be expected. Required space for REDI included office space for supervisors/trainers and home visitors and classroom space where the program was implemented. There were three primary space needs. First, space for supportive personnel (trainers, home visitors, and the administrative assistant who prepared intervention materials) was covered under facility and administrative costs (e.g., indirect costs) that were included in university budget files. Other non-personnel resources in this setting were required, including computers and other miscellaneous costs (e.g., communications, postage, copier) necessary to support program activities. Costs for these non-personnel resources were extracted from budget files. Both facilities costs and these non-personnel costs associated with the intervention were determined using the allocation weight described above, i.e., based on the proportion of project personnel involved in intervention duties. This was necessary since overall facilities costs also covered resources for research and evaluation. A second space requirement was for the Head Start staff who provided on-site supervision and coaching to teachers (program-level curriculum supervisors). These staff were housed at local Head Start administrative offices although primary intervention activities occurred within the classrooms, and space costs were absorbed in contract payments to the agencies where program consultants were based. A third space requirement was the Head Start classrooms where REDI instruction occurred. In this case, no costs were allocated for classroom space given the curriculum was delivered as part of daily activities, with no additional space required beyond the primary classroom space use. Facilities costs are shown in Table 2a for administration of REDI, overall, and represent roughly 21% and 19% of total costs in years 1 and 2, respectively. Costs for facilities for REDI-P were allocated based on the percentage of time that intervention personnel (primarily supervisors) spent on REDI-P activities (described below); these costs were low relative to other aspects of the home-visiting program.

The setting also has implications for travel requirements. For REDI-C, travel costs for personnel were negligible given all activities occurred within the pre-school classroom, with no additional travel required by teachers for delivering the curriculum. Travel needs were mostly for training and supervisory purposes, including teacher travel for a 2-day training (including both reimbursement for driving costs and hotel costs). Visits to the classrooms in both years of implementation required periodic local travel for curriculum supervisors, although such costs were negligible and absorbed into contracts with the local Head Start agencies. Travel costs to schools are not valued here for children or teachers, because the intervention was embedded in the school day and there were no additional transportation costs beyond the standard supports provided by Head Start. Travel was more instrumental for REDI-P home visits, where costs were necessary to enable ten home visits for each cohort during the preschool years, and six booster home visits the following year once the child was enrolled in kindergarten (thus costs stretched across three years for REDI-P for both cohorts). These costs were reflected in project budget files as reimbursed driving costs to home visitors. As shown in Table 2a, travel costs for REDI-C outside of training were negligible (less than 1% of total costs). Incremental travel costs for REDI-P home visits comprised roughly 8% to 13% of total costs in the three years of implementation.

3.3. Personnel

Primary personnel for REDI-C were the Head Start teachers who implemented the curriculum, using about 3.5 hours of their classroom time to deliver the intervention each week as part of their normal teaching responsibilities. For REDI-P, key personnel included five home visitors (to cover all three counties). Supervisory support was provided by local coaches (Head Start staff) for REDI-C and intervention trainers/supervisors located at the university for REDI-C and REDI-P. Three local coaches made periodic visits to all classrooms and were also available on an as-needed basis. They were paid for this work through contracts with the county Head Start organizations where they were already engaged, using a standard rate of compensation based on the number of classrooms using REDI. Project trainers/supervisors were involved in providing oversight and supervision to both REDI-C local coaches and REDI-P home visitors, so personnel time had to be appropriately allocated. Supervisor activities for REDI-P ensured that home visits were planned appropriately given the needs of the families. Through interviews with the project developer and one supervisor we were informed that 30% of supervisor time on the project went toward REDI-C while the other 70% was required for REDI-P. Finally, a university-based administrative assistant’s time was required for obtaining and preparing intervention materials and other miscellaneous tasks associated with intervention implementation, and was equally divided between REDI-C and REDI-P based on time breakdowns provided by program leaders.

As noted, resources for training personnel were vital for effective implementation of the program. For REDI-C, a two-day training workshop was held in the first year to introduce the intervention to new teachers, and another one-day “refresher” training workshop was held each year to review key intervention elements for returning teachers in order to promote high-fidelity implementation. These training workshops were held at a central location and served teachers from 26 classrooms in the first cohort and 38 classrooms in the second cohort. Training resources included travel and other meeting needs (including food and hotels for some teachers), and teachers were compensated for their time for training (roughly $150 per day per teacher). Supervisory personnel time (in salary amounts) was required to carry out the training sessions and was determined as a percentage of their salary during the training period. Separate training sessions were necessary to train home visitors to implement REDI-P. A two-day training occurred in the first year with three one-day booster trainings roughly every six months following the initial training. Training costs included personnel time, materials and meals.

While costs to compensate teachers for their time during training were included as REDI-C costs, no additional personnel costs were required for teachers to implement the program because implementation occurred during their usual work time. Lessons were integrated into existing activities, rather than displacing alternative instruction (as might occur for programs directed at children of older ages, for instance). For example, all classrooms included book-reading; REDI-C enriched the way book reading was conducted. Moreover, the program provided professional development opportunities and introduced teaching strategies that improved classroom order, thus providing benefits to teachers (Domitrovich, Gest, Gill, Jones, & Sanford DeRousie, 2009). For considering costs, this offset the need to include teacher opportunity costs as part of overall program costs. Likewise, no indirect valuation was necessary for parents’ time to participate in the home visits. Program developers structured the home visit to occur when the parent would be spending time caring for the target child, and thus would likely not be precluding employment time (i.e., that could be represented as an opportunity cost). In addition, the program would never be carried out with alternative personnel taking the role of the parent in delivering the home instruction. In order to best represent the true costs of REDI-P as they would always occur, we do not present an estimate that considers such a scenario.

Tables 2a and 2b show the costs for personnel for both arms of the program. For REDI-C, personnel costs were higher in the first year although make up the same percent of overall costs across both years (roughly 33%). Personnel costs made up most of the incremental costs for REDI-P, comprising over 80% of costs in all three years. Training costs represented roughly a third of REDI-C costs in the first year of the program, and roughly a fifth of costs for booster trainings in the second year. For REDI-P training costs were minimal (less than 3% of incremental costs) and thus costs mostly comprised just personnel time for meeting days.

3.4. Intervention Components

In this cost analysis it is important to distinguish costs to deliver REDI-P over and above REDI-C, as REDI-P costs are incremental or in addition to costs for REDI-C. It is important to delineate the costs that would be linked in subsequent analysis to intervention outcomes from REDI-C alone versus REDI-P. Additionally, there may be some efficiencies when REDI-P is included in the effort versus REDI-C alone. Our cost analysis here reveals that this mostly occurs through dual supervisory roles, where supervisors can divide their time as needed to both arms of the intervention. In turn, space requirements for supervisors could also be combined for both purposes as well as other non-personnel resources (e.g., computers).

The incremental costs for REDI-P are noted above, consisting predominantly of costs for personnel (home visitors and supervisors), facilities and additional materials needed for home-based program activities. Although these costs are considered incremental, the costs for REDI-P exceed those for REDI-C given the need for individual visits for each family. Indeed, the total incremental cost for both cohorts was approximately $261,565 to deliver REDI-P, which exceeded costs for REDI-C overall ($208,038).

3.5. Program Recipients, and Cost Analysis Summary

The cost per participant is often presented to represent the program cost, rather than total costs, and may be used to compare across interventions. While it may seem straightforward, there can be differences in how programs arrive at the key unit. The cost analysis should rely on a strong basis for determining this unit, such as the intervention logic model. We defined REDI’s reach here as the children served by the intervention curriculum. Although teachers and parents were directly involved in delivery of the program and received intervention to support their skill development, we did not consider the teachers or parents to be recipients of the program, nor were siblings of the target child considered recipients even though there may be some benefits to these individuals. Following the logic model of the intervention, we focused on the intended distal outcome of the intervention, which was the skill acquisition and school adjustment of the children. Theoretically, an effective home-visiting program may have benefits that go beyond the target participant. However, those benefits are not represented in the current logic model of the intervention and in turn are not considered for economic assessment.

Overall costs for REDI-C were divided by all children served (i.e., all children attending preschool in the classrooms where REDI was implemented) in order to calculate the per participant cost. Per child, a point estimate for classroom REDI cost was approximately $191 for one year of instruction. Additional costs per child for those who received the home visiting services were determined from the total incremental costs for REDI-P divided by the number of families receiving home visiting services, which were approximately $2,491 per child. The combined per-child point estimate for REDI-P costs (including both class and home-visiting) was approximately $2,682 (in 2008 dollars).

3.6. Sensitivity Analysis

Anticipated variation in implementation characteristics for a program can provide important information for the cost analysis. A key part of the cost analysis results is presentation of variation (or representation of uncertainty) for the point estimates presented above, based on factors where we could reliably foresee alternative specifications for program inputs. We identified three primary components that would likely vary across different implementations held in different settings (for example, childcare centers or public prekindergartens) or different geographical locations: costs related to curriculum, arrangements for classroom coaches, and facilities charges (Table 4). First, as noted above in this case curriculum for classrooms were already available from prior implementation, thus costs for materials involved generating copies of materials. An alternative would involve purchasing all materials separately; this would be especially necessary where the program had not been implemented before. Costs listed by program developers per classroom for curriculum include $839 to purchase Preschool PATHS, $45 for costs to print the Language and Literacy curriculum, $50 for the Lakeshore Learning Alphabet Sounds Photo Library, and $425 for books for the Interactive Reading Program. For this analysis, these costs were factored by the total number of classes across two cohorts, and the difference in costs were calculated over and above the copying costs that occurred in this implementation. The sensitivity analysis reflects the maximum amount of costs that could occur if all materials had to be purchased to carry out REDI.

Table 4.

Sensitivity Analysis Details for this Implementation of REDI

The implementation Anticipated alternative Amount greater (or less) in alternative scenario, rounded to nearest dollar (2008 dollars)
Curriculum PATHS curriculum already available
Costs include:
--Funds to copy all curriculum, already available in the schools
--Additional book costs to cover 38 classrooms
Materials purchased originally
--enough purchased for 38 classes
--includes preschool PATHS curriculum, Language and Literacy curriculum, Lakeshore Learning Alphabet Sounds photo library, and books for interactive reading
$32,208
Increase of approximately $29.60 per participant
Coaching arrangement Supervisory personnel (central, university location)
Coaching contracted through Head Start facilities (local to three preschool locations)
Coaches fully covering supervision of classroom intervention activities
--hiring enough coaches to cover 26 classes in first year, and 38 classes in second year
--training costs at start-up for coaches to effectively carry out supervision of intervention activities
$185,266
Increase of approximately $170.28 per participant
Facilities & administrative costs Covered through university facilities and administrative costs (roughly 20% of total intervention costs) Assuming non-university setting for any supervisory and administrative personnel (10% rate) ($23,882)
Decrease of approximately $21.95 per participant

We used similar methods to determine how much more local coaches could cost with an original implementation. For this version of REDI, coaches’ time was covered through contracts with regional Head Start administrators, and their role overseeing REDI was embedded along with their weekly duties. Had coaches needed to be hired separately, costs would exceed what occurred for this implementation. For considering the alternative, program developers determined the required coach staffing in terms of number of full-time employee (FTE) time per classroom. Specifically, it was estimated that one full-time coach can cover 10 to 12 classrooms during the school year. From this, we determined that 2.17 FTE positions could cover 26 classes in year 1, and 3.17 FTE could cover 38 classes in year 2. Including an extra day of training for coaches – estimated to be $2,000 – the total increase in coaching personnel costs above those determined here would be roughly $185,266. Considering the potential greater costs that could occur for both curriculum materials and alternative coach staffing, we estimate that costs for REDI-C could increase to roughly $391 per child, or $2,882 if the child received REDI-P.

The third alternative cost involves that going toward facilities. For this analysis, facilities for supervisors and relevant activities to overseeing both REDI-C and REDI-P were covered through indirect costs included in university budgets. These costs are roughly 20% of overall costs to carry out the intervention, and likely exceed what would occur in implementations where supervisors are local. We estimated that this percentage could go as low as 10% of overall costs. This is likely conservative given the nature of the intervention, where most activities occur within the Head Start classroom. This change would have led to a decrease in roughly $23,882 from the total costs across three years. From this, we can determine the lowest expected cost at roughly $169 per student for REDI-C or $2,660 for REDI-P. Overall, our sensitivity analysis determined that the range of costs per child could vary from $169 to $391 for REDI-C alone, and from $2,660 to $2,882 if REDI-P was also delivered.

4. Discussion

Evaluation of the program costs of ECE interventions can provide crucial information for policy and funding decisions. Ideally, the results from such evaluations not only inform about specific program implementations, but also help reveal how costs would unfold in different implementations of that program. Additionally, cost analyses should provide information about resources required under different programming conditions (Bowden et al., 2018). Such comparison relies on equitable methodologies that consider necessary ingredients and resources in a comprehensive and transparent manner (Jones et al., 2015). The example presented here for the REDI project shows how, using established cost analysis approaches, one can carry out a comprehensive assessment of necessary resources for such an ECE enrichment intervention that estimates the expected budgetary needs for implementation of this program as intended. The sensitivity analysis provides information on variation in these costs expected to occur across anticipated alternative arrangements. Such variation may be driven by certain setting, timing, or geographical factors. We next examine how consideration of such issues should be considered when carrying out a cost analysis in this field.

4.1. Five Key Issues for Cost Analyses of ECE Interventions

The REDI cost analysis described in this paper highlights five key issues that are relevant to the cost assessments of many ECE interventions. We have identified these issues based on prior work in both cost analysis and ECE fields, with background supported by prior research (Calculating the Costs of Child Welfare Services Workgroup, 2013; Corso & Filene, 2009; Levin et al., 2018). First, it is important to consider the range of costs that will occur at different phases of intervention implementation. Rarely do assessments compare “start-up” program costs to “maintenance” program costs to differentiate the (often higher) costs of initiating a new program relative to the on-going costs of the program after it becomes a sustained component of usual practice. We recommend a clear analysis of the range of costs across intervention phases, as provided in the REDI illustration. Second, the setting of the intervention can affect costs and a cost assessment needs to describe clearly the degree to which a particular setting (Head Start programs in the current analysis) is absorbing or off-setting costs that will otherwise need to be expended in an alternative implementation setting. For example, the REDI project was conducted in Head Start programs with educational program supervisors on staff who could take on the role of the REDI-C supervision once the program was established in classrooms. Currently, we are evaluating REDI-C in community childcare centers that do not have this staff resource. In that context, we are training center directors to provide ongoing supervision for teachers and will evaluate the relative costs (and benefits) of this alternative delivery system. Third, the nature of program personnel should be specified, clarifying the differential costs associated with different kinds of personnel. For example, in REDI, local Head Start supervisors for the REDI-C intervention were less costly than university supervisors for the REDI-P intervention, due to differences in proximity and corresponding travel costs and salaries. Providing clear estimates on these aspects of personnel selection are important as they allow program implementers to determine their relative costs over time if they were to train local supervisors rather than rely on external trainers/supervisors. Personnel qualifications should also be considered. Certainly higher qualified personnel may lead to a greater program impact, but for cost analysis may also translate into higher personnel costs. Fourth, a clear delineation of the costs of different intervention components (as in the analysis of the costs of REDI-C and incremental costs of REDI-P) is also useful, particularly because of the fast pace of ECE programming innovations and evaluations. Some ECE programs are tiered, in which all participants receive some intervention components (e.g., an initial assessment, a school program) and others receive additional, more intensive components (e.g., individualized intervention, home visits.) The cost analysis should identify costs shared across the intervention components, and demarcate any incremental costs associated with additional components to clarify the expected variable costs associated with different levels of intervention delivery. Future programming can rely on a stronger foundation for decision-making if the incremental cost (and benefits) of particular program components have been well-documented. Finally, it is important for cost analyses to be clear regarding the program recipients, and the way that the “denominator” was determined for program costs per individual served. ECE interventions are often delivered within a classroom or family setting. The cost analysis should be based on the targeted recipients of the intervention services, which is not necessarily the same units as the research design for a primary evaluation. For example, in the REDI study, we provided intervention services to teachers and parents and evaluated the impact on their skill development. However, because the REDI logic model considered children to be the targeted recipients of the intervention and teachers and parents to be implementers of the intervention, only the children were considered in the “denominator” representing the individuals served by the intervention.

Attending to these five issues will strengthen the evidence base of ECE enrichment program costs, and help policy makers and administrative decision-makers understand what the costs represent and what can drive more efficient use of resources (Table 5). Theoretically, continued rigorous assessment of a program’s costs can enable refined estimates of costs across multiple implementations (e.g., in different regions). Often, programs are represented as having a finite cost. However, as illustrated here, the cost of particular programs is rarely rigid and set, but rather ranges depending upon multiple features of the implementation conditions. A thorough assessment of the phase, setting, personnel, component(s), and recipients of an ECE enhancement, along with use of the ingredients-based approach, can provide comprehensive cost estimates and allow for estimates of a range of costs based on likely variations in future implementations.

Table 5.

Key Issues for Establishing Cost Analyses for Early Childhood Education Interventions

Key issues and questions Impact on cost assessment
Phase of the Intervention
Is cost assessment considering a new program or one that has been implemented before? Does assessment consider the first year of implementation or is it for an on-going program? Are future costs likely to be different than costs in the beginning of implementation? Report initial costs for intervention materials, training, and supervision that are one-time only; separately report program maintenance costs that will occur in subsequent years. Include costs for trainers, payments to staff to attend training, training supplies, travel costs for training (supervisors and staff), and amount of time and frequency (training days, supervision time) needed for training
Was the program undergoing development at the time? Estimate program development costs and remove from overall intervention costs (i.e., that would not occur in subsequent implementation)
Setting of the Intervention
Where does the intervention occur?
What resources are already available or must be purchased for the intervention?
What personal time or other activities are displaced by the intervention?
Use ingredients-based methods to assess costs for supplies/materials and space purchased for intervention; separately include indirect valuation of supplies/materials and space needed (but not purchased for) intervention. Indicate time spent by personnel and participants for intervention activities even if embedded in existing jobs or roles.
Nature of Program Personnel
Who implemented this version of the program and how much time was involved?
Who provided training/supervision and how often did they interact with staff?
List all personnel involved in intervention implementation, total time involved, and percentage of overall work time that goes toward this program. List all supervisors and their typical location. Include all costs for staff and supervisors to travel to carry out the program.
Delineation of Multiple Components
Are there different components that at least some participants receive?
What percentage of participants receive these components and how often?
List all personnel involved in additional components, including intervention implementers and supervisors, and the time involved.
Include any additional supplies/materials, training costs, and travel for additional components.
Program participants
What is the proper base unit for cost analyses? Who receives intervention and is expected to demonstrate benefits on measureable outcomes? List all individuals who should benefit from the intervention and measured outcomes. Should base unit for analysis on intervention logic model and on evaluated program outcomes.

This cost assessment of the REDI program produced cost estimates of receiving REDI in the classroom per child, as well as the incremental costs for home visiting services. This information can be useful for examining differential cost-effectiveness from these two arms of the intervention, and subsequently inform whether one approach is a more efficient use of resources than the other (Foster, Olchowski, & Webster-Stratton, 2007). Home visits obviously require many more resources but could be more cost effective if there is a greater impact on targeted outcomes. Follow-up assessments of study participants now underway will provide data for estimates of the long-term benefits of REDI classroom and parent programs, which will enable cost effectiveness and cost benefit analyses.

4.2. Limitations

There are limitations to this study that should be noted. The cost analysis made great use of direct costs in budgets, and certain levels of variation are harder to determine. For instance, the costs of program ingredients may change just as a matter of time, dependent on broader economic factors. Other costs are subject to regional differences (e.g., travel costs for home visitors and program consultants, space costs for project administrators) that may be difficult to gauge for variation given the vastly different settings whereby REDI could be installed. Some resource needs and personnel time were reliant on estimates from program leaders for proportion of time spent in certain activities, and thus are subject to appraisal error. This includes amounts apportioned for development of REDI-P, where personnel time toward development of the program had to be distinguished from time going toward program implementation. We did not include certain opportunity costs – such as costs for classroom space or teacher or parent time to implement the program – in order to best represent the expected required resources for future planning. In contrast to some educational interventions, the REDI-C curriculum was designed to be embedded into usual activities of Head Start preschool and the program would not occur in a location where additional space costs would be required, thus we feel that exclusion of such costs allows for best representation of resource requirements in a real setting. In addition, we believe the REDI-C and REDI-P programs enriched the educational quality of time teachers and parents (respectively) spent reading and talking with children but did not significantly supplant alternative educational activities, justifying our decision not to include opportunity costs in our analysis. For example, parents in the REDI-P control and intervention groups reported spending similar amounts of time reading with and playing with their children, but parents in the intervention group reported asking more questions and talking more with their children while reading. We do acknowledge that alternative approaches to cost analysis may take a more conservative approach that places value on such resources regardless of whether a true cost would ever occur. We also do not consider any induced costs due to implementation of REDI, that may be relevant for certain educational interventions (e.g., an intervention that successfully influences participants to seek other needed services) (Bowden, Shand, Belfield, Wang, & Levin, 2017).

Our focus here involved understanding the costs of a specific program as it exists currently while anticipating likely costs if fundamental elements (e.g., curriculum, coaching arrangements) differed. Of course, cost evaluation does not occur in a vacuum, and certain limitations should be acknowledged that relate to larger matters out of the control of program developers and interventionists. For instance, one should acknowledge key differences in regional policy that could affect variation in costs, an influential but ever-changing landscape. This may be related to how funding occurs for ECE programming. For instance, blended funding for programs is common. Currently, twenty-nine states with pre-kindergarten programs allow a mixed service delivery model in which public schools, public and private preschools, Head Start, and community agencies all participate in serving children. Although schools house the majority of pre-kindergarten students, about 30 percent of all enrolled children receive services in community settings (Barnett et al., 2016). In some states, such as Connecticut and Oregon, rates are much higher, with 85% and 83% of children, respectively, receiving pre-kindergarten services in a program other than a public school. As a consequence, the costs of the “base program” that is being enriched varies tremendously, as public school versus community settings operate under different licensing requirements, with different levels of teacher and staff salaries, different length of day and class size, affecting program costs. In this study, the costs we estimated were for the addition of REDI to the “base” program of Head Start; a total accounting of all costs incurred would include the costs of REDI and the cost of the Head Start programs that housed it.

Larger issues related to funding for early childhood educational programming will influence the degree to which evidence-based interventions can be integrated into existing programming and sustained. Programs such as REDI depend on the educational systems in place within communities, and vast differences will exist for how programs could be installed and for how long. Any cost analysis should acknowledge the larger infrastructure whereby programs will exist, and the degree of volatility that exists over time. The dynamic nature of systems through which early childhood interventions occur has implications for the ability to implement effective programs in general, and likewise has great implications for how resources are managed.

4.3. Conclusions and Future Directions

Detailed information on program costs can be useful in multiple ways. For research purposes it provides a critical part of the equation for assessing program cost-effectiveness. It also provides key program information more generally for policy and planning decisions. For those considering implementing new versions of the program, it is essential to understand what type of funding will be necessary, and how necessary planned resources will be affected by length of implementation, region, size of the population and other key factors. For all of these reasons, program costs should be comprehensive, accurate and, if possible, generalizable to other settings.

Our example for the REDI intervention presented here has presented costs in terms of five key issues for cost analysis, with lower per-child costs shown where only the universal classroom curriculum is delivered in contrast to higher per-child costs where home visiting services are provided. Future research will incorporate program evaluation results to further understand whether more intensive home-visiting services are cost effective for REDI.

Trends in policy for global assessment of multiple programs in an area such as early child educational intervention – sometimes involving comparison between programs – should incorporate information that has been derived in equivalent ways. The reality is that program cost estimation for early childhood interventions is not carried out equally across projects. This is understandable since those running the programs are unlikely to be trained or experienced in evaluating costs. Costs may not be considered at all or in contrast programs may include economic experts on the evaluation team to carry out rigorous cost evaluation. Differences in approaches for determining costs may not be discernible to policy makers and administrators. Subsequent cost-effectiveness assessment can be undermined by unequal footing across programs for the cost part of the equation. This is especially problematic when policy decisions might be made based on simplest estimates, such as a return-on-investment ratio (benefits divided by costs). Better attention toward carrying out cost analysis that is thorough and comprehensive can better enable comparison between different programs. An important part of this is representing how costs might vary across different implementations, and the degree to which determined costs are unique to a single effort. Improvement in the evaluation of the use of these resources will be crucial in the future to help best inform program planning and optimal use of scarce public funds for quality early childhood educational interventions.

Acknowledgments:

This project was supported by National Institute of Child Health and Human Development Grants HD046064 and HD43763. We express our appreciation toward the teachers, students, parents, and program personnel who served as partners in this project in the Huntingdon, Blair, and York County Head Start Programs of Pennsylvania.

Footnotes

Declarations of interest: none

References

  1. Administration for Children and Families. (2010). Head Start Impact Study. Final Report (U. S. D. o. H. a. H. Services Ed.). Washington, DC: U.S. Department of Health and Human Services. [Google Scholar]
  2. Bailey D, Duncan GJ, Odgers CL, & Yu W (2017). Persistence and fadeout in the impacts of child and adolescent interventions. Journal of Research on Educational Effectiveness, 10(1), 7–39. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Barnett WS, Friedman-Krauss AH, Gomez RE, Horowitz M, Weisenfeld GG, Brown KC, & Squires JH (2016). The State of Preschool 2015. Retrieved from New Brunswick, NJ: http://nieer.org/state-preschool-yearbooks/the-state-of-preschool-2015 [Google Scholar]
  4. Bierman KL, Domitrovich CE, Nix RL, Gest SD, Welsh JA, Greenberg MT, Blair C, Nelson K & Gill S (2008). Promoting academic and social-emotional school readiness: The Head Start REDI program. Child Development, 79, 1802–1817. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Bierman KL, Heinrichs BS, Welsh JA, Nix RL, & Gest SD (2017). Enriching preschool classrooms and home visits with evidence-based programming: Sustained benefits for low-income children. Journal of Child Psychology and Psychiatry, 58, 129–137. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bierman KL, Welsh JA, Heinrichs BS, & Nix RL (2018). Effect of preschool home visiting on school readiness and need for services in elementary school: A randomized clinical trial. JAMA Pediatrics, 172(8), e181029. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Bierman KL, Welsh J, Heinrichs BS, Nix RL, & Mathis ET (2015). Helping Head Start parents promote their children’s kindergarten adjustment: The REDI parent program. Child Development, 86, 1877–1891. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Bowden A, Escueta M, Muroga A, Rodriguez V, & Levin H (2018). Minnesota Reading Corps Pre-K Program cost analysis. Retrieved from https://www.cbcse.org/publications/mrc
  9. Bowden A, Shand R, Belfield C, Wang A, & Levin H (2017). Evaluating educational interventions that induce service receipt: A case study application of City Connects. American Journal of Evaluation, 38(3), 405–419. [Google Scholar]
  10. Brotman LM, Calzada E, Huang KY, Kingston S, Dawson‐McClure S, Kamboukos D,… Petkova E (2011). Promoting effective parenting practices and preventing child behavior problems in school among ethnically diverse families from underserved, urban communities. Child development, 82(1), 258–276. [DOI] [PubMed] [Google Scholar]
  11. Calculating the Costs of Child Welfare Services Workgroup. (2013). Cost analysis in program evaluation: A guide for child welfare researchers and service providers (A. f. C. a. F. Children’s Bureau, U.S. Department of Health and Human Services Ed.). Washington, DC: Children’s Bureau, Administration for Children and Families. [Google Scholar]
  12. Campbell FA, Ramey CT, Pungello E, Sparling J, & Miller-Johnson S (2002). Early childhood education: Young adult outcomes from the Abecedarian Project. Applied developmental science, 6(1), 42–57. [Google Scholar]
  13. Clements DH, Sarama J, Wolfe CB, & Spitler ME (2013). Longitudinal evaluation of a scale-up model for teaching mathematics with trajectories and technologies: Persistence of effects in the third year. American Educational Research Journal, 50(4), 812–850. [Google Scholar]
  14. Committee for Economic Development. (2006). The economic promise of investing in high-quality preschool: Using early education to improve economic growth and the fiscal sustainability of states and the nations. Washington, DC: Committee for Economic Development. [Google Scholar]
  15. Conti G, & Heckman JJ (2014). Economics of Child Well-Being In A. B-A. et al. (Ed.), Handbook of Child Well-Being (pp. 363–401). Dordrecht, Netherlands: Springer. [Google Scholar]
  16. Corso P, & Filene JH (2009). Programmatic cost analysis of the Family Connections program. Protecting Children, 24(3), 78–88. [Google Scholar]
  17. Council of Economic Advisers. (2014). The economics of early childhood investments. Washington, DC: The White House. [Google Scholar]
  18. Crowley DM (2014). The role of social impact bonds in pediatric health care. Pediatrics, 134, e331–e333. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Crowley DM, Dodge KA, Barnett WS, Corso P, Duffy S Graham P, Greenberg MT, Haskins R, Hill L, Jones DE, Karoly LA, Kuklinski MR, & Plotnick R (2018). Standards of evidence for conducting and reporting economic evaluations in prevention science. Prevention Science, 19, 366–390. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Domitrovich CE, Cortes RC, & Greenberg MT (2007). Improving young children’s social and emotional competence: A randomized trial of the preschool “PATHS” curriculum. The Journal of primary prevention, 28(2), 67–91. [DOI] [PubMed] [Google Scholar]
  21. Domitrovich CE, Gest SD, Gill S, Jones D & Sanford DeRousie R (2009). Individual factors associated with professional development training outcomes of the Head Start REDI Program. Early Education and Development, 20(3), 402–430. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Farran DC, & Lipsey MW (2016). Evidence for the benefits of state prekindergarten programs: Myth & misrepresentation. Behavioral Science & Policy, 2(1), 9–18. [Google Scholar]
  23. Foster E, Dodge K, & Jones D (2003). Issues in the economic evaluation of prevention programs. Applied Developmental Science, 7, 76–86. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Foster E, Olchowski A, & Webster-Stratton C (2007). Is stacking intervention components cost-effective? An analysis of the Incredible Years program. Journal of the American Academy of Child & Adolescent Psychiatry, 46, 1414–1424. [DOI] [PubMed] [Google Scholar]
  25. Foster E, Porter M, Ayers T, Kaplan D, & Sandler I (2007). Estimating the costs of preventive interventions. Evaluation Review, 31(3), 261–286. [DOI] [PubMed] [Google Scholar]
  26. Friedman-Krauss AH, Barnett WS, Weisenfeld G, Kasmin R, DiCrecchio N, & Horowitz M (2018). The State of Preschool 2017: State preschool yearbook. Retrieved from New Brunswick, NJ: http://nieer.org/state-preschool-yearbooks/yearbook2017 [Google Scholar]
  27. Gaylor E, Kutaka T, Ferguson K, Williamson C, Wei X, & Spiker D (2016). Evaluation of Kindergarten Readiness in Five Child-Parent Centers: Report for 2014–15. Menlo Park, CA: SRI International. [Google Scholar]
  28. Haddix AC, Teutsch SM, & Corso PS (Eds.). (2003). Prevention effectiveness: A Guide to decision analysis and economic evaluation (Second ed.). Oxford: Oxford University Press. [Google Scholar]
  29. Heckman JJ (2011). The economics of inequality: The value of early childhood education. American Educator, 35(1), 31. [Google Scholar]
  30. Heckman JJ (2012). The case for investing in young children In Falk B (Ed.), Defending Childhood: Keeping the Promise of Early Education (pp. 235–242). New York: Teachers College Press. [Google Scholar]
  31. Jones DE, Greenberg MT, & Crowley DM (2015). The economic case for SEL In Durlak J, Domitrovich CE, Weissberg R, & T. Gullotta (Eds.), Handbook for Social and Emotional Learning (pp. 97–113). New York, NY: Guilford Press. [Google Scholar]
  32. Levin HM, McEwan PJ, Belfield C, Bowden AB, & Shand R (2018). Economic evaluation in education: Cost-effectiveness and benefit-cost analysis. Thousand Oaks, CA: Sage Publications. [Google Scholar]
  33. Macmillan R, McMorris BJ, & Kruttschnitt C (2004). Linked lives: Stability and change in maternal circumstances and trajectories of antisocial behavior in children. Child development, 75(1), 205–220. [DOI] [PubMed] [Google Scholar]
  34. National Academies of Sciences Engineering and Medicine. (2016). Advancing the Power of Economic Evidence to Inform Investments in Children, Youth, and Families. Washington, DC: The National Academies Press. [PubMed] [Google Scholar]
  35. Rathbun A, & Zhang A (2016). Primary Early Care and Education Arrangements and Achievement at Kindergarten Entry. NCES 2016–070. National Center for Education Statistics. [Google Scholar]
  36. Reardon SF (2011). The widening academic achievement gap between the rich and the poor: New evidence and possible explanations In Duncan GJ & Murnane RJ (Eds.), Whither opportunity? Rising inequality, schools, and children’s life chances. (pp. 91–116). New York: Russell Sage Foundation. [Google Scholar]
  37. Reynolds A, Temple J, Ou S, Arteaga I, & White BA (2011). School-based early childhood education and age-28 well-being: Effects by timing, dosage, and subgroups. Science, 333(6040), 360–364. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Reynolds A, Temple J, White B, Ou S, & Robertson D (2011). Age 26 Cost–Benefit Analysis of the Child‐Parent Center Early Education Program. Child development, 82(1), 379–404. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Schindler HS, McCoy DC, Fisher PA, & Shonkoff JP (2019). A historical look at theories of change in early childhood education research. Early Childhood Research Quarterly, 48, 146–154. [Google Scholar]
  40. Schweinhart LJ (2005). Lifetime effects: the High/Scope Perry Preschool study through age 40. Ypsilanti, MI: High/Scope Press. [Google Scholar]
  41. Welsh JA, Bierman KL, & Mathis ET (2014). Parenting programs that promote school readiness In Boivin M and Bierman K (Eds.) Promoting school readiness and early learning: The implications of developmental research for practice (pp. 253–279). New York: Guilford Press. [Google Scholar]
  42. Welsh JA, Bierman KL, Nix RL, Heinrichs B, & Gest SD (in press). Sustained effects of a school readiness intervention: Fifth grade outcomes of the Head Start REDI program. Early Childhood Research Quarterly. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Zaveri H, Burwick A, & Maher E (2014). Home Visiting: The Potential for Cost Savings from Home Visiting Due to Reductions in Child Maltreatment. Washington, DC: Casey Family Programs. [Google Scholar]

RESOURCES