Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2011 Jul 1.
Published in final edited form as: Early Child Res Q. 2010 Jul 1;25(3):299–313. doi: 10.1016/j.ecresq.2009.05.003

Implementation of a Relationship-Based School Readiness Intervention: A Multidimensional Approach to Fidelity Measurement for Early Childhood

Lisa L Knoche 1, Susan M Sheridan 1, Carolyn P Edwards 1, Allison Q Osborn 1
PMCID: PMC2932638  NIHMSID: NIHMS120183  PMID: 20824112

Abstract

The implementation efforts of 65 early childhood professionals involved in the Getting Ready project, an integrated, multi-systemic intervention that promotes school readiness through parent engagement for children from birth to age five, were investigated. Digital videotaped records of professionals engaged in home visits with families across both treatment and comparison conditions were coded objectively using a partial-interval recording system to identify and record early childhood professionals' implementation of intervention strategies and their effectiveness in promoting parent engagement and interest in their child. Adherence, quality of intervention delivery, differentiation between groups, and participant responsiveness were assessed as multiple dimensions of fidelity. Early childhood professionals in the treatment group relative to the comparison group demonstrated greater frequency of adherence to some intervention strategies, as well as higher rates of total strategy use. In addition, significant positive relationships were found between years of experience, education and quality of intervention delivery. Quality of intervention delivery was different by program type (Early Head Start versus Head Start). Adherence in the treatment group was correlated with the rate of contact between parent and early childhood professional during the home visit.

School Readiness

In recent years, many debates have taken place about how best to promote young children's school readiness (Vernon-Feagans & Blair, 2006). Many of the most respected early childhood interventions depend on child-centered curricula, with teachers and other service providers working directly with children (Mahoney & O'Sullivan, 1990). Simultaneously, the emergence of universal prekindergarten programming and pressures toward accountability in many states have prompted the development and trial of curricula intended to promote specific language, literacy, socioemotional, and/or mathematical skills in preschool children through intensive, child-centered instructional experiences delivered in school- or child-care based instructional settings (Barnett et al., 2008; Bierman et al., 2008; Preschool Curriculum Evaluation Research Consortium, 2008; Raver et al., 2008). As such, early childhood educators provide enriched educational experiences for children who are expected to acquire and develop necessary skills for school success.

A broader, ecological perspective on school readiness argues the necessity of considering the surrounding contexts, roles, and relationships that collectively interact to influence child and family development (Bronfenbrenner, 1977). Within an ecological approach, emphasis and responsibility for a child's readiness for school shifts from early childhood programs alone to the articulation of specific responsibilities shared among professionals, home visitors, teachers, parents, and extended family members (Mashburn & Pianta, 2006). The ecological, family-centered approach to school readiness extends our concern from a focus simply on “child readiness” for school to one that addresses “parent, child, and school readiness” to engage in learning over time and across settings.

There are innovative interventions that support this child- and family-centered approach. For example, much of the current programming in early childhood special education (ECSE) derives from such a perspective (Dunst, Trivette, & Cross, 1986; Girolametto, Verbey, & Tannock, 1994; Mahoney & MacDonald, 2007; McCollum, Gooler, Appl, & Yates, 2001). These interventions are not characteristic of early childhood programming outside of ECSE (e.g., universal pre-K); many early childhood programs build effective relationships with families, but are not focused on participatory opportunities for families (Dunst, 2002). Child- and family-centered interventions represent a promising direction as opposed to interventions that target child-only pathways to school readiness. More research and evaluation studies are needed to reveal the potential of alternative ecological frameworks for promoting school readiness among vulnerable young children.

Getting Ready Intervention

The Getting Ready intervention is an integrated, multi-systemic, ecologically-based intervention that promotes school readiness through enhancing parent engagement for children from birth to age five. As a model of practice, Getting Ready (Sheridan, Marvin, Knoche, & Edwards, 2008) promotes professional behavior that supports parents' competence and confidence in their interactions with their children within the context of cultural and family practices and values. The model is focused on supporting the dyadic parent-child relationship, and an exchange of ideas and developmentally-appropriate expectations for children between parents and early childhood professionals (ECPs). Triadic (McCollum & Yates, 1994) and collaborative (Sheridan & Kratochwill, 2008) strategies are used by professionals (see Table 1), which involve establishing relationships, maintaining positive communication, asking for and building on parental observations of child development, recognizing parental strengths and offering affirmations, supporting parents via the provision of developmental information, and encouraging them to discuss and prioritize concerns and needs for their child while at the same time focusing on child strengths. Specifically, ECPs observe parent-child interactions frequently, often in home settings, in an effort to support parental practices and interactions with their children. Additionally, ECPs brainstorm effective approaches to support children's development, model strategies that have been shown to advance children's attention, motivation, and understanding of new concepts and skills, and provide feedback (affirmations, suggestions, and demonstrations) on parental efforts to engage children in natural and structured learning opportunities. Children's responses are noted, and the need for adjusting intervention components is discussed based on these shared observational data, and plans for future directions are developed.

Table 1.

Getting Ready Intervention Strategies and Operational Coding Definitions

Strategy Definition
Strategy Use (Adherence)
Establish/re-establish relationship with parent Meaningful interaction and conversation exchange between the Early Childhood Professional (ECP) and parent which convey support, caring, or interest in family activities and well-being on the part of the facilitator. This includes exchanging personal information, acknowledging parent's response, discussing topics outside the bounds of the home visit, and “small talk.” Coded for the duration of the conversation/topic by either the ECP or parent.
Asks parent to share observations and ideas The ECP directly or indirectly, through questions or supportive statements, invites the parents' input regarding the child's development, likes/dislikes, and supportive strategies. Focus is on observations the parent makes about the child.
Affirm parents' competence Developmentally supportive interactions are warmly recognized and expanded upon, as are characteristics of child competence.
Establish dyadic context Elements of the environment are intentionally and actively arranged or rearranged to increase the probability of developmentally matched, mutually enjoyable parent-child interaction. The ECP makes efforts (irrelevant to success) to provide activities that support the dyad/parent-child interaction either directly through parent or indirectly through child.
Help parents discuss and prioritize concerns/needs This will typically occur during agenda setting or planning for next visit. Discuss concerns for the child as seen by the ECP and parent. The ECP and parent collaboratively select concerns to focus on. The ECP will engage parent in conversation about priorities and desires. The ECP might ask about concerns; this is a support for discussion; the parent might not have any concerns.
Focus parent's attention on child strengths Verbal statements are used to comment upon child's strengths and to draw parent's attention to particular competencies or actions within child. These comments may be in retrospect or occurring during the home visit.
Provide developmental information Information about the child's developmental agenda is given by verbally labeling or interpreting the child's emotional, cognitive, language, and/or motor abilities within the context of play and interaction. ECP provides parents with education around developmental milestones and why or how to engage the child in the activity.
Brainstorm This process is collaborative, a back and forth between ECP and parent. The ECP invites the parent to brainstorm/select strategies that fit into their home and daily routine (collaborative).
Make suggestions/ provide directives This process is directive. The ECP makes explicit statements to the parent about behaviors to support the target child's development and/or the parent-child interaction. This is typically not done as part of a collaborative conversation with the parent.
Promote practice and interaction through modeling Dyadic interaction roles are momentarily taken on by the ECP to enhance the parent's repertoire of developmentally appropriate strategies for interacting with the child. Whether prompted directly or indirectly, the parent responds by trying out the modeled behavior. It is modeling only if ECP demonstrates and turns it back to parent to practice.
Help plan for future goals, directions Discussion of strategies that will be used at home and/or in the classroom to support the child's development and how those strategies will be carried out. Keeping track of progress toward goals and activities for the next contact/visit can also be discussed.
Effectiveness (Quality of Intervention Delivery)
Overall effectiveness of the early childhood professional A high rating on this item indicates that the ECP provides ample opportunities for collaboration and initiates meaningful conversation with parent during the home visit (e.g., no interruptions while conversing with parent). ECP's attention is focused on parent-child relationship, and when appropriate she brings child into the activity/discussion. The ECP is effective in initiating conversations and discussions with parent and asking open-ended questions, and frequently encourages active parental participation throughout the home visit. A working relationship between ECP and parent is evident.
Participant Responsiveness
Parental level of interest and engagement with the early childhood professional A high rating on this item indicates that the parent displays much interest in or initiates activities with the ECP and participates in a bidirectional discussion with the ECP related to meaningful issues for the child and family. Parent's participation with the ECP is active such that parent initiates and elaborates on topics of discussions. Parent also asks questions or provides information to the ECP related to the topic of discussion. Physically, parent is usually in close proximity to ECP during most of the visit. Parent appears interested and enjoys interacting with the ECP.
Parental interest and engagement with child A high rating on this item indicates the parent displays much interest in and initiates conversations/activities with the child and in the proposed activities and materials of the visit. Parent engages in meaningful conversations and interactions with the child. Parent's participation is active such that parent initiates child participation and elaborates/encourages child in discussions. Physically, parent is usually in close proximity to child during most of the visit. Parent appears genuinely and highly interested to interact and enjoys their interactions with the child.

Empirical investigations of the Getting Ready intervention to date have indicated that the intervention has improved functioning and well being in both children and families in the treatment group. For children, the intervention was effective in improving social-emotional outcomes (Sheridan, Knoche, Edwards, Bovaird, & Kupzyk, in press). Specifically, preschool children in the treatment group were rated by teachers to have an increase in attachment and initiative behaviors, and a decrease in anxiety-withdrawal behaviors over a 2-year intervention period (Sheridan et al., in press). These findings suggest the intervention is particularly effective in building social-emotional competencies beyond the effects experienced as a function of participation in early childhood programming alone. Similar social-emotional effects have been indicated in the infant-toddler sample (Sheridan, Knoche, Edwards, Bovaird & Kupczyk, 2009). Additionally, findings also indicate enhanced parent engagement behaviors in parents of infants and toddlers (Cline, Knoche, Edwards, Sheridan, & Martinez, 2009). Specifically, after 8 months of participation in the intervention, treatment families demonstrate improved levels of parental responsiveness (quality of warmth and sensitivity) and guided support (support for children's learning), as well as a decrease in the amount of directives and demands placed on children. These findings indicate the initial effectiveness of the Getting Ready intervention; analyses, however, have not included the degree to which ECPs are implementing the intervention strategies as intended (i.e. fidelity).

Importance of Implementation with Fidelity

Beyond selecting a focus for school readiness efforts (child and/or family), implementing intervention efforts with fidelity is critically important. To adequately and reliably test the efficacy of interventions or treatment programs (Dane & Schneider, 1998), it is necessary to understand if the intervention is actually occurring as designed. Research has illustrated that variations in implementation fidelity contribute to programming outcomes (Durlak, 1998; Dusenbury, Brannigan, Falco, & Hansen, 2003; Zvoch, Letourneau, & Parker, 2007); thus, impact studies of early childhood intervention must take into account factors associated with implementation fidelity to fully understand programmatic outcomes, and the critical features of intervention that are linked to outcomes.

There are multiple interrelated issues associated with implementation fidelity that, if ignored, may lead to significant misrepresentations of intervention outcomes in an era when careful and systematic scrutiny in efficacy trials is at its height (Dusenbury et al., 2003; Flay et al., 2005). Careful measurement of implementation fidelity is essential for at least four reasons. First, failure to measure the fidelity with which interventions are implemented may lead to an incorrect conclusion that an intervention is ineffective when in fact it is effective or conversely, as effective when it is not. For example, statistical analyses comparing effects for participants in treatment versus comparison conditions might indicate that a given intervention is not effective at improving child outcomes. However, careful scrutiny of the data might indicate that some intervention sites or participants failed to implement the intervention as designed, or with fidelity. When the level or degree of implementation fidelity is introduced into the analyses, it may be that those sites with low fidelity to the intervention were not significantly different than the comparison sites, thereby masking true treatment effects for sites or participants implementing with high fidelity. Second, lack of attention to fidelity may lead to further implementation of the “wrong” treatment. Without monitoring implementation, adjustments to ensure adherence to the intervention protocol are not possible. Third, insufficient assessment of implementation fidelity may hide the reality that an intervention was applied in an uncontrolled, variable fashion, precluding the ability of replication under similar conditions using identical procedures (Rossi & Freeman, 1985). Failure to assess treatment fidelity results in research practices evaluating the effects of an intervention as described, rather than delivered, yielding unreliable results with little to no bearing on actual intervention implementation and possible effects (a conceptual “type III error” or mismatch between the research foci and research questions; Dobson & Cook, 1980). Finally, it is not only important to investigate fidelity within the treatment group, but also within the comparison group. Measuring and accurately describing the counterfactual condition, or “business as usual,” allows the unique features of the intervention, as well as overlap in service, to be identified. In evaluating interventions, it is important and highly relevant to understand the processes and practices of the comparison group.

Currently, there are limited studies in education broadly, and early childhood specifically, that address implementation fidelity. Methods for collecting data to accurately assess implementation fidelity in early childhood programming are emerging (Brophy-Herb et al., 2008; Klute, Moreno, Sciarrino, & Anderson, 2008; Trivette & Swanson, 2008; Zvoch et al., 2007) but not prevalent in the field (O'Donnell, 2008). Studies that provide detailed approaches for tracking fidelity in early childhood programming, as well as the subsequent use of fidelity data in efficacy analyses, are needed. Specifically, studies that investigate implementation across both treatment and comparison conditions are necessary to describe the counterfactual.

Defining Fidelity of Implementation

Fidelity of implementation in program evaluation or intervention studies is a multidimensional construct and is commonly characterized along five dimensions: adherence, dosage, quality of program/intervention delivery, participant responsiveness, and program differentiation (Dane & Schneider, 1998; Dusenbury et al., 2003; O'Donnell, 2008). Adherence is conceptualized as the implementation of intervention strategies as designed by program developers. Dosage is the amount of intervention that is delivered to participants. The quality of intervention delivery goes beyond adherence and indicates the quality, or effectiveness with which intervention strategies are delivered. Participant responsiveness describes the participants' level of engagement in and receptiveness to intervention programming. Finally, program differentiation indicates whether the characteristics of the intervention distinguish treatment from comparison groups during the implementation of the intervention in studies evaluating the efficacy of interventions. These components can be collapsed into factors addressing fidelity to structure and fidelity to process (Mowbray, Holter, Teague, & Bybee, 2003; O'Donnell, 2008).

In a study testing the efficacy of the Getting Ready intervention, we collected objective, observational data across both treatment and comparison conditions that address participants' adherence to intervention strategies as delivered during home visits with families; the quality of intervention delivery as measured by a global rating of effectiveness of the early childhood professionals in initiating parental interest and engagement; responsiveness to the intervention as measured by parent's level of engagement and rate of contact with their child and with the professional during home visits; and program differentiation in the form of measures of adherence, quality, and responsiveness. Following recommended practice (O'Donnell, 2008), multidimensional data were collected that address both fidelity to structure (adherence) and fidelity to process (quality of intervention delivery, program differentiation). Participant responsiveness informs both structure (in terms of frequency of interactions between parents and children) and process (vis á vis ratings of overall parent engagement with children and professionals).

Understanding implementation fidelity is important as we move forward in designing, developing, and implementing early childhood interventions. Investigation of fidelity to structure and process in early childhood intervention studies will help determine whether important implementation features are intact, allowing follow-up questions about an intervention's potential to produce desired child and family outcomes. Furthermore, knowledge about factors that influence implementation of interventions in early childhood, including the identification of characteristics of individuals administering intervention activities that might relate to effective implementation fidelity, is important. Factors such as amount of professional experience and level of education and their associations to implementation fidelity are relevant to determine necessary conditions under which interventions can be most effectively implemented (Greenberg, Domitrovich, Graczyk, & Zins, 2005). Additionally, programmatic characteristics, including the developmental focus of services (infants/toddlers; preschoolers) or setting (home-or center-based) may relate to participants' abilities to implement interventions with fidelity, and data on implementation within these variable contexts is of interest. A descriptive and univariate examination of fidelity data is a first step toward understanding these associations, though to fully understand the contribution of these characteristics to implementation, studies that are designed to investigate variations in implementation are needed.

Purpose of Study and Research Questions

The purposes of this study are to examine the implementation efforts of early childhood professionals (ECPs) across both treatment and comparison conditions for individuals involved in the Getting Ready intervention, and to specify the relationship between fidelity to structure/process and participant responsiveness as measured by parent engagement. By including findings from both treatment and comparison participants, we are able to better understand the counterfactual condition with regard to certain aspects of family-centered services provided as part of typical home visits, and identify elements of the intervention that are being implemented naturally by early childhood professionals in our comparison condition. Additionally, contextual factors that might contribute to implementation fidelity are explored. Three specific research questions are investigated (with dimensions of fidelity indicated).

1. Do early childhood professionals in the Getting Ready treatment group demonstrate greater frequency of intervention strategy use and more effectively engage parents than those in the comparison condition, following training? Specifically:

  1. What is the degree to which ECPs across treatment and comparison groups deliver triadic/collaborative strategies during home visits (i.e., adherence)?

  2. What is the degree to which ECPs across treatment and comparison groups implement triadic/collaborative strategies effectively to initiate parent interest and engagement (i.e., quality)?

  3. To what degree can participants in the treatment group be differentiated from those in the comparison group based on rate of strategy implementation and ratings of effectiveness (i.e., program differentiation)?

To further understand the utility of the Getting Ready intervention and associated strategies for participants in the intervention group (i.e., receiving active training and support in the intervention), we include some questions designed to investigate implementation for professionals in the treatment condition. First, for participants in the treatment group, it is important to understand the relationships between adherence to strategy use, quality of implementation, and participant (parent) responsiveness to the intervention. Second, for these trained participants, it is necessary to begin to explore how program and individual characteristics may relate to implementation.

2. What is the bivariate relationship between (a) the ECPs' adherence to Getting Ready intervention strategies and the quality with which they initiate parental interest and engagement, and (b) participant responsiveness (measured by the rate of parent-child/parent-professional interaction and the level of parent-child/parent-professional engagement in home visits), for participants in the treatment group? Do these relationships vary by program type (Early Head Start vs. Head Start)?

3. What is the bivariate relationship between implementation fidelity and professional characteristics (i.e., education, years of experience), program type (Early Head Start vs. Head Start) and time in intervention for participants in the treatment group?

Method

Participants & Settings

Participants in this project were 65 early childhood professionals (ECPs) in Early Head Start (n = 38) and Head Start (n = 27) settings involved in the Getting Ready project, a large, federally-funded longitudinal research study investigating the effects of a parent engagement intervention on school readiness. Upon project initiation, sites were randomly assigned to treatment or comparison conditions, with professionals in the same buildings and/or workgroups assigned to the same conditions. Participation was voluntary for ECPs, and informed consent was obtained from all participants (i.e., parents, early childhood educators), following University Institutional Review Board procedures. Thirty-three participants were in the treatment group and 32 were in the comparison group.

Table 2 presents demographic information for participating ECPs, including age, level of education, years of experience and race/ethnicity data. The average number of years of experience for Early Head Start ECPs was six. Their length of time enrolled in the Getting Ready study averaged 14.2 months. Head Start ECPs averaged nine years of early childhood experience, and the mean length of time they were involved in the Getting Ready study averaged 21.3 months. Demographic characteristics, including age, years of experience in early childhood settings and years of experience conducting home visits were not statistically significantly different between comparison and treatment participants. Analyses also indicated that educational level was statistically equivalent between participants in the treatment and comparison groups, within each program type.

Table 2.

Demographic Information of Early Head Start (EHS) and Head Start (HS) Early Childhood Professionals

EHS (n = 38) HS (n = 27) Overall (N = 65)
Treatment Sample n=19 n=14 n=33
Comparison Sample n=19 n=13 n=32
Mean Age (SD) 32.93 (10.08) 35.00 (11.00) 33.88 (10.47)
Mean Length of Employment (in months) (SD) 29.36 (43.15) 38.00 (46.67) 33.45 (44.63)
Mean Early Childhood Setting Experience (in months) (SD) 75.84 (66.85) 114.42 (98.26) 94.90 (85.03)
Mean Home Visiting Services Experience (in months) (SD) 33.26 (41.02) 49.23 (70.63) 40.81 (56.98)
Mean Length of Time in Intervention (in days) (SD) 352.26 (212.58) 452.49 (283.09) 396.48* (249.31)
Gender: Female 100% 100% 100%
Ethnicitya:
 Hispanic/Latino 72% 4% 42%
 Non-Hispanic/ Latino 28% 96% 59%
Racea:
 White/Caucasian 27% 92% 58%
 Hispanic/Latino 69% 8% 40%
 Other 3% 0% 2%
Level of Educationa:
 High School Diploma 4% 2%
 Some Training beyond High School but not a degree 35% 18%
 One-Year Vocational Training Certificate 10% 6%
 Two-Year College Degree 35% 18%
 Four-Year College Degree 17% 46% 31%
 Some Graduate College Coursework 35% 16%
 Graduate Degree 19% 9%
Child Development Related Degree 53% 100% 80%
Early Childhood Teaching Endorsement/Certificate 11% 100% 55%
Another Type of Endorsement or Certification 27% 78% 62%
Child Development Associate Credential 39% 10% 27%
a

Chi-square analyses reveal statistically significant distributions between EHS and HS groups, p <.001.

*

p < .05.

** p < .01.

*** p < .001

The programmatic elements described below for both Early Head Start and Head Start are characteristic of both treatment and comparison conditions. Early Head Start (EHS) programs serving families with children age birth to three years were located within three community service agencies in rural counties in a Midwestern state; each agency housed between five and twenty-one ECPs. ECPs in the Early Head Start agencies provided services through weekly home visits scheduled to last up to 90 minutes and monthly family group activities held at the community agency (socializations). The average size of ECPs' caseloads in EHS was 10 families.

Head Start classrooms involved in the Getting Ready study were housed within elementary schools in a school district in a midsized, Midwestern community. Center-based Head Start services were provided to children age three to five years, five days per week for half-day (four hour) programming. In addition to center-based services, Head Start ECPs visited families' homes five times per academic year, with visits lasting up to 60 minutes, and held group socializations three to four times per academic year. Their average classroom/caseload size was 18 children and families.

Getting Ready (Triadic/Collaborative) Intervention Strategies

The Getting Ready strategies employed by ECPs as part of the intervention were intended to strengthen parental responsiveness, confidence, and competence in the context of parent-child interactions. In addition, parents were encouraged to actively participate, set goals, and assist in educational decision making. The Getting Ready intervention strategies combine triadic practices (McCollum & Yates, 1994) with collaborative processes (Sheridan & Kratochwill, 2008). The 11 strategies that constitute the Getting Ready intervention represent an integration of triadic and collaborative strategies and are fully defined in Table 1. The strategies are based in family-centered principles and encourage early childhood professionals to partner with families to positively impact children's development and promote child and family readiness for school (Sheridan et al., 2008). The strategies are individualized, responsive, and applied uniquely with families in a dynamic process; as such no specific, predetermined level of “good fidelity” is specified a priori. That is, we do not know at exactly what frequency they should be delivered in order to be effective. Rather, as a first approximation, we define “good fidelity” by findings from the composite of indicators, expecting Total Strategy Use to be above a 50% score (i.e., observed in over half of the intervals).

Early childhood professionals in the treatment group administered the intervention in naturalistic contexts of home visits, socializations, and center activities following initial and ongoing training. All early childhood professionals in the treatment group received general training via a structured training institute, booster sessions, and on-going group and individualized coaching twice monthly, with efforts instituted to move professionals toward internalization and full conceptual and practical integration of the Getting Ready strategies.

Comparison Condition

ECPs in the comparison condition1 participated in training sessions that were child-focused as compared to family- and child-focused as in the treatment group. Specifically, they received training on social-emotional development and learning competencies of young children. They did not receive specific training on working with parents, or strategies to involve parents in their children's development. Comparison ECPs received supervision at the agency or school level for case management. Practices to involve parents in programmatic activities were highly consistent with the national Head Start and Early Head Start philosophies and policies. The number of family contacts did not vary between control and treatment groups. Family activities and expectations for home visit frequency were the same across treatment and comparison conditions, as described above.

Data Collection

Digital video recordings of home visits completed by ECPs in both the treatment and comparison conditions were collected initially after four months of involvement in the Getting Ready study, and then at least twice per year (and up to four times per year in the home-based Early Head Start program) by members of the research team, over the course of professionals' involvement in the study. Thus, home visits from all ECPs in the treatment and comparison conditions were videotaped. Between one and seven home visits were recorded for each professional; a total of 154 visits were taped. More visits were collected from professionals who were in the study for a longer period of time. Early childhood professionals selected the family to participate in the taping; the family also provided consent to be recorded. To control for familiarity between the ECP and parent, the family had to have been assigned to the ECP for at least four months, and the visit being recorded had to be representative of a typical home visit (e.g. usual family members were present; child was healthy). Home visits for recorded visits ranged from 20 to 90 minutes in length. Video recordings and demographic data used in this study include those collected between September 2004 and December 2007.

Measurement of Study Variables

Implementation fidelity

The key variables used in this investigation are derived from the Home Visit Coding Guide used to code the home visit videotapes described above. Study variables map onto indicators of implementation fidelity as specified below. Specifically, the Home Visit Coding Guide measured (a) ECP's use of individual strategies (i.e., rate with which each Getting Ready strategy was used during home visit) as well as the total rate of strategy use (i.e., sum of the rates of individual strategy use) (adherence); (b) ECP's effectiveness at initiating parental interest and engagement (quality of intervention delivery); (c) the interest and engagement levels between parents and children, and parents and ECPs (participant responsiveness); and (d) the rate of interaction between parents-children and parents-ECPs (participant responsiveness). Additional variables included demographic characteristics of early childhood professionals and the amount of time ECPs had been involved in the intervention. Dosage, defined as the number of home visits completed, was not included in the analyses because the rate of home visit completion was consistent across all professionals as a function of EHS and HS programming.

The Home Visit Coding Guide was adapted from the Home Visit Observation Form (HVOF; McBride & Peterson, 1997). The HVOF was originally used to determine processes and content of home visits with families of children newborn to three years with disabilities. The original coding scheme was based on a family-centered approach to home visit delivery, with the purpose of quantitatively describing ECP practices within the home visit. Modifications to the HVOF were completed for the present study to allow for the assessment of triadic/collaborative strategies and measurement of important interactions and outcomes (i.e., parent-child/parent-professional engagement) unique to the Getting Ready intervention (see Table 1). The face and content validity of the adapted form which identified specific triadic/collaborative strategies demonstrated by early childhood professionals during home visits was validated through an expert review by two authorities in the use of triadic/collaborative strategies during home visits. Experts reviewed definitions of the modified HVOF and provided suggestions for adjustment of the coding form prior to its use.

The modified Home Visit Coding Guide specified a partial-interval recording system to identify and record early childhood professionals' (a) implementation of intervention strategies, (b) effectiveness in promoting parent engagement, and (c) parental interest in and engagement with their child in the context of the home visit. Specifically, for each videotaped home visit collected from each ECP, one-minute partial interval recording procedures were used wherein strategy use (Table 1) was coded if it was observed to occur at all during the interval. During these same intervals, the interactions among participants were coded to identify who was interacting with whom (parent-child; parent-professional; professional-child). These procedures were used for the duration of each videotaped home visit.

Additionally, observational ratings were made on three global scales at the end of each 10-minute segment of the home visit to assess general levels of effectiveness and engagement, with scores ranging from one (low) to four (high). The distance between anchor points is assumed to be equally spaced; the data are considered interval-level given the meaningful intervals between each anchor point. The first scale assesses the overall effectiveness of the early childhood professional. This rating was based on the early childhood professional's ability to provide ample opportunities for collaboration, to initiate meaningful conversations and discussions with the parent, and to focus on the parent-child relationship. The second scale measures parental level of interest and engagement with the early childhood professional, coded in terms of parent engagement through bidirectional discussions, initiations and elaborations on meaningful issues for the child and family, and demonstrations of active participation in activities presented by the early childhood professional. The third scale assesses the level of parental interest and engagement with the child. Behaviors operationally defined in this scale include parents showing interest, participating actively, and engaging in and initiating meaningful conversations and interactions with their child. Table 1 presents additional information on the global scales.

Scores generated from the Home Visit Coding Guide for each early childhood professional are as follows: Individual Strategy Use (i.e., rate with which each strategy was used during home visit computed for each strategy by dividing the cumulative strategy use over the course of the visit by the total length of the home visit in minutes); Total Rate of Strategy Use (i.e., sum of the rates of individual strategy use); Parent-Child, Parent-Professional, and Professional-Child Rate of Interaction (number of intervals dyad spent interacting during the home visit, divided by the visit length in minutes); and Effectiveness Rating (i.e., average Likert rating of effectiveness as coded by an independent coder, across all 10-minute intervals during a home visit; see above). Parents also received a score for their level of interest and engagement with the professional, and their interest and engagement with their child, based on the same one (low) to four (high) Likert scale. Because most ECPs completed more than one home visit, total scores per professional were computed by averaging values across all of their respective home visits. The skewness and kurtosis values for each of the likert ratings were examined, and were determined to be normally distributed.

Six independent coders from the research team were trained to accurately and reliably code ECP behaviors and interactions with families during each of the videotaped home visits. Coders were naive to condition assignment. All trained coders viewed sample digital videos identifying the various triadic strategies, practiced coding behaviors in pairs using samples of video recorded home visits, and independently coded ECPs' behaviors and interactions using a minimum of two samples of digital videos of home visits. To ensure integrity of the coding process, all coders were required to independently code at least three sample visits obtaining an inter-rater reliability of 85% before proceeding to independent coding.

During the ongoing coding process, reliability checks were completed for approximately 33% of the home visits to assess inter-rater reliability using an 85% cut-off which is considered an acceptable level for behavioral data (Suen & Ary, 1989). In cases where inter-rater reliability decreased below 85%, a refresher course in coding was conducted. Inter-rater agreement across codes for Getting Ready strategies ranged from 91.6% to 99.6%. In addition, the Cohen's kappa statistic was computed as an estimate of inter-rater reliability for exact agreement, and was sizeable at .80, as values higher than .60 are considered substantial (Landis & Koch, 1977). For the three global, Likert-type scales, inter-rater agreement for ratings within one point ranged from 93.5% to 97.3% (exact agreement ranged from 68.4% to 80.6%). Additionally, intraclass correlations (ICCs) were computed for each of the three global, Likert scales for exact agreement. The average ICC for parent engagement with ECP was .81 with a range across intervals of .63 to .92; the ICC for parent engagement with child was .79, with a range of .68 to .94; and the ICC of ECP effectiveness was .69 with a range of .62 to .85.

Demographic information

At the beginning of their participation in the Getting Ready study, each Early Head Start and Head Start professional completed a demographic survey (Sheridan, Edwards, & Knoche, 2003). Items included topics such as educational background, professional experiences, certifications, gender, and ethnicity/race. Details regarding demographic characteristics of Early Head Start and Head Start ECPs are in Table 2.

Time in intervention

A variable was computed to establish the length of time a professional had participated in the Getting Ready study. The amount of time in days from the date of original consent (which was the time of training for treatment ECPs) to the date of the median home visit recording was computed and included in analyses as a possible correlate of implementation fidelity. The time an ECP spent in the Getting Ready project, however, is not a proxy for dosage of intervention. Dosage, rather, is defined as the total number of home visits completed with each family, which is specified by each program (i.e., based on program requirements, all Early Head Start families receive weekly home visits; Head Start families receive five home visits per year). Summary statistics for time in intervention are provided in Table 2.

Results

The results for this study are presented in three parts, according to each primary research question. Both effect size estimates and tests of statistical significance are included. Effect sizes are considered moderate if d = .50 to .70 and large if d = .80 or greater (Cohen, 1988). Effect sizes were computed using Cohen's d = 2t/√(df); this effect size calculation is appropriate for interval-level data. Analyses included frequencies, means, t-tests and bivariate correlations. The analytic approach for each research question is specified in the sections that follow.

Research Question 1: Do early childhood professionals in the Getting Ready treatment group demonstrate greater frequency of intervention strategy use and more effectively engage parents than those in the comparison condition, following training?

Descriptive statistics revealed that early childhood professionals in the treatment group utilized Getting Ready strategies over an average of 59% of the home visit (see Table 3). Participants in the comparison condition were observed engaging in strategy use in 48% of the home visit. The strategies used relatively most frequently by both groups was establishing/re-establishing positive relationships with parents, asking parents to share their observations, and making suggestions. When ECPs in the treatment group were coded as not engaging in Getting Ready strategies, they were observed to be providing updates to parents on classroom activities, sharing agency-required information, and collecting paperwork; these activities were coded in approximately 10% of the home visit intervals. The comparison ECPs engaged in these activities nearly twice as frequently, during 19% of the intervals. In addition, parents and children in the treatment group were observed to be interacting with each other during 69% of intervals, on average compared to 54% for participants in the comparison group. Effectiveness was rated as 2.67 on a four-point Likert-type scale for professionals in the treatment group, indicating relatively high levels of quality in initiating parental interest and engagement with their children. This rating was 2.10 for ECPs in the comparison group.

Table 3.

Rate of Strategy Use, Interaction Among Participants and Effectiveness Ratings by Treatment and Comparison Groups

Comparison Treatment

M(SD) M(SD) t-test df Obtained p-value Critical p(i) Effect Size
Getting Ready Strategy Use (Adherence)

Total Strategy Use 0.48 (0.20) 0.59 (0.22) 2.15*** 62.82 0.0358 0.017 0.53
Establish/Re-establish relationship with parent 0.12 (0.08) 0.14 (0.08) 0.81 62.98 0.4231 0.038 0.20
Ask parent to share observations and ideas 0.08 (0.07) 0.10 (0.07) 1.25 62.74 0.2172 0.029 0.31
Affirm parent's competence 0.02 (0.02 0.06 (0.05) 3.94* 46.67 0.0003 0.004** 0.97
Establish dyadic context 0.02 (0.03) 0.04 (0.03) 2.35* 62.95 0.0218 0.008 0.58
Help parents discuss and prioritize concerns/needs 0.01 (0.01) 0.01 (0.01) 1.48 63.00 0.1446 0.025 0.36
Focus parents' attention on child strengths 0.08 (0.09) 0.07 (0.06) −0.05 52.82 0.9626 0.050 −0.12
Provide developmental information 0.06 (0.06) 0.05 (0.05) −0.52 59.98 0.6080 0.042 −0.18
Brainstorm 0.001 (0.001) 0.01 (0.01) 2.27* 43.67 0.0281 0.013 0.55
Make suggestions/Provide directives 0.07 (0.06) 0.08 (0.04) 0.50 58.65 0.6187 0.046 0.29
Promote practice and interaction through modeling 0.001 (0.001) 0.01 (0.01) 1.84 37.44 0.0735 0.021 0.45
Help plan for future goals, directions 0.03 (0.03) 0.04 (0.03) 1.05 61.55 0.3005 0.033 0.26

Primary Interactors

Parent-child rate of interaction 0.54 (0.26) 0.69 (0.14) 2.77** 48.17 0.0080 0.017a 0.69
Parent-ECP rate of interaction 0.77 (0.18) 0.77 (0.14) 0.20 58.74 0.8439 0.033 0.05
ECP-child rate of interaction 0.65 (0.22) 0.64 (0.17) −0.18 58.10 0.8561 0.050 −0.05

Effectiveness (Quality of Intervention Delivery)

ECP effectiveness rating 2.10 (0.67) 2.67 (0.69) 3.36** 62.99 0.0013 0.017a 0.83

Participant Responsiveness

Rating of parent engagement with ECP 2.77 (0.55) 3.00 (0.55) 1.70 62.95 0.0936 0.050 0.42
Rating of parent engagement with child 2.50 (0.73) 2.94 (0.72) 2.43* 61.57 0.182 0.030a 0.61
a

Significant with FDR control; obtained p must be ≤ p(i) to be significant

*

p < .05.

**

p < .01.

***

p < .001.

To determine the degree to which strategy use could be differentiated based on treatment group assignment, a set of 18 independent-sample t-test (two-tailed) analyses were conducted (Table 3). The t-tests resulted in three “families” of t-tests. The “strategy use” family was composed of 12 variables (i.e., eleven rates of individual strategy use and a total strategy use variable). The “interactors” family included the frequency of interaction between parent-child, parent-ECP, and child-ECP. The “quality” family included ratings of ECP effectiveness, parent-child engagement, and parent-ECP engagement. We used the Benjamini and Hochberg (1995) sequential Bonferroni-type procedure to protect against inflation of Type I error (i.e., we implemented a procedure to control the false discovery rate, or FDR; Benjamini & Hochberg) within each family. The procedure has been proven to control the FDR for both continuous and discrete test statistics (Benjamini & Yekutieli, 2001).

Additionally, traditional t-tests assume homogeneity of variance between the independent groups. Levene's test for equality of variance was used and resulted in significant test statistics on five of the 18 variables. In these cases, the corrected test statistics, degrees of freedom, and p-values are reported in Table 3 and these p-values were entered into the FDR control procedure.

Although seven of the tests of significance had p-values less than .05, four fell below the incremental critical p-values for FDR control. After imposing control for FDR, findings indicated that ECPs in the treatment group demonstrated significantly more attempts to affirm parent's competence, (t(46.67) = 3.94, p<.001), and had significantly higher ratings of overall effectiveness in intervention implementation (t(62.99) = 3.36, p = .001). Ratings of parent engagement with their children were also significantly higher in the treatment group (t(61.57) = 2.43, p = .018), as was the amount of interaction between parent and child (t(48.17) = 2.77, p = .008).

The effect sizes provided in Table 3 indicate seven effects that were moderate or large, providing yet another analysis of group differences. Relative to comparison participants, treatment group participants demonstrated higher rates of overall intervention strategy use, offered more affirmations of parent's competence, engaged in a higher rate of brainstorming with families during the home visit, and demonstrated more frequent efforts to establish the dyadic context between parent and child. In addition, the amount of interaction between parent and child occurring during home visits was significantly higher in the treatment group, and ratings of parent engagement with their children were also higher in the treatment group. Early childhood professionals in the intervention group had higher ratings of overall effectiveness in intervention implementation than those professionals not involved in the intervention, differentiating the two groups on this important dimension.

Research Question 2: What is the bivariate relationship between (a) the ECPs' adherence to Getting Ready intervention strategies and the quality with which they initiate parental interest and engagement, and (b) participant responsiveness (measured by the rate of parent-child/parent-professional interaction and the level of parent-child/parent-professional engagement in home visits), for participants in the treatment group? Do these relationships vary by program type (Early Head Start vs. Head Start)?

The second research question considered the bivariate association between adherence and program quality, and participant responsiveness (i.e., ratings of parental engagement in home visits, rates of interaction). We were interested in understanding whether training in Getting Ready procedures and the subsequent use of strategies by treatment ECPs related to participant (parental) uptake (e.g. responsiveness as measured by ratings of parental engagement). Therefore, only professionals in the treatment group were included in the analyses. Because we presumed there may be differences between participants working across different programs (Early Head Start, Head Start), we investigated unique relationships within groups, and differences between them. T-test analysis (two-tailed), bivariate correlations and a Fisher's z-test to assess the difference between correlations were conducted. Total rate of strategy use over the course of the home visit was used as a measure of implementation adherence and overall effectiveness ratings were used to assess quality of intervention delivery.

First, an independent-samples t-test analysis was conducted to assess group differences between Early Head Start and Head Start ECPs. The t-test analysis indicated statistically significant differences between Early Head Start and Head Start professionals for overall effectiveness (t(31) = −2.06, p < .05, two-tailed; d = 0.76) with HS professionals (M = 2.94; SD = 0.41) demonstrating higher levels than EHS professionals (M = 2.47; SD = 0.79). Results were not statistically significant between program types on overall strategy use, d = 0.19.

Because of suggested differences between program types in terms of quality (i.e., effectiveness) of intervention delivery as indicated by the t-test analysis and individual characteristics that varied between ECPs across the EHS and HS programs (see Table 2), bivariate correlation analyses were conducted for the sample overall, and for each program type separately. Correlations are presented in Table 4. In addition, Fisher's z-test was conducted to assess differences in correlations between program types.

Table 4.

Bivariate Correlations of Adherence, Quality of Intervention Delivery and Participant Responsiveness by Program (Early Head Start, Head Start) and Overall

Participant Responsiveness Effectiveness (Quality) Total Strategy Use (Adherence)

EHS (n=19) HS (n=14) Overall (N = 33) EHS (n=19) HS (n=14) Overall (N = 33)

Parent-ECP Rate of Interaction 0.05 0.59* 0.05 0.49* 0.67** 0.48**
ECP-Child Rate of Interaction −0.03 −0.05 0.11 0.29 −0.20 0.16
Parent-Child Rate of Interaction 0.44+ −0.16 0.25 0.08 −0.31 −0.01
Rating of Parent Engagement with ECP 0.01 0.85*** 0.19 −0.02 0.82*** 0.20
Rating of Parent Engagement with Child 0.20 0.19 0.19 −0.12 −0.05 −0.10
+

p< .10.

*

p < .05.

**

p < .01.

***

p < .001.

Overall, positive significant correlations were indicated between total strategy use (adherence) and rate of parent-professional interactions during the visit (participant responsiveness). No statistically significant relationships were identified between effectiveness (quality of intervention delivery) and rates of interaction between professionals, parents and children or for average ratings of engagement between parents, professionals and children (although some were significant for the Head Start group). In general, total strategy use was significantly related to effectiveness of implementation for the combined group of ECPs (r(31) = 0.54, p< .001), as well as for each individual program (EHS, r(17) = 0.51, p< .05; HS, r(12) = 0.88, p< .001). The pattern of correlations between variables was different for each program type (ECPs in Head Start versus Early Head Start). Though the relationship of quality of implementation to the rate with which parents and ECPs interacted varied between EHS and HS professionals, the Fisher's z-difference between correlations indicated they were not significantly different (z = 1.6, p>.05). However, the test indicated that the correlations for the rating of parent engagement with ECP were significantly different, for both quality of implementation (z = 3.18, p<.01) and adherence to strategies (z = 3.0, p<.01), as well as the relationship between total strategy use and effectiveness of implementation (z = 2.08, p<.05).

Research Question 3: What is the bivariate relationship between implementation fidelity and professional characteristics (i.e., education, years of experience), program type (Early Head Start vs. Head Start) and time in intervention for participants in the treatment group?

To investigate the relationship between professional characteristics (education and years of experience), program characteristics, time in intervention and implementation fidelity, t-test (two-tailed) and bivariate correlation analyses were conducted. Again, we focused exclusively on treatment participants in this analysis as a result of our interest in associations for those who received training and ongoing support in the implementation of the intervention. Therefore, only professionals in the treatment group were included in the analyses. As a result of the suggested differences between programs in terms of quality of intervention delivery and because demographic characteristics of ECPs varied between programs, correlation analyses were conducted for each program, in addition to the overall sample of ECPs in the treatment group only. Bivariate correlations are provided in Table 5. Overall positive significant correlations were indicated between effectiveness in strategy use and experience in an early childhood setting (r(26) = 0.40, p < .05), and educational level (r(26) = 0.52, p < .01). Overall strategy adherence was not positively related to any of the education and experience variables. Experience conducting home visits, length of time in current work settings, and time since training in the Getting Ready intervention did not significantly relate to professionals' adherence or quality of delivery. Despite differences indicated in the t-test analysis for program type, no significant relationships were identified specific to either Early Head Start or Head Start ECPs.

Table 5.

Bivariate Correlations of Professional Characteristics and Measures of Quality of Intervention Delivery and Adherence by Program (Early Head Start, Head Start) and Overall

Professional Characteristics Effectiveness (Quality) Total Strategy Use (Adherence)

EHS (n = 19) HS (n = 14) Overall (N = 33) EHS (n = 19) HS (n = 14) Overall (N = 33)

Months of home visiting experience 0.26 0.08 0.23 0.28 0.04 0.12
Months of employment 0.28 0.12 0.24 0.28 0.05 0.20
Months of early childhood experience 0.31 0.37 0.40* 0.28 0.40 0.26
Level of education 0.28 0.39 0.52** 0.08 0.39 0.04
Time in intervention 0.10 −0.09 0.18 0.07 −0.02 0.02
*

p <.05.

**

p <.01.

Discussion

To fully understand early childhood programming and intervention, studies of implementation fidelity are needed. Research has demonstrated that variations in implementation of programming can create a host of interpretive difficulties for early childhood intervention researchers. Lack of implementation fidelity may reduce the “real” efficacy of an intervention, potentially masking significant intervention effects when they are present due to inconsistencies or inaccuracies in the delivery of intervention procedures (Dane & Schneider, 1998). To further study fidelity in early childhood programming, investigations of appropriate methodologies for assessing implementation fidelity, including multi-dimensional approaches, are needed. The current study used a multidimensional approach to the measurement of implementation fidelity that allowed for an investigation of the full implementation of the Getting Ready intervention, an ecologically-based early childhood intervention to promote school readiness.

In this study, the structure of intervention implementation was assessed by investigating adherence (i.e., what was delivered in terms of specific Getting Ready strategies). We attempted to understand process elements by attending to how the intervention was delivered by early childhood professionals (i.e., quality, or the competence with which early childhood professionals delivered the strategies; and differentiation, or the uniqueness of the Getting Ready intervention strategies vis á vis other interventions in place in naturalistic early childhood education settings). Participant responsiveness (i.e., the extent to which recipients of the intervention are engaged and involved) taps both structure and process (O'Donnell, 2008), was also assessed. Given that responsiveness shares important features with outcomes that we deem significant to our Getting Ready intervention (i.e., parent-child and parent-professional engagement; parent-child and parent-professional interactions), participant responsiveness was of keen interest to us.

A unique aspect and particular strength of the present study is the comprehensive approach with which we assessed the implementation fidelity of our early childhood intervention. Specifically, four out of the five fidelity criteria specified by Dane and Schneider (1998) were measured in this study, and not surprisingly, each revealed a slightly different perspective on the implementation of this complex intervention. Most intervention research that addresses fidelity does so in a manner that either (a) focuses on methods to increase, not monitor fidelity; or (b) assesses only a narrow and limited aspect of implementation (e.g., adherence; Dane & Schneider, 1998). A more complete picture of the adequacy of intervention delivery is possible by monitoring implementation using all or most of the five fidelity criteria, which potentially allow researchers to understand each aspect of fidelity (Dane & Schneider, 1998) and how they inform implementation more broadly.

A second strong feature unique to our study concerns our attention to implementation fidelity not only in the treatment group, but also in the comparison group, thereby uncovering elements of service delivery under conditions of “business as usual.” Indeed, to determine the “added value” of interventions that are aimed at enhancing outcomes for children and families, it is necessary to determine objectively that which constitutes standard practice. We objectively assessed active intervention ingredients delivered by early childhood professionals not only in our treatment group but also in our comparison group participants, as well as their effectiveness at initiating engagement and collaboration with parents during home visits. This allowed us to (a) specify “business as usual” with regard to certain aspects of family-centered services in Head Start and Early Head Start home visits, and (b) identify elements of our Getting Ready intervention that are being implemented naturally by ECPs in our comparison condition.

Scientifically, the assessment across treatment and comparison conditions will allow us to discern what key intervention ingredients are unique to the Getting Ready treatment group (those that differentiate the treatment from other early childhood practices), and those that are common elements in good standard early childhood practice, such as prevailed in our participating agencies and schools. Indeed, some of what we considered active Getting Ready strategies were not unique to the treatment condition, as evidenced by similar rates of implementation within home visits across treatment and comparison groups. Some of the ECPs in the comparison group were delivering relatively high levels of parent-child (triadic) and parent-professional (collaborative) strategies, independent of exposure to the Getting Ready intervention training and support. These data will allow us to conduct analyses that go beyond simple assignment to condition, and further investigate the critical features of the Getting Ready strategies that relate to important child and family outcomes, an important next step in utilizing the implementation fidelity data, though not a focus of the current investigation.

Taken together, our findings suggest generally satisfactory implementation fidelity among participants in our treatment group. Treatment professionals were observed to be actively using Getting Ready strategies during 59% of the intervals during home visits, with a substantial amount of intervals (69%) revealing evidence of parent and child interacting with one another. Furthermore, treatment professionals' performance can be differentiated from that of the comparison group in several ways. Specifically, early childhood professionals in our treatment group effectively initiated parental interest and engagement (quality) at a significantly higher level than comparison group participants. Effect size analyses indicated strategy use at a higher rate than the comparison group, however tests of significance that accounted for a potentially false discovery rate (Type I error) indicated a significant difference in affirmation of parents' competence alone. Additionally, in home visits delivered by members of the treatment group, significantly more time intervals involved parents and children interacting with each other, and parents were rated as demonstrating greater interest and engagement with their child, compared to home visits completed by comparison group professionals (participant responsiveness), after controlling for a false discovery rate. Finally, twice as much time was spent by the comparison group explaining classroom behaviors and activities, and reviewing paperwork than was spent by the treatment group.

It is the composite set of fidelity indicators, including adherence, quality of program/intervention delivery, participant responsiveness and program differentiation, taken together, that allow us to fully understand implementation fidelity of the Getting Ready intervention. Any one fidelity metric cannot be interpreted in isolation given the process-driven, individualized nature of the Getting Ready intervention. The intervention is fluid and dynamic; as such, individual child and family needs drive behaviors of ECPs. Unlike a classroom curriculum that is intended to be implemented daily for a set amount of time covering a predetermined series of steps, the Getting Ready triadic/collaborative strategies are implemented in response to child and family needs. The process orientation of the intervention complicates efforts to define “good” fidelity; studies that are designed to take into account family factors as well as implementation practices are needed to define a threshold level of fidelity.

The relationship between fidelity to structure and fidelity to process, as studied in this investigation is a model for future studies of implementation fidelity. At a general level, overall Getting Ready strategy use (adherence) and effectiveness (quality) across all participants were significantly, though not entirely, related to one another, shedding light on important relationships among certain elements of structure and process. The findings suggest that each measure contained unique information as an indicator of treatment fidelity; they are not measuring the same construct when considering the entire sample of ECPs in the treatment group. The strong correlation between these two variables for Head Start ECPs indicates they are closely related, however this is not the case for the Early Head Start ECPs. We believe that overall strategy use by professionals is a global indicator of adherence to the Getting Ready intervention; however, studies are needed to (a) discern the most critical strategies that define the intervention (e.g., item-total correlations), and (b) identify those that contribute most significantly to effectiveness of implementation (e.g., correlations between individual strategies and effectiveness; and between individual strategies and parent-child/parent-professional engagement). As the current study was not designed to study the effects of fidelity specifically, we do not have sufficient statistical power to run these analyses with the current data set. Future investigations, including those that vary levels of implementation systematically, are needed.

In general, effectiveness of the ECPs in Head Start settings positively related to the amount of time that professionals and parents spent interacting over the course of the home visit, and to parents' general engagement with the early childhood professional. Effectiveness was marginally positively related to the rate of parent-child interaction for Early Head Start ECPs. Similarly, adherence to intervention strategies was significantly related to the amount of parent-professional interaction during home visits for both Early Head Start and Head Start participants. Traditional child-centered interventions focus on professional-child exchanges, where an intervention (or curriculum, for example) is delivered directly to a child by the professional, minimizing the role of the parent. In contrast, increased contact and communication between early childhood professionals and parents, as well as increased interaction between parents and children, are goals of the Getting Ready intervention. Indeed, professionals who used more triadic/collaborative strategies during the home visit were more likely to be interacting with parents in meaningful ways. These findings relate to participant responsiveness, an aspect of fidelity that mirrors the goals and desired outcomes of the Getting Ready intervention. In this way, the marginally significant relationship between effectiveness and rate of parent-child interaction suggests the Getting Ready approach as one that supports parent-child connections. Future research should explore whether such positive associations observed during home visits generalize to parent-child interactions in other naturalistic settings.

Finally, professional characteristics were significantly related to the quality with which early childhood professionals in the treatment group implemented the Getting Ready intervention. Overall, ECPs with higher levels of education and with more experience working in early childhood received higher ratings of effectiveness than those with less education and experience. Furthermore, aspects of fidelity were found to correlate in unique ways for Head Start and Early Head Start professionals. The t-test analysis indicated that Head Start ECPs were observed to have higher rates of overall effectiveness; however, no significant difference between programs was indicated for overall adherence to intervention strategies. Specifically, variation was indicated between Early Head Start and Head Start ECPs in terms of the manner in which ratings of quality and overall strategy use related to measures of participant responsiveness.

We speculate that the differences in effectiveness between professionals associated with Head Start and Early Head Start programs may be a function of the striking differences in education and experience among these professionals. Specifically, in our sample, 100% of Head Start ECPs held bachelor's degrees or higher, as compared to only 17% of Early Head Start professionals. Further, all Head Start professionals, but only 11% of Early Head Start professionals, held state teaching certifications or endorsements. Head Start professionals, on average, had 38.6 months (approximately 3.25 years) more experience in early childhood, and 16 months (1.3 years) more home visiting experience than those in Early Head Start. However, it is also possible that the differences are due to unique aspects of the Getting Ready intervention (e.g., a relationship-based, family-centered intervention as opposed to traditional child-focused interventions), which potentially draw on a combined set of important background features of professionals to support high levels of implementation, including training and field experience. Finally, there may be an interaction between characteristics of the professional (e.g., levels of education and/or experience) and characteristics of the agency or work setting; however, agency data necessary to explore this question are not currently available.

These findings have implications for professional development necessary to support early childhood professionals involved in intervention delivery. It can be expected that the implementation of parent engagement interventions will vary across different groups of early childhood participants. In our study, variation in effectiveness was indicated for individuals with diverse professional characteristics (education, experience) and across different program types. Additionally, findings highlight that the length of time in the intervention, the notion that “longer is better,” is not necessarily salient. Particularly in highly individualized interventions such as the Getting Ready intervention, characteristics of both professionals and families must be taken into account to fully support implementation efforts.

Limitations

The study has several limitations. First, the sample size of participating ECPs and recorded home visits was relatively small, preventing the use of advanced statistical analyses. Second, the averaging of scores across multiple home visits for professionals precludes us from investigating specific family characteristics that might also contribute to intervention implementation. Third, although some selection criteria were imposed, the early childhood professionals selected the home visits that would be recorded, and they were fully aware that their behaviors were being captured on video. Thus, the data collected were not a random sample of home visits. These factors would be expected to introduce some bias, though not necessarily one that distinguishes treatment versus comparison as coders were naive to condition assignment. Additionally, research has indicated that selecting one's best work is most relevant when assessing program effects (Kibel, 1999).

Fourth, the home visiting coding guide that we developed is modified from another tool (HVOF; McBride & Peterson 1997) and therefore untested through prior research. Psychometric studies are needed to ensure that the measures assess fidelity of Getting Ready strategies reliably and with adequate content and construct validity (O'Donnell, 2008). We found the codes could be applied quite reliably across raters, and the scale was validated with experts on triadic/collaborative strategies, but have limited statistical information on validity. For example, the measure assumes that more is probably better when it comes to using triadic/collaborative strategies, but it may be that these strategies actually have thresholds (which might vary when interacting with different types of parents) necessary for their usefulness. As such, it is not yet known empirically “how much is enough” or what is “good” fidelity, and whether critical cut-offs exist for individual or collective groups of strategies. “Affirming parent competence,” for instance, may reach a point of saturation and lose its effectiveness if applied too often. Furthermore, it may be that certain strategies are more potent when used in certain combinations, or when used at certain critical junctures of the home visit. Rate of interaction measures take no account of these sequential and interactive effects. Because of the process orientation of the intervention, and individualized approach by ECPs to unique family situations, a standard for “good” fidelity is difficult to define. In general, systematic research studies of fidelity are needed to understand the degree to which particular strategies must be used to influence change in the behavior of parents of young children. In a related fashion, the manner in which different forms of fidelity data can be combined and interpreted is not known; further analysis with various combinations of composite data and their predictive ability is needed. The current study was not designed, and therefore not sufficiently powered, to answer these specific questions.

Finally, we were interested in identifying whether differences across program types were related to intervention implementation and fidelity. This led to investigations across Early Head Start and Head Start settings, with some significant differences found. Our findings do not indicate the superiority or effectiveness of one program over another; rather, we provide the data to raise awareness of variation in implementation of our Getting Ready intervention across programs, and suggest factors that might be contributing to implementation in our sample. Likewise, data do not allow us to conclude that program type (Early Head Start, Head Start) is the primary variable discerning the differences. It is possible that the location of services (center-based versus home-based), rate of contact with family, or developmental level of the child (infant/toddler or preschooler) are more potent factors in influencing intervention implementation than sheer program type alone.

Future Directions

The findings suggest several important directions for future research. We still have much to learn about the structure and process elements that have the greatest treatment utility (i.e., predict positive child and family outcomes outside of home visits). Indeed, the triadic/collaborative strategies constituting the Getting Ready intervention and taught to early childhood professionals are complex. The effective use of Getting Ready strategies is in large part dependent upon characteristics of the child and family, and dynamic features of the interactions within home visits. Our gross measure of “total strategy use” does not entirely capture the essence of the intervention process, nor does the rate of various types of interactions among participants. Furthermore, the threshold level required for “good” fidelity is not known as it relates to outcomes for children and families; and moreover, whereas our ratings of effectiveness and engagement (parent-child; parent-professional) tap qualitative aspects of implementation fidelity, they are global ratings that may be insensitive to certain structural features. An integrated, mixed-methods approach may be necessary to fully capture relevant structural and process elements of the Getting Ready intervention.

As more fidelity research is undertaken, it is necessary to identify the most critical strategies that define interventions and identify elements that contribute most significantly to effectiveness of delivery, and to meaningful outcomes. Components analyses are necessary that identify specific strategies and processes that contribute empirically to parent and child treatment effects. Studies designed with implementation fidelity as the primary variable of interest are needed so as to provide sufficient statistical power for analyses. The current investigation provides a first look at implementation fidelity data from the Getting Ready intervention, and informs future studies. This investigation, however, was not intended to link implementation fidelity to child and parent outcomes; this is the focus of future work.

Finally, the next iteration of intervention studies investigating implementation fidelity of the Getting Ready intervention need to relate data on ECP behavior during home visits with observed parental engagement in settings outside of the home visit to assess validity. Additionally, analyzing more home visits per ECP and taking into account the hierarchical nature of the data, including non-independence of ratings (multiple ratings per ECP) are needed.

Conclusions

In sum, the current study identified and implemented an objective, multidimensional approach to assessing structural and process elements of fidelity of a parent engagement intervention. The study findings demonstrate the benefit of using multiple indicators to assess fidelity and identified elements of the Getting Ready intervention that differentiated the treatment and comparison groups; this approach provides direction for intervention researchers who are interested in using innovative and comprehensive approaches to assessing fidelity. Additionally, the study specified the degree to which quality and adherence of intervention implementation related to parental engagement during home visits, and suggested professional (such as education) and programmatic factors that might relate to the implementation practices of early childhood professionals in Early Head Start and Head Start. As we learn more about factors that contribute to the implementation practices of early childhood professionals, as well as the degree to which varying levels of implementation relate to parental engagement during interactions with professionals, advances in the development, implementation and evaluation of collaborative, ecological and relational interventions to promote school readiness in young children will become increasingly possible.

Acknowledgments

The development of this paper was supported by a grant awarded to Drs. Susan Sheridan and Carolyn Pope Edwards by the Department of Health and Human Services (DHHS) -- National Institute of Child Health and Human Development (NICHD), Administration for Children and Families (ACF) and Office of the Assistant Secretary for Planning and Evaluation (ASPE); and the Department of Education (ED) -- Office of Special Education and Rehabilitative Services (Grant #1R01H00436135). The opinions expressed herein are those of the investigators and do not reflect the funding agencies.

Footnotes

1

All children and families involved in this study were enrolled in Head Start and Early Head Start programming and were thus receiving services regardless of condition assignment; thus we use the term “comparison” condition as to not imply a “no-treatment” control group.

References

  1. Benjamini Y, Hochberg Y. Controlling the false discovery rate: A new and powerful approach to multiple testing. Journal of the Royal Statistical Society, Series B. 1995;57:1289–1300. [Google Scholar]
  2. Benjamini Y, Yekutieli D. The control of the false discovery rate in multiple testing under dependency. The Annals of Statistics. 2001;29:1165–1188. [Google Scholar]
  3. Barnett WS, Jung K, Yarosz DJ, Thomas J, Hornbeck A, Stechuk R, et al. Educational effects of the Tools of the Mind curriculum: A randomized trial. Early Childhood Research Quarterly. 2008;23:299–313. [Google Scholar]
  4. Bierman KL, Domitrovich CE, Nix RL, Gest SD, Welsh JA, Greenber MT, et al. Promoting academic and social-emotional school readiness: The Head Start REDI program. Child Development. 2008;79:1802–1817. doi: 10.1111/j.1467-8624.2008.01227.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Bronfenbrenner U. Toward an experimental ecology of human development. American Psychologist. 1977;32:513–531. [Google Scholar]
  6. Brophy-Herb H, Schiffman RF, Onaga E, Van Egeren LA, Horodynski M, Fitzgerald HE, et al. The “Building Early Emotion Skills (BEES)” reports on implementation, fidelity of implementation, and curricular effectiveness. In: DeCourcey W Chair, editor. Head Start university partnerships: Curriculum development; Symposium conducted at the meeting of the Head Start Ninth National Research Conference; Washington DC. Jun, 2008. [Google Scholar]
  7. Cline KD, Knoche LL, Edwards CP, Sheridan SM, Martinez MM. Getting Ready: One-year effects of a parent engagement intervention on parenting behaviors in low-income families. the biennial meeting of the Society for Research in Child Development; Denver, CO. Apr, 2009. [Google Scholar]
  8. Cohen J. A coefficient for agreement for nominal scales. Educational and Psychological Measurement. 1960;20:37–46. [Google Scholar]
  9. Cohen J. Statistical power analysis for the behavioral sciences. 2nd ed. Erlbaum; Hillsdale, NJ: 1988. [Google Scholar]
  10. Dane AV, Schneider BH. Program integrity in primary and secondary prevention: Are implementation effects out of control? Clinical Psychology Review. 1998;18:23–45. doi: 10.1016/s0272-7358(97)00043-3. [DOI] [PubMed] [Google Scholar]
  11. Dobson D, Cook TJ. Avoiding type III error in program evaluation: Results from a field experiment. Evaluation and Program Planning. 1980;3:269–276. [Google Scholar]
  12. Dunst C, Trivette C, Cross A. Mediating influences of social support: Personal, family, and child outcomes. American Journal of Mental Deficiency. 1986;90:403–417. [PubMed] [Google Scholar]
  13. Dunst C. Family-centered practices: Birth through high school. The Journal of Special Education. 2002;36:139–147. [Google Scholar]
  14. Durlak JA. Why program implementation is important. Journal of Prevention & Intervention in the Community. 1998;17:5–18. [Google Scholar]
  15. Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research. 2003;18:237–256. doi: 10.1093/her/18.2.237. [DOI] [PubMed] [Google Scholar]
  16. Flay BR, Biglan A, Boruch RF, Castro FG, Gottfredson D, Kellam S, et al. Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science. 2005;6:151–175. doi: 10.1007/s11121-005-5553-y. [DOI] [PubMed] [Google Scholar]
  17. Girolametto L, Verbey M, Tannock R. Improving joint engagement in parent-child interaction: An intervention study. Journal of Early Intervention. 1994;18:155–167. [Google Scholar]
  18. Greenberg MT, Domitrovich CE, Graczyk PA, Zins JE. The study of implementation in school-based preventive interventions: Theory, research, and practice. U.S. Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Center for Mental Health Services; Washington, DC: 2005. [Google Scholar]
  19. Kibel BM. Success stories as hard data: An introduction to results mapping. Kluwer Academic; New York: 1999. [Google Scholar]
  20. Klute MM, Moreno AJ, Sciarrino CA, Anderson S. Fidelity and implementation of `Learning through Relating', a pre-literacy and social communication curriculum for infants and toddlers. In: DeCourcey W Chair, editor. Head Start university partnerships: Curriculum development; Symposium conducted at the meeting of the Head Start Ninth National Research Conference; Washington DC. Jun, 2008. [Google Scholar]
  21. Mahoney G, MacDonald J. Autism and developmental delays in young children: The Responsive Teaching curriculum for parents and professionals. Pro-Ed; Austin, TX: 2007. [Google Scholar]
  22. Mahoney G, O'Sullivan P. Early intervention practices with families of children with handicap. Mental Retardation. 1990;28:169–176. [PubMed] [Google Scholar]
  23. Mashburn A, Pianta R. Social relationships and school readiness. Early Education and Development. 2006;17:151–176. [Google Scholar]
  24. McBride SL, Peterson CA. Home-based early intervention with families of children with disabilities: Who is doing what? Topics in Early Childhood Special Education. 1997;17:209–233. [Google Scholar]
  25. McCollum JA, Gooler F, Appl DJ, Yates TJ. PIWI: Enhancing parent–child interaction as a foundation for early intervention. Infants and Young Children. 2001;14:34–45. [Google Scholar]
  26. McCollum JA, Yates TJ. Dyad as focus, triad as means: A family-centered approach to supporting parent-child interactions. Infants and Young Children. 1994;6:54–63. [Google Scholar]
  27. Mowbray CT, Holter MC, Teague GB, Bybee D. Fidelity criteria: Development, measurement, and validation. American Journal of Evaluation. 2003;24:315–340. [Google Scholar]
  28. O'Donnell CL. Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K-12 curriculum intervention research. Review of Education Research. 2008;78:33–84. [Google Scholar]
  29. Preschool Curriculum Evaluation Research Consortium . Effects of preschool curriculum programs on school readiness. U.S. Government Printing Office; Washington, DC: 2008. NCER 2008–2009. [Google Scholar]
  30. Raver CC, Jones SM, Li-Grining CP, Metzger M, Champion KM, Sardin L. Improving preschool classroom processes: Preliminary findings from a randomized trial implemented in Head Start settings. Early Childhood Research Quarterly. 2008;23:10–26. doi: 10.1016/j.ecresq.2007.09.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Rossi PH, Freeman HE. Evaluation: A systematic approach. Sage; Newbury Park, CA: 1985. [Google Scholar]
  32. Sheridan SM, Edwards CP, Knoche LL. Getting Ready Demographic Questionnaire. University of Nebraska-Lincoln; 2003. Unpublished document. [Google Scholar]
  33. Sheridan SM, Knoche LL, Edwards CP, Bovaird JA, Kupzyk KA. Parent engagement and school readiness: Effects of the Getting Ready intervention on preschool children's social-emotional competencies. Early Education and Development. doi: 10.1080/10409280902783517. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Sheridan SM, Knoche LL, Edwards CP, Bovaird JA, Kupzyk KA. Getting Ready: Intervention effects on the social competence of low-income infants and toddlers. the biennial meeting of the Society for Research in Child Development; Denver, CO. Apr, 2009. [Google Scholar]
  35. Sheridan SM, Kratochwill TR. Conjoint behavioral consultation: Promoting family-school connections and interventions. Springer; New York: 2008. [Google Scholar]
  36. Sheridan SM, Marvin CA, Knoche LL, Edwards CP. Getting ready: Promoting school readiness through a relationship-based partnership model. Early Childhood Services. 2008;2(3):21–45. [Google Scholar]
  37. Suen HK, Ary D. Analyzing quantitative behavioral observation data. Lawrence Erlbaum; Hillsdale, NJ: 1989. [Google Scholar]
  38. Trivette C, Swanson J. Windows of opportunity: Treatment fidelity results from a capacity building model with Early Head Start home visitors. In: DeCourcey W Chair, editor. Head Start university partnerships: Curriculum development; Symposium conducted at the meeting of the Head Start Ninth National Research Conference; Washington DC. Jun, 2008. [Google Scholar]
  39. Vernon-Feagans L, Blair C. Measurement of school readiness. Early Education and Development. 2006;17:1–5. [Google Scholar]
  40. Zvoch K, Letourneau LE, Parker RP. A multilevel multisite outcomes-by-implementation evaluation of an early childhood literacy model. American Journal of Evaluation. 2007;28:132–150. [Google Scholar]

RESOURCES