Abstract
This paper reports results from a student survey fielded using an experimental design with 14 Kentucky school districts. Seven of the fourteen districts were randomly assigned to implement the survey with active consent procedures; the other seven districts implemented the survey with passive consent procedures. We utilized our experimental design to investigate the impact of consent procedures on (a) participation rates, (b) demographic characteristic of the survey samples, and (c) estimates of ATOD use. We found that the use of active consent procedures resulted in reduced response rates, under-representation of male students and older students, and lower lifetime and past 30 day prevalence rates for most drugs and for most antisocial behaviors. Methodological implications of these findings are discussed, along with directions for further research.
Keywords: schools, student survey, active consent, data quality
INTRODUCTION
Surveys of student populations provide important data to policymakers and researchers on underage alcohol and drug usage, risky student behavior, and school safety. Because student respondents usually are minors, informed consent from parents is a required component of these surveys. Although parental consent can be obtained using either active (so-called “written” consent) or passive consent procedures, active consent procedures increasingly are required by school districts and Institutional Review Boards (IRBs) even when the surveys are anonymous.
Our research compares the impact of active versus passive consent procedures using data from a school survey conducted with 14 Kentucky school districts during the fall semester of 2007. Seven of the fourteen districts were randomly assigned to implement the survey with active consent procedures; the other seven districts implemented the survey with passive consent procedures. We focus on these two groups of school districts and utilize our experimental design to investigate the impact of consent procedures on (a) participation rates, (b) demographic characteristic of the responding samples, and (c) estimates of ATOD use. We discuss the results of our analyses and their implications for planning student surveys and for evaluating intervention programs. We also discuss important areas for future research on the impact of active consent procedures on data from student surveys.
BACKGROUND AND PREVIOUS LITERATURE
Most respondents in school-based student surveys are minors, and Title 45 Code of Federal Regulations Part 46 (45CFR46; Department of Health and Human Services, 1991) requires researchers to obtain informed consent from their parents. For many years, most research using school-based student surveys has relied on passive consent procedures to fulfill informed consent requirements. Under passive consent procedures, researchers send a consent form to parents and require parents to return the form only if they do not want their child to participate (i.e., they must actively opt out). Survey administrators assume that youth have permission to participate if they do not receive a returned “nonconsent” form from parents. Historically, many IRBs have held that passive consent procedures fulfill ethical and statutory requirements for informed consent when student surveys are anonymous (Severson & Biglan, 1989). The high participation rates and low costs that often accompany passive procedures have made passive consent the preferred strategy for many research efforts (Johnson, Bryant, Rockwell, Moore, Straub, Cummings, & Wilson., 1999), and it has been suggested that such procedures satisfy both the letter and the spirit of federal informed consent regulations (Severson & Biglan, 1989).
However, changing research environments have meant that active consent procedures increasingly are required by the U.S. Department of Education, IRBs, and local school boards for surveys, even when the surveys are anonymous (Mammel & Kaplan, 1995). Major reasons for this shift toward active consent include: (1) requirements from the U.S. Department of Education for active parental consent for all research the agency funds, (2) changing interpretations of federal consent guidelines by many IRBs, (3) increased scrutiny of research involving children in school settings by the U.S. Congress, and (4) increased legal liabilities for school districts and researchers conducting research on students and other minors.
Active consent procedures require parents to sign and return a form indicating that their child has permission to participate in a research effort (i.e., they must actively opt in). This strategy assumes that consent is granted only when parents explicitly give permission for their child to participate. Under active consent, an unreturned consent form is equivalent to refusal of consent, meaning that nonparticipation of students (i.e., survey nonresponse) can occur from explicit refusals of consent or from parents who fail to return the consent form because (a) they did not receive the form, (b) they simply neglected to return it or (c) the form was lost in transit back to the school, (Dillman, Eltinge, Groves, & Little, 2002; Thompson, 1984).
Most research on active consent has been conducted as part of evaluations of prevention programs that use researcher-administered nonexperimental pre- and post-test survey questionnaires. The research conducted to date on the influence of active consent procedures on survey findings has found that the use of active consent can have three potential effects on survey data:
lower participation rates,
biased sample demographics, and
different estimates of alcohol, tobacco, and other drug use.
Widely varying approaches, measures, and non-experimental or other limited research designs mean the findings reviewed below are best interpreted as suggestive of the types of effects that may occur when active consent procedures are used. As we report, the experimental design used in our study provides more definitive evidence on the impact of active consent procedures on student survey data.
Survey Response Rates and Survey Nonresponse Bias
A major concern about active consent is that it can lead to low survey response rates. Many research studies have investigated how low survey response rates may lead to nonresponse bias in survey data (c.f., Groves 2006). A low response rate indicates that a small portion of the eligible sample completed a questionnaire and this can result in a final sample that does not represent the target population from which it was drawn. Essentially, the concern is that low response rates from active consent procedures can increase the probability of nonresponse bias in survey data. However, nonresponse bias is “a function of both the nonresponse rate and the difference between respondents and nonrespondents on the statistic of interest” (Dillman, Eltinge, Groves, & Little, 2002, p. 3; Keeter et al., 2000, p. 126). Although one aim of efforts to increase response rates is to minimize nonresponse bias, low response rates do not necessarily mean that survey data will be characterized by a high level of nonresponse bias, just as high response rates do not guarantee low levels of bias (cf. Groves, 2006). Groves, Presser, & Dipko (2004) explain this relationship between response rates and nonresponse error using leverage-salience theory, which suggests that there are a wide variety of influences on respondents’ decisions to participate in a survey (such as interest in the topic of the survey), and that only those influences that are related to the key statistics are likely to add nonignorable nonresponse bias to the survey results.
Consent Procedures and Student Surveys
Many studies of student surveys have shown that passive consent procedures in school settings often produce markedly higher participation rates than active consent procedures (Lueptow et al., 1977; Severson & Ary, 1983; Kearney et al. 1983; Ellickson & Hawes, 1989; Esbensen et al., 1996; Fendrich & Johnson 2001, White, Hill, & Effendi, 2004). Studies using active consent have reported student participation rates that have ranged from 40 to 60 percent on average (Lueptow et al., 1977; Severson & Ary, 1983; Kearney et al. 1983; Ellickson & Hawes, 1989; Esbensen et al., 1996) with some studies reporting markedly lower survey response rates (MacGregor & McNamara, 1995) and others, markedly higher survey response rates (Ellickson & Hawes, 1989; Esbensen et al., 1999; Pokorny, 2001, Eaton, 2004). In contrast, studies using passive consent procedures routinely produce participation rates between 80 and 100 percent (Kearney et al., 1983; Ellickson & Hawes, 1989; Esbensen et al., 1996; Pokorny, 2001; Eaton, 2004; Langhinrichsen-Rohling, et al., 2006).
Active consent procedures can result in lower student participation rates, because the process is more complex and requires more parental involvement. Parents may never receive consent forms that are sent home with students. If the active consent form is delivered, a parent must first read it, decide to grant or deny consent, and then see that the consent form is returned—no small task, especially when the student is the designated courier. Because researchers can do little (aside from preparing visually appealing and persuasive consent materials) to influence parents’ decision to grant or deny consent, most researchers focus their efforts on increasing the delivery and return rates for parental consent forms. The literature describes a variety of labor-intensive and costly procedures implemented by researchers to boost the number of consent forms that parents return (Kearney, 1983; Thompson, 1984; Ellickson & Hawes, 1989; Esbensen et al., 1996; Johnson et al., 2000; Ji, Pokorny, & Jason, 2004). Nevertheless, (and consistent with Leverage-Salience Theory), these efforts have met with limited success in part because researchers do not yet have a full understanding of the most important reasons why parents do or do not return consent forms. Research has shown that the use of active consent procedures can affect the demographic characteristics of student samples.
Several studies (Kearney et al. 1983; Dent et al. 1993; Esbensen et al. 1999; Unger, et al., 2004) found that the use of active consent resulted in an under-representation of African-American, Hispanic, and Asian minority students. Dent et al. (1993), Schuster et al. (1998), Pokorny (2001) and Unger (2004) found that active consent procedures resulted in biased gender distributions, with male students being proportionally underrepresented in their samples. Kearney et al. (1983) and Esbensen et al. (1999) found that parents of younger students returned consent forms and provided consent more often than did parents of older students. Kearney et al. (1983) also found that students whose parents granted consent had significantly higher standardized achievement tests scores than students whose parents did not reply. Schuster et al. (1998) and Unger (2004) found that active consent procedures overrepresented students who had better grades, and who expected to obtain a graduate or professional school education. Other research similarly suggests that using active consent procedures in a student survey can impact the sample distribution of key household characteristics (e.g., Severson & Ary 1984; Dent et al. 1993; Esbensen et al. 1999; Baker, Yardley, and McCaul, 2001; Henry, Smith, and Hopkins, 2002).
A final and most important question for researchers is whether at-risk students (i.e., those more likely to use illegal substances) are more likely to be excluded from student surveys when active consent procedures are used compared to when passive consent procedures are used. Severson and Ary (1984) found that students whose parents provided consent differed significantly in a number of at-risk behaviors from those whose parents did not provide consent. They found that students whose parents did not provide consent were significantly more likely to smoke tobacco, smoke marijuana, and drink alcohol, meaning that the required provision of written parental consent biased findings toward underestimates of use. Dent et al. (1993) also found that active consent response/nonresponse was significantly associated with tobacco use. Students whose parents did not return a consent form reported higher percentages of tobacco use, higher rates of smoking, and less willingness to consider quitting smoking or stopping use of other forms of tobacco than those whose parents returned a consent form. They also found that students whose parents did not return a consent form tended to be involved in more “drug culture” activities such as rock concerts and were more likely to be latch-key kids. Finally, these authors also found that sampled students who did not participate in the survey were more likely to report having negative perceptions of their classroom's climate, disliking the subject matter of the class, and perceiving that their teacher did not take personal interest in them as students (see also Esbensen and colleagues, 1999, and Pokorny, 2001).
Finally, although most of the research on the influence of consent procedures has been conducted as part of larger evaluation studies of prevention or public health programs, one recent study (White, Hill, & Effendi 2004) used an experimental design to assess the influence of active consent procedures on data from a student survey. White and colleagues assigned 81 schools in Victoria, Australia to either an active or passive consent condition and administered a student survey instrument to 80 students in each school. Although this study found that the use of active consent reduced participation rates and had a modest impact on drug use estimates, the authors noted that their design had significant limitations that limited the internal and external validity of their conclusions. Some schools opted out of their assigned condition and the authors noted that they were unable to reliably document how many schools failed to follow the assigned consent and implementation protocols. Additionally, the authors noted that their design did not have sufficient statistical power to detect significant differences between the active and passive consent conditions.
In sum, the research suggests that compared to passive consent, active consent procedures often result in decreased student participation rates and can affect the distribution of key demographic, attitudinal, and ATOD variables. Baker, Yardley and McCaul (2001) argue pointedly that, “by requiring adolescents’ parents to provide active consent, researchers run the risk of losing the very subjects that are the targets of their research or interventions” (p. 608). Because prevention practitioners increasingly rely on student survey data to inform strategies aimed at reducing ATOD use, the potential for nonresponse bias resulting from active consent has increased as a result of recent environmental and statutory changes (e.g., 2001's No Child Left Behind legislation) that require active consent procedures be used when studying adolescent populations. As noted above, the present study used an experimental design in which seven yoked-pairs of Kentucky public school districts were randomly assigned to either a passive consent or active consent condition to better understand the impact of active consent procedures on student survey data. Compared to our seven districts that used passive parental consent procedures, we hypothesized that the use of active parental consent procedures would:
H1: result in lower participation rates and higher refusal rates.
H2: affect sample distributions on key demographic variables such as race, gender, and parental education levels.
H3: affect distributions of variables measuring alcohol and other drug use and key risk and protective factors that have been found to be associated with drug use by students (minors), resulting in lower reported rates of alcohol and other drug use by minors, lower levels of risk factors, and higher levels of protective factors.
We hypothesize that these effects will occur because the use of active consent procedures causes a nonrandom subset of a student population—in particular, males and minority students—to be underrepresented in survey samples that use active consent (cf. Severson & Ary 1984; Dent 1993; Esbensen 1999; Pokorny 2001; Baker 2002). The hypothesized effects on substantive variables measuring drug use and risk and protective factors results will be likely to occur, because the sampled nonresponding students that are missing from the data sets when active consent is used are more likely than the sampled responding students that are present in the data sets to (a) use alcohol, tobacco, and other drugs, (b) be at risk for drug use, and (c) have low levels of protective factors that can help prevent drug use by minors (cf. Baker 2001).
METHODS
Our data were collected as part of the 2007 Kentucky Youth Outcomes Survey (KYOS), a student survey that was administered in 14 Kentucky school districts during the fall semester of the 2007−08 school year. The survey took advantage of an infrastructure that was created for an established biennial survey (called the “KIP Survey”) that has been administered with 134 school districts throughout Kentucky since 1999; however, the 2007 KYOS was conducted during an “off-year” when the biennial KIP survey was not being administered.
Survey questionnaire
The survey instrument was modeled after the Communities That Care Survey (Hawkins, Catalano, & Miller 1992) and was designed to measure risk and protective factors, behaviors related to alcohol, tobacco, and other drugs, and school safety issues (Arthur, Hawkins, & Catalano 1998). In keeping with established conventions for the biennial KIP survey, the 2007 Kentucky Youth Outcomes Survey was designed to be an anonymous, self-administered pencil and paper instrument, and was administered in the 6th, 8th, 10th and 12th grades in participating school districts. The questionnaire also included three questions measuring the lifetime, past year and past 30 day prevalence of a fictional drug called “Zycopan.” These questions were used as validity checks, and as described below, aided in the data cleaning process.
School district matching, recruitment and random assignment
The current analysis reports data from 14 Kentucky school districts that participated in the 2007 KYOS survey. The recruitment and randomization of school districts to active or passive consent conditions was conducted using a three step process. In Step 1, we solicited interest in the proposed project from the 134 school districts that participated in the biennial KIP survey between 1999 and 2006. Districts were informed that they would receive a $500 cash incentive if they were one of the districts selected to participate in the study and if they administered the student survey with strict fidelity to the protocols specified by the research team. In Step 2, school districts were paired (resulting in a yoked design) based on their similarity on the following characteristics: (a) total student population; (b) proportion of ethnic minority students; (c) percentage of students participating in a free/reduced price lunch program; and (d) percentage of urban/rural population. Data for each of these criteria came from school district information and/or data provided by the Kentucky Department of Education. These matches were based on a Mahalinobis’ distance metric that was calculated for these four measures. The matching process was used to reduce the possibility that differences between conditions could be a function of systematic differences in community characteristics across the active and passive conditions. In addition, to be eligible to participate in the 2007 KYOS, districts could not have an existing policy requiring the use of active consent for student research. In Step 3, we then selected the seven best matching pairs of school districts (14 districts) from the larger matched pool and randomly assigned one district in each pair to a passive consent condition, and the other district to an active consent condition. A series of t-tests on variables measuring characteristics of districts (such as enrollment, graduation rate, dropout rate, percentage of non-white students, percentage of students receiving free and reduced price lunch) and on variables measuring characteristics of surrounding communities (such as median income and percentage of high school graduates). These results are discussed below in our section on post hoc group comparability.
In order to ensure that survey administration protocols were followed with fidelity, after the seven best-matching pairs are determined, but before random assignment was made, letters of commitment to participate were gathered from the 14 participating school districts. Once both letters from a pair of school districts were received, random assignment into active and passive conditions was made. Districts then were notified of their assignment, and formal communications detailing the survey protocols and responsibilities involved in being part of the project were communicated to each participating district. The recruitment and matching process was not complete until all districts confirmed that they understood the study protocols and had committed in writing to follow the consent procedures for their condition with fidelity.
Despite all these precautions and the methodical approach to implementing this experimental design, after Step 3 activities had been completed (and random assignment to conditions had been completed, along with distribution of training materials to all districts), three of the 14 school districts notified the project team that they were unable to fulfill their commitment to participate in the project. Two districts had been randomly assigned to the active condition and one had been randomly assigned to the passive condition. The three districts indicated to the researchers that their decisions to withdraw from the study were unrelated to the study itself or to the condition that they had been randomly assigned, and instead stemmed from internal concerns that the districts were not making satisfactory instructional progress to prepare students for state-mandated proficiency tests. Three additional school districts were recruited to replace the drop-out schools and were randomly assigned to the active or passive consent condition using probabilities to ensure that two of the three replacement districts were assigned to active consent and one was assigned to passive consent conditions. To assess the similarity of the new districts to those that had been previously recruited, a series of t-tests (described above) was conducted on variables measuring a variety of district and community characteristics. The results of these t-tests demonstrated that even with the addition of the three replacement districts, there were no significant differences between districts in the active and passive consent conditions.
Survey administration protocols
Because the validity of the results from this study depended on the data being collected in a consistent and professional manner across the 14 participating districts, the research team developed rigorous and extensive survey administration procedures. We developed data collection, training and technical assistance materials, and protocols to assist school administrators in organizing and administering both the consent process and the survey. These survey administration protocols and procedures were assembled in a Student Survey Training Manual that was tailored to the district's consent condition. The self-administered forms, checklists and management tools included within the Training Manual included:
Survey management forms to assist schools in organizing the survey process and assessing the number of students, classrooms, and survey administrators needed;
Sample letters to parents from the superintendent or principals, a survey Fact Sheet, consent forms, and other materials needed to develop the parental consent packet. Consent materials were tailored to the consent condition that the school district had been randomly assigned to and consent packets were sent home with students at least three weeks before the survey was scheduled to be administered. The strategy of sending consent forms home with students was required by the funding agency, the National Institutes on Drug Abuse (NIDA) in order to ensure that the current project utilized a consent form delivery mechanism that is most commonly used by school districts and other survey administrators.
Instructions to ensure that students received at least one reminder to return consent forms to the survey administrator--through an all-call system, emails to parents, additional forms sent home, and/or classroom reminders by teachers.
Survey administration procedures and management forms with step-by-step instructions on how to (1) prepare for the survey, (2) create an orderly and confidential survey environment, and (3) ship completed questionnaires. Instructions included specifications on the optimal group size for administering the survey and forms for keeping track of parents and students who decline participation so that these students would be dismissed to an alternate location or allowed to study or sit quietly during survey administration.
Confidentiality and professional ethics principles and guidelines that were designed to ensure that appropriate steps were taken to protect students’ rights to privacy. All survey administrators were asked to sign an Agreement of Confidentiality and a Professional Ethics form included in the manual.
Instructions on how to arrange students in the classroom to encourage privacy and discourage disclosure of responses.
A survey administrator script, which was required to be read to students prior to administration. The script explained the purpose of the survey, reiterated the voluntary nature of participation, stressed that students’ answers were strictly confidential, and provided instructions to students for marking and returning their survey forms. The Student Survey form came packaged with a plain envelope that students were instructed to use to cover their answers while completing the questionnaire. Students were instructed to place their completed questionnaire in the envelope, seal it, and then to drop it in a collection box provided by the survey administrator.
Instructions for read-aloud administration, so that students with reading difficulties could complete the questionnaire with assistance.
A Consent Procedures Checklist form that was completed by all survey administrators to document all of the protocols that were implemented by each classroom coordinator in distributing and collecting consent forms. These forms were used as a quality assurance mechanism to promote high levels of compliance with the consent and survey administration protocols.
Training, technical assistance, and QA monitoring
Prior to districts beginning the consent or survey administration process, the Principal Investigator conducted a training call with the survey administrator in each of the 14 school districts. These calls, which focused on reviewing the Training Manual, survey administration procedures, and the parental consent protocols, averaged between one to two hours in length. The research team discovered that because the 2007 Kentucky Youth Outcomes Survey was utilizing infrastructure created for the ongoing biennial KIP survey, additional training was not needed nor requested by districts.
Technical assistance was provided to all participating school districts via electronic mail and a toll-free telephone number. This technical assistance encouraged survey administrators to implement survey protocols as specified and to complete all survey administration forms and consent checklists consistently and accurately. Because the success of the proposed research hinged on school districts following the survey administration protocols with fidelity, the research team used the consent procedures checklist, electronic mail follow-up, and weekly telephone contact with each school district to monitor district fidelity to the administration protocols. In addition, to promote high quality survey implementation, survey administrators were able to submit questions regarding support, procedures, forms, or other issues to PIRE staff and receive an immediate response from a member of the research team. These protocols and quality assurance procedures resulted in good levels of fidelity to the survey administration process by the 14 participating districts.
Incentives
Because previous literature identified unreturned parental consent forms as the largest contributor to survey nonresponse in student surveys that utilize active consent, student-level incentives were used to motivate return of consent forms. The project provided funds to each of the active consent districts for incentives. The actual incentive schemes were developed in consultation with district staff to fit local district needs and culture. The most common incentive approach used in active consent districts was to enter each student who returned a consent form (signed or unsigned by their parents) into a drawing for one or more gift cards to local merchants. Some of the active consent districts also provided coupons for free soft drinks or other refreshments at school activities if a consent form was returned. In keeping with findings from previous literature (Johnson et al., 1999), school district survey administrators reported that these student-level incentives generated excitement among students and helped motivate the return of consent forms. Similar monetary incentives also were offered to districts that implemented the survey with passive consent procedures in order to provide “thank you” gifts to students.
All of the passive consent school districts declined the student-level incentives because they were not typically provided as part of the biennial KIP survey (and thus, the district was concerned that providing incentives for the 2007 KYOS could create an “entitlement” mentality that could impact future administrations of the ongoing biennial KIP survey). As such, the actual experimental treatment that was experienced by the various districts differed by more than simply whether they were assigned to the active or passive consent condition. That is, all districts that were assigned to the active condition used incentives to motivate student participation, whereas none of those assigned to the passive condition used incentives to motivate participation. Whether this difference affected the results of this study in nonignorable ways is unknown, but we do not consider it a major threat to the validity of our interpretation of the results.
ANALYSIS STRATEGY
Data Cleaning
Initial exploration of the data made it clear that some participants were not necessarily attentive when completing the survey (e.g., extreme response sets). As mentioned previously, the survey questionnaire included three items were included asking about lifetime, past year, and 30 day use of a fictitious substance called Zycopan as one validity check. We used a decision rule to eliminate the data from all analyses for those students who indicated use of 80% or more of the illicit substances asked about in the past 30 days and use of Zycopan at any of the three time intervals. This rule eliminated 47 participants (0.6%) who appeared to have provided poor quality data by “satisficing” or “flat-lining” on their responses to the survey questions.
Differential Attrition
As noted above, in order to ensure that our groups of seven active and seven passive consent districts were as equivalent as possible, the larger group of 14 districts was first matched into seven yoked pairs, and then random assignment to conditions was conducted. However, the original design was compromised by three districts deciding to drop out of the study after randomization to conditions had taken place. Two of these districts had been randomly assigned to an active consent condition, and one had been randomly assigned to a passive consent condition. Three replacement districts were recruited into the study and were randomly assigned to an active or passive consent condition. To assure that these replacements did not compromise the random assignment process, we ran all analyses described below with both the full set of 14 districts and with the original 11 school districts (i.e., only those districts that were not affected by the drop-out/replacement process). The pattern of results was nearly identical, differing only in nominal statistical significance due to the drop from 12 to 9 degrees of freedom. For this reason, the remainder of the manuscript focuses on the analyses using data from all 14 communities.
Post Hoc Group Comparability
Analyses also were conducted to ensure that the active and passive consent schools were similar, as random assignment was at the school-level and not at the individual-level or classroom level. Data were only available for the counties or school districts where these schools resided. Comparisons were made between the two groups using independent groups t-tests. Cohen's d was calculated for all comparisons, where these effect sizes are sometimes interpreted as small (d=.20), medium (d=.50), or large (d=.80; Cohen, 1988). As can be seen in Table 1, none of the differences were significant or approached significance and effect sizes were consistently small.
Table 1.
Mean(SD) | ||||
---|---|---|---|---|
Passive | Active | t | d | |
Enrollment (in thousands) | 3.75(31.64) | 2.19(.97) | 1.25 | .76 |
Attendance | 93.97(.85) | 94.09(1.45) | −.18 | −.10 |
Dropout Rate | 2.01(.98) | 2.17(.80) | −.33 | −.18 |
Graduation Rate | 84.29(5.83) | 84.64(6.59) | −.11 | −.06 |
% of Students Proficient In Reading | 56.71(6.10) | 60.86(7.90) | −1.10 | −.59 |
% of Students Proficient In Math | 33.14(4.38) | 37.14(8.13) | −1.15 | −.64 |
% of College Bound Students | 45.07(11.74) | 51.33(6.21) | −1.25 | −.70 |
Teacher Experience Level | 11.61(.86) | 11.80(1.70) | −.26 | −.14 |
Per Pupil Expenditures (in thousands) | 10.12(.70) | 9.64(.84) | 1.15 | .62 |
% Eligible for Free/Reduced Lunch | 63.77(3.94) | 59.94(15.96) | .62 | .38 |
% Rural | 83.92(17.62) | 78.98(23.83) | .44 | .24 |
% of County Residents with HS Diploma or GED | 60.21(4.85) | 64.51(10.65) | −.97 | −.56 |
% of County Residents with Bachelors Degree or Higher | 10.49(1.98) | 10.24(2.72) | .19 | .10 |
Median Income in County (in thousands) | 28.30(3.55) | 31.94(10.39) | −.88 | −.52 |
Poverty Rate for County | 26.74(4.23) | 26.00(10.15) | .18 | .10 |
Unemployment Rate for County | 6.41(.86) | 7.33(2.37) | −.96 | −.57 |
Avg. # Of Family Members Reported As In Household | 2.45(.09) | 2.51(.07) | −1.50 | −.81 |
Avg. # Immediate Family Members In Family | 2.92(.05) | 2.96(.06) | −1.44 | −.77 |
Median Sell Price For Home (in thousands) | 54.12(17.88) | 60.50(26.31) | −.53 | −.29 |
Number Of Homes Sold In That County | 85.00(64.14) | 68.00(56.44) | .53 | .28 |
% White Students | 94.86(7.64) | 92.73(11.74) | .40 | .22 |
% Black Students | 2.97(5.58) | 4.60(9.02) | −.41 | −.22 |
% Hispanic Students | .78(.77) | 1.92(2.49) | −1.15 | −.70 |
% Male Students | 52.19(.44) | 51.36(1.58) | 1.34 | .82 |
Infant Mortality Rate (Per 100 Births) | 8.27(4.39) | 8.60(6.10) | −.12 | −.06 |
% Children In Poverty | 27.96(5.60) | 27.80(10.02) | .04 | .02 |
Students In Single Parent Family (Mother Only) | 11.41(.89) | 11.70(1.72) | −.39 | −.22 |
Student Teacher Ratio | 15.31(.97) | 15.50(1.61) | −.26 | −.14 |
NOTE: Statistical significance tests based on 12 df. + p < .10, * p < .05
Student Sample Characteristics and Admission of Socially Undesirable Behavior/Characteristics
These data were analyzed by using Hierarchical Linear Modeling (HLM). All HLMs were run as random intercept regressions, which assumes that additional variability exists among communities on the dependent measure, that when not statistically controlled, can masquerade as a spurious group difference. Clustering effects were generally small, as suggested by the small intraclass correlation coefficients in the tables presented. Our student level model represented a simple equation with only an intercept:
Our level of inference for these models is at the school level, as schools (and not students) were randomly assigned to conditions. As such, the school-level equation included an intercept, a dummy variable representing assignment to condition, and an error term:
Whereas HLM specifically was used for continuous outcomes, Hierarchical Generalized Linear Modeling (HGLM) was used for dichotomous outcomes, where we assumed that the outcome was better described by the binomial distribution using a logit link function. Effect sizes were calculated for all comparisons. The odds ratio was calculated for dichotomous dependent measures and the effect size r (interpreted like a correlation coefficient; Cohen, 1988) was calculated for continuous dependent measures.
Prior to presenting our results, it is important to acknowledge that HLM/HGLM modeling techniques depend on strong assumptions. For example, both HLM and HGLM assume that the random intercepts are independent of the disturbances in the outcome equation. This cannot be tested and may well be wrong. Freedman (2008) has formally shown that random assignment does not justify the use of regression in any of the usual forms, including logistic regression. Biased estimates of the regression coefficients and standard errors can result (Freedman, 2008). Very recent work has suggested that there may be other, more robust modeling procedures (Small et al., 2008). However, HLM has been widely used to analyze cluster randomized experiments (Murray, 1998) and consistent with common practice we do the same. We believe that any biases that may result are not large enough to materially impact our findings. For example, if one simply compares the outcomes for the active and passive consent groups in a totally model-free manner, the estimates of treatment effects are very similar to those produced by the model. We found that when we looked at simple mean/percentage differences using the unadjusted means/percentages reported in Tables 2, 3, and 4 (below), the direction of the model-free findings are identical to the conclusions of the HLM/HGLM statistical tests. Further, in all but two cases where we found statistically significant differences, there were non-overlapping 95% confidence intervals for the active and passive consent conditions. Lifetime and past 30 day cocaine/crack use were the exceptions.
Table 2.
Mean(SD) | ||||
---|---|---|---|---|
Passive | Active | t | d | |
All Districts | ||||
Response Rate | 78.59(13.62) | 29.10(24.04) | 4.74* | 2.63 |
Refusal Rate | 1.03(.89) | 2.66(3.06) | −1.35 | −.82 |
Original Districts Only | ||||
Response Rate | 77.58(14.63) | 35.92(25.61) | 3.40* | 2.07 |
Refusal Rate | .73(.47) | 3.00(3.64) | −1.53 | −1.10 |
NOTE: Statistical significance tests for all districts based on 12 df and 9 df for original districts.
p < .05
Table 3.
Mean(SD) | ||||
---|---|---|---|---|
Passive | Active | t | d | |
Age | 14.28(2.22) | 13.51(2.18) | −2.72* | −.35 |
Grade | 8.86(2.16) | 8.09(2.13) | −2.62* | −.36 |
% Male | 52.00(50.00) | 44.44(50.00) | −5.08* | −.15 |
% Caucasian | 95.28(40.88) | 92.31(44.21) | −1.04 | −.07 |
% Lives w/ Mother & Father | 52.19(50.00) | 56.35(50.00) | 1.53 | .08 |
% Lives with Mother Only | 15.89(48.16) | 14.32(47.71) | −1.30 | −.03 |
% Lives with Father Only | 3.56(38.85) | 3.03(37.70) | −.97 | −.01 |
% Free/Reduced Lunch | 48.76(50.00) | 41.49(49.99) | −.97 | −.15 |
NOTE: Statistical significance tests based on 12 df.
p < .05
Table 4.
Unadjusted %/Mean(SD) | |||||
---|---|---|---|---|---|
Passive | Active | t | OR/r‡ | ICC | |
Antisocial Behavior (past 12 Months) | |||||
School Suspension | 10.20(45.94) | 6.40(43.00) | −2.60* | .58 | .02 |
Carrying Handgun | 6.41(43.00) | 4.01(39.71) | −3.06* | .53 | .02 |
Selling Illegal Drugs | 3.34(38.40) | 1.29(31.66) | −3.73* | .32 | .03 |
Arrested | 2.97(37.55) | 2.21(35.43) | −1.42 | .69 | .02 |
Attacking Someone | 10.20(45.94) | 9.27(45.38) | −1.05 | .90 | <.01 |
Drunk/High at School | 7.63(44.16) | 4.37(40.33) | −3.79* | .51 | .01 |
Carrying Handgun at School | .42(24.54) | .36(23.73) | −.30 | .86 | <.01 |
Lifetime Substance Use | |||||
Smokeless Tobacco | 26.68(49.67) | 18.53(48.74) | −3.68* | .59 | .01 |
Cigarettes | 41.10(49.99) | 30.94(49.86) | −3.64* | .61 | .01 |
Alcohol | 44.03(50.00) | 34.12(49.93) | −2.20+ | .62 | .04 |
Marijuana | 16.04(48.20) | 8.63(44.95) | −4.18* | .46 | .02 |
Inhalants | 9.66(45.63) | 9.71(45.66) | .05 | 1.01 | <.01 |
Cocaine/Crack | 3.04(37.71) | 1.23(31.30) | −2.96* | .41 | .02 |
Narcotics | 10.25(45.97) | 8.60(44.93) | −1.60 | .78 | .01 |
Methamphetamines | 1.33(31.86) | .80(28.47) | −1.65 | .56 | .02 |
OTC Drugs to Get High | 8.27(44.67) | 6.39(42.98) | −2.34+ | .76 | <.01 |
30 Day Substance Use | |||||
Smokeless Tobacco | 13.67(47.49) | 7.39(43.96) | −4.45* | .50 | .01 |
Cigarettes | 18.21(48.68) | 11.41(46.57) | −3.71* | .47 | .03 |
Alcohol | 17.47(48.53) | 12.17(46.91) | −2.90* | .61 | .02 |
Being Drunk | 15.15(47.96) | 8.78(45.05) | −3.90* | .47 | .02 |
Marijuana | 5.71(42.22) | 2.30(35.72) | −4.27* | .33 | .02 |
Inhalants | 2.68(36.81) | 2.44(36.12) | −.51 | .91 | <.01 |
Cocaine/Crack | .93(29.47) | .22(21.20) | −2.43+ | .24 | <.01 |
Narcotics | 3.57(38.88) | 2.96(37.53) | −.90 | .80 | .03 |
Methamphetamines | .44(24.89) | .15(19.24) | −1.50 | .33 | <.01 |
OTC Drugs to Get High | 2.94(37.48) | 3.19(38.08) | .49 | 1.09 | <.01 |
Past 2 Week Binge Drinking | 12.02(46.85) | 7.23(43.81) | −5.07* | .57 | <.01 |
School Perceived as Safe | 86.37(47.48) | 88.79(46.47) | 1.34 | 1.41 | .05 |
Mediators of Antisocial & Using Behavior | |||||
Friends Use of Drugs | .55(.74) | .37(.61) | −4.04* | −.76 | .01 |
Perceived Harm of Drug Use | 1.97(.79) | 2.04(.84) | 1.76 | .45 | .01 |
# of Consequences Due to Substance Use | .69(1.60) | .55(1.42) | −2.10 | −.52 | <.01 |
Perceptions of Substance Use Being Wrong | 1.44(.57) | 1.32(.51) | −2.55+ | −.59 | .03 |
Perceptions of Alcohol Use Being Wrong | 1.14(.36) | 1.10(.31) | −2.39+ | −.57 | .01 |
Average Age of First Use | 6.61(1.82) | 7.01(1.53) | 3.50* | .71 | .01 |
Average Age of Delinquent Behavior | 7.39(1.21) | 7.57(1.06) | 2.84* | .63 | <.01 |
NOTE: Negatively signed rs and t-values, as well as odds ratios less than one, indicate that the passive condition had higher values than the active condition. Statistical significance tests based on 12 df.
p < .10
p < .05
Odds Ratios (OR) are reported for all variables, except the mediators of antisocial and using behavior.
RESULTS
Impact of Active Consent on Response and Refusal Rates
Survey management forms completed by survey administrators in each of the 14 participating districts allowed for calculations of district response rates and parental refusal rates. Table 2 presents information on the average response and refusal rates achieved by districts that were randomly assigned to active and passive consent conditions. Table 3 shows that districts that were randomly assigned to a passive consent condition on average achieved a response rate of 79% (calculated as AAPOR RR1). In contrast, districts that were randomly assigned to an active consent condition achieved an average response rate of only 29%. This difference was statistically significant at the .001 level.
Table 2 also presents information about average parental refusal rates for the active and passive consent districts. Districts that were randomly assigned to a passive consent condition reported, on average, that 1% of parents of students in grades 6, 8, 10, and 12 returned a signed consent form indicating that their child did not have permission to participate in the 2007 KYOS. In contrast, for districts that were randomly assigned to an active consent condition, 2.7% of parents returned a signed consent form indicating that their child did not have permission to participate in the survey. This difference in refusal rates approached marginal significance at the .10 level. The results were similar when the analyses were confined only to the eleven districts that were unaffected by the replacement.
As noted above, active consent requires that a non-returned consent form be treated as a refusal of consent. The 3% average refusal rate for districts that were randomly assigned to use active consent procedures also confirms a result from previous literature (Dillman, Eltinge, Groves, & Little, 2002; Thompson, 1984), in that the vast majority of non-response in active consent districts stemmed not from explicit refusals by parents but instead from parental consent forms not being returned to district survey administrators.
Impact of Active Consent on Sample Demographics
The survey questionnaire for the 2007 Kentucky Youth Outcomes Survey included a number of demographic questions including age, grade, gender, race, and student living arrangements (both parents, mother-only, father-only). The questionnaire also included a measure of Hispanic ethnicity; however, there were too few Hispanic students in the sample to permit comparisons across active and passive consent districts. Table 3 (below) presents information on the sample demographic variables for the seven active consent and seven passive consent districts.
Table 3 shows that three of the seven variables examined had statistically-significant effects in the hypothesized direction. Students who completed the survey in schools that were randomly assigned to the active consent condition tended to be younger and in lower grades than students completing the survey in schools that were randomly assigned to the passive consent condition. In addition, students completing the survey in schools that had been randomly assigned to the active consent condition were less likely to be male. These three effects are consistent with findings in previous literature that the use of active consent typically reduces the representation of male students and older students in survey samples. It also is worth noting that the active and passive sub-samples did not differ statistically on the other five variables and that with the exception of % of free and reduced price lunch eligibility, the trends for the other demographic variables (student race, % living with both parents, % living with mother only, and % living with father only) were contrary to the hypothesized direction.
Impact of Active Consent on Substantive Survey Variables
The survey questionnaire for the 2007 Kentucky Youth Outcomes Survey focused on measuring student alcohol, tobacco, and other drug use, school safety and student engagement in a variety of antisocial behaviors, along with risk and protective factor variables associated with those behaviors. Table 4 presents information on risk and protective factor scale scores and on student engagement in antisocial behavior for students completing the survey in the seven active consent and seven passive consent districts. The variables in Table 4 include two types of measures—individual dichotomous measures of the prevalence of student engagement in antisocial behavior and substance use, and multi-item scales for risk and protective factors.1
Table 4 shows that our findings were in the predicted direction with admissions of antisocial and substance using behavior being more likely in the passive consent condition than in the active consent condition. All but two of the tests were in the predicted direction, and most of the differences were statistically significant. Considering only significant findings, students in the passive consent condition were more likely to have admitted to having engaged in antisocial behavior in the past 12 months (having been suspended, having carried a handgun, having sold illegal drugs, and having been drunk/high at school), having used substances in their lifetime (smokeless tobacco, cigarettes, marijuana, and cocaine/crack), having used substances in the past 30 days (smokeless tobacco, cigarettes, alcohol, having been drunk, and marijuana), and having engaged in binge drinking in the past two weeks. Examination of the mediators of substance use suggested students in the passive condition had a significantly greater number of friends who used drugs and they were significantly more likely to have earlier initiation ages for substance use and delinquent behavior.
DISCUSSION
The results presented above in Tables 2-4 support the three hypotheses that guided this study. First, we found evidence that the use of active parental consent procedures resulted in strikingly lower survey response rates--about 50 percentage points lower for our active subsample. We found that the overwhelming majority of this nonresponse in the active condition was due to consent forms not being returned. Second, we found that the use of active consent procedures also affected the demographic characteristics of the resulting sample with students in the active consent sub-sample being significantly younger, in lower grades, and less likely to be male. These results are consistent with previous research and with theoretical expectations. However, the active and passive sub-samples did not differ statistically on the other five demographic variables tested. Third, our results also supported our hypothesis that the use of active consent procedures resulted in lower lifetime and past 30 day prevalence rates for most drugs and for most antisocial behaviors measured in the survey instrument used for the 2007 Kentucky Youth Outcomes Survey. This last set of findings suggests that the use of active consent procedures can lead to nonresponse bias in prevalence rates and on substantive survey variables.
Additionally, it is important to note that there was one threat to the integrity of the experimental design. As noted above, three districts dropped out of the study after committing to participate. Maintaining the statistical power of the study required that replacement districts be recruited and randomly assigned to conditions. However, as noted above, the analyses were run two different ways (with just the 11 districts that were “original” to the study and unaffected by the drop-out issue and with the full seven pairs of original and “replacement” districts) and no significant differences between the “original” and the “replacement” districts were found (aside from minor power issues with the reduced complement of original districts).
Despite the challenge listed above, a great deal of effort went into the maintenance of the fidelity of the experimental design. For example, the present study provided two tangible incentives to districts to participate in the project—a $500 incentive and a survey report that was patterned after what they typically received as part of the ongoing biennial KIP survey (and which districts had experience and success using). The problem experienced in the current study with three districts dropping out after being randomized to conditions leads to the practical question of what, if anything, could have done better to prevent this from happening. Upon notification from each of the districts that dropped out, the lead author (and Principal Investigator) began a series of conversations with officials in each of those districts that were designed to ameliorate their concerns and keep them in the study. Although the present study was limited in its financial resources, the principal investigator offered to provide additional financial incentives to those districts and offered to adjust time frames for survey administration. In all three cases it became clear that the district concerns revolved around standardized testing and the KYOS causing a loss of instructional time—a concern for which there was no feasible solution or ability for the study to ameliorate. Given that time demands on educators are not likely to decrease in the future, this experience leads to a practical recommendation that future experimental studies that involve school districts build in sufficient budgetary resources to oversample districts—thus providing a “cushion” of sorts, in case some are dropped from the experimental conditions. However, this experience also emphasizes that additional research is needed to better understand how to get school districts to participate in research efforts.
Finally, there are two other areas in which additional research is needed. First, although many studies have looked at the impact of active consent at the student level, many fewer studies have conducted research with parents—to understand why or why not they sign and/or return consent forms. Findings of this type can help in formulating language for those forms and will help in developing practical mechanisms to raise the likelihood that parents receive and read consent forms. We currently are conducting this type of study with two active consent districts and two passive consent districts.
A second important area for future research involves developing new mechanisms to collect data about students who do not have consent to participate in a student survey. This type of research is critically important, because no matter how well a student survey is planned and implemented, some students are not included. Understanding the demographic, attitudinal, and behavioral profiles of those “hard core” nonresponding students is needed to better assess the degree of nonresponse bias that is persists in student survey data; this information also can help researchers further refine recruitment strategies and consent protocols used in student surveys.
Although much research remains to be done, the contributions of our study are its findings that the use of active consent procedures can reduce survey response rates, affect sample demographics, and impact substantive survey variables. Future research that is focused on better understanding the parental dynamics of the active consent process will help researchers better target practical efforts to increase the return of consent forms, and in doing do optimize data quality in student surveys that are required to use active consent.
Acknowledgments
Funding for this project was provided by the National Institutes on Drug Abuse (NIDA) under grant #1 R01DA019972-01A1, M. Courser, PI. We thank Paul Gruenwald, Linda Young, and the anonymous reviewers for their assistance and helpful comments on an earlier version of this paper. A previous version of this paper was presented at the 2008 Annual Meeting of the American Association for Public Opinion Research, May 15−18, New Orleans, LA.
Footnotes
Risk factors are variables that are associated with ATOD use or engagement in antisocial behavior; protective factors have been found to help reduce the likelihood that youth will use drugs or engage in antisocial behavior. Psychometric analyses were conducted via factor and reliability analyses to ensure that scales were robust.
Contributor Information
Matthew W. Courser, Associate Research Scientist Pacific Institute for Research and Evaluation—Louisville Center 1300 S. 4th Street, Suite 300 Louisville, KY 40208 (502) 634−3694, x7381 (ph) (614) 995−4223 (fax) mcourser@pire.org
Stephen R. Shamblen, Associate Research Scientist Pacific Institute for Research and Evaluation—Louisville Center
Paul J. Lavrakas, Research Methodologist Stanford, CT
David Collins, Research Scientist Pacific Institute for Research and Evaluation—Louisville Center
Paul Ditterline, Research Associate Pacific Institute for Research and Evaluation—Louisville Center.
REFERENCES
- Arthur M, Hawkins D, Catalano R. Student survey of risk and protective factors and prevalence of alcohol, tobacco, & other drug use. 1998. Prepared by Developmental Research and Programs, Inc., Seattle, for the Diffusion Consortium Project.
- Baker J, Yardley J, McCaul K. Characteristics of responding, nonresponding, and refusing parents in an adolescent lifestyle choice study. Evaluation Review. 2001;25:605–618. doi: 10.1177/0193841X0102500602. [DOI] [PubMed] [Google Scholar]
- Brehm J. The Phantom Respondents: Opinion Surveys and Political Representation. University of Michigan Press; Ann Arbor: 1993. [Google Scholar]
- Cohen J. Statistical power analysis for the behavioral sciences. 2nd ed. Erlbaum; Hillsdale, NJ: 1988. [Google Scholar]
- Curtin R, Presser S, Singer E. The effects of response rate changes on the index of consumer sentiment. Public Opinion Quarterly. 2000;64:413–428. doi: 10.1086/318638. [DOI] [PubMed] [Google Scholar]
- Dempster A, Laird N, Rubin D. Maximum likelihood from incomplete data via the EM algorithm. (Series B).Journal of the Royal Statistical Society. 1977;39(1):1–38. [Google Scholar]
- Dent C, Galaif J, Sussman S, Stacy A, Burtun D, Flay B. Demographic, psychosocial, and behavioral differences in samples of actively and passively consented adolescents. Addictive Behaviors. 1993;18:51–56. doi: 10.1016/0306-4603(93)90008-w. [DOI] [PubMed] [Google Scholar]
- Dillman D. Mail and Internet Surveys: The Tailored Design Method. 2nd ed. John Wiley and Sons; New York: 2000. [Google Scholar]
- Eaton DK, Lowry R, Brener ND, Grunbaum JA, Kann L. Passive vs. active parental permission in school-based survey research. Evaluation Review. 2004;28:564–577. doi: 10.1177/0193841X04265651. [DOI] [PubMed] [Google Scholar]
- Ellickson P, Hawes J. An assessment of active versus passive methods of obtaining parental consent. Evaluation Review. 1989;13:45–55. doi: 10.1177/0193841X8901300104. [DOI] [PubMed] [Google Scholar]
- Esbensen F, Miller M, Taylor T, He N, Freng A. Differential attrition rates and active parental consent. Evaluation Review. 1999;23:316–335. doi: 10.1177/0193841X9902300304. [DOI] [PubMed] [Google Scholar]
- Fendrich M, Johnson T. Examining prevalence differences in three national surveys: Impact of consent procedures, mode, and editing rules. Journal of Drug Issues. 2001;31:615–642. [Google Scholar]
- Freedman DA. Randomization does not justify logistic regression. Statistical Science. 2008;23:237–249. [Google Scholar]
- Groves RM. Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly. 2006;70(5):646–675. [Google Scholar]
- Groves RM, Presser S, Dipko S. The role of topic interest in survey participation decisions. Public Opinion Quarterly. 68:2–31. [Google Scholar]
- Groves Robert M., Dillman D, Eltinge J, Little R. Survey Nonresponse. John Wiley and Sons; New York: 2002. [Google Scholar]
- Groves Singer, Corning Leverage-Saliency Theory of Survey Participation - Description and an Illustration. Public Opinion Quarterly. 2000;64:299–308. doi: 10.1086/317990. [DOI] [PubMed] [Google Scholar]
- Hawkins JD, Catalano RF, Miller JY. Risk and protective factors for alcohol and other drug problems in adolescence and early adulthood: Implications for substance abuse prevention. Psychological Bulletin. 1992;112:64–105. doi: 10.1037/0033-2909.112.1.64. [DOI] [PubMed] [Google Scholar]
- Henry K, Smith E, Hopkins A. The effect of active parental consent on the ability to generalize the results of an alcohol, tobacco, and other drug prevention trial to rural adolescents. Evaluation Review. 2002;26:645–655. doi: 10.1177/0193841X0202600604. [DOI] [PubMed] [Google Scholar]
- Holbrook A, Pfent A, Krosnick J. Response rates in recent surveys conducted by non-profits and commercial survey agencies and the news media.. Paper presented at the 2003 annual meeting of the American Association for Public Opinion Research; Nashville, TN. 2003. [Google Scholar]
- Hollmann C, McNamara J. Considerations in the use of active and passive parental consent procedures. Journal of Psychology. 1999;133:141–156. [Google Scholar]
- Johnson K, Bryant D, Rockwell E, Moore M, Straub B, Cummings P, Wilson C. Obtaining active parental consent for evaluation research: A case study. American Journal of Evaluation. 1999;20:239–249. [Google Scholar]
- Kearney K, Hopkins A, Mauss A, Weisheit R. Sample bias resulting from a requirement for written parental consent. Public Opinion Quarterly. 1983;47:96–102. [Google Scholar]
- Keeter S, Miller C, Kohut A, Groves R, Presser S. Consequences of reducing nonresponse in a national telephone survey. Public Opinion Quarterly. 2000;64:125–148. doi: 10.1086/317759. [DOI] [PubMed] [Google Scholar]
- Langhinrichsen-Rohling J, Arata C, O'Brien N, Bowers D, Kilbert J. Sensitive Research with Adolescents: Just How Upsetting Are Self-Report Surveys Anyway? Violence and Victims. 2006;21:425–444. [PubMed] [Google Scholar]
- Lavrakas P. Telephone Survey Methods: Sampling, Selection, and Supervision. 2nd ed. Sage; Newbury Park: 1993. [Google Scholar]
- Leakey T, Lunde K, Koga K, Glanz K. Written parental consent and the use of incentives in a youth smoking prevention trial: A case study from project SPLASH. American Journal of Evaluation. 2004;25:509–523. [Google Scholar]
- Lueptow L, Mueller S, Hammes R, Master L. The impact of informed consent regulations on response rate and response bias. Sociological Methods and Research. 1977;6:183–204. doi: 10.1177/004912417700600204. [DOI] [PubMed] [Google Scholar]
- MacGregor E, McNamara J. Comparison of return procedures involving mailed versus student delivered parental consent forms. Psychological Reports. 1995;77:1113–1114. [Google Scholar]
- Mammel K, Kaplan D. Research consent by adolescent minors and institutional review boards. Journal of Adolescent Health. 1995;17:323–330. doi: 10.1016/1054-139x(95)00176-s. [DOI] [PubMed] [Google Scholar]
- Murray DM. Design and analysis of group-randomized trials. Oxford University Press Inc.; New York, NY: 1998. [Google Scholar]
- Pokorny S, Jason L, Schoeny M, Townsend S, Curie C. Do participation rates change when active consent procedures replace passive consent? Evaluation Review. 2001;25:567–580. doi: 10.1177/0193841X0102500504. [DOI] [PubMed] [Google Scholar]
- Raudenbush SW, Bryk A. Hierarchical linear models. 2nd Ed. Sage; Newbury Park, CA: 2002. [Google Scholar]
- Rubin DB. Bias reduction using Mahalanobis-metric matching. Biometrics. 1980;36:293–298. [Google Scholar]
- Schuster M, Bell R, Berry S, Kanouse D. Impact of a high school condom availability program on sexual attitudes and behaviors. Family Planning Perspectives. 1998;30:67–72. 88. [PubMed] [Google Scholar]
- Severson H, Ary D. Sampling bias due to consent procedures with adolescents. Addictive Behaviors. 1984;8:433–437. doi: 10.1016/0306-4603(83)90046-1. [DOI] [PubMed] [Google Scholar]
- Severson H, Biglan A. Rationale for the use of passive consent in smoking prevention research: Politics, policy, and pragmatics. Preventive Medicine. 1989;18:267–279. doi: 10.1016/0091-7435(89)90074-1. [DOI] [PubMed] [Google Scholar]
- Small D, TenHave T, Rosenbaum P. Randomization inference in a group-randomized Trial of treatments for depression: covariate adjustment, noncompliance, and quantile effects. Journal of the American Statistical Association. 2008;103:271–279. [Google Scholar]
- Thompson T. A comparison of methods of increasing parental consent rates in social research. Public Opinion Quarterly. 1984;48:779–787. [Google Scholar]
- White V, Hill D, Effendi Y. How does active parental consent influence the findings of drug use surveys in schools? Evaluation Review. 2004;28:246–260. doi: 10.1177/0193841X03259549. [DOI] [PubMed] [Google Scholar]