Skip to main content
BMJ Open logoLink to BMJ Open
. 2021 Jan 12;11(1):e042133. doi: 10.1136/bmjopen-2020-042133

Protocol for the process evaluation of a complex intervention delivered in schools to prevent adolescent depression: the Future Proofing Study

Joanne R Beames 1, Raghu Lingam 2, Katherine Boydell 1, Alison L Calear 3, Michelle Torok 1, Kate Maston 1, Isabel Zbukvic 4, Kit Huckvale 1, Philip J Batterham 3, Helen Christensen 1, Aliza Werner-Seidler 1,
PMCID: PMC7805380  PMID: 33436468

Abstract

Introduction

Process evaluations provide insight into how interventions are delivered across varying contexts and why interventions work in some contexts and not in others. This manuscript outlines the protocol for a process evaluation embedded in a cluster randomised trial of a digital depression prevention intervention delivered to secondary school students (the Future Proofing Study). The purpose is to describe the methods that will be used to capture process evaluation data within this trial.

Methods and analysis

Using a hybrid type 1 design, a mixed-methods approach will be used with data collected in the intervention arm of the Future Proofing Study. Data collection methods will include semistructured interviews with school staff and study facilitators, automatically collected intervention usage data and participant questionnaires (completed by school staff, school counsellors, study facilitators and students). Information will be collected about: (1) how the intervention was implemented in schools, including fidelity; (2) school contextual factors and their association with intervention reach, uptake and acceptability; (3) how school staff, study facilitators and students responded to delivering or completing the intervention. How these factors relate to trial effectiveness outcomes will also be assessed. Overall synthesis of the data will provide school cluster-level and individual-level process outcomes.

Ethics and dissemination

Ethics approval was obtained from the University of New South Wales (NSW) Human Research Ethics Committee (HC180836; 21st January 2019) and the NSW Government State Education Research Applications Process (SERAP 2019201; 19th August 2019). Results will be submitted for publication in peer-reviewed journals and discussed at conferences. Our process evaluation will contextualise the trial findings with respect to how the intervention may have worked in some schools but not in others. This evaluation will inform the development of a model for rolling out digital interventions for the prevention of mental illness in schools.

Trial registration number

ANZCTRN12619000855123; https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=377664&isReview=true.

Keywords: child & adolescent psychiatry, depression & mood disorders, preventive medicine


Strengths and limitations of this study.

  • The methodology of this embedded process evaluation is underpinned by implementation frameworks and logic modelling.

  • Flexible and pragmatic quantitative and qualitative data collection methods will be used to balance research rigour with feasibility within the school delivery context.

  • Process data from a range of key stakeholders will be collected, including school staff, study facilitators and students.

  • To minimise burden on schools, fidelity data from teachers and in-depth qualitative data from students will not be collected.

  • The methodology and study processes were pilot tested to ensure appropriateness within the school context.

Introduction

Embedded in randomised controlled trials (RCTs), process evaluations provide insight into reasons why interventions work in some contexts and not in others. These process evaluations can help to demystify the ‘black box’ of complex intervention trials by taking into account contextual factors, differences in ways the intervention is delivered and adaptations made for intervention delivery into a particular system.1 2 Contextual factors typically include features of the organisation or broader environment that influence the delivery of the intervention (eg, leadership, engagement, culture, political landscape). Considering the contribution of contextual factors is necessary to aid interpretation of trial outcomes, maximise the knowledge gained from trials, identify optimal delivery processes across different settings and inform broader dissemination efforts. In recognition of this need for process evaluations, the UK’s Medical Research Council (MRC) set out a framework that emphasises the value of these evaluations in order to capture both contextual and implementation factors associated with complex interventions.3

In line with best practice recommendations that study protocols that prespecify methods and approaches should be published to maintain research integrity,4 we describe a protocol for a mixed-methods process evaluation embedded within a cluster RCT (cRCT) known as the Future Proofing Study (FPS). The FPS is a large school-based trial examining whether depression can be prevented using cognitive–behavioural therapy (CBT) delivered by smartphone application.5

There has been an increase in the availability of digital mental health programmes which show promise in addressing the significant disease burden associated with depression.6 Depression often first emerges during adolescence7 and treatment alone cannot adequately reduce this burden.8 Accumulating evidence indicates that early adolescence is the ideal developmental window during which to intervene because it captures young people before the incidence of depression increases exponentially around the age of 16–17 years.9 10 Therefore, increased efforts are being directed toward prevention approaches that are developed specifically for delivery among adolescents.

While there is evidence supporting the value and effectiveness of prevention programmes in schools,11 12 their uptake has been significantly limited by low levels of help-seeking13 as well as practical constraints in cost and scalability. A lack of understanding about how to deliver these interventions sustainably at scale, and engage those who stand to benefit, have hampered the translation of prevention approaches into the community.

Two ways to overcome the barriers of cost and scalability of implementing adolescent depression interventions are:

  • Delivering universal prevention programmes in schools.

  • Using technology to deliver programmes, which tends to be lower cost than traditional face-to-face methods.14

Working with schools to deliver universal prevention programmes to every student, regardless of their risk, circumstance or symptom profile, not only dramatically increases the potential reach of interventions, it also means that these programmes can eventually be integrated into the curriculum and delivered to every student, making this approach a sustainable implementation strategy for widescale delivery and dissemination. Additionally, working with schools to deliver such programmes reduces the need for young people to actively seek professional help. This is critically important—despite having a significant need, fewer than 80% of young people with mental illness seek help and receive the services they need.15 16 The scaffolding provided by schools can be leveraged to deliver prevention programmes to all students, which fits with a trend for schools to be designated the first point of contact to support youth mental health problems when they first emerge.9 Process evaluations of face-to-face mental health programmes have been documented, some of which have been delivered in school settings (eg, 17). However, we were unable to find any process evaluation descriptions in the literature that evaluate digital mental health programmes in school settings. Conducting process evaluations specific to digital methods of delivery is important because the contextual barriers and facilitators, as well as concerns around fidelity of the intervention, are likely to have unique characteristics. For example, facilitator or training in intervention delivery will be different in supporting the use of an automated programme relative to a face-to-face programme. This process evaluation is an initial step towards addressing that gap.

Technology offers a promising way to deliver mental health interventions to the community. Digital mental health interventions offer two key advantages over face-to-face approaches—first, they are cheaper to access and more cost-effective,14 18 and second, they can be used to reach people across vast geographical areas. The latter is particularly important given Australia’s geography, where remoteness and low population density meant that people in regional and rural areas often do not have the same level of access to mental health services as people in metropolitan areas. With more young people than ever using smartphones, mental health intervention delivered online or through applications represents an exciting avenue for reaching adolescents.

The FPS

The trial that is the subject of this process evaluation is the FPS. The FPS addresses barriers of reach, cost and scalability by delivering a depression prevention programme via a smartphone app to year 8 secondary school students aged 12–13 years. This study is being conducted with approximately 200 Australian schools (up to 10 000 participants), of which half will be allocated to the intervention condition. The primary outcome is symptoms of depression. Secondary outcomes include anxiety, psychological distress and insomnia, among others. Symptom outcomes will be assessed at baseline, post-intervention, 6 months (primary outcome only) and then annually for 5 years. The primary endpoint is 12 months following baseline. More details are available from the Australian New Zealand Clinical Trials Registry and protocol paper.5

The intervention being delivered is known as SPARX, a CBT-based programme incorporating gamification principles. The development and initial evaluation of SPARX have been documented in detail previously.19 SPARX has been tested in an Australian sample of secondary students (delivered via computer) and shown to prevent depression in the lead up to final school examinations.20 The gamified intervention teaches young people about the relationship between thoughts, feelings and behaviour. Skills learnt through SPARX include emotion identification, emotion regulation, behavioural activation (being active), recognising and challenging unhelpful thoughts, and practical problem-solving. SPARX consists of seven 20-minute modules. The intervention is fully automated, and the therapeutic components are standardised. See the trial protocol for full intervention details.5

School engagement and recruitment

The FPS requires multiple levels of approval, beginning with state department and independent school body approval (New South Wales (NSW) Department of Education; Catholic school dioceses), followed by individual school engagement. An engagement strategy for this study involves sending electronic communication material to all schools across NSW and in other Australian capital cities, targeting school principals and well-being staff to invite them to participate in this study opportunity. Schools are invited to submit expressions of interest and are subsequently followed up by the research team over the phone to explain more about the study. The schools for which the FPS is a good fit (as determined by the school) are then signed onto the study, with support from the principal, the school counsellor and at least one other staff member (typically a teacher). In line with best practice in implementation science,21 this group of 2–3 staff members (typically not including the principal) will form the school-specific ‘study implementation team’. After signing on to the study, several webinars are scheduled throughout the lead up to the study start date, so that school staff and parents can listen to a 15-minute study overview from the trial manager and have their questions answered.

Preparation

In preparation for in-class assessment sessions, study facilitators (volunteer research assistants) are recruited to support the study. The purpose of these facilitators is to attend schools to introduce the study to students, and ensure the technology is functioning so students can download the SPARX app and complete the baseline (and post-assessment) questionnaires. All facilitators go through an interview and screening process prior to selection, then attend a half day face-to-face training session before supporting the study in schools. This training provides an overview of the study and detailed information about their roles within schools, including a step-by-step guide to running the sessions. Discussion and practice are core components of the training.

Aims and objectives

We adapted the MRC framework for complex interventions to focus on effectiveness of an evidence-based intervention within a specific context of delivery (ie, schools). Overall, the objective of this process evaluation is to understand how SPARX is implemented and delivered in schools, and to identify systematic differences and variation in delivery. Specifically, the aims are:

  1. To evaluate the reach (including completion), uptake and acceptability of the intervention (school and student level).

  2. To understand the contribution of contextual factors (eg, characteristics of the outer/inner setting, intervention, individuals) on:

    • School-level fidelity to the implementation strategy. For example, different schools will likely provide different levels of study support based on available resourcing.

    • Implementation outcomes (intervention reach, uptake, acceptability), as assessed from the perspectives of school staff, teachers and students. For example, young people’s openness to receiving mental health material via an app will likely impact intervention acceptability and completion.

  3. To examine the impact of school-level variation (in implementation fidelity and outcomes) on clinical effectiveness outcomes at the school and student level. School-level clinical effectiveness is defined as changes in clinical outcomes (eg, self-reported depression) for different schools. The differing ways in which schools support and deliver the intervention will inevitably impact its effectiveness, and this evaluation will assess these differences.3 22 For example, degree of support from senior school leaders who would be expected to understand that participating in study activities is a priority for the school will likely impact effectiveness at the cluster level.

This process evaluation has been designed to capture important information from teachers, school staff and students at both the school and individual level, which may ultimately impact the effectiveness of the intervention on clinical mental health outcomes for students. Findings will provide insight into factors which support and/or hinder the implementation of digital universal mental health programmes in school settings. Knowledge gained from this process evaluation will help to inform the development of a model and guide for how to best deliver digital mental health programmes to young people in schools.

Methods and analysis

Design

This study uses a hybrid type 1 approach,23 with a focus on implementation process factors and outcomes in the context of an effectiveness trial. The evaluation is guided by the Consolidated Framework for Implementation Research (CFIR) and RE-AIM (Reach, Effectiveness, Adoption, Implementation, Maintenance) framework.24 25 In keeping with Nilsen’s categorisation of implementation theories, models and frameworks, these frameworks help us to understand different parts of the implementation process.26 The CFIR will be used to identify barriers and facilitators to intervention implementation and effectiveness.26 The CFIR is a widely used deterministic theoretical framework which has been applied in multiple settings,27 including schools (eg, 28). The CFIR identifies five major domains, including (1) the outer setting, which includes the social, political and economic context that the organisation in which implementation is occurring exists; (2) the inner setting, which includes features and characteristics of the organisation such as leadership and relative priority; (3) the characteristics of individuals, which include organisational staff knowledge and attitudes about the intervention, and their role and identification within the wider organisation; (4) the characteristics of the intervention itself and (5) implementation processes, which include the ways that the intervention will be delivered in a given context (including fidelity to the implementation strategy). Normalisation Process Theory (NPT) will be used to provide additional insights into implementation processes.29 NPT aims to identify and explain the implementation of new interventions and how they become integrated into routine care.

The RE-AIM framework will be used to evaluate the implementation outcomes, including intervention reach, uptake and acceptability/appropriateness.24 30 These outcomes also map onto the framework of implementation outcomes proposed by Proctor et al.31 32 Reach refers to the proportion of eligible participants who opened, used and completed the intervention, as well as the proportion of students from the entire cohort of eligible students in intervention schools who consented to participate. Uptake refers to the proportion of schools that were onboarded to the study (intervention and control arms), and of school staff who were willing to support the delivery of the intervention (intervention arm only). Reach and uptake also incorporate the representativeness of the sample (school level and individual level). Acceptability and appropriateness refer to the perceived agreeableness or fit of the intervention. This evaluation will incorporate how the barriers and facilitators identified through the CFIR impacted the implementation outcomes, and, in turn, how these implementation outcomes impacted effectiveness outcomes.

Logic model

Following MRC guidance,3 the research team developed a logic model for the FPS process evaluation in a series of participatory workshops. The logic model was prospectively informed by key CFIR constructs identified in previous literature as being important in school-based studies, but also feasible and appropriate to measure within the school context in the FPS. The key constructs included the outer setting, inner setting, individual characteristics and intervention characteristics. The logic model (figure 1) was developed to consider the key factors (mapped to the CFIR) that would potentially impact implementation of the intervention (mapped to RE-AIM), as well as its clinical effectiveness. The process evaluation methods, including the selection of dependent variables and design of surveys and semistructured interview guides, were derived from this logic model. See table 1 for CFIR/RE-AIM domains, key research questions, process data and data that will be collected within each domain.

Figure 1.

Figure 1

Logic model. The model shows that CFIR constructs, including school context characteristics, school organisational characteristics and individual characteristics, will influence how staff engage with the implementation strategy. The intervention itself, which includes the core cognitive–behavioural therapeutic components, is conceptualised as standardised across individuals because it is delivered digitally, follows a fixed schedule and does not incorporate tailored content. The yellow input factors are expected to vary across schools and individuals, thus influencing engagement and flexibility of the implementation strategy and in turn, implementation outcomes and student-level outcomes. The logic model and implementation plan were externally peer reviewed by an experienced and internationally recognised implementation scientist outside the team within an implementation workshop. For details on assessment of these factors, see table 1. CFIR, Consolidated Framework for Implementation Research; FPS, Future Proofing Study.

Table 1.

Process evaluation details including process data, outcome data, data type and source

CFIR and RE-AIM constructs Research aim Process or outcome data Data type and source
Outer setting
 School contextual characteristics What was the broad context of the schools in which the SPARX intervention was delivered? (aim 2) School socioeconomic index
School location (metropolitan/regional)
Publicly available information (ICSEA, GPS)
Inner setting
 School organisational characteristics What were the characteristics of the delivery environments (schools)? (aim 2)
What were the barriers and facilitators that affected buy-in, delivery and student uptake? (aim 2)
School size, type, composition, funding
School culture
Implementation climate, relative priority, competing demands, leadership, school counsellor availability and level of support, networks and communication, school culture and climate, school readiness for implementation
Publicly available information (school size, type, funding)
SSPESH (assesses school culture)
Interviews with school staff
Implementation Climate measures (staff and students)
Relative Priority measure
Competing Demands measure
Interviews with school staff
Checklists completed by trial manager
Other administrative data including number of staff allocated to assist with delivery and consent process, level of communication with the research team
 School leadership How supportive of delivering SPARX were school principals, deputy principals and executives? (aim 2) Level of support and buy-in from school leaders Interviews with school staff
Individual characteristics
 School staff What were the characteristics (including attitudes, beliefs, traits) of school staff supporting the delivery of the intervention? (aim 2)
How supportive of delivery were school staff who were involved on the ground? (aim 2)
Age, gender, current employment, role, etc
Leadership, skills, motivations, expectations, self-efficacy, expectations, time available, knowledge and beliefs about the intervention, study buy-in
Demographics questionnaire
Implementation Leadership Scale
Interviews with school staff and study facilitators
 Study facilitators How well did study facilitators attending schools support the delivery of the intervention? (aim 2) Age, employment
Skills, self-efficacy or confidence, motivations and expectations
Demographics questionnaire
Self-confidence measure
Interviews with study facilitators
 Students What were the characteristics of young people that affected intervention uptake and effectiveness? (aim 2) History of mental illness Reported by year 8 students as part of the online FPS survey
Intervention characteristics
 SPARX Were there any barriers to intervention use? (aims 2 and 3)
What do staff think about the efficacy and advantage of using the intervention? (aim 2)
Technical issues
Evidence strength and quality, relative advantage
Logs of technical issues sent through schools, parents and participants
IT data pertaining to technical problems
Informal feedback provided by schools and research staff attending schools
Relative Advantages measure
Anticipated Benefits measure
Interviews with school staff
Implementation processes
 Normalisation and integration How did school staff perceive the implementation processes? (aim 2) Coherence, cognitive participation and collective action NoMAD
 Fidelity to the implementation strategy To what extent was the intervention implemented as planned? (aim 2) School delivery of the FP programme, including changes to the plan Completed implementation checklists, emails and feedback forms
Interviews with school staff
Implementation outcomes
 Reach What was the extent to which those who were eligible to receive SPARX used it? (aim 1) Proportion of eligible participants who consented to participate; proportion who opened, used and completed the SPARX intervention
Representativeness of the student sample
Administrative data about consent Digital analytic data including usage (app downloads, installs, opens), completion rate (number of modules completed) and time spent using SPARX
Reported by year 8 students as part of the online FPS survey
 Uptake How many eligible schools participated in the study? Within those schools, how many staff supported the delivery of SPARX? (aim 1) Proportion of eligible schools that were onboarded to the study; proportion of school staff (in intervention schools) who supported SPARX
Representativeness of the sample
Administrative data
Publicly available information about schools and self-report demographic data from school staff
 Acceptability/appropriateness How satisfied were participants with the intervention? (aim 1)
How satisfied were school staff with supporting the intervention the FP programme (including SPARX)? (aim 1)
Acceptability/appropriateness of the intervention, expectations
Acceptability/appropriateness of the FP programme (including SPARX)
Reported by year 8 students as part of the online FPS survey
Informal conversations and feedback provided by year 8 students
Implementation Appropriateness measure
Interviews with school study staff
Informal conversations and feedback provided by school staff
Across domains
How might the relationship between the intervention, the staff supporting the programme and context of each school shape variation in outcomes (implementation strength metric)? (aims 2 and 3)
How might the school-level variation (in implementation fidelity and outcomes) affect clinical effectiveness outcomes (eg, self-reported depression)? (aim 3)
What key lessons emerge from this study that can be generalised to the implementation of digital mental health programmes in schools more broadly?

The process data and outcomes are mapped onto figure 2.

CFIR, Consolidated Framework for Intervention Research; FPS, Future Proofing Study; GPS, Global Positioning System; ICSEA, Index of Community Socio-Educational Advantage; IT, information technology; NoMAD, Normalisation Measure Development questionnaire; RE-AIM, Reach, Effectiveness, Adoption, Implementation, Maintenance; SSPESH, Survey of School Promotion of Emotional and Social Health.

A pilot study of eight schools conducted in 2019 was used to assess the suitability of the planned implementation strategy and process evaluation methods. We retrospectively applied our logic model to collected data to evaluate whether our methodology captured relevant constructs and sufficient variation in these constructs. We integrated learning from this pilot study using a dynamic feedback process to strengthen our methodology for the full-scale FPS phase. Some minor changes were made following this pilot which relate to the CFIR, including a greater emphasis on subconstructs such as school climate (inner setting).

School implementation strategy

The school implementation strategy was developed by the study authors. The authors drew on their experience in school-based intervention delivery and integrated feedback from teachers and school staff from several completed school-based trials that delivered digital interventions to school students.20 33 Stakeholder consultation specifically for this study involved discussions with the Department of Education, consultation with several school parent committees and consultation with both youth and parent Lived Experience Advisory Panels. This strategy was also refined following the first pilot wave involving eight intervention schools.

The school ‘study implementation teams’ are principally responsible for the implementation of the intervention and liaising with the research team. As described earlier, these teams typically incorporate at least one classroom teacher and one school counsellor to assist with the intervention delivery. Study facilitators also support the delivery of the SPARX intervention by attending schools for the first school session.

Implementation strategy

During the active intervention phase, schools allocate a minimum of 4×20 min school class sessions during which students complete the SPARX intervention. The additional three sessions may be completed either in class if permitted by the schools, or in the students’ own time. The implementation strategy developed by the project team and stakeholders to support the completion of the SPARX intervention comprises:

  • Standardised facilitator training delivered face-to-face over a half day.

  • Study facilitators are present at schools to support students in downloading the SPARX app and completing the baseline assessments.

  • School implementation team provided with a SPARX user guide and information booklet they can refer to during the sessions.

  • Schools provide students with weekly verbal reminders in homeroom class to use the app regularly.

  • Schools publish brief information about the study and mental health tips in the school weekly newsletter.

  • Schools liaise with research team weekly to troubleshoot problems.

This is the strategy being outlined to schools by the research team and adherence to this strategy will be assessed. Whether or not schools schedule more than the four mandated in-class sessions for intervention completion is flexible and can be adapted to suit the preferences of the school. Student participants receive a $A20 voucher after the intervention period to cover any phone or data-related costs incurred. See figure 2 for details of training and delivery structure.

Figure 2.

Figure 2

Details of implementation strategy training and delivery structure.

Data collection methods and participant groups for process evaluation

There are three participant groups taking part in the process evaluation: year 8 students, school staff members (eg, teaching and counselling staff) and study facilitators (see table 2). School staff members will be members who are responsible for leading the delivery of the study in their school or will be teaching staff who have a supporting role (eg, homeroom teachers who provide reminders to students to complete the intervention).

Table 2.

Summary of data forms (and collection point) provided by each of the participant groups

Participant group Questionnaire Individual interview Digital analytics
Year 8 students ✓ (Post-intervention) ✓ (Ongoing)
School staff ✓ (Post-intervention) ✓ (Post-intervention)
Facilitators ✓ (Before first school visit and after final post-intervention visit) ✓ (After final school post-intervention visit)

Four types of data will be collected and triangulated: self-report questionnaire data, digital analytic data, administrative data and qualitative interview data. Self-report questionnaire data specific to the process evaluation will be collected from staff and study facilitators using online survey software (Qualtrics) programmed by one of the authors (JRB). These questionnaires assess demographic information, school organisational characteristics, individual characteristics, intervention characteristics, implementation processes and implementation outcomes. Where no available published questionnaires were identified as being suitable for this process evaluation, we adapted existing standardised measures or developed our own items (details below). Self-report questionnaire data from year 8 students about intervention use and feedback will be collected from an online survey (details described in 5). Digital analytic data about intervention use will be captured by the purpose-built Black Dog Institute research platform, which is being used for the broader FPS. Administrative data will be collected through a range of sources, including communications with the research team. Qualitative data will be collected using semistructured interviews. All data collected will be from intervention schools only as no intervention is implemented in control schools. Student and school staff data will be collected immediately after the intervention period has been completed (ie, after the 6-week intervention stage); facilitator data will be collected both before and after the intervention period.

Implementation predictors

School organisational characteristics (inner setting)

Publicly available information

Publicly available information will be collected about school contextual characteristics, including socioeconomic level, size, location, type and funding.

General school culture

School staff will complete the Survey of School Promotion of Emotional and Social Health (SSPESH34) and a measure of organisational culture.35 The SSPESH assesses a school’s capacity to promote social and emotional well-being, and contains four subscales: Positive School Community, Student Social and Emotional Learning, Engaging Families and Supporting Students Experiencing Mental Health Difficulties. Items are rated on a 4-point Likert scale from 0 (not yet in place) to 3 (completely in place), and preliminary investigations support the scale structure and criterion-related validity.34

We adapted a 9-item questionnaire about general culture within a healthcare setting to use within the school setting.35 Items are rated on a 5-point Likert scale from 1 (strongly agree) to 5 (strongly disagree). Lower scores indicate a more positive working culture, which includes transparency, productive working relationships and receptivity to feedback. Previous investigation has shown that this measure has good internal consistency, although overlaps somewhat with other subconstructs within the inner setting (eg, learning climate34).

Implementation climate

School staff will complete a measure of implementation climate that we adapted for use in the school context.35 Items are rated on a 5-point Likert scale from 1 (strongly agree) to 5 (strongly disagree). Lower scores indicate increased staff receptivity to the programme and support within the school, including rewards and recognition. This measure has acceptable internal consistency and good discriminant validity.35

Relative priority

School staff will complete one adapted item from the School Contextual Barriers subscale of the Perceived Attributes of the Healthy Schools Approach Scale.36 The item is rated on a 5-point scale Likert scale from 1 (strongly agree) to 5 (strongly disagree). Higher scores indicate that other activities did not interfere with implementation of the FP programme.

Competing demands

Staff will complete three items, developed specifically for this study, which assess how much time they allocated, and desired to allocate, to the FP programme relative to other competing workload demands on visual analogue scales (anchor by 0=no work time, 100%=all of my work time). Higher scores indicate increased allocation of time and prioritisation of the FP programme.

Individual characteristics

Leadership

School staff will complete the Implementation Leadership Scale (ILS37). The ILS is a 12-item scale that assesses leadership behaviours that support implementation of evidence-based practices. The ILS contains four subscales, and two will be included in the current study (Knowledge Leadership and Supportive Leadership). These scales assess the degree to which the staff member was knowledgeable about and offered support to the programme. Items are rated on a 5-point Likert scale from 0 (very much so) to 4 (not at all) and will be recoded such that higher scores indicate more effective implementation leadership behaviours. This measure has excellent internal validity, convergent validity and discriminant validity.37

Self-efficacy and confidence

Study facilitators will rate their level of confidence in performing 23 different tasks during school visits on visual analogue scales, which are based on the training programme they completed (anchor by 0=not at all confident, 100=very confident). Higher scores indicate higher levels of confidence. A similar questionnaire will be repeated following their school visits, along with four short-answer questions that assess experiences with supporting staff and students during school visits.

Intervention characteristics

Relative advantage and evidence strength and quality

School staff will complete the 2-item Relative Advantages subscale and one item from the Anticipated Benefits subscale of the Perceived Attributes of the Healthy Schools Approach Scale.36 Adapted for the current study, items are rated on a 5-point scale Likert scale from 1 (strongly agree) to 5 (strongly disagree). Scores will be recoded such that higher scores indicate greater perceived advantage of the FP programme compared with others and greater perceived impacts on mental health, respectively.

Implementation processes

Normalisation and integration into routine practice

School staff will complete NPT’s accompanying tool, the Normalisation MeAsure Development questionnaire (NoMAD38). The NoMAD is a 23-item measure that assesses how professionals involved in the implementation of a complex intervention perceive implementation processes. It is a flexible measure that can be altered to more accurately describe the adoption of new interventions at the provider level. Ten items of the NoMAD will be used in this study to assess intervention buy-in from school staff, and how staff members incorporated the initiative into their standard work responsibilities. These items are grouped into three categories: coherence (ie, making sense of an intervention), cognitive participation (ie, working with others to support an intervention) and collective action (ie, the type of work that people do to support an intervention). Items are rated on a 5-point Likert scale from 1 (strongly agree) to 5 (strongly disagree). Initial validation demonstrated that the NoMAD has good face validity, construct validity and internal consistency.39

Fidelity to the implementation strategy

Fidelity will be captured through implementation checklists, communications between schools and research staff, and interviews with school staff.

Implementation outcomes

Reach

SPARX app usage data from students will allow for the assessment of app use (downloads, installs and opens), completion (number of modules completed, out of a total of seven) and time spent using SPARX. Administrative data about the number of students in intervention schools with consent to participate will also be used as an indicator of reach. Self-report data about individual characteristics (eg, gender, mental health history) will be used to gauge the representativeness of the students in the intervention schools.

Uptake

Administrative data about the proportion of schools that were onboarded to the study and the proportion of teachers who supported the intervention will be collected to provide an index of uptake by schools. Representativeness of the sample will be informed by publicly available information about school characteristics (eg, socioeconomic status, location) and self-report data about school staff characteristics (eg, role, gender).

Appropriateness

School staff will complete the Intervention Appropriateness Measure (IAM40). The IAM is a pragmatic 4-item measure of the perceived fit, relevance or compatibility of an evidence-based practice for a context, person or problem.32 Items have been adapted for this study and are rated on a 5-point Likert scale from 1 (strongly agree) to 5 (strongly disagree). Scores will be recoded such that higher scores indicate higher levels of perceived appropriateness. The IAM demonstrated strong psychometric properties in previous research.40 School staff will also complete one adapted item from the Agency Leadership Support subscale of the Barriers and Facilitators to Implementing Survey.41 The item is rated on a 5-point Likert scale from 1 (strongly agree) to 5 (strongly disagree) and will be recoded such that higher scores indicate greater compatibility of the FP programme within a particular school.

Acceptability

Year 8 students will complete an 11-item feedback questionnaire about SPARX. The questionnaire assesses three domains, including: (1) reasons for non-adherence; (2) intervention acceptability and (3) skills learnt from the intervention. Items will be quantified individually. This questionnaire has previously been used to assess the acceptability of the SPARX programme in the school context.42

Individual interviews

Interview guides for school staff and study facilitators were derived from the logic model, CFIR online resources (eg, https://cfirguide.org/evaluation-design/qualitative-data/), and the broader literature investigating the delivery of interventions in school settings43 44 (see online supplemental file 1 for interview guides). Interviews will provide information about both implementation predictors and outcomes. For school staff, questions focus on motivations and expectations about the intervention and study processes more broadly; knowledge and beliefs about the intervention; relative advantages of the intervention; self-efficacy; barriers and facilitators affecting the delivery of the intervention, including fidelity; appropriateness and acceptability of the intervention; and recommendations for future implementation. School counsellors will also be asked questions about their experiences of managing high-risk participants within the study and compatibility with existing workload. For study facilitators, questions focus on motivations and expectations about their role; confidence and competence in supporting the delivery of SPARX in schools; the quality of their training and perceptions about their ability to support the study as required.

Supplementary data

bmjopen-2020-042133supp001.pdf (131.9KB, pdf)

The interview guides allow for flexibility in questioning and diversion in responses. Questions will primarily be open-ended, with specific prompts and follow-up questions being used as necessary to encourage respondents to elaborate on their ideas and provide examples.

Patient and public involvement

The FPS was developed with key stakeholders including school personnel, school counsellors, parents, adolescents and individuals with a lived experience of mental illness. All aspects of the study design, appropriateness of outcome measures and consent procedures were developed in consultation with these stakeholder groups. The current process evaluation involves consultation with school staff members to understand their experience. The SPARX intervention itself was codesigned with young people. All study results will be shared directly with participants and their schools through lay summaries and infographics.

Procedure

The FPS has three waves of delivery (October–December 2020; April–July 2021; July–September 2021). Process evaluation data will be collected at each wave. All school students will complete relevant survey questions at the same time as completing primary measures for the FPS. All study facilitators will be asked to provide informed consent for their data to be used for research purposes following the compulsory face-to-face training session they attend with the research team at the Black Dog Institute. Data provided by facilitators will be from two surveys, one completed immediately following the training and another completed after their final school visit. All school staff from intervention schools (on average, three from each school, estimated number of intervention schools=100) who were directly involved in the study will be invited to complete one survey following the 6-week SPARX intervention period. All surveys will be completed online.

After completing their respective online questionnaires, all school staff and study facilitators will be given the option to participate in a 60-minute individual interview (completed either in person or remotely). Purposive sampling will be used to capture a range of diverse school settings and experiences. Face-to-face interviews will be held in a quiet room on school grounds or at the Black Dog Institute. Virtual interviews, which may be required due to COVID-19 restrictions, will also be conducted. All interviews will be audio-recorded and transcribed verbatim. All contact with study facilitators and school staff, including the semistructured interviews, will be made by research staff who have had no previous contact with them during the trial. This independence minimises the risk of bias and demand effects.

Data analysis plan

Quantitative

Survey questionnaire data (from approximately 100 schools) will be exported into data analytical software for analysis. Descriptive statistics will be calculated for all participant groups and will provide information about intervention use and its acceptability (questionnaire data from year 8 participants), differential implementation, fidelity to the implementation strategy, school context factors within each school (questionnaire including several short answer questions from school staff), and competence and experience in schools from those facilitating the study (facilitator questionnaires). Differences between school clusters will be assessed using analysis of variance methods. Regression models will assess the effects of contextual factors on implementation outcomes (eg, reach, uptake, acceptability).

Qualitative

Interviews will be digitally audio-recorded and transcribed verbatim. The transcripts will be checked for accuracy against the sound files as per best practice in transcription.45 46 Qualitative data will then be imported into NVivo to aid in data management and analysis. Thematic analysis will be undertaken to identify, interpret and report on the repeated patterns of meaning within the data, drawing from Braun and Clark’s classic six-phase model.47 48 An iterative and reflexive approach will be used to analyse the data, incorporating themes from the data together with topics covered in the interview guide. Two coders will independently engage in a familiarisation phase before generating codes and initial themes for a subset of the data. These codes and themes will be reviewed and discussed by the two coders, with refinement occurring via an interactive process. A senior qualitative analyst will also review the first-stage coding framework and scheme before all transcripts are coded by the first coder. Refinement will continue to occur via an interactive process until final codes and themes are realised and defined across the whole data set. Research rigour will be enhanced by a team approach to analysis, reflexive field notes and prolonged engagement with the subject matter.49

Triangulation of qualitative and quantitative data

Triangulation involves the use of multiple approaches to address a research question. The combination of several approaches increases confidence in the findings and provides a more comprehensive account of the results than individual approaches would do alone.50 In this study, reliability, validity and confidence will be maximised through cross-verification and exploration of differences between the outcomes of the various methods. This takes place in several ways:

  • Maximising validity in analysis of qualitative data within the research team using techniques such as discussing coding, constant comparison, accounting for deviant cases and systematic coding.

  • Triangulation of school staff and research assistant interviews with results from the questionnaires, exploring and accounting for differences.

  • Triangulation of self-report and interview data with publicly available information relating to school contextual characteristics (eg, school socioeconomic level and size) and school delivery of the programme, including deviations to the implementation strategy.

  • Mapping the perspectives of different stakeholders across the study (school staff, study facilitators).

Additional analyses

We will generate an ‘implementation strength’ metric for each school and its relationship to primary and secondary trial outcomes. This is an emerging evaluation approach, used mainly in low-income and middle-income countries. The approach aims to understand the degree of implementation effort needed during intervention delivery to achieve desired benefits.51 52

The implementation strength metric will provide funders and policymakers with an objective measure to monitor effectiveness of implementation if it goes beyond this trial and becomes a sustainable approach. The metric can be used to assess whether the approach to implementation meets the minimum level required to prevent the onset of mental health problems in adolescents. The metric will be based on implementation inputs and contextual factors informed by the intervention logic model and process evaluation frameworks (CFIR and RE-AIM) (eg, adoption from teachers and other staff in the school and fidelity to the intervention strategy), and will be developed with the FP research team using principal component analysis.

Ethics and dissemination

Ethics approval was obtained from the University of NSW Human Research Ethics Committee (HC180836; 21st January 2019) and the NSW Government State Education Research Applications Process (SERAP 2019201; 19th August 2019). Results will be submitted for publication in peer-reviewed journals and discussed at conferences. Our process evaluation will contextualise the trial findings with respect to how the intervention may have worked in some schools but not in others. This evaluation will inform the development of a model for rolling out digital interventions for the prevention of mental illness in schools.

Discussion

This paper describes the design of a mixed-methods process evaluation of a cRCT, the FPS. The FPS investigates the impact of a digital cognitive–behavioural therapy intervention when delivered at scale in school settings. Digital mental health programmes have tremendous potential to prevent up to 22% of depression cases and, when delivered at scale, could have population-level impacts.53 However, these programmes have not been translated into practice and policy because optimal ways to scale and deliver these interventions are not yet well understood.

As an initial step to address this issue, the current process evaluation will attend to contextual and implementation factors that vary across schools and provide a lens through which to interpret trial efficacy outcomes. We expect that results will provide a richly detailed and nuanced understanding of the key factors involved in the effective delivery of digital mental health programmes across different schools. We expect that results will not only contextualise our trial findings but will also be used as a model to guide the delivery of school-based interventions that focus on preventing mental illness more broadly. Findings from this process evaluation will indicate whether the approach used in the FPS trial is likely to be sustainable in the school environment going forward and, if so, the threshold level of support required in order to prevent depression and benefit student mental health.

The prospective publication of this protocol outlines our planned methodological approach. It also serves as a road map for other researchers on a practical way of carrying out process evaluations of complex interventions in the school setting. As is the case with the delivery of interventions across different contexts, we acknowledge that our approach has inbuilt flexibility to explore the data and make provisions for unexpected implementation factors that arise.

Limitations and strengths

There are several limitations to our process evaluation that warrant mention. First, we are not including direct observation of teachers in their role supporting the delivery of SPARX. While this would provide objective fidelity data, it requires resources beyond the scope of this project and is not representative of how the programme will be sustained following the conclusion of the trial. Second, given the complexity of the study and high demand placed on students (eg, engaging with the study apps and completing the online surveys at multiple time points over 5 years), we are not collecting in-depth qualitative data by way of interviews. Instead, we collect information about students’ perceptions of the intervention (eg, acceptability) through short self-report questions in the online survey. Second, the qualitative interviewer is a member of the research team (but not the evaluation team). Care will be taken to ensure that this staff member has no contact with schools prior to the interview visit to minimise bias. However, there remains a risk that demand effects may impact the information that is shared. Third, the process evaluation process (questionnaires and interview) will undoubtedly add to the burden placed on school staff. Given that the FPS is already placing a significant burden on the time of busy school staff, this additional component might contribute to low levels of participation.

To the point of burden on schools, one of the strengths of the design is that we have undertaken a pilot phase involving eight intervention schools and have been able to refine our processes (eg, introduce an incentive) for school staff members to participate. This process evaluation also involves the combination of qualitative and quantitative methods which will be triangulated to provide a coherent and comprehensive picture of the data. The use of the ‘implementation strength’ metric represents a novel approach in this field, borrowed from the low–middle-income country implementation science sector. The inclusion of this approach will provide important information to funders and policymakers following on from this trial, indicating the level of implementation support required to prevent mental illness and improve well-being of adolescent school students.

Notwithstanding the limitations raised above, this process evaluation will contribute to the broader knowledge base and indicate how best to deliver digital mental health prevention programmes in school settings.

Trial status

Recruitment for the trial is underway. Data collection commences in October 2020 (delayed from April 2020 due to COVID-19).

Supplementary Material

Reviewer comments
Author's manuscript

Acknowledgments

We would like to thank Professor Melanie Barwick for her advice on this manuscript, together with all the facilitators from the Training Institute for Dissemination and Implementation Research in Health (TIDIRH; Australia 2020) for their comments and feedback on this project.

Footnotes

Twitter: @ACalear, @alizaws

Contributors: HC and AW-S conceived the study and secured the funding. AW-S and JRB led the design of the process evaluation, with input from all authors (including KH, IZ and PJB), and expert guidance from RL and KB. AW-S drafted the manuscript, with assistance from JRB. ALC, MT, KM, RL, KB and HC have a continuing role in monitoring the conduct and outcomes of the process evaluation. All named authors contributed substantially to the approved final manuscript.

Funding: Funding for this project came from an NSW Ministry of Health Early-Mid Career Fellowship awarded to AW-S, and a Black Dog Institute Post-Doctoral Fellowship awarded to JRB, secured by HC. ALC is supported by NHMRC fellowships 1122544 and 1173146. PJB is supported by NHMRC Fellowship 1158707. Funding for the randomised controlled trial within which this process evaluation is embedded came from an NHMRC Project Grant Awarded to HC GNT1120646.

Disclaimer: The funding bodies had no role in any aspect of the study design or this manuscript.

Competing interests: None declared.

Patient consent for publication: Not required.

Provenance and peer review: Not commissioned; externally peer reviewed.

Supplemental material: This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

References

  • 1.Oakley A, Strange V, Bonell C, et al. . Process evaluation in randomised controlled trials of complex interventions. BMJ 2006;332:413–6. 10.1136/bmj.332.7538.413 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Greenhalgh T, Papoutsi C. Studying complexity in health services research: desperately seeking an overdue paradigm shift. BMC Med 2018;16:95. 10.1186/s12916-018-1089-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Moore GF, Audrey S, Barker M, et al. . Process evaluation of complex interventions: medical Research Council guidance. BMJ 2015;350:h1258. 10.1136/bmj.h1258 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Grant A, Treweek S, Dreischulte T, et al. . Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting. Trials 2013;14:15. 10.1186/1745-6215-14-15 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Werner-Seidler A, Huckvale K, Larsen ME, et al. . A trial protocol for the effectiveness of digital interventions for preventing depression in adolescents: the future Proofing study. Trials 2020;21:2. 10.1186/s13063-019-3901-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Tal A, Torous J. The digital mental health revolution: opportunities and risks. Psychiatr Rehabil J 2017;40:263–5. 10.1037/prj0000285 [DOI] [PubMed] [Google Scholar]
  • 7.Sawyer SM, Azzopardi PS, Wickremarathne D, et al. . The age of adolescence. Lancet Child Adolesc Health 2018;2:223–8. 10.1016/S2352-4642(18)30022-1 [DOI] [PubMed] [Google Scholar]
  • 8.Andrews G, Issakidis C, Sanderson K, et al. . Utilising survey data to inform public policy: comparison of the cost-effectiveness of treatment of ten mental disorders. Br J Psychiatry 2004;184:526–33. 10.1192/bjp.184.6.526 [DOI] [PubMed] [Google Scholar]
  • 9.Lawrence D, Johnson S, Hafekost J, et al. . The mental health of children and adolescents: report on the second Australian child and adolescent survey of mental health and wellbeing, 2015. [Google Scholar]
  • 10.Lee FS, Heimer H, Giedd JN, et al. . Mental health. Adolescent mental health--opportunity and obligation. Science 2014;346:547–9. 10.1126/science.1260497 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Werner-Seidler A, Perry Y, Calear AL, et al. . School-Based depression and anxiety prevention programs for young people: a systematic review and meta-analysis. Clin Psychol Rev 2017;51:30–47. 10.1016/j.cpr.2016.10.005 [DOI] [PubMed] [Google Scholar]
  • 12.van Zoonen K, Buntrock C, Ebert DD, et al. . Preventing the onset of major depressive disorder: a meta-analytic review of psychological interventions. Int J Epidemiol 2014;43:318–29. 10.1093/ije/dyt175 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Gulliver A, Griffiths KM, Christensen H. Perceived barriers and facilitators to mental health help-seeking in young people: a systematic review. BMC Psychiatry 2010;10:113. 10.1186/1471-244X-10-113 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Donker T, Blankers M, Hedman E, et al. . Economic evaluations of internet interventions for mental health: a systematic review. Psychol Med 2015;45:3357–76. 10.1017/S0033291715001427 [DOI] [PubMed] [Google Scholar]
  • 15.Essau CA. Frequency and patterns of mental health services utilization among adolescents with anxiety and depressive disorders. Depress Anxiety 2005;22:130–7. 10.1002/da.20115 [DOI] [PubMed] [Google Scholar]
  • 16.Rickwood DJ, Deane FP, Wilson CJ. When and how do young people seek professional help for mental health problems? Med J Aust 2007;187:S35. 10.5694/j.1326-5377.2007.tb01334.x [DOI] [PubMed] [Google Scholar]
  • 17.Kösters MP, Chinapaw MJM, Zwaanswijk M, et al. . Friends for life: implementation of an indicated prevention program targeting childhood anxiety and depression in a naturalistic setting. Ment Health Prev 2017;6:44–50. 10.1016/j.mhp.2017.03.003 [DOI] [Google Scholar]
  • 18.Lee YY, Barendregt JJ, Stockings EA, et al. . The population cost-effectiveness of delivering universal and indicated school-based interventions to prevent the onset of major depression among youth in Australia. Epidemiol Psychiatr Sci 2017;26:545–64. 10.1017/S2045796016000469 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Merry SN, Stasiak K, Shepherd M, et al. . The effectiveness of SPARX, a computerised self help intervention for adolescents seeking help for depression: randomised controlled non-inferiority trial. BMJ 2012;344:e2598. 10.1136/bmj.e2598 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Perry Y, Werner-Seidler A, Calear A, et al. . Preventing depression in final year secondary students: school-based randomized controlled trial. J Med Internet Res 2017;19:e369. 10.2196/jmir.8241 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Higgins MC, Weiner J, Young L. Implementation teams: a new lever for organizational change. J Organ Behav 2012;33:366–88. 10.1002/job.1773 [DOI] [Google Scholar]
  • 22.Clarke DJ, Godfrey M, Hawkins R, et al. . Implementing a training intervention to support caregivers after stroke: a process evaluation examining the initiation and embedding of programme change. Implement Sci 2013;8:96. 10.1186/1748-5908-8-96 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Curran GM, Bauer M, Mittman B, et al. . Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care 2012;50:217. 10.1097/MLR.0b013e3182408812 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health 1999;89:1322–7. 10.2105/AJPH.89.9.1322 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Damschroder LJ, Aron DC, Keith RE, et al. . Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009;4:50. 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci 2015;10:53. 10.1186/s13012-015-0242-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Kirk MA, Kelley C, Yankey N, et al. . A systematic review of the use of the consolidated framework for implementation research. Implement Sci 2016;11:72. 10.1186/s13012-016-0437-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Leeman J, Wiecha JL, Vu M, et al. . School health implementation tools: a mixed methods evaluation of factors influencing their use. Implement Sci 2018;13:48. 10.1186/s13012-018-0738-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Murray E, Treweek S, Pope C, et al. . Normalisation process theory: a framework for developing, evaluating and implementing complex interventions. BMC Med 2010;8:63. 10.1186/1741-7015-8-63 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Glasgow RE, Harden SM, Gaglio B, et al. . Re-aim planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health 2019;7:64. 10.3389/fpubh.2019.00064 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Proctor EK, Landsverk J, Aarons G, et al. . Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health 2009;36:24–34. 10.1007/s10488-008-0197-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Proctor E, Silmere H, Raghavan R, et al. . Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health 2011;38:65–76. 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Calear AL, Christensen H, Mackinnon A, et al. . The YouthMood project: a cluster randomized controlled trial of an online cognitive behavioral program with adolescents. J Consult Clin Psychol 2009;77:1021–32. 10.1037/a0017391 [DOI] [PubMed] [Google Scholar]
  • 34.Dix KL, Green MJ, Tzoumakis S, et al. . The survey of school promotion of emotional and social health (sspesh): a brief measure of the implementation of whole-school mental health promotion. School Ment Health 2019;11:294–308. 10.1007/s12310-018-9280-5 [DOI] [Google Scholar]
  • 35.Fernandez ME, Walker TJ, Weiner BJ, et al. . Developing measures to assess constructs from the inner setting domain of the consolidated framework for implementation research. Implement Sci 2018;13:52. 10.1186/s13012-018-0736-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Deschesnes M, Trudeau F, Kébé M. Psychometric properties of a scale focusing on perceived attributes of a health promoting school approach. Can J Public Health 2009;100:389–92. 10.1007/BF03405277 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Aarons GA, Ehrhart MG, Farahnak LR. The implementation leadership scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci 2014;9:45. 10.1186/1748-5908-9-45 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Finch TL, Girling M, May CR, et al. . Nomad: implementation measure based on normalization process theory, 2015. Available: http://www.normalizationprocess.org
  • 39.Finch TL, Girling M, May CR, et al. . Improving the normalization of complex interventions: part 2 - validation of the NoMAD instrument for assessing implementation work based on normalization process theory (NPT). BMC Med Res Methodol 2018;18:135. 10.1186/s12874-018-0591-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Weiner BJ, Lewis CC, Stanick C, et al. . Psychometric assessment of three newly developed implementation outcome measures. Implement Sci 2017;12:108. 10.1186/s13012-017-0635-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Salyers MP, Rollins AL, McGuire AB, et al. . Barriers and facilitators in implementing illness management and recovery for consumers with severe mental illness: trainee perspectives. Adm Policy Ment Health 2009;36:102–11. 10.1007/s10488-008-0200-0 [DOI] [PubMed] [Google Scholar]
  • 42.Perry Y, Calear AL, Mackinnon A, et al. . Trial for the prevention of depression (TRIPOD) in final-year secondary students: study protocol for a cluster randomised controlled trial. Trials 2015;16:451. 10.1186/s13063-015-0929-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Punukollu M, Burns C, Marques M. Effectiveness of a pilot school-based intervention on improving scottish students’ mental health: A mixed methods evaluation. J Youth Adolesc 2019;25 10.1080/02673843.2019.1674167 [DOI] [Google Scholar]
  • 44.Outhwaite LA, Gulliford A, Pitchford NJ. A new methodological approach for evaluating the impact of educational intervention implementation on learning outcomes. Int J of Meth Educ 2020;43:225–42. 10.1080/1743727X.2019.1657081 [DOI] [Google Scholar]
  • 45.Poland BD. Transcription quality. Handbook of interview research: Context and method, 2002: 629. [Google Scholar]
  • 46.Poland BD. Transcription quality as an aspect of rigor in qualitative research. Qualitative Inquiry 1995;1:290–310. 10.1177/107780049500100302 [DOI] [Google Scholar]
  • 47.Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006;3:77–101. 10.1191/1478088706qp063oa [DOI] [Google Scholar]
  • 48.Braun V, Clarke V. Reflecting on reflexive thematic analysis. Qual Res Sport, Exerc Health 2019;11:589–97. 10.1080/2159676X.2019.1628806 [DOI] [Google Scholar]
  • 49.Noble H, Smith J. Issues of validity and reliability in qualitative research. Evid Based Nurs 2015;18:34–5. 10.1136/eb-2015-102054 [DOI] [PubMed] [Google Scholar]
  • 50.Tashakkori A, Teddlie C. Handbook of mixed methods in social & behavioral research. Thousand Oaks. CA: Sage, 2003. [Google Scholar]
  • 51.Hargreaves JRM, Goodman C, Davey C, et al. . Measuring implementation strength: lessons from the evaluation of public health strategies in low- and middle-income settings. Health Policy Plan 2016;31:860–7. 10.1093/heapol/czw001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Victora CG, Black RE, Boerma JT, et al. . Measuring impact in the millennium development goal era and beyond: a new approach to large-scale effectiveness evaluations. Lancet 2011;377:85–95. 10.1016/S0140-6736(10)60810-0 [DOI] [PubMed] [Google Scholar]
  • 53.Ebert DD, Cuijpers P. It is time to invest in the prevention of depression. JAMA Netw Open 2018;1:e180335. 10.1001/jamanetworkopen.2018.0335 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary data

bmjopen-2020-042133supp001.pdf (131.9KB, pdf)

Reviewer comments
Author's manuscript

Articles from BMJ Open are provided here courtesy of BMJ Publishing Group

RESOURCES