Skip to main content
Discover Mental Health logoLink to Discover Mental Health
. 2025 Mar 4;5(1):27. doi: 10.1007/s44192-025-00151-3

A scoping review of school-based expressive writing implementation reporting practices: missed opportunities and new research directions

Janet Amos 1,, Justin Moase 2, Ingrid E Sladeczek 1
PMCID: PMC11880461  PMID: 40035928

Abstract

Background

Expressive writing (EW) interventions are an effective, flexible, and cost-efficient option for mental health promotion, making them ideally suited for resource-limited school settings. However, the effectiveness of EW interventions varies greatly across studies, which may be partly explained by how EW interventions are implemented. As school-based EW interventions become increasingly popular and more widely used, rigorous reporting of implementation can help advance this emerging field by informing how variation in implementation across studies influences intervention outcomes.

Purpose

The purpose of this scoping review was to evaluate the implementation reporting practices of EW interventions in school settings as they can profoundly impact EW effectiveness.

Methods

The present scoping review assessed the current state of fidelity of implementation (implementation) reporting in the school-based EW literature and identified areas where more rigorous reporting is needed. Out of an initial sample of 367 studies, 19 were eligible for inclusion in the review. Data were analyzed for critical issues and themes derived from Cargo et al.'s (2015) Checklist for Implementation (Ch-IMP).

Results

Overall, the results of this scoping review indicate that researchers who implement EW in school settings have not consistently assessed key implementation domains such as dose received and fidelity.

Conclusions

To address this problem, the present review adds a unique contribution to the literature by identifying how rigorous reporting of implementation can strengthen the evidence base for school-based EW interventions. Specifically, researchers can support the use of EW interventions in schools through increased implementation reporting to better understand how variability in fidelity of implementation affects treatment outcomes.

Keywords: Expressive writing, Implementation, School, Adolescents

Introduction

Adolescence marks a high-risk period for the onset of psychopathology, such as depression and anxiety [1]. At the same time, during adolescence, individuals are highly susceptible to environmental influences and are capable of remarkable adaptability [2]. As a result, adolescence is a time of considerable opportunity for intervention [3, 4]. Helping adolescents develop the tools for expressing, understanding and managing their emotions may help prevent mental health problems throughout the lifespan and support social and academic functioning in the classroom [3, 57].

Expressive Writing (EW) is a brief, individually focused psychosocial intervention that promotes emotional expression and processing of emotions through writing about stressful or emotional topics such as peer difficulties or traumatic events [8, 9]. Writing may help people interpret their experiences, search for explanations, promote insight and understanding, or reappraise their situation in a different light [9, 10]. EW interventions are an effective, flexible, and cost-effective option for mental health promotion, making them ideally suited for resource-limited settings, such as schools [9, 11]. School-based mental health programs have great potential to address young people’s mental health needs and reach students who would otherwise not have access to services [12]. In addition, promoting social and emotional development within schools also supports the academic functioning of all students [12, 13].

EW interventions for adolescents positively affect internalizing behaviour, problem behaviour, and school participation with mean g effect sizes ranging from 0.107 to 0.246 [9]. However, the effectiveness of EW varies considerably across studies with some individual studies obtaining negative results and others reporting g effect sizes as large as 1.488 [9]. This variation may be partly explained by how EW interventions are implemented [9, 10, 1416]. However, in the EW literature, insufficient methodological information and significant paradigm adaptations make it difficult to determine which component of the intervention contributed to its success.

It is crucial to provide adequate contextual information on implementation to explain why an intervention succeeded or failed, refine theories, and determine which program components are fundamental to the intervention’s success to inform whether it is feasible to implement the intervention in practice [14, 1719]. When interventions are free of serious implementation problems, effect sizes are two to three times larger than when problems are present [15].

School settings have characteristics and needs that can influence the successful implementation of an intervention [3, 18, 2023]. The organizational leadership of the school, educational policies, school philosophy, teachers’ engagement, and the school schedule are all contextual variables that may influence implementation [18, 2123]. Given the unique school context, it is important to examine the effectiveness of EW in real-world settings such as schools, rather than assume that success in research settings will carry over to community settings.

In the EW literature, much research on the effectiveness and mechanisms of EW has been examined in highly-controlled research settings [3]. Outside of research settings implementation may take place in less-than-ideal circumstances, making it difficult to implement interventions with a high level of fidelity [18]. Thus, examining the effectiveness and implementation of interventions in the context in which they will be implemented provides a more accurate understanding of an intervention’s effectiveness [15].

Objectives

Despite the importance of assessing implementation, these domains are often not reported frequently enough in primary research to be examined in meta-analyses and systematic reviews [15]. As school-based EW interventions become increasingly popular and more widely used, rigorous reporting of implementation can help advance this emerging field by informing how variation in implementation across studies influences intervention outcomes. Thus, the purpose of the present scoping review is to (a) assess the current state of implementation reporting in the school-based EW literature, (b) identify areas where more rigorous reporting is needed, and (c) provide recommendations for future research and implementation reporting practices.

Method

Inclusion criteria

Primary research studies describing EW interventions conducted in school-based settings were considered for inclusion in this scoping review. Studies must have been published after 1986, the year of Pennebaker’s seminal research on EW, and include some variation of their original written emotional disclosure task [24, 25, p. 4]. All research designs that assessed at least one outcome and were published in English were eligible for inclusion. Theses and dissertations from the last five years were also included in the review to capture the full extent of the school-based EW literature that may not have been published yet [9, 10, 26].

Search strategy

Studies were identified by searching the following databases: PsycInfo (1806 Ovid), ERIC (EBSCO), Scopus, and ProQuest Theses and Dissertations Global (last 5 years). Within PsycInfo, searches were limited by age group and within ERIC searches were limited to elementary, middle, and high school populations. Appropriate search terms were identified by reviewing the indexing terms of several recent meta-analyses focusing on EW [see ].

Searches were conducted using two groups of keywords. To identify EW interventions, the following keywords were used: expressive writing, expressive emotional writing, emotional writing, emotional disclosure, therapeutic writing, creative writing for therapeutic purposes, narrative writing, Pennebaker, written disclosure, written emotional disclosure, and written emotional expression. To identify school-based interventions, the second group of keywords was as follows: school, high school, middle school, elementary school, primary school, and secondary school. The searches were conducted on June 12th, 2018, updated in December 2022, and further studies were also identified in January 2023 by searching the reference lists of relevant articles. All papers were independently reviewed by two authors and discrepancies were discussed until an agreement was reached. Inter-rater reliability was assessed using Cohen’s kappa. Kappa tests produced a score of 0.91, indicating almost perfect agreement.

Extraction of results

Using Arksey and O’Malley's charting technique, data were extracted, synthesized, and interpreted based on key issues and themes [27]. Data extraction was independently completed by two authors and discrepancies were discussed until an agreement was reached. Themes were derived from the Program Implementation (process evaluation) subsection of Cargo et al.'s Checklist for Implementation (Ch-IMP), a theoretically informed checklist for assessing implementation practices [14]. This study focused specifically on the Program Implementation component of the Ch-IMP, as it directly aligned with the primary aim to comprehensively assess the practical aspects of implementation related to the program delivery process.

The eleven domains of the Ch-IMP were assessed in this study: (a) Recruitment, which “refers to specific information on the procedures used to recruit participants into or attract participants to the intervention”; (b) Attrition, “a measure of drop-out rates, or the proportion of participants lost during the course of an intervention or during follow up”; (c) Reach, which “refers to the degree to which the intended audience participates in an intervention by ‘their presence’”; (d) Dose Delivered, which “refers to the proportion or amount of an intervention delivered to participants often measured through frequency (e.g., twice per week), duration (e.g., duration of program in months), and intensity (e.g., total of program delivery hours)”; (e) Dose Received, which “is a characteristic of the target populations’ engagement and active participation in an intervention and is an objective measure of the extent to which participants actually utilise and interact with program strategies, materials, or resources”; (f) Fidelity, which “measures the extent to which an intervention is implemented as originally intended by program developers (e.g., adherence)”; (g) Adaptation, “the extent to which program content is intentionally or purposefully changed during implementation, from the original standard, to enhance program effectiveness”; (h) Participant Engagement, which “refers to the subjective attributes that define their participation in, interaction with or receptivity to an intervention”; (i) Provider Engagement, which “refers to the subjective attributes of program staff that can influence their capacity to deliver intervention strategies”; (j) Co-Intervention, “when interventions other than the treatment under study are applied differently to the treatment and control/comparison groups”; and (k) Contamination, “when an intervention is unintentionally delivered to participants in the control group or inadvertent failure to deliver the intervention to the experimental group” [14]. For example, in the context of expressive writing, contamination would occur if the control group is asked to write about a neutral topic, but the participants do not adhere to the instructions as intended and instead write about meaningful or emotional topics. It would also occur if the participants in the experimental condition did not write about emotional or meaningful topics, despite being instructed to do so.

Reach, recruitment, dose delivered, dose received, and fidelity were identified as the five minimum domains of implementation needed for sufficient process evaluation. These five domains provide valuable insight into the conditions under which positive and negative intervention effects occur and their causal mechanisms [14]. Supplemental information was also examined when available. Variables related to study characteristics were also recorded.

In line with Cargo et al.’s definition, the present review operationalized reach as an individual-level variable, focusing on the extent to which individuals in the target population engage with the intervention [14]. Reach was calculated through dividing the total number of individuals who participated in the intervention by the total number of individuals in the target population and multiplying the result by 100 to express it as a percentage. It is important to acknowledge that other frameworks in implementation science research define reach as a policy or organization-level variable, which assesses the extent to which the intervention is adopted or implemented within larger systems [28]. Although we recognize the broader conceptualization of reach and its relevance to policy and organizational contexts, the primary interest of this study was to examine individual participation and exposure to the intervention.

Results

Study selection and study characteristics

Searches returned 367 records, which were imported into Endnote and screened for eligibility in three phases: initial abstract and title screening, a strict full-text screening, and a data extraction phase, resulting in 19 records meeting eligibility criteria (see Fig. 1). Most studies involved adolescents and targeted outcomes related to mental health (e.g., anxiety). School setting spanned across the primary, elementary, and high school levels with three studies that examined primary or elementary students, four studies that examined middle school students, eight studies that examined high school students, and three studies that examined multiple age groups. Additional details about the included studies can be found in Table 1.

Fig. 1.

Fig. 1

PRISMA diagram of the study inclusion and exclusion process

Table 1.

Characteristics of included studies

Authors (Year) School Setting Implementor Country Population and Risk Status Outcomes
Bray et al. (2006) [30] One PS, two MS, and one HS Not Reported United States Chronic Asthma Physical health; mental health; quality of life
*Curry (2012) [30] Public HS Researcher Peru Shared Traumatic Events Mental health
Facchin et al. (2014) [43] Three vocational HS Experimenters Italy Transition to Highschool Self-concept
Giannotta et al. (2009) [44] Public MS Not Reported Italy Not at Risk Peer victimization; mental health; coping strategies
Hines et al. (2016) [32] Public HS Researchers United States Failed Standardized Math Test General/mathematics anxiety; somatic symptoms
Horn et al. (2011) [3] Public MS Researchers Germany Not at Risk Grades; school absence; negative affect
Jones et al. (2018) [38] Four public HS Researchers United States Not at Risk Goal persistence; grades
Kalantari et al. (2012) [39] School set-up for refugee children Researchers Afghanistan War Bereaved Traumatic grief
Kliewer et al. (2011) [29] Public MS Researchers United States At-Risk Urban Adolescents Adjustment; aggressive behavior; emotional lability
Margola et al. (2010) [33] Rural public HS Not Reported Italy Shared Trauma Adjustment trajectories
Muris et al. (2002a) [40] Four public PS Researcher Netherlands Anxiety Disorders Anxiety
Muris et al. (2002b) [41] Two public PS Researcher Netherlands Generalized Anxiety Disorder Anxiety
Ramirez & Beilock (2011) [34] Public HS Exam Proctor United States Not at Risk Test anxiety; exam performance
Reynolds et al. (2000) [42] Two public PS and two public HS Not Reported United Kingdom Not at Risk Mental health; strengths and difficulties
Shen et al. (2018) [37] Three public HS Not Reported China Severe Test Anxiety Test anxiety
Soliday et al. (2004) [45] Public MS Researchers United States Not at Risk Mental health; medical visits
Travagin et al. (2016) [7] Public MS Not Reported Italy Not at Risk Affect; social involvement; peer problems
Unterhitzenberger & Rosner (2014) [35] Orphanage boarding school Not Reported Rwanda Orphaned Prolonged grief; depressive symptoms
*Walter (2018) [36] Public PS Teachers United States High Math Anxiety Math achievement; math anxiety

Articles marked by an asterisk (*) indicate theses or dissertations

PS = Primary School(s); MS = Middle School(s); HS = High School(s)

Implementation rigour and reporting

While all 19 studies reported on at least one dimension of implementation, the number of implementation domains reported varied greatly, ranging from one to eight (M = 5.21, SD = 1.68). Notably, a single study reported on all five minimum domains of implementation (recruitment, reach, dose delivered, dose received, and fidelity) [29]. A comprehensive list of the domain(s) assessed by each study can be seen in Table 2.

Table 2.

Implementation domains reported in included studies

Authors (Year) RC AT R DD DR FD AD Par. EG Prov. EG Co-Int CT
Bray et al. (2005) [30]
*Curry (2012) [31]
Facchin et al. (2014) [43]
Giannotta et al. (2009) [44]
Hines et al. (2016) [32]
Horn et al. (2011) [3]
Jones et al. (2018) [38]
Kalantari et al. (2012) [39]
Kliewer et al. (2011) [29]
Margola et al. (2010) [33]
Muris et al. (2002a) [40]
Muris et al. (2002b) [41]
Ramirez & Beilock (2011) [34]
Reynolds et al. (2000) [42]
Shen et al. (2018) [37]
Soliday et al. (2004) [45]
Travagin et al. (2016) [7]
Unterhitzenberger & Rosner (2014) [35]
*Walter (2018) [36]

Articles marked by an asterisk (*) indicate theses or dissertations

RC = Recruitment; AT = Attrition; R = Reach; DD = Dose Delivered; DR = Dose Received; FD = Fidelity; AD = Adaptations; Par. EG = Participant Engagement; Prov. EG = Provider Engagement; Co-Int. = Co-Intervention; CT = Contamination

Adaptation Almost all studies (95%, n = 18) made at least one adaptation to Pennebaker and Beall's original paradigm, with a mean of 1.7 adaptations per study (SD = 0.87) [24]. A complete list of adaptations made by each study can be seen in Table 3. Twelve studies altered the dose, thirteen studies adapted the writing instructions by asking participants to write about a specific upsetting experience [7, 2933, 35, 36], positive emotions [37], or successes and failures [38], or changing the writing instructions as the sessions progressed [3941]. Four studies provided time for discussion in addition to writing, such as their anxieties [40, 41], brainstorming topics to write about [36],and writing in general [42], or psychoeducation [3].

Table 3.

Intervention adaptations reported in included studies

Authors (Year) Adaptation(s) Description of Adaptation(s)
Bray et al. (2005) [30] Writing instructions Writing about most stressful life experience
*Curry (2012) [31] Writing instructions Instructions altered and simplified for youth
Facchin et al. (2014) [43] Writing instructions Writing topic included benefits during high school transition to facilitate the use of cognitive strategies in EW
Giannotta et al. (2009) [44] Dose Dose changed to four writing sessions (two per week)
Hines et al. (2016) [32] Writing instructions Writing about feelings on math and school
Horn et al. (2011) [3] Dose; Psychoeducation Six 45-min sessions over 10 weeks; education on ER
Jones et al. (2018) [38] Writing instructions; Dose Writing topic integrated competence-building themes in accounts of successes and failures; single writing session
Kalantari et al. (2012) [39] Writing instructions Writing topic progressed from feelings about a traumatic event to giving advice to another person in the same situation
Kliewer et al. (2011) [29] Writing instructions; Dose; “Enhanced” writing Writing instructions included excerpts of poems and stories; increased writing sessions; addition of “enhanced”, culturally relevant EW group for sample demographic
Margola et al. (2010) [33] Writing instructions Structured writing about a classmate’s death
Muris et al. (2002a) [40] Writing instructions; Dose; Discussion time; Homework assignments Writing about increasingly frightening situations; six 50-min sessions with discussion time; homework journaling
Muris et al. (2002b) [41] Writing instructions; Dose; Discussion time Anne Frank’s diary used as journaling referent; 12 30-min sessions; discussion of anxieties and writing topics
Ramirez & Beilock (2011) [34] Writing instructions; Dose Writing about an upcoming exam; single session
Reynolds et al. (2000) [32] Discussion time Discussion of feelings on writing about emotional events
Shen et al. (2018) [37] Writing instructions; Dose Writing about positive emotions; 30 consecutive sessions
Soliday et al. (2004) [45] None None
Travagin et al. (2016) [7] Writing instructions; Dose Writing about peer issue; one-week interval between sessions
Unterhitzenberger & Rosner (2014) [35] Writing instructions; Dose Writing about loss of parent; one-week interval between sessions
*Walter (2018) [36]

Writing instructions; Dose;

Group brainstorming

Writing about math anxiety; five consecutive 10-min sessions; brainstorming writing topics

Articles marked by an asterisk (*) indicate theses or dissertations

ER = Emotion Regulation

Recruitment, attrition, and reach Almost all studies (84%, n = 16) provided some information related to recruitment. Most commonly, information was provided on consent (n = 10) and assent (n = 9), and incentives to participate (n = 3). Few studies discussed recruitment at the school level and as a result, it remains unclear which subgroups of schools are more or less likely to be successfully recruited to participate in EW. There may be bias in the kinds of schools that choose to participate in EW interventions as Walter notes that one school was chosen “due to the teachers’ and principal’s willingness to participate” and Jones et al. reported: “two of our schools were charter schools eager to try activities that increase social-emotional strengths” [36, p. 6, 38, p. 79].

Most studies (84%, n = 16) also provided some information on attrition [14]. However, only seven studies [3, 29, 36, 39, 4345] provided an explanation of why students dropped out or missed part of the intervention, and only six studies [29, 37, 4044] reported information on attrition rates separately for the intervention and control groups and thus were able to analyze if attrition rates were uneven between the groups.

Just over half of the studies (53%, n = 10) provided enough information to calculate reach. Five studies calculated reach, and four provided reasons why participants from the eligible population did not participate. The recruitment rates reported ranged from 45 to 100%. Very few of the RCT studies in this review (n = 15) used The Consolidated Standards of Reporting Trials (CONSORT) flow diagram to depict rates of recruitment, attrition, and reach (20%, n = 3).

Dose delivered and dose received The dose delivered was the most commonly reported implementation dimension. Nearly all (95%, n = 18) studies reported all three aspects of the dose delivered: frequency, intensity, and duration. The remaining study reported only duration, making it difficult to determine the total writing time. In contrast to the reporting of the dose delivered, only 11 studies reported the dose received. In some cases, the dose of intervention received was quite different from the dose of intervention delivered. For example, Kliewer et al. found that although the delivered intervention dose was eight writing sessions, the average dose received (i.e., average number of sessions participants actively engaged in emotional writing) was six sessions, two-thirds of the dose delivered [29]. Further information on the circumstances under which the dose delivered and dose diverged or how this impacted the intervention effects is not reported.

Fidelity Only two studies in this review examined fidelity as a distinct domain of implementation [29, 30]. The terminology used varied, with some authors using the term fidelity and others using the term treatment reliability. In Kliewer et al. and Bray, a single indicator of fidelity was used, and Bray relied on a self-report measure by the interventionist [29, 30].

Several studies did not formally assess fidelity but did provide contextual information suggestive of implementation difficulties. In one instance, Kalantari et al. described the environment as “overcrowded, noisy, with poor air-conditioning” and they noted that students complained that “they had to sit on the floor” [38, pp. 143–144]. In another case, Curry described how “the absence of supervision set off an eruption of activity in which students moved desks, socialized, and shouted out to each other across the room. Many students left for the day. The unpredictability of the students’ behavior became a challenge for the researcher” [31, p. 58]. These descriptions suggest low levels of fidelity and participant engagement associated with contextual barriers to implementation in schools, which were not formally assessed.

Participant and provider engagement Eight studies included a statement regarding participant engagement. However, half the studies provided only brief contextual statements (e.g., that the students were receptive, became engaged, or that there was good cooperation) as opposed to collecting more systematic information [3, 31, 39, 45]. Two studies assessed provider engagement [29, 41].

Co-intervention 11% of studies (n = 2) reported checks to ensure that the control and intervention groups did not have had unequal exposure to interventions other than EW [33, 45].

Contamination Just over half of studies (58%, n = 11) assessed whether experimental participants completed the intervention, whether control participants unintentionally received the intervention, or both. Most studies assessed contamination by examining the content of the writing (e.g. use of emotional words, writing about the topic requested). Of the 11 studies that assessed contamination, six determined there may have been contamination [7, 29, 32, 4244]. Four studies determined that some of the participants in the control group may have inadvertently received the intervention, as some participants assigned to neutral writing conditions wrote about emotional or meaningful topics [32, 34, 43, 44]. In addition, two studies found that some participants in the intervention group failed to receive the intervention, due to some participants who wrote about neutral topics or were noncompliant [7, 42].

Discussion

Assessment of implementation is essential for building a robust and useful evidence base [20, 46]. The present scoping review assessed the extent and nature of implementation reporting in the school-based EW literature, with the goal of identifying areas where more rigorous reporting is needed and of providing recommendations for future research.

Current state of implementation reporting and implementation reporting needs

Within the EW literature, recruitment, attrition, dose delivered, and contamination were the most frequently reported implementation domains, while fidelity, provider engagement, participant engagement, and co-intervention were the least frequently reported. Very few studies within the school-based EW literature reported on the five domains of implementation suggested as the minimum requirements. Within each implementation domain, there were models of rigorous reporting and opportunities to improve reporting practices.

About half of the studies in this review made multiple adaptations to the traditional EW paradigm. Adaptations such as increasing the dose and providing participants with specific examples or questions as part of the writing instructions increased the effectiveness of EW [10]. However, modifying more than one aspect of EW per study makes it difficult to determine which intervention component contributes to its effectiveness, and for many studies it remains unclear which intervention component or adaptation was responsible for the outcomes obtained [16].

While participant recruitment was an area of rigour, school recruitment was under reported. Qualitative information suggests that the kinds of schools that agree to participate in EW research are eager for social-emotional programming and may consequently experience fewer problems with implementation due to a motivation for the intervention to succeed. It is possible that these schools experience better outcomes than schools who are not as eager to engage in social-emotional interventions. Assessing how schools are recruited can inform whether conclusions about EW effectiveness can be extended to all school populations.

Reporting on attrition was another area of strength among the studies in this review. The frequent reporting of attrition has allowed researchers to determine that studies with lower attrition rates produced larger effects [9]. However, it remains unclear whether effects were larger in studies with low attrition rates because participants in those studies received a larger dose of the intervention or because adolescents who found the intervention unhelpful, unpleasant, or harmful left the research study [9].

Few of the RCTs in this review that did not use a CONSORT flow diagram reported information on all three aspects of recruitment. This is consistent with research by Turner, Shamseer, Altman, Schulz, and Moher who found that when researchers followed the CONSORT statement when reporting the results of an RCT they produced more complete reports [47]. EW researchers could improve the reporting of reach by following the CONSORT statement. Mendelson et al. caution that certain groups of participants may be more likely to consent than students living in vulnerable contexts whose parents are more difficult to reach to obtain consent for participation in social-emotional interventions [22]. Reporting the reasons why participants are not reached can inform whether certain subgroups of participants are systematically absent from EW research.

While dose delivered was the most frequently reported implementation dimension in this review, only 57% of studies also assessed dose received. In a school setting, where students may be absent from school during an intervention, it cannot be assumed that the dose delivered, and the dose received will be equivalent. Meta-analyses of the effects of EW have found that dosage moderated the effects of the writing [9, 10], suggesting that poor outcomes may be due to receiving an insufficient dose.

One area where rigorous reporting could be improved is the assessment of fidelity. When researchers monitor fidelity, in combination with delivering an adequate dose, and ensuring the control group does not experience contamination, effect sizes tend to be two to three times larger than when these aspects of implementation are not assessed [15]. To improve implementation reporting, Dusenbury et al. recommend that researchers use observations, multiple measures that have demonstrated good reliability and validity, and base their estimates on multiple sessions [18]. Another consideration is that in cases where the intervention is implemented by people other than research staff, such as teachers or other school professionals, levels of fidelity may be lower [20]. If EW interventions are to be scaled up in the future, it would be helpful to understand whether they can easily be implemented by persons other than the developers.

One of the 19 studies, [36], stated that there were high levels of fidelity in EW implementation by teachers trained by the researcher. This suggests that EW delivery by teachers with supervision from researchers or school psychologists may be feasible and positively affect participant engagement compared to implementation by an external researcher. However, please note that first, our interpretation of these findings regarding fidelity is based on data from a single study, and thus should be considered with caution until further research provides additional evidence, and second, no systematic measures of fidelity were reported. Implementor education and relevant experience are important considerations for successful delivery by educational providers [14]. A key implication of these findings is that a more systematic process of fidelity evaluation and reporting would have aided future researchers in understanding how EW intervention fidelity differs when delivered by teachers.

Assessing provider engagement will become increasingly important if EW is implemented by teachers or other school staff as a growing body of research has identified that implementor characteristics are related to intervention fidelity [21, 48]. Similarly, participant engagement may also be related to intervention outcomes [31]. In some cases, participants are not engaged in EW which is potentially problematic because they may not be taking part in the intervention in a consequential way [22]. For example, in one study, participants who rated the intervention as not meaningful did not experience any significant changes in their ratings of somatic symptoms over time, whereas participants who rated the intervention as extremely meaningful did [31]. Future evaluations of EW interventions would benefit from assessing participant and provider engagement to determine the impact on study outcomes.

Co-intervention is another area where implementation reporting can be bolstered. It is common for schools to offer multiple programs and services, and consequently, it is worthwhile to monitor what interventions the control group may be receiving as well as any other interventions that may be taking place concurrently within the intervention school [46]. By contrast, contamination was one of the most frequently reported implementation domains. This review found that participants do not always follow writing directions, even when given explicit instructions. Half of the studies that assessed contamination found some evidence that not all participants followed the writing directions. Accordingly, it is useful for researchers to continue to assess contamination to determine if participants received the intervention as intended.

Strengths and limitations

A strength of this scoping review was the use of a systematic search,and synthetization methodology that provided a clear picture of the current state of reporting with regard to implementation and identified areas where more rigorous reporting was needed. Another strength is the study’s focus on implementation strategies in addition to intervention effects, which provides a more comprehensive understanding of implementation and treatment outcomes in schools. Limitations include the restriction to peer-reviewed English language journal articles and theses and dissertations from the last five years, which may have excluded relevant research in other languages, theses and dissertations older than five years that remain unpublished,and grey literature. Despite these limitations, this scoping review provides a comprehensive assessment of the information readily available to researchers and practitioners seeking to learn more about the effectiveness of EW interventions and how to implement them in school settings.

Implications and future research directions

First, rigour of implementation reporting could be increased by reporting the minimum recommended domains: recruitment, reach, dose delivered, dose received, and fidelity [14, 19, 49]. If space constraints are a concern, implementation information can be reported as supplemental materials or in an accompanying manuscript focusing on the process evaluation rather than the outcomes evaluation [20]. Consistent reporting of minimum implementation domains will allow researchers to answer questions related to the optimal dose of EW as well as how feasible it is to implement EW within a school setting with high levels of fidelity (i.e., 80% to 100% adherence to the fidelity checklist). Furthermore, future researchers can gain a more comprehensive understanding of implementation practices and their impact on intervention effects through utilizing the Program Implementation and Action Model components of Cargo et al.’s Ch-IMP in tandem [14]. Examining components of the action model will provide insight into how factors such as staffing, resources, and settings impact treatment outcomes. The combined use of these components can facilitate an examination of how the practical and organizational aspects of school-based implementation interact and affect intervention outcomes.

Second, researchers could examine the potential impact of provider engagement, participant engagement, and fidelity on EW outcomes, as these domains been examined regularly as potential moderators within EW meta-analyses. These areas are worthy of future study as participant engagement, provider engagement, and fidelity have all been demonstrated to moderate intervention outcomes [15, 50]. Previous EW research with adults has used qualitative measures (e.g., emotional content analysis) to operationalize participant engagement [51]. We recommend that future school-based EW research use qualitative measures in addition to self-report methods (e.g., Likert scales that measure student interest and perceived usefulness of the intervention) to systematically evaluate how variability in engagement affects implementation outcomes. In contrast, there is a need to develop reliable and valid measures for provider engagement to capture how different providers influence implementation in schools.

Third, rigorous reporting of implementation can be used to inform what specific characteristics make EW successful [14, 15, 18, 19, 50]. Many researchers have adapted EW by altering the writing instructions or adding components such as psychoeducation. When altering EW interventions, it is important to concurrently assess implementation so that the results obtained, positive or negative, can be interpreted in context. Ultimately, rigorous reporting of implementation will support the development of a robust evidence base regarding the utility and feasibility of EW, to inform the implementation of social-emotional interventions.

Conclusion

Overall, the results of this scoping review suggest that key implementation domains such as dose received and fidelity have been overlooked in the EW literature. This is problematic as without assessing implementation it is difficult to make definitive conclusions regarding the efficacy and feasibility of implementing EW interventions as effect sizes may be over or underestimated [15, 19]. This scoping review comprehensively summarizes the current state of implementation reporting and identifies ways in which more rigorous reporting can be used to strengthen the evidence base for EW interventions.

Author contributions

JA conceptualized the review, screened articles, extracted and synthesized from included articles, and was a major contributor to writing the manuscript. JM screened articles, extracted, and synthesized information from included articles, and contributed to writing the manuscript. IES provided support and advice throughout the conceptualization and writing of the review. All authors reviewed and approved the final manuscript.

Funding

No funding was received to assist with the preparation of this manuscript.

Data availability

No datasets were generated or analysed during the current study.

Code availability

There is no analytic code associated with this study.

Declarations

Ethics approval and consent to participate

This study does not involve human participants and therefore ethics approval and informed consent were not required.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Walker EF. Adolescent neurodevelopment and psychopathology. Curr Dir Psychol Sci. 2002;11(1):24–8. [Google Scholar]
  • 2.Lee FS, Heimer H, Giedd JN, Lein ES, Estan N, Weinberger DR, Casey BJ. Adolescent mental health—opportunity and obligation. Science. 2014;346(6209):547–9. 10.1126/science.1260497. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Horn AB, Pössel P, Hautzinger M. Promoting adaptive emotion regulation and coping in adolescence: a school-based programme. J Health Psychol. 2010;16(2):258–73. 10.1177/1359105310372814. [DOI] [PubMed] [Google Scholar]
  • 4.Steinberg L. A behavioral scientist looks at the science of adolescent brain development. Brain Cogn. 2010;72(1):160–4. 10.1016/j.bandc.2009.11.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Ng ZJ, Huebner ES, Hills KJ, Valois RF. Mediating effects of emotion regulation strategies in the relations between stressful life events and life satisfaction in a longitudinal sample of early adolescents. J Sch Psychol. 2018;70(February):16–26. 10.1016/j.jsp.2018.06.001. [DOI] [PubMed] [Google Scholar]
  • 6.Smyth JM, Arigo D. Recent evidence supports emotion-regulation interventions for improving health in at-risk and clinical populations. Curr Opin Psychiatr. 2009;22:205–10. 10.1097/YCO.0b013e3283252d6d. [DOI] [PubMed] [Google Scholar]
  • 7.Travagin G, Margola D, Dennis JL, Revenson TA. Letting oneself go isn’t enough: cognitively oriented expressive writing reduces preadolescent peer problems. J Res Adolesc. 2016;26(4):1048–60. 10.1111/jora.12279. [DOI] [PubMed] [Google Scholar]
  • 8.Nazarian D, Smyth JM. An experimental test of instructional manipulations in expressive writing interventions: examining processes of change. J Soc Clin Psychol. 2013;32(1):71–96. 10.1521/jscp.2013.32.1.71. [Google Scholar]
  • 9.Travagin G, Margola D, Revenson TA. How effective are expressive writing interventions for adolescents? A meta-analytic review. Clin Psychol Rev. 2015;36(1):42–55. 10.1016/j.cpr.2015.01.003. [DOI] [PubMed] [Google Scholar]
  • 10.Frattaroli J. Experimental disclosure and its moderators: a meta-analysis. Psychol Bull. 2006;132(6):823–65. 10.1037/0033-2909.132.6.823. [DOI] [PubMed] [Google Scholar]
  • 11.Winslett PW. 2005. Positive and negative emotional writing in adolescence: gender differences, benefits, and effects on self-disclosure and intimacy. ProQuest Dissertations and Theses, Ph.D.(Dissertation/Thesis PG-). 10.3102/00346543067001043
  • 12.Paulus FW, Ohmann S, Popow C. Practitioner review: school-based interventions in child mental health. J Child Psychol Psychiatry. 2016;57(12):1337–59. 10.1111/jcpp.12584. [DOI] [PubMed] [Google Scholar]
  • 13.Durlak JA, Weissberg RP, Dymnicki AB, Taylor RD, Schellinger KB. The impact of enhancing students’ social and emotional learning: a meta-analysis of school-based universal interventions. Child Dev. 2011;82(1):405–32. 10.1111/j.1467-8624.2010.01564.x. [DOI] [PubMed] [Google Scholar]
  • 14.Cargo M, Stankov I, Thomas J, Saini M, Rogers P, Mayo-Wilson E, Hannes K. Development, inter-rater reliability and feasibility of a checklist to assess implementation (Ch-IMP) in systematic reviews: the case of provider-based prevention and treatment programs targeting children and youth. BMC Med Res. 2015;1:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41:327–50. 10.1007/s10464-008-9165-0. [DOI] [PubMed] [Google Scholar]
  • 16.Sloan DM, Marx BP. Maximizing outcomes associated with expressive writing. Clin Psychol Sci Pract. 2018;2017:1–4. 10.1111/cpsp.12231. [Google Scholar]
  • 17.Cargo M, Harris J, Pantoja T, Booth A, Harden A, Hannes K, Thomas J, Flemming K, Garside R, Noyes J. Cochrane qualitative and implementation methods group guidance series—paper 4: methods for assessing evidence on intervention implementation. J Clin Epidemiol. 2018;97:59–69. 10.1016/j.jclinepi.2017.11.028. [DOI] [PubMed] [Google Scholar]
  • 18.Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Educ Res. 2003;18(2):237–56. 10.1093/her/18.2.237. [DOI] [PubMed] [Google Scholar]
  • 19.Steckler AB, Linnan L. Process evaluation for public health interventions and research. Hoboken: Wiley; 2002. [Google Scholar]
  • 20.Feagans Gould L, Dariotis JK, Greenberg MT, Mendelson T. Assessing fidelity of implementation (FOI) for school-based mindfulness and yoga interventions: a systematic review. Mindfulness. 2016;7(1):5–33. 10.1007/s12671-015-0395-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Forman SG, Barakat NM. Cognitive-behavioral therapy in the schools: bringing research to practice through effective implementation. Psychol Sch. 2011;48(3):283–96. 10.1002/pits. [Google Scholar]
  • 22.Mendelson T, Dariotis J, Feagans Gould L, Smith A, Smith A, Gonzalez A, Greenberg M. Implementing mindfulness and yoga in urban schools: a community-academic partnership. J Child Serv. 2013;8(4):276–91. 10.1108/JCS-12-2012-0014. [Google Scholar]
  • 23.Owens JS, Lyon AR, Brandt NE, Masia Warner C, Nadeem E, Spiel C, Wagner M. Implementation Science in School Mental Health: Key Constructs in a Developing Research Agenda. Sch Ment Heal. 2014;6(2):99–111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Pennebaker J, Beall S. Confronting a traumatic event: toward an understanding of inhibition and disease. J Abnorm Psychol. 1986;95(3):274–81. 10.1037/0021-843X.95.3.274. [DOI] [PubMed] [Google Scholar]
  • 25.Pennebaker J. Writing about emotional experience as a therapeutic process. Psychol Sci. 1997;8(3):162–6. [Google Scholar]
  • 26.Baikie KA, Wilhelm K. Emotional and physical health benefits of expressive writing. Adv Psychiatr Treat. 2005;11(11):338–46. 10.1192/apt.11.5.338. [Google Scholar]
  • 27.Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol Theory Pract. 2005;8(1):19–32. 10.1080/1364557032000119616. [Google Scholar]
  • 28.Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin Pol Mental Health Mental Health Serv Res. 2011;38:65–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Kliewer W, Lepore SJ, Farrell AD, Allison KW, Meyer AL, Sullivan TN, Greene AY. A school-based expressive writing intervention for at-risk urban adolescents’ aggressive behavior and emotional lability. J Clin Child Adolescent Psychol Adolescent Psychol. 2011;40(5):693–705. [DOI] [PubMed] [Google Scholar]
  • 30.Bray MA, Kehle TJ, Peck HL, Margiano SG, Dobson R, Peczynski K, Gardner K, Theodore L. Written emotional expression as an intervention for asthma. J Appl Sch Psychol. 2006;22(1):141–65. 10.1300/J370v22n01_08. [Google Scholar]
  • 31.Curry SJ. The journal project: written expression of trauma as intervention for high school students in Ayacucho. Peru: Pepperdine University; 2011. [Google Scholar]
  • 32.Hines CL, Brown NW, Myran S. The effects of expressive writing on general and mathematics anxiety for a sample of high school students. Education. 2016;137(1):39–45. [Google Scholar]
  • 33.Margola D, Facchin F, Molgora S, Revenson TA. Cognitive and emotional processing through writing among adolescents who experienced the death of a classmate. Psychol Trauma Theory Res Pract Policy. 2010;2(3):250–60. 10.1037/a0019891. [Google Scholar]
  • 34.Ramirez G, Beilock SL. Writing about testing worries boosts exam performance in the classroom. Science. 2011;331(6014):211–3. [DOI] [PubMed] [Google Scholar]
  • 35.Unterhitzenberger J, Rosner R. Lessons from writing sessions: a school-based randomized trial with adolescent orphans in Rwanda. Eur J Psychotraumatol. 2014;5(1):24917. 10.3402/ejpt.v5.24917. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Walter H. The effect of expressive writing on second-grade math achievement and math anxiety. New York: George Fox University; 2018. [Google Scholar]
  • 37.Shen L, Yang L, Zhang J, Zhang M. Benefits of expressive writing in reducing test anxiety: a randomized controlled trial in Chinese samples. PLoS ONE. 2018. 10.1371/journal.pone.0191779. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Jones BK, Destin M, McAdams DP. Telling better stories: competence-building narrative themes increase adolescent persistence and academic achievement. J Exp Soc Psychol. 2018;76(December 2017):76–80. 10.1016/j.jesp.2017.12.006. [Google Scholar]
  • 39.Kalantari M, Yule W, Dyregrov A, Neshatdoost H, Ahmadi SJ. Efficacy of writing for recovery on traumatic grief symptoms of Afghani refugee bereaved adolescents: a randomized control trial. OMEGA J Death Dying. 2012;65(2):139–50. 10.2190/OM.65.2.d. [DOI] [PubMed] [Google Scholar]
  • 40.Muris P, Meesters C, Gobel M. Cognitive coping vs emotional disclosure in the treatment of anxious children: a pilot-study. Cogn Behav Ther. 2002;31(2):59–67. 10.1080/16506070252959490. [Google Scholar]
  • 41.Muris P, Meesters C, Van Melick M. Treatment of childhood anxiety disorders: a preliminary comparison between cognitive-behavioral group therapy and a psychological placebo intervention. J Behav Ther Exp. 2002;33:143. [DOI] [PubMed] [Google Scholar]
  • 42.Reynolds M, Brewin CR, Saxton M. Emotional disclosure in school children. J Child Psychol Psychiatry. 2000;41(2):151–9. 10.1017/S0021963099005223. [PubMed] [Google Scholar]
  • 43.Facchin F, Margola D, Molgora S, Revenson TA. Effects of benefit-focused versus standard expressive writing on adolescents’ self-concept during the high school transition. J Res Adolesc. 2014;24(1):131–44. [Google Scholar]
  • 44.Giannotta F, Settanni M, Kliewer W, Ciairano S. Results of an Italian school-based expressive writing intervention trial focused on peer problems. J Adolesc. 2009;32(6):1377–89. 10.1016/j.adolescence.2009.07.001. [DOI] [PubMed] [Google Scholar]
  • 45.Soliday E, Garofalo JP, Rogers D. Expressive writing intervention for adolescents’ somatic symptoms and mood. J Clin Child Adolesc Psychol. 2004;33(4):792–801. 10.1207/s15374424jccp3304_14. [DOI] [PubMed] [Google Scholar]
  • 46.Durlak JA. Studying program implementation is not easy but it is essential. Prev Sci. 2015;16(8):1123–7. 10.1007/s11121-015-0606-3. [DOI] [PubMed] [Google Scholar]
  • 47.Turner L, Shamseer L, Altman DG, Schulz KF, Moher D. Does use of the CONSORT Statement impact the completeness of reporting of randomised controlled trials published in medical journals? A cochrane review. Syst Rev. 2012;1(1):1–7. 10.1186/2046-4053-1-60. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Carlson JA, Engelberg JK, Cain KL, Conway TL, Geremia C, Bonilla E, Kerner J, Sallis JF. Contextual factors related to implementation of classroom physical activity breaks. Transl Behav Med. 2017;7(3):581–92. 10.1007/s13142-017-0509-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Saunders RP, Evans MH, Joshi P. Developing a process-evaluation plan for assessing health promotion program implementation: a how-to guide. Health Promot Pract. 2005;6(2):134–47. 10.1177/1524839904273387. [DOI] [PubMed] [Google Scholar]
  • 50.Garvey C, Julion W, Fogg L, Kratovil A, Gross D. Measuring participation in a prevention trial with parents of young children. Res Nurs Health. 2006;29(3):212–22. 10.1002/nur.20127. [DOI] [PubMed] [Google Scholar]
  • 51.Sabo Mordechay D, Eviatar Z, Nir B. Emotional engagement in expressive writing: clinical and discursive perspectives. Narrat Inquiry. 2021. 10.1002/nur.20127. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

No datasets were generated or analysed during the current study.

There is no analytic code associated with this study.


Articles from Discover Mental Health are provided here courtesy of Springer

RESOURCES