Abstract
Understanding how to design engaging unguided digital health interventions is key in our ability to utilize digital tools to improve access to care. Therapeutic persuasiveness (TP) is a design concept that relates to how the digital intervention features as a whole should be designed to encourage users to make positive changes in their lives, while reducing the experienced effort required from them to engage in these activities. In our previous work, we examined the user traffic of publicly available programs, finding programs' TP quality to be a reliable, robust, and stable predictor of real-world usage; however, these findings have not been subject to experimental manipulation in a controlled trial. The current study examined the impact of TP quality in digital parent training programs (DPTs) aimed at treating child's behavior problems. We conducted a pilot randomized controlled trial comparing two interventions that utilize the same evidence-based content of established DPTs, but that differ in terms of the quality of TP (standard: DPT-STD; enhanced: DPT-TP). Altogether, parents from 88 families who have a child with behavior problems were enrolled in the study. Compared to DPT-STD (n = 43), participants allocated to DPT-TP (n = 45) used the program significantly more (ps < 0.001; Cohen's ds = 0.91–2.22). In terms of program completion, 68.9 % of DPT-TP participants completed it compared to 27.9 % of DPT-STD participants. Significant differences between the interventions were also found in reported improvements in child behavior problems favoring DPT-TP (ps < 0.05; Cohen's ds = 0.43–0.54). The results point to the importance of adequate product design and the utilization of conceptual frameworks in order to improve user engagement challenges.
Keywords: User engagement, Program usage, Persuasive, Digital, Parent training, Unguided
1. Introduction
One of the main recognizable challenges in the digital health interventions field is that users poorly adhere to digital programs in their unguided format, without added human-support (e.g., (Baumel et al., 2019a; Fleming et al., 2018)). Aiming to address this challenge, over the years, more attention has been given to how mechanisms of action reflected in the software's functions impact program usage, program completion, and the intervention's effectiveness (e.g., (Perski et al., 2016; Ritterband et al., 2009; Graham et al., 2019)). Systematic reviews examining the characteristics of digital interventions have suggested that user adherence (Kelders et al., 2012), positive behavior change (Hamari et al., 2014), and program efficacy (Webb et al., 2010) can be increased by embedding a persuasive system design focused on the incorporation of behavior change techniques. For brevity, we refer to this design approach as therapeutic persuasiveness (Baumel et al., 2017a).
1.1. Therapeutic persuasiveness
We coined the term ‘therapeutic persuasiveness’ while developing Enlight, a suite of eHealth quality rating scales that was created based on a rigorous systematic review of available eHealth quality criteria (Baumel et al., 2017a). Therapeutic persuasiveness refers to how the features of a digital program, taken together, are designed to encourage users to make positive changes in their lives. This stands in contrast to persuasive system design that might be targeted at achieving goals not necessarily beneficial to the person behind the user. We identified several criteria that define a therapeutically persuasive program. The main such criteria are call to action, monitoring, ongoing feedback, and program adaptation based on user state and goal achievements (Baumel et al., 2017a) (see Fig. 1 for a conceptual model of therapeutic persuasiveness).
Fig. 1.
An illustration of therapeutic persuasiveness conceptual model. The intervention program is divided into 1-X phases, each with clear objectives related to the person's life. At each point of time the program helps the person achieve these objectives using the features above. These features are meant to optimize the effort required to change and support user's accountability during the change process.
Let us take, for example, the goal of helping parents to increase the number of positive interactions they foster with their child. From the standpoint of a traditional digital program, the main focus would be on providing psychoeducation and suggestions to increase the occurrence of positive interactions. From a therapeutic persuasiveness standpoint, however, the intervention designer should also strive to reach parents in their daily environment and make the notion of positive interactions salient in their mind above competing activities. This could be done by triggering parents at the right time to inspire them, monitoring the positive interactions with which they are engaged, and providing parents with appropriate acknowledgment and feedback based on their state (Baumel and Muench, 2021; Baumel and Faber, 2017).
From the perspective of behavior change theories, the absence of these features cannot ideally support users' self-management of desired and undesired behaviors, or their ability to respond to areas of difficulty that they may encounter during the therapeutic process (e.g., (Abraham and Michie, 2008; Doshi et al., 2003)). Congruent with these studies, meta-analyses have found that interventions that include self-monitoring components in conjunction with other components (e.g., feedback) have been significantly more effective (Dombrowski et al., 2012; Michie et al., 2009). Conceptually, programs with a higher quality of therapeutic persuasiveness should therefore be utilized to a greater extent and deemed more effective.
In correspondence with these ideas, our team examined whether the quality of unguided eHealth interventions could predict the product's real-world usage, using the Enlight suite of quality ratings (usability, visual design, user engagement, content, therapeutic persuasiveness, therapeutic alliance), each comprised of several criteria that are rated by trained reviewers who examine the product (Baumel et al., 2017a). Programs' usage metrics were gathered based on a dataset of anonymized logs from consenting users, simultaneously comparing user traffic between 30 (Microsoft Internet Explorer add-on) and 70 (SimilarWeb Pro panel) different eHealth interventions. The incorporation of therapeutic persuasiveness within the software functions was found to be the most robust and stable predictor of program usage, explaining 11 % to 42 % of the variance in program usage in the regression models (Baumel and Yom-Tov, 2018; Baumel and Kane, 2018). We also found that researchers can learn how to reliably evaluate the therapeutic persuasiveness quality of a program, achieving high inter-rater reliability rates (>0.85; (Baumel et al., 2017a)). This finding presents the transparency of this conceptual mechanism of action, which, if deemed relevant, could be useful for scholars worldwide.
Overall, past findings imply that enhancing the quality of therapeutic persuasiveness as a mechanism of action in unguided interventions may increase the intervention's acceptability and efficacy. Pilot testing whether this causal relationship exists was the focus of the current study.
1.2. The targeted intervention
This study focused on a digital parent training program (DPT) aimed at treating child behavior problems. This targeted intervention represents a classic case study for several reasons.
-
(a)
It involves a common public mental health problem. Child behavior problems are among the most prevalent types of childhood mental health problems (Egger and Angold, 2006; Keenan et al., 2007). When left untreated, behavior problems impose significant social, emotional, and economic costs, and place a burden on individuals, families, and societies (Raaijmakers et al., 2011).
-
(b)
Parent training aimed at treating child behavior problems is required in an unguided digital form to address the need for increased access to care (e.g., (Owens et al., 2002; Kazak et al., 2010)).
-
(c)
Parents poorly adhere to DPTs in their unguided form (Dadds et al., 2019; Day and Sanders, 2018). For example, in an RCT of a DPT, Day and Sanders showed that a completely unguided DPT resulted in poor module completion rates, compared to a human-supported condition of the same program (median module completion of 2 and 7, respectively) (Day and Sanders, 2018). There is a question as to whether enhancing the therapeutic persuasiveness quality of unguided DPTs could increase program completion rates as well.
1.3. The current study
This study employed a pilot, randomized controlled trial, to compare two interventions that utilize the same evidence-based content of established DPTs, but that differ in the quality of therapeutic persuasiveness. Comparing two active interventions enables us to preliminarily examine the causal link between this conceptual mechanism of action (embedded in the program prior to allocation) and beneficial outcomes.
Specifically, the main aims of this pilot study were to evaluate the impact of therapeutic persuasiveness on program usage and completion, and on the efficacy of a DPT for early onset of child behavior problems. We hypothesized that compared to parents who receive DPT-STD, parents who receive DPT-TP will exhibit higher metrics of program usage and report greater improvements in child behavior problems and parenting related variables, as measured in beneficial change between pre- and post-intervention.
2. Methods
The pilot study design was a two-arm parallel group randomized controlled trial with repeated measures. It consisted of two active intervention conditions. Randomization and trial procedures were carried out in accordance with recommended guidelines (e.g., (Higgins et al., 2011); ClinicalTrial.gov registry number: NCT05344885). In this paper we report the differences between the two interventions in usage and outcomes, as measured between pre- and post-intervention time (10 weeks from baseline). An oral presentation on this topic was conducted at ISRII-11th conference.
2.1. Participants and recruitment procedure
The study protocol was approved by the institutional review board of University of Haifa (approval number: 058/22).
2.1.1. Eligibility
Parents were eligible to participate if they reported: (a) having a child between the ages of 3 and 7 with (b) high levels of behavior problems based on the ECBI subscales (ECBI Problem ≥15 & ECBI Intensity ≥132; (Burns and Patterson, 2001)); (c) having access to a smartphone device with cellular and Internet connection. Parents were excluded if they reported that: (a) their child is in regular contact with a professional or taking medications aimed at treating behavioral or emotional problems; (b) they are currently accessing parenting support elsewhere; or that (c) their child has been diagnosed with an intellectual disability or developmental delay. Parents not eligible to participate were referred to local services.
2.1.2. Recruitment procedure
Parents were recruited from May 1, 2022 to July 7, 2022 through a Facebook advertising campaign, relevant social media parenting groups, and digital banners shared through WhatsApp. Interested parents were directed to the project website, which offered basic information about the study. Parents who left their contact details received an email with a link to a brief eligibility screener consisting of the exclusion criteria and items concerning their child's behaviors. Items were drawn from the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-V) criteria for oppositional defiant disorder, and parents who marked concerns for at least 4 behaviors were deemed as preliminary eligible. Screening, interest, and parents' understanding of the terms of the study were then confirmed by a research assistant in a phone call. Prospective participants were then directed to a web-based informed consent form. Once consent was obtained, they completed the baseline assessment battery. Eligible participants then received login credentials (a text message to their cellphone and an email to their email address that they had to use in order to successfully login). Parents were randomly assigned (using a computer-generated randomization procedure) to one of two conditions (1:1) – stratified for child gender – by an independent researcher who was blind to their assessments.
2.2. Overview of interventions
The parent training strategy for improving child behavior problems and associated dispositions leans on the notion that parenting practices play a significant role in directing children toward both appropriate and non-appropriate behaviors (Holden, 2014; McMahon and Forehand, 2005). The program protocol incorporates common components of evidence-based parent training programs and DPTs for child disruptive behaviors (e.g., (Baumel et al., 2017b; Sanders et al., 2012; Sourander et al., 2016)). Both parents are encouraged to participate, and one is assigned as the leader who oversees making sure they both adhere to the program. The protocol includes seven modules recommended to be completed within a nine-week period; each module discusses a specific theme: (1) introduction to parent training; (2) positive interactions and quality time; (3) parental emotion regulation; (4) effective routines and clear ground rules; (5) recognizing positive behaviors/ignoring minor negative behaviors; (6) overcoming disobedience; and (7) mindful parenting and conversation between partners.
Interventions are delivered through MindTools, an open-source eHealth platform that was originally developed under the name Serafin by Prof. Håvar Brendryen from The University of Oslo. The platform was adapted and further upgraded by the leading author (AB) and is available on GitHub (MindTools Israeli Upgraded Version of Serafin, n.d.). Intervention content is designed and created in the content management system, which enables the provision of web-based content, text and WhatsApp messaging, and emails. All content components can be delivered based on logic-driven rules (if-then statements; see Fig. 2).
Fig. 2.
An example of the system's admin user interface. Within each arrow, a condition can be created, which determines whether a certain node will be deployed. Blue arrows are between web pages/notifications that the participants view and interact with; grey arrows are back-end processes.
Standard DPT (DPT-STD).
The DPT-STD comprises seven 10- to 25-minute e-learning modules, each corresponding to one of the themes mentioned above. Each module's content includes videos, pictures, and texts guiding the parent through the training process, and interactive features – such as multiple-choice questions with direct feedback, and answers to frequently asked questions. At the end of each module recommendations for practicing the skills are presented. Additional features include downloadable materials, the ability to view past modules, and reminders being sent to parents who do not login to the platform for two weeks.
DPT with enhanced Therapeutic Persuasiveness quality (DPT-TP).
The DPT-TP includes all the ingredients of DPT-STD, but with additional features that correspond to the conceptual model of therapeutic persuasiveness. Each theme/module in the program comprises the learning phase described above followed by a 1–2 week focusing phase. The focusing phases were designed to help the desired therapeutic activities become salient in the parent's mind and to help the parent acquire skills in a non-judgmental manner, while avoiding the burden and potential failures that may be associated with the idea of “training”. Accordingly, the program utilizes the following features:
-
(1)
Call to action: Parents receive timely triggers (with tips or motivating notes) via text messages that are related to the specific goals and therapeutic activities of the modules they have completed. Triggers drew on accepted paradigms for the tailoring and adaptation of digital triggers (for a review see (Muench and Baumel, 2017)).
-
(2)
Monitoring and ongoing feedback: Specific practices related to the therapeutic activities/skills that were being taught are documented within the system using a brief daily report that included no more than seven logic-driven multiple-choice questions and take less than two minutes to complete (see Fig. 3). These questions ask about parental practices, that the parents were required to focus on during the day, based on a step-by-step script. This was meant to help the parent effortlessly implement the new way of thinking during their interactions with their child (Baumel and Muench, 2021). The program offers nonjudgmental feedback and recognizes desirable achievements (Badami et al., 2011).
-
(3)
Adaptation to user state: In the end of the first module (introduction) parents were asked to report about the current state at home and this was used to recommend them as to whether to complete three learning modules which were not deemed obligatory: positive interactions, emotion regulation, and effective routines. (That is, parents who reported to have many positive interactions with their child were not recommended to complete the corresponding phases). During the focusing phases parents' reports on their daily activities were used to acknowledge their success, to suggest additional actions or information, or to advance them to the next phase. Effort related to desired therapeutic activities was adapted based on graded tasks. For example, when parents learned how to overcome disobedience (module 6), they were guided to choose one specific behavior problem they would like to focus on first.
Fig. 3.
Mobile screenshot samples of a daily brief monitoring questionnaire (left screen) and personalized daily positive feedback (right screen; texts are in Hebrew).
Therapeutic persuasiveness fidelity.
We reviewed all new content found exclusively in the DPT-TP focusing phase (e.g., through tips or new suggestions) and incorporated them into the DPT-STD eLearning modules, ensuring content consistency across both interventions. Using the Enlight quality expert rating scale, we evaluated both programs. The DPT-STD design received a TP score of 2.1, which is close to “poor”, while DPT-TP achieved a score of 4.5, falling between “good” and “very good”.
2.3. Measures
The study instruments included self-reported questionnaires and data on program usage. Parents completed a demographic questionnaire at baseline. The self-report measures were administered through Qualtrics. The reported parenting related variables were completed by the parent who led the use of the intervention.
2.3.1. Eyberg Child Behavior Inventory (ECBI) (Eyberg, 1999)
Child behavior problems were assessed using the Intensity and Problem subscales of the 36-item ECBI (Burns and Patterson, 2001; Burns and Patterson, 1990). For each item, caregivers rate the intensity of the behavior (1 = never to 7 = always) and whether each behavior is a problem (0 = no; 1 = yes). In this study internal consistency scores (coefficient alphas) were as follows:
ECBI Problems, 0.82; ECBI Intensity, 0.80.
2.3.2. The Parenting Scale (PS) (Arnold et al., 1993)
Parental disciplinary behaviors in response to their child's misbehaviors were assessed using two PS subscale scores, Over-reactivity (11 items) and Laxness (10 items), which reflect effective discipline and discipline mistakes on either end, using a 7-point Likert scale. In this study internal consistency scores (coefficient alphas) were as follows: Laxness, 0.83; Over-reactivity, 0.75.
2.3.3. The Parenting Tasks Checklist (PTC) (Sanders and Woolley, 2001)
Task-specific self-efficacy was assessed using items taken from the setting self-efficacy (6 statements) and behavioral self-efficacy (6 statements) subscales. Item responses are given on a scale of 0 (Certain I can't do it) to 100 (Certain I can do it). In this study internal consistency scores (coefficient alphas) were as follows: PTC setting, 0.88; PTC behavioral, 0.91.
2.3.4. Parental Self Efficacy (Me as a Parent [MaaP])
Overall self-efficacy was assessed using the 4-item Self-Efficacy subscale of MaaP. Each item is rated on a Likert scale (1 = strongly disagree to 5 = strongly agree) (Hamilton et al., 2015). In the current study the internal consistency score (coefficient alphas) of the subscale was 0.79.
2.3.5. Alabama Parenting Questionnaire (APQ) Positive Parenting Practices (Frick, 1991)
Positive parenting practices were assessed using the APQ Positive Parenting Practices subscale (6 items). Parents were asked to rate each item on a scale of 1 (never) to 5 (always) according to how often it typically occurs in their home. In this study the internal consistency score (coefficient alphas) of the subscale was 0.93.
2.3.6. Program usage and completion rate
Measures of program usage included: number of log-in days, unique logins, and total time of use. As both parenting programs utilize e-learning modules we also present the percentage of people completing the “overcoming disobedience” module (which was deemed as the main obligatory component in the intervention), and the percentage of people completing the whole program.
2.4. Statistical analyzes
Demographic, usage, and clinical characteristics were reported as frequency and percent for categorical variables and mean and standard deviations for continuous variables. Differences between participant intervention groups were calculated using an independent samples t-tests for continuous variables and chi-square tests for categorical variables. The dependent outcome variables were calculated based on the within-subject difference between baseline and post intervention time. We also calculated pre-post difference in outcomes over time for each intervention condition separately using dependent samples t-tests. Effect size estimates were calculated based on Cohen's d for continuous variables and Crammer's V for categorical variables.
Given the pilot nature of this study we present both completer analysis (that includes only parents who completed measurements at post intervention time), and intent-to-treat analysis (that also includes participants who did not complete measurements at post-intervention). Multiple imputations (5 sets) were generated using predictive mean matching as the imputation method and with treatment condition as a Level 2 variable. All analyses were performed using SPSS, version 27.
3. Results
Participant's flow diagram is presented in Fig. 4. Overall, parents of 88 children with behavior problems enrolled to the study, of which 45 were allocated to DPT-TP and 43 to DPT-STD. At the 10-week post-intervention assessment, 8 participants in DPT-STD were lost to follow-up, which was twice the number of lost-to-follow up participants in DPT-TP. Participant demographics by study groups are presented in Table 1. Child mean age at the beginning of the intervention was 4.90 (SD = 1.32), and 56.8 % of children were males. The leading parent age was 36.52 (SD = 3.61) and in 95.5 % of the families the leading parent of the intervention was a mother. No significant differences in demographic characteristics or baseline measures were found.
Fig. 4.
Flow of participants through the trial.
Table 1.
Participant demographic characteristics by intervention condition at baseline.
| Continuous | Total (N = 88) |
DPT-TP(N = 45) |
DPT-STD(N = 43) |
t (86) | am | |
|---|---|---|---|---|---|---|
| M (SD) | M (SD) | M (SD) | ||||
| Parent age (years)a | 36.52 (3.61) | 36.42 (3.54) | 36.63 (3.72) | −0.27 | 0.79 | |
| Child age (years) | 4.90 (1.32) | 4.76 (1.24) | 5.06 (1.41) | −1.10 | 0.28 | |
| Number of children in family | 2.61 (0.91) | 2.73 (0.96) | 2.49 (0.85) | 1.26 | 0.21 | |
| Categorical | N (%) | N (%) | N (%) | χ2 | p | |
| Child gender | Male | 50 (56.8 %) | 26 (57.8 %) | 24 (55.8 %) | 0.03 | 0.85 |
| Female | 38 (43.2 %) | 19 (42.2 %) | 19 (44.2 %) | |||
| Leading parent | Male | 4 (4.5 %) | 2 (4.4 %) | 2 (4.7 %) | 0.00 | 0.96 |
| Female | 84 (95.5 %) | 42 (95.6 %) | 41 (95.3 %) | |||
| Participating | Both parents | 55 (62.5 %) | 29 (64.4 %) | 26 (60.5 %) | 0.15 | 0.70 |
| One parent | 33 (37.5 %) | 16 (35.6 %) | 17 (39.5 %) | |||
| Educationa | High-school | 13 (14.8 %) | 6 (13.3 %) | 7 (16.3 %) | 0.15 | 0.70 |
| Above | 75 (85.2 %) | 39 (86.7 %) | 36 (83.7 %) | |||
| House level incomeb | <15,000 | 16 (18.2 %) | 8 (17.8 %) | 8 (18.6 %) | 0.09 | 0.96 |
| 15,000-18,000 | 30 (34.1 %) | 16 (35.6 %) | 14 (32.6 %) | |||
| >18,000 | 42 (47.7 %) | 21 (46.7 %) | 21 (48.85 %) | |||
| Religiosity | Secular | 52 (59.1 %) | 28 (62.2 %) | 24 (55.8 %) | 4.50 | 0.11 |
| Traditional | 25 (28.4 %) | 9 (20.0 %) | 16 (37.2 %) | |||
| Religious | 11 (12.5 %) | 8 (17.8 %) | 3 (7.0 %) | |||
| Hours of work/study per weeka | Under 10 | 16 (18.2 %) | 6 (13.3 %) | 10 (23.3 %) | 3.72 | 0.29 |
| Between 10 and 29 | 13 (14.7 %) | 9 (20 %) | 4 (9.3 %) | |||
| Between 30 and 39 | 23 (26.1 %) | 10 (22.2 %) | 13 (30.2 %) | |||
| >39 | 36 (40.9 %) | 20 (44.4 %) | 16 (37.2 %) | |||
Refers to the parent leading the intervention.
In Israeli Shekel (ILS).
3.1. Program usage and completion
Program Usage and completion rates by study groups are presented in Table 2. DPT-TP participants outperformed DPT-SD in all usage metrics (ps < 0.001) with large effect size differences in usage (Cohen's d = 0.91–2.22) and medium to large effect size differences in program completion (Cramer's V = 0.39, 0.41). The mean usage time of DPT-TP participants was almost twice in comparison to DPT-STD participants and the number of unique logins was more than three times higher. Subsequently, 68.9 % of DPT-TP participants completed the program in comparison to 27.9 % of DPT-STD participants.
Table 2.
Differences between DPT-TP and DPT-STD in program usage and completion metrics.
| DPT-TP (N = 45) |
DPT-STD (N = 43) |
t (df = 86) | p | Cohen's d | |
|---|---|---|---|---|---|
| M (SD) | M (SD) | ||||
| Number of login days | 22.49 (10.16) | 5.49 (3.50) | 10.40 | <0.001 | 2.22 |
| Unique logins | 24.78 (11.66) | 6.98 (5.38) | 9.26 | <0.001 | 1.95 |
| Usage time (minutes) | 136.16 (74.55) | 77.53 (52.49) | 4.25 | <0.001 | 0.91 |
| % (n/n) | % (n/n) | χ2(df = 1) | p | Cramer's V | |
| “Overcoming disobedience” module completers | 75.6 % (34/45) | 37.2 % (16/43) | 13.17 | <0.001 | 0.39 |
| Program completers | 68.9 % (31/45) | 27.9 % (12/43) | 14.78 | <0.001 | 0.41 |
3.2. Differences in outcome variables
Differences in reported changes between baseline and follow-up are reported in Table 3. Significant effects in both interventions were found between baseline and follow-up in ECBI metrics (ps < 0.001; both completer and ITT analyzes), and in most parenting measures.
Table 3.
Descriptive statistics and differences between DPT-TP and DPT-STD in reported changes between baseline and post-intervention.
| Completer | DPT-TP |
DPT-STD |
t |
p | Cohen's d | ||
|---|---|---|---|---|---|---|---|
| Baseline M (SD) |
Follow-up M (SD) |
Baseline M (SD) |
Follow-up M (SD) |
||||
| N = 39 | N = 33 | (df = 70) | |||||
| ECBI Intensity | 157.05 (21.21) | 120.26 (27.01)⁎⁎⁎ | 155.51 (18.34) | 131.58 (25.97)⁎⁎⁎ | 1.97 | 0.026 | 0.47 |
| ECBI Problems | 22.17 (5.39) | 13.72 (8.11)⁎⁎⁎ | 22.24 (6.69) | 17.00 (8.95)⁎⁎⁎ | 1.83 | 0.035 | 0.43 |
| PS laxness | 3.27 (0.94) | 2.55 (0.90)⁎⁎⁎ | 3.13 (0.96) | 2.72 (1.01)⁎⁎ | −1.18 | 0.12 | 0.28 |
| PS over-reactivity | 3.49 (0.92) | 2.76 (0.57) | 3.50 (0.68) | 3.01 (0.99) | −1.00 | 0.16 | 0.24 |
| PTC setting | 68.74 (23.00) | 79.76 (15.40)⁎⁎⁎ | 70.67 (13.76) | 76.54 (14.15)⁎ | −1.18 | 0.12 | 0.28 |
| PTC behavioral | 55.16 (24.15) | 74.20 (17.73)⁎ | 54.43 (17.69) | 69.13 (21.48)⁎ | −0.78 | 0.22 | 0.18 |
| MaaP | 15.69 (2.49) | 16.33 (1.78)⁎⁎⁎ | 15.66 (2.39) | 16.21 (1.76)⁎⁎⁎ | −2.00 | 0.42 | 0.05 |
| APQ | 12.79 (2.39) | 13.43 (1.55)⁎ | 13.42 (1.78) | 14.00 (1.41)⁎⁎⁎ | −0.13 | 0.45 | 0.03 |
| Intent-to-treat | N = 45 | N = 43 | (df = 86) | ||||
| ECBI intensity | 157.51 (20.30) | 119.27 (26.98)⁎⁎⁎ | 155.33 (19.86) | 131.68 (22.68)⁎⁎⁎ | 2.54 | 0.007 | 0.54 |
| ECBI problems | 21.98 (5.40) | 13.15 (7.94)⁎⁎⁎ | 22.40 (6.56) | 16.90 (7.83)⁎⁎⁎ | 2.10 | 0.02 | 0.45 |
| PS laxness | 3.34 (0.94) | 2.65 (0.88)⁎⁎⁎ | 3.19 (1.00) | 2.70 (0.89)⁎⁎ | −1.02 | 0.15 | 0.22 |
| PS overreactivity | 3.49 (0.88) | 2.82 (0.57)⁎⁎⁎ | 3.52 (0.72) | 3.00 (0.87)⁎⁎ | −0.74 | 0.23 | 0.16 |
| PTC setting | 67.83 (22.34) | 77.00 (16.5)⁎⁎⁎ | 70.95 (13.30) | 76.52 (12.37)⁎⁎ | −0.93 | 0.18 | 0.20 |
| PTC behavioral | 53.87 (23.24) | 73.01 (16.97)⁎⁎⁎ | 54.57 (17.38) | 69.23 (18.75)⁎⁎⁎ | −0.94 | 0.17 | 0.20 |
| Maap | 15.46 (2.42) | 16.43 (1.69)⁎⁎ | 15.70 (2.61) | 16.20 (1.55) | −0.96 | 0.17 | 0.20 |
| APQ | 12.78 (2.26) | 13.41 (1.46)⁎ | 13.51 (1.66) | 14.00 (1.26)⁎⁎⁎ | −0.38 | 0.35 | 0.08 |
Notes. When an asterisk is presented at follow-up a significant difference between pre- and post-intervention was found using a paired sample t-test.
Significant at p < .05.
Significant at p < .01.
Significant at p < .001.
Significant differences between the interventions were found in ECBI reported improvements favoring DPT-TP. The differences were found both in completer (ps = 0.026, 0.035; Cohen's d = 0.47, 0.43) and ITT analysis (ps = 0.007, 0.02; Cohen's d = 0.54, 0.45) with medium effect size differences. No other significant differences were found between the two conditions, even though descriptively, reported improvements in all parenting variables were larger in DPT-TP condition.
4. Discussion
This study is the first to examine the casual impact of therapeutic persuasiveness quality on digital health program usage and efficacy. While most users of the therapeutic persuasiveness enhanced intervention completed it (68.8 %) less than a third (27.9 %) completed the standard program. Subsequently, while both interventions were found to be effective in fostering significant beneficial changes over time, the enhanced intervention resulted in a significantly better reduction in child behavior problems (Cohen's ds > 0.45). The comparison between two active interventions that were employed during the trial helps accounting for trial biases that exist in unguided interventions (Baumel et al., 2019b) and strengthen the promise of the findings.
The results of this study are congruent with previous research findings that point to the importance of fostering a therapeutic process that leverage digital interventions advantages (e.g. (Lattie et al., 2016; Ritterband et al., 2012)). We suggest that the fundamental difference that led to the current findings was that the enhanced product design targeted an overall quality of experience that included goal salience, monitoring and feedback, and adaptation to user state. To achieve a satisfying result (from the users experience perspective) there is an important interplay between different components. When the users are explicitly required to achieve certain goals, it does not make sense if their goal achievements are then not monitored, and then if the program does not provide any feedback to the user – and so on. While targeting a conceptual framework of design, the main challenges are its reliability and its transparency – that is, whether developers can agree on what it means, adapt it, and use it independently. As presented in the introduction the offered conceptual framework is based on validated and available scales that answer both of these challenges (Baumel et al., 2017a). Subsequently, the concept of therapeutic persuasiveness can be utilized to examine whether engagement with unguided programs – who are found to be with low scores of therapeutic persuasiveness – could be dramatically enhanced through the upgrade of intervention's quality.
Considering the dearth of studies of dynamically tailored unguided interventions (that are not only based on text messaging), one might question why there is a pronounced emphasis on guided interventions instead of exploring how unguided programs are designed. Our experience offers some insight into this. For every hour we spent on the e-learning modules of DPT-STD, we devoted ten hours to the sessions comprising DPT-TP's focusing phases. This does not account for the time and effort required to upgrade and adapt a platform capable of delivering programs with highly personalized pathways. In essence, there's a significant commitment needed to develop automated causal pathways to increase engagement, which might be partially replicable with human support.
The payoff for such an investment lies in the ability to provide these services to thousands of users or more, which is not the case in academic studies (that involve no more than a few hundred participants supported in limited research grants). However, as we learn more about non-specific engagement and outcome ingredients in multi-component digital interventions (e.g., (Baumel and Muench, 2021; Nahum-Shani et al., 2022)), we should expect more investment in the development of unguided interventions and comparative effectiveness research on constructs such as therapeutic persuasiveness.
4.1. Limitations and future directions
The study has several limitations and future directions. First, this pilot study needs to be replicated in a fully powered trial of DPTs and ideally across intervention targets as it is unclear if results are unique to DPTs.
Second, while findings point to a moderate effect size difference between interventions in child behavior problems (i.e. the primary outcome measure), they have not shown a similar magnitude of difference in parenting variables (i.e. the variables that are considered as mechanisms of change). Therefore, the exact ways in which therapeutic persuasiveness fosters change remain unknown. It would be helpful to conduct a powered trial that examines mediating effects and particularly how program usage translates to mechanisms of change (e.g., parenting variables) which then translate to desired clinical outcomes (e.g., child behaviors). It could be that mechanisms of change work differently based on the digital program design.
Third, the impact of product design on user engagement may have implications for the ways we understand the role of human support in digital interventions. Leaning on the supportive accountability and efficiency models of support (Mohr et al., 2011; Schueller et al., 2016) it could be argued that human support is provided in different failure points of the program, and that therefore, guided interventions are less needed in automated programs with enhanced interactive qualities. A possible future direction would be to carry out a factorial study design examining both the impact of therapeutic persuasiveness (with/without) and human support (with/without) on outcomes. This method of investigation could enable us to examine the extent to which the quality of the product reduces the need for human support.
Finally, more than half of the participants who left their contact details and received the screening link (456/755) did not complete it. While such numbers are not odd in digital health campaigns it would be interesting to examine the characteristics of these people and whether reach can be extended using a specific campaign targeting those early non-responders.
4.2. Conclusions
The results of this study point to the importance of intentional product design and the use of proper conceptual frameworks of non-specific factors to better address user engagement challenges. There is a considerable evidence that guided interventions are as effective as face-to-face therapy (Cuijpers et al., 2019); however, investing more effort in the proper development of unguided interventions is needed because it might result in unguided interventions that are as effective as guided interventions. As the field of digital health interventions matures, and technology keeps moving forward, it seems that the need in developing expertise in product design and addressing it well during the development phase becomes very important.
Declaration of competing interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgements
This study was partially funded by the University of Haifa. The funder had no involvement in the study or preparation of the manuscript.
References
- Abraham C., Michie S. A taxonomy of behavior change techniques used in interventions. Health Psychol. 2008;27(3):379–387. doi: 10.1037/0278-6133.27.3.379. [DOI] [PubMed] [Google Scholar]
- Arnold D.S., O’Leary S.G., Wolff L.S., Acker M.M. The Parenting Scale: a measure of dysfunctional parenting in discipline situations. Psychol. Assess. 1993;5(2):137–144. [Google Scholar]
- Badami R., VaezMousavi M., Wulf G., Namazizadeh M. Feedback after good versus poor trials affects intrinsic motivation. Res. Q. Exerc. Sport. 2011;82(2):360–364. doi: 10.1080/02701367.2011.10599765. [DOI] [PubMed] [Google Scholar]
- Baumel A., Faber K. Evaluating Triple P online: a digital parent training program for child behavior problems. Cogn. Behav. Pract. 2017;25(4):538–543. [Google Scholar]
- Baumel A., Kane J.M. Examining predictors of real-world user engagement with self-guided eHealth interventions: analysis of mobile apps and websites using a novel dataset. J. Med. Internet Res. 2018;20(12) doi: 10.2196/11491. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baumel A., Muench F.J. Effort-optimized intervention model: framework for building and analyzing digital interventions that require minimal effort for health-related gains. J. Med. Internet Res. 2021;23(3) doi: 10.2196/24905. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baumel A., Yom-Tov E. Predicting user adherence to behavioral eHealth interventions in the real world: examining which aspects of intervention design matter most. Transl. Behav. Med. 2018;5(5):793–798. doi: 10.1093/tbm/ibx037. [DOI] [PubMed] [Google Scholar]
- Baumel A., Faber K., Mathur N., Kane J.M., Muench F. Enlight: a comprehensive quality and therapeutic potential evaluation tool for mobile and web-based eHealth interventions. J. Med. Internet Res. 2017;19(3) doi: 10.2196/jmir.7270. 28325712 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baumel A., Pawar A., Mathur N., Kane J.M., Correll C.U. Technology-assisted parent training programs for children and adolescents with disruptive behaviors: a systematic review. J. Clin. Psychiatry. 2017;78(8):e957–e969. doi: 10.4088/JCP.16r11063. 28493653 [DOI] [PubMed] [Google Scholar]
- Baumel A., Muench F., Edan S., Kane J.M. Objective user engagement with mental health apps: systematic search and panel-based usage analysis. J. Med. Internet Res. 2019;21(9) doi: 10.2196/14567. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baumel A., Edan S., Kane J.M. Is there a trial bias impacting user engagement with unguided e-mental health interventions? A systematic comparison of published reports and real-world usage of the same programs. Transl. Behav. Med. 2019;9(6):1020–1033. doi: 10.1093/tbm/ibz147. [DOI] [PubMed] [Google Scholar]
- Burns G.L., Patterson D.R. Conduct problem behaviors in a stratified random sample of children and adolescents: new standardization data on the Eyberg child behavior inventory. Psych. Assess. 1990;2(4):391–397. [Google Scholar]
- Burns G.L., Patterson D.R. Normative data on the Eyberg child behavior inventory and Sutter-Eyberg student behavior inventory: parent and teacher rating scales of disruptive behavior problems in children and adolescents. Child Fam. Behav. Ther. 2001;23(1):15–28. [Google Scholar]
- Cuijpers P., Noma H., Karyotaki E., Cipriani A., Furukawa T.A. Effectiveness and acceptability of cognitive behavior therapy delivery formats in adults with depression: a network meta-analysis. JAMA Psychiatry. 2019;76(7):700–707. doi: 10.1001/jamapsychiatry.2019.0268. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dadds M.R., Sicouri G., Piotrowska P.J., Collins D.A., Hawes D.J., Moul C., Lenroot R.K., Frick P.J., Anderson V., Kimonis E.R. Keeping parents involved: predicting attrition in a self-directed, online program for childhood conduct problems. J. Clin. Child Adolesc. Psychol. 2019;48(6):881–893. doi: 10.1080/15374416.2018.1485109. [DOI] [PubMed] [Google Scholar]
- Day J.J., Sanders M.R. Do prents benefit from help when completing a self-guided parenting program online? A randomized controlled trial comparing Triple P Online with and without telephone support. Behav. Ther. 2018;49(6):1020–1038. doi: 10.1016/j.beth.2018.03.002. [DOI] [PubMed] [Google Scholar]
- Dombrowski S.U., Sniehotta F.F., Avenell A., Johnston M., MacLennan G., Araújo-Soares V. Identifying active ingredients in complex behavioural interventions for obese adults with obesity-related co-morbidities or additional risk factors for co-morbidities: a systematic review. Health Psychol. Rev. 2012;6(1):7–32. [Google Scholar]
- Doshi A., Patrick K., Sallis J.F., Calfas K. Evaluation of physical activity web sites for use of behavior change theories. Ann. Behav. Med. 2003;25(2):105–111. doi: 10.1207/S15324796ABM2502_06. [DOI] [PubMed] [Google Scholar]
- Egger H.L., Angold A. Common emotional and behavioral disorders in preschool children: presentation, nosology, and epidemiology. J. Child Psychol. Psychiatry. 2006;47(3–4):313–337. doi: 10.1111/j.1469-7610.2006.01618.x. [DOI] [PubMed] [Google Scholar]
- Eyberg S.M. Eyberg child behavior inventory and Sutter-Eyberg student behavior inventory-revised: professional manual. Psychol. Assess. Resour. 1999 [Google Scholar]
- Fleming T., Bavin L., Lucassen M., Stasiak K., Hopkins S., Merry S. Beyond the trial: systematic review of real-world uptake and engagement with digital self-help interventions for depression, low mood, or anxiety. J. Med. Internet Res. 2018;20(6) doi: 10.2196/jmir.9275. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Frick P.J. University of Alabama; 1991. The Alabama Parenting Questionnaire. Unpublished rating scale. [Google Scholar]
- Graham A.K., Lattie E.G., Mohr D.C. Experimental therapeutics for digital mental health. JAMA Psychiatry. 2019;76(12):1223–1224. doi: 10.1001/jamapsychiatry.2019.2075. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hamari J., Koivisto J., Pakkanen T. Paper Presented at: International Conference on Persuasive Technology. 2014. Do persuasive technologies persuade?-a review of empirical studies. (Padua, Italy) [Google Scholar]
- Hamilton V.E., Matthews J.M., Crawford S.B. Development and preliminary validation of a parenting self-regulation scale:“me as a parent”. J. Child Fam. Stud. 2015;24(10):2853–2864. [Google Scholar]
- Higgins J.P., Altman D.G., Gøtzsche P.C., Jüni P., Moher D., Oxman A.D., Savović J., Schulz K.F., Weeks L., Sterne J.A. The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials. BMJ. 2011;343:d5928. doi: 10.1136/bmj.d5928. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Holden G.W. Sage Publications; London: 2014. Parenting: A Dynamic Perspective. [Google Scholar]
- Kazak A.E., Hoagwood K., Weisz J.R., Hood K., Kratochwill T.R., Vargas L.A., Banez G.A. A meta-systems approach to evidence-based practice for children and adolescents. Am. Psychol. 2010;65(2):85. doi: 10.1037/a0017784. [DOI] [PubMed] [Google Scholar]
- Keenan K., Wakschlag L.S., Danis B., Hill C., Humphries M., Duax J., Donald R. Further evidence of the reliability and validity of DSM-IV ODD and CD in preschool children. J. Am. Acad. Child Adolesc. Psychiatry. 2007;46(4):457–468. doi: 10.1097/CHI.0b013e31803062d3. [DOI] [PubMed] [Google Scholar]
- Kelders S.M., Kok R.N., Ossebaard H.C., Van Gemert-Pijnen J.E. Persuasive system design does matter: a systematic review of adherence to web-based interventions. J. Med. Internet Res. 2012;14(6) doi: 10.2196/jmir.2104. 23151820 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lattie E.G., Schueller S.M., Sargent E., Stiles-Shields C., Tomasino K.N., Corden M.E., Begale M., Karr C.J., Mohr D.C. Uptake and usage of IntelliCare: a publicly available suite of mental health and well-being apps. Internet Interv. 2016;4:152–158. doi: 10.1016/j.invent.2016.06.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McMahon R.J., Forehand R.L. Guilford Press; New York, NY: 2005. Helping the Noncompliant Child: Family-based Treatment for Oppositional Behavior. [Google Scholar]
- Michie S., Abraham C., Whittington C., McAteer J., Gupta S. Effective techniques in healthy eating and physical activity interventions: a meta-regression. Health Psychol. 2009;28(6):690–701. doi: 10.1037/a0016136. [DOI] [PubMed] [Google Scholar]
- MindTools Israeli Upgraded Version of Serafin. https://github.com/inonit/serafin/commits/feature/israeli-version Retreived September 14, 2022 from.
- Mohr D.C., Cuijpers P., Lehman K. Supportive accountability: a model for providing human support to enhance adherence to eHealth interventions. J. Med. Internet Res. 2011;13(1) doi: 10.2196/jmir.1602. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Muench F., Baumel A. More than a text message: dismantling digital triggers to curate behavior change in patient centered health interventions. J. Med. Internet Res. 2017;19(5) doi: 10.2196/jmir.7463. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nahum-Shani I., Shaw S.D., Carpenter S.M., Murphy S.A., Yoon C. Engagement in digital interventions. Am. Psychol. 2022;77(7):836. doi: 10.1037/amp0000983. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Owens P.L., Hoagwood K., Horwitz S.M., Leaf P.J., Poduska J.M., Kellam S.G., Ialongo N.S. Barriers to children’s mental health services. J. Am. Acad. Child Adolesc. Psychiatry. 2002;41(6):731–738. doi: 10.1097/00004583-200206000-00013. [DOI] [PubMed] [Google Scholar]
- Perski O., Blandford A., West R., Michie S. Conceptualising engagement with digital behaviour change interventions: a systematic review using principles from critical interpretive synthesis. Transl. Behav. Med. 2016;7(2):254–267. doi: 10.1007/s13142-016-0453-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Raaijmakers M.A., Posthumus J.A., Van Hout B.A., Van Engeland H., Matthys W. Cross-sectional study into the costs and impact on family functioning of 4-year-old children with aggressive behavior. Prev. Sci. 2011;12(2):192–200. doi: 10.1007/s11121-011-0204-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ritterband L.M., Thorndike F.P., Cox D.J., Kovatchev B.P., Gonder-Frederick L.A. A behavior change model for internet interventions. Ann. Behav. Med. 2009;38(1):18–27. doi: 10.1007/s12160-009-9133-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ritterband L.M., Bailey E.T., Thorndike F.P., Lord H.R., Farrell-Carnahan L., Baum L.D. Initial evaluation of an internet intervention to improve the sleep of cancer survivors with insomnia. Psycho-Oncology. 2012;21(7):695–705. doi: 10.1002/pon.1969. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sanders M., Woolley M. PFSC; Brisbane: 2001. Parenting Tasks Checklist. [Google Scholar]
- Sanders M.R., Baker S., Turner K.M. A randomized controlled trial evaluating the efficacy of Triple P online with parents of children with early-onset conduct problems. Behav. Res. Ther. 2012;50(11):675–684. doi: 10.1016/j.brat.2012.07.004. [DOI] [PubMed] [Google Scholar]
- Schueller S.M., Tomasino K.N., Mohr D.C. Clinical Psychology; Science and Practice: 2016. Integrating Human Support Into Behavioral Intervention Technologies: The Efficiency Model of Support. [Google Scholar]
- Sourander A., McGrath P.J., Ristkari T., Cunningham C., Huttunen J., Lingley-Pottie P., Hinkka-Yli-Salomäki S., Kinnunen M., Vuorio J., Sinokki A. Internet-assisted parent training intervention for disruptive behavior in 4-year-old children: a randomized clinical trial. JAMA Psychiatry. 2016;73(4):378–387. doi: 10.1001/jamapsychiatry.2015.3411. [DOI] [PubMed] [Google Scholar]
- Webb T., Joseph J., Yardley L., Michie S. Using the internet to promote health behavior change: a systematic review and meta-analysis of the impact of theoretical basis, use of behavior change techniques, and mode of delivery on efficacy. J. Med. Internet Res. 2010;12(1) doi: 10.2196/jmir.1376. 20164043 [DOI] [PMC free article] [PubMed] [Google Scholar]




