Abstract
Accessibility of evidence-based behavioral health interventions is one of the main challenges in health care and effective treatment approaches are not always available for patients that would benefit from them. Digitization has dramatically changed the health care landscape. Although mHealth has shown promise in addressing issues of accessibility and reach, there is vast room for improvements. The integration of technical innovations and theory driven development is a key concern. Digital solutions developed by industry alone often lack a clear theoretical framework and the solutions are not properly evaluated to meet the standards of scientifically proven efficacy. On the other hand, mHealth interventions developed in academia may be theory driven but lack user friendliness and are commonly technically outdated by the time they are implemented in regular care, if they ever are. In an ongoing project aimed at scientific innovation, the mHealth Agile Development and Evaluation Lifecycle was used to combine strengths from both industry and academia in the development of ACTsmart – a smartphone-based Acceptance and Commitment Therapy treatment for adult chronic pain patients. The present study describes the early development of ACTsmart, in the process of moving the product from alpha testing to a clinical trial ready solution.
Subject terms: Lifestyle modification, Quality of life
Introduction
For many health conditions, such as chronic pain, access to evidence-based behavioral health interventions is limited due to geographical and financial reasons. Treatments that may be beneficial to restore or improve functioning are not always available. Digital solutions may contribute to more successful health care by increasing access and reach, with effects comparable to those in face-to-face treatment.1 Also, digital solutions provide the possibility of collecting both passive (system generated) and self-reported continuous data unbiased by retrospective recall, which may enable the aggregation of key information for further development of both treatment models and technical solutions.
Most digital health interventions developed within industry lack theory-based strategies known to drive behavior change,2–4 evidence-based content,5–8 and systematic efficacy-testing.9 In contrast, interventions developed within academia are usually derived from behavioral theory and evaluated scientifically.10 Unfortunately, methods such as randomized controlled trials are time-consuming and costly, which implies that the academic approach is less flexible than commercial mHealth development processes that utilize repeated rapid cycles of fine-tuning based on user feedback.11 The lengthy process of efficacy testing also prevents rapid dissemination, and novel digital interventions developed within academia are therefore at risk of being technically outdated when implemented in routine health care. Moreover, poor retention rates are common in digital solutions12 and negatively influence effect sizes.13
Chronic pain affects 18–30% of the adult population.14 Many individuals display significant reductions in daily functioning and quality of life and chronic pain is associated with elevated risks of insomnia, depression, suicidality, and anxiety.14,15 Cognitive, behavioral, and emotional factors play an important role in the maintenance of pain-related impairment16 and there is broad consensus concerning the utility of behavioral treatments for chronic pain.17,18 A vast number of smartphone applications targeting persons suffering with chronic pain have been developed during the past decade. However – as concluded in a review presenting 111 different digital pain applications19 – “Pain apps appear to be able to promise pain relief without any concern for the effectiveness of the product, or for possible adverse effects of product use”. In a more recent systematic review20 the problem raised in the 2011 review remains with the majority of the apps being simplistic; they lack involvement of health care professionals in the development, have no rigorous testing for efficacy on pain-related health outcomes, and lack a theoretical or evidence-based framework.20 Acceptance and Commitment Therapy (ACT; as first described by Hayes, et al.21) has strong empirical support for chronic pain.22,23 Despite the scientific support, access to ACT treatment is still limited, in part because treatment is usually provided face-to-face by therapists with specific training.
mHealth interventions can improve access and reach of effective behavioral-based pain treatment to those suffering with chronic pain. The digital format makes it possible to use automated and tailored messages, to send reminders and provide instant feedback as well as to collect both passive and self-reported data unbiased by retrospective recall, which allows for close follow up and research on treatment.24 To ascertain that the intervention is user friendly, effective, and up to date, there is a need for an approach that combines the strengths of the industry’s rapid development process and academia’s theory-based approach and efficacy testing. The mHealth Agile Development & Evaluation Lifecycle25 provides a framework for rapid and sustainable mHealth development, evaluation, and implementation.
The overall objective of the present project was to develop a digital solution (ACTsmart) that is user friendly, flexible, effective, accepted by the users, solicits retention, and is designed for continuous data collection while following individual participants in their daily life. The development team applied the mHealth Agile Development & Evaluation Lifecycle to promote continuous development, fine-tuning of treatment content, and data driven decision-making. Development is based on a series of short iterations, with alpha testing on a small sample of end-users followed by multiple re-iterations and testing until a satisfactory level was reached, which was subsequently beta tested on a sample of naïve end-users.
The present study describes phase one and two (alpha and beta testing) of the lifecycle. The aim of the study was twofold. Firstly, to document the development process (alpha and beta testing) and the gained insights, to make this knowledge available to other developers and scientists. Secondly, to scientifically evaluate and optimize ACTsmart to make it a clinical trial ready solution. In the Beta-testing the following feasibility aspects were evaluated (1) if ACTsmart was accepted by users (patients and therapists) as a means to deliver treatment, (2) to what extent patients interacted with the solution, and (3) if an ACTsmart delivered treatment was practically and technically feasible for patients and therapists.
Results
For description of features, functionality and screenshots of the ACTsmart patient interface, please see Methods.
Alpha testing (phase 1)
In total, 15 individuals – nine chronic pain patients and six therapists – participated in the alpha testing of development phase 1. The alpha tests focused on user friendliness and comprehensibility. Figure 1 shows one of three patient personas that were generated from early user experience (UX) interviews describing demographic information, needs and motivations, characteristics and pain behaviors. Personas help designers create an understanding of the potential end users, and keep the end users in mind during the design process.
Figure 2 shows paper sketches that were used in alpha testing of the patient interface. Each alpha test was documented and used to guide following iterations of the tested function.
Figure 3 shows an example of documented insights after testing a specific feature. The feature, content, or design was reiterated and re-tested until all alpha testers no longer found critical or blocking issues, and the development team considered all core features and functionality to be implemented. The solution was then considered a beta-ready product.
Beta testing (phase 2)
The vast majority of patients included in the beta testing reported being satisfied with the treatment content and found it sufficiently short, comprehensible and with language not being too difficult. Many patients appreciated being able to listen to all written content. However, some individuals perceived the amount of exercises overwhelming and suggested a reduced number. Being able to save and revisit own written reflections was also suggested to be highly meaningful to monitor change and progress. Some beta patients appreciated the flexible format of the treatment program (being able to choose the order of work sections and exercises) while others preferred a clearer direction on when to do what.
The values-section was perceived as the least satisfying to many patients in the beta testing, due to difficulties understanding purpose and function, instructions and/or exercises.
Most patients found the treatment motivating, but what feature or part of the treatment they reported motivated them the most varied between the exercises, value work, and contact with a therapist.
Insights for future development to move ACTsmart from beta-ready product to clinical trial ready product was gathered by the development team based on beta testing. Of these, some were considered research questions to address in future studies, while others were used to guide immediate improvements. Examples of the latter were the suggestion to have the possibility to see previous reflections on exercises, to clarify the expected level of work effort during each week of the treatment program, and how to restructure and simplify the values section. An example of an iteration post-beta trial, based on beta user insights can be seen in Figs 4 and 5, where Fig. 4 shows the steps through formulation of values after approval of alpha testers, while Fig. 5 shows the steps through formulation of values and the iterations that followed based on post beta interviews. The changes included restructuring and simplifying the texts as well as adding more steps in the formulation of life values, goals, and sub-goals.
Therapists involved in the beta testing described lacking a clear structure to guide the patients through the treatment. Also, they reported a number of therapist tasks being time consuming, such as searching for specific content referred to by the patient. Still, all therapists involved in the beta testing expressed that they found ACTsmart to be mostly user friendly and wanted to continue using it as a clinical tool. Implications for future development based on therapist beta-testers were for example to make treatment content searchable from therapist interface, develop the possibility to send both chat messages and reminders from the same platform and to improve clarity in the treatment manual. Detailed results of individual UX-interviews post treatment, as well as implications for further development and testing, are summarized in Table 1 (patients) and Table 2 (therapists).
Table 1.
Acceptability question | Beta user insight | Implications |
---|---|---|
Satisfaction with treatment content |
• Positive to the text-based content (sufficiently short, comprehensible, not too psychological). • Appreciated being able to listen to all written content. • Most participants found the exercises helpful. • The amount of exercises was overwhelming. • Annoying to not be able to see own previous reflections of exercises. |
• Develop possibility to see previous reflection on exercise. Alpha-test and re-iterate. • Run beta trial with reduced amount of exercises. • Run beta trial that starts with fewer exercises, dispensing more during the course of the treatment. |
Satisfaction with treatment format |
• Some appreciated the free format and had no difficulties navigating through the intervention, others lacked structure and clarity. • One participant felt stressed by not knowing how much time or effort that was required. • Some participants found it hard to know when to proceed from theory and exercises to values and exposure. |
• Develop “bulletin board” on the start page with “tip of the week” and current treatment week. Alpha-test and re-iterate. • In therapist treatment manual clarify expected work effort during the current week with instructions to inform the participants continuously. Alpha-test and re-iterate. |
Satisfaction with values section |
• Most respondents experienced the values section as difficult initially. • Some participants found layout and examples helpful while others found it to be even more confusing and unclear. • A few participants did not understand the connection between test of prioritized life values and later values work. • Some found formulation of values and the possibility to tick goals and steps as motivating while others did exposure/behavior change without ticking goals and steps in the application. |
• Restructure and simplify values section. Alpha-test and re-iterate. |
Treatment’s ability to motivate |
• Many found the exercises motivating. • Many found the values work helpful as direction for change. • Some emphasized the possibility of receiving support from their therapist as motivating. • Most positive with ACT as form of treatment. |
• No immediate development, alpha-test or iterations planned based on this feasibility area. |
Table 2.
Acceptability question | Beta user insight | Implications |
---|---|---|
Satisfaction with treatment format |
• Lacked a clear path that guided participants through treatment. • Too much work directed at suggesting to the patients what to do next. |
• Develop “bulletin board” on the start page with “tip of the week” and which treatment week it is. Alpha-test and re-iterate. |
Intent to continue use in clinical work | • All therapists wanted to continue using ACTsmart as a clinical tool, as single treatment contact and/or supplement to face-to-face treatment to give/monitor homework and/or reduce number of sessions face-to-face. |
• No further development, alpha test or iterations planned based on this feasibility area. • Implementation studies in various clinical settings with varying levels of expertize in clinicians. • Studies on blended care approach combining face-to-face treatment with ACTsmart. |
Use of therapist time |
• Time consuming to scroll through treatment content to answer content-specific questions. • Inefficient to send text messages from different platform. • Inefficient to need to log in to see new patient activities in treatment, notification function suggested. |
• Make content available and searchable from therapist interface. Alpha-test and re-iterate. • Investigate regulatory possibilities to send text messages from treatment platform. • Investigate regulatory possibilities to use push notifications (to patients). • Decision to not notify therapists on all treatment activity by push notifications due to protection of work/life balance. |
User friendliness |
• Therapists perceived design, format and most content user-friendly for participants but not the on-boarding process. • Therapists perceived the expected work effort for the patients unclear. • Therapists suggest emphasizing that the treatment progress requires patient engagement, e.g. repeated exercises. |
• Develop process for on-boarding, including expected work load and level of engagement for patients. Alpha-test and re-iterate. Beta test in clinical trial. |
Supports communication with patients |
• Sparse communication from some (low activity) patients. • Lacked total overview of patient’s treatment activity due to immaturity of therapist interface which complicated providing specific/relevant feedback. |
• Further technical development of therapist interface. Alpha-test and re-iterate. • Rewrite treatment manual with actions to identify and reach inactive patients at earlier stage. • Develop technical solution to flag uncompliant patients. • Alpha-test and re-iterate the above. • Beta-test in clinical trial. |
Of the 31 patients that started treatment during beta testing, 28 (90.3%) completed treatment according to our pre-defined criteria of completion. Patients completed on average 84% of treatment content, and 26 (84%) of the patients formulated values and reported behavior changes towards at least one value. On average, patients sent 6.2 chat messages to their therapist (range 0–18). Many patients requested to have access to the application and material post treatment but only 6 (21%) of completers logged in during the 12 months they had access to the system after the end of the treatment period. Usage data is summarized in Table 3.
Table 3.
Feasibility area | Result |
---|---|
Usage, n = 31 | |
Completion, n (%) | 28 (90%) |
Completed treatment contenta, m (median) | 84% (90%) |
Formulated values and reported valued action, n (%) | 26 (84%) |
Number of chat messages to therapist, m [range] | 6.2 [0–18] |
Logins after end of treatment among completers, % (n) | 6 of 28 (21%) |
Practicality, n = 31 | Mean [range] |
Therapist minutes per patient | 127 [17–254] |
Text messages reminders outside platform (per patient) | 1.94 [0–6] |
Therapist phone calls (per patient) | 0.29 [0–1] |
Technical feasibility | n (percent) |
No of cases that required second lineb support | 18 |
Regarding patient interface | 11 (61%) |
Regarding therapist interface | 7 (39%) |
Reason for support need | |
Technical bug | 12 (67%) |
User error | 4 (22%) |
Missing function in therapist interface | 2 (11%) |
Device used, n = 16 | n (percent) |
Smartphone only | 7 (44%) |
Smartphone and computer | 4 (25%) |
Smartphone and tablet | 2 (13%) |
aRefers to completion of all available content, text-based or exercises.
bFirst line support was the supervising psychologist, second line support was technical staff.
All therapists interacted with patients and treatment content in a regular browser on a desktop computer (no additional software was required). Double authentication was used via the therapist’s smartphone.
In total, therapists spent on average ~2 h per patient throughout the treatment, and on average ~16 min per patient/week (range 2–32). Therapists sent in total 66 text message reminders outside the platform (push notifications were not possible at this stage of the development) (m = 1.94, range 0–6) and made in total ten phone calls (m = 0.29, range 0–1) during the course of the treatment. The most commonly used device by patients was a smartphone (44%), and the majority of the work performed in treatment was carried out at home (57%). In the beta testing, patients reported no technical issues that were specific for a certain device or brand. For further data on practicality, see Table 3.
Discussion
ACTsmart is feasible with regards to usage, acceptability, and practicality, which warrants subsequent studies to evaluate the effect of this digital intervention. Importantly, the feasibility results suggest that the structure of the intervention was well received but also provided extensive feedback on what could be further improved to meet the needs of the end users before moving forward to test the effects in clinical trials. Positive aspects of the treatment and the digital solution that was reported by the end users was the micro-learning format, the use of everyday language, the opportunity to choose whether to read or listen to content, as well as acceptance of ACT when delivered digitally via smartphone. Aspects of the treatment that were reported as less satisfactory was for instance the inability to save and revisit own responses, as well as difficulties in planning how much work to put in and when to do what work. Also, the amount (or dose) of the material and exercises seemed to be important; too little was perceived as insufficient and too much was considered overwhelming. Moreover, although many of the patients in beta-testing requested continued access to the treatment material after the regular treatment phase was over, few continued to log in after the active treatment period. The therapist contact, as well as a clear treatment time frame, seems to be important to patients’ acceptability and attrition in treatment. However, this should be addressed in further studies by for example comparing guided and unguided treatment.
The intervention and delivery format were well received also among the therapists, and they reported several benefits when using ACTsmart in their clinical work. For example, the digital format promoted continuous work with behavior change more clearly compared to the usual face-to-face sessions. However, therapists involved in the present study wanted a more detailed therapist manual. Also, it was reported that further developments should make it easier to navigate within the treatment content to be able to more quickly respond to content-specific questions from patients.
In summary, many of the issues that came up in both alpha and beta testing were universal in nature, and did not specifically reflect chronic pain and associated symptoms, and should be considered when developing digital solutions aimed at behavior change or management of other chronic diseases.
The present study also illustrates the usefulness of the mHealth Agile Development & Evaluation Lifecycle, where the agile process allowed for a continuous development of the technical solution and the intervention until satisfying levels of acceptance and practicality was reached. The first draft of the treatment and technical solution that was outlined by the development team in the preparation phase changed radically during alpha testing, while beta-testing mainly provided information regarding minor adjustments and fine tuning, testing of practical and technical aspects that was not detectable in the alpha test phase, as well as generating ideas for future research questions. To go directly to clinical trial with a solution that has not been previously tested or guided by actual end users could result in a costly, time consuming and non-user-friendly solution that might risk poor retention rates. There might also be a heightened risk that a potentially effective treatment is falsely rejected when poor results are due to a poor technical solution, or an unsatisfactory delivery of treatment.
In line with previous findings, the present study suggests that smartphone treatment can reduce therapist time spent per patient.26–28 Furthermore, this treatment format can bring the behavior change program closer to the patients’ everyday lives as it prompts and supports both practice and use of target behaviors in real-life situations where the behavioral change takes place.29 Also, mHealth solutions can facilitate a better understanding of patient behaviors through continuous data collection with high ecological validity.
A few limitations to the present study should be considered. The alpha testing was based on a convenience sample with highly motivated patients and therapists, and the beta testing utilized a sample of self-referred patients with an interest in undergoing a digital self-management behavioral intervention. However, the self-referred beta sample has a similar pain duration as participants in previous research recruited from a tertiary pain clinic.30,31 Still, it is yet unclear to what extent the feasibility results are generalizable to the broader pain population.
A preliminary efficacy testing of the treatment outcome is required to evaluate the effects of an intervention. Also, larger clinical trials with different samples are needed to scientifically assess the utility and external validity of ACTsmart, as well as the change mechanisms (mediators and moderators of treatment outcome) including for whom this intervention may be useful. In addition, the required level of therapist competence and the need for therapist support to obtain satisfying treatment effects, should be addressed in future research. Furthermore, cost-effectiveness and dose-response relationship are important research objectives.
Although large clinical trials (RCT:s) have traditionally been the method of choice for efficacy testing, research methods compatible with agile development should be considered. Studies addressing the utility of specific treatment components, change mechanisms, and tailored interventions may benefit from utilizing the mHealth Agile Development & Evaluation Lifecycle in combination with evaluation methods such as single case experimental design, A/B testing, small group iterations, and continuous UX testing. Further development and evaluation of the utility of the specific components and technical functions within ACTsmart requires a series of optimization studies, which may benefit from applying approaches that allows close monitoring of individual trajectories and relationships between interventions and change processes. Such bottom-up approach will facilitate the development of tailored or flexible treatment programs, where specific components are combined to address individual needs. Individual-level data may shed light on important aspects such as dose-response relations and mechanisms of change (moderated mediation), which are critical to empirically driven personalized treatment and improved treatment effects on a larger proportion of patients.
Based on feedback from therapists involved in the beta testing, future research should also explore the possibility to integrate ACTsmart with face-to-face treatment. Combining standard treatment with a digital intervention may have several benefits. For health care organizations, digital interventions may facilitate standardized care across therapists or health care units; quality assurance through improved protocol adherence and a minimal deviation from the empirically supported practice (therapist drift). Furthermore, digital solutions may enhance treatment compliance and support behavior change between sessions.
To conclude, the results and completion of the first three phases of the mHealth Agile Development & Evaluation Lifecycle has provided the opportunity to further optimize ACTsmart as well as validate that the form of delivery is feasible and acceptable. These steps are crucial before moving ACTsmart to clinical trials that evaluate the effects and change mechanisms of the therapeutic intervention.
Methods
Procedure and design
Within the present study the three first phases (0–2) of the mHealth Agile Development & Evaluation Lifecycle25 were completed. In the present study phase zero will be described briefly, while phase one and two are described in more detail. Figure 6 illustrates the lifecycle, adapted to the ACTsmart development project. In phase zero, the project identification phase, the agile and user centered development method Lean UX32 and scrum methodology33 were used as project management approaches. Phase one and two were divided into five sprints. Each sprint had specific objectives and continued during ~30 days. The project leader compiled all proposed changes for the solution and prioritized among possible functionality enhancements. Early in phase one, the decision was made to build an independent, cloud-based and flexible technical solution, rather than further develop an existing platform, to maximize flexibility in the development. Also, it was decided to focus on the patient interface during phase one, and to prepare the patient interface for beta testing in phase two. Consequently, the therapist interface was still rudimentary when the beta testing/clinical feasibility trial was conducted.
Alpha testing
Alpha testing is an acceptance testing preferably carried out with potential end-users. It involves simulating a real user environment by carrying out tasks that actual users might perform. The alpha test is made to identify as many potential problems as possible before releasing a product for beta testing, and can be performed on early versions of the product such as paper sketches, versions that lacks all features and on versions that is yet too unstable for reliable use. The alpha testing also gives preliminary end-user feedback, to get feedback when adjustments are still easy to make.
In phase one, nine individuals (age 19–65 years, 78% women) with complex chronic pain (potential end-users) were recruited for alpha testing from a tertiary care pain clinic after completing a standard face-to-face ACT-treatment. Alpha testing began with end-user interviews to inform personas and work flow ideas. Based on these nine interviews three personas were generated. The interface was then built based on the personas and continuously and repeatedly tested with the alpha users. See Table 4 for a detailed description of the test flow.
Table 4.
Phase 0 | Innovative | Organizational | ||||
Identification of challenge. Envision of product. User research. Identify end-users. Identify target market. | Gather expertize and resources needed. Create project organization. Identify strategic/operative goals. Secure funding and resource allocation for phase 1. Define work model. | |||||
Phase 1 | Organizational | Technical development | Content development | End-user insights | Alpha testing and re-iterations | Beta testing |
Sprint 0 | Establishment of effect map with goals, target groups and user needs. |
Identify basic needs and functionality. Set limitations for phase 1 (focus on patient interface). Make basic technical choices (independent, cloud based, flexible). |
Interviews with clinicians specialized in chronic pain. | |||
Sprint 1 |
Definition of operationalized short-and long-term goals. Identification of external risk factors and strategies to prevent/manage the risks. |
Decision to use Microsoft Azure cloud-based platform for both content and data collection. Multidisciplinary design studio session to generate ideas for functionalities. |
Identification of types of content (text, audio, video). |
Interviews with patients. Creation of protopersonas. |
||
Sprint 2 | Production of HTML-prototype for weekly patient reported outcome measures. |
Paper sketches for content structure. Production of first versions of written content. |
HTML-prototype. Content structure sketches. |
|||
Sprint 3 | Creation of draft for system navigation and system levels. |
Prototype for audio content. Prototype for written content. Draft of first animated video. |
Weekly patient-reported outcome measures. Solution content. Solution structure. |
|||
Sprint 4 |
Production of first draft of the structure and content of value module. Creations of more animations. Recording of audio tracks. First drafts of treatment illustrations. |
First draft of value module. Animations. Audio tracks. Illustration drafts. |
||||
Transition period | Creation of messaging function. Bug testing. |
Collection of system generated data. Interviews with alpha testers. Interviews with clinicians. |
First compound prototype of the application. Weekly patient-reported measures. |
Planning of beta testing in clinical feasibility trial. | ||
Phase 2 | Organizational | Technical development | End-user insights | Alpha testing and re-iterations | Beta testing | |
Sprint 0 |
Joint application for further funding. Resource allocation for phase 2. |
First sketches of therapist interface. Creation of double authentication login. |
Recruitment of end-users (adult pain patients) for beta testing. | |||
Sprint 1 |
Development of web prototype for therapist interface. Ongoing debugging. First- and second line technical support. |
Draft sketches of therapist interface. Web prototype of therapist interface. |
Start of 8-week beta testing (clinical feasibility trial). | |||
Sprint 2 |
Ongoing debugging. First- and second line technical support. |
Preparation of UX-survey and UX-interviews with beta testers. | Web prototype of therapist interface. | Ongoing beta testing (clinical feasibility trial). | ||
Sprint 3 | Production of differentiated access for therapist interface. | Web prototype of therapist interface. |
In-depth beta UX interviews. Beta UX survey. |
|||
Sprint 4 |
Workshop on handling of support and bugs. Completion of first version therapist interface. |
Compilation of beta UX insights. | Patient interface. | |||
Transition period |
Patient interface. Therapist interface. |
Six therapists were continuously involved in the alpha testing of the therapist interface which was carried out in the same way as with the patient alpha testers. See Table 4 for a detailed description of the test flow. Figure 7 shows an example of a development of one of the features in the therapist interface throughout the different phases of alpha testing.
Beta testing
The alpha testing phase is followed by beta testing (phase 2), in which the intervention is tested by real users in a real environment. In the present study, the beta testing sample consisted of 31 individuals (87% women, aged 25–57) with chronic pain (mean pain duration 19.74 years, range 0.5–40 years) with no prior exposure to behavioral treatment for their chronic pain condition, actively requesting participation in a research study on internet-delivered ACT. In the beta sample, seven out of 31 (23%) underwent a UX-interview and 16 out of 31 (52%) answered a UX-survey. Four therapists trained in ACT and behavioral treatment of chronic pain delivered treatment during the beta testing.
After both alpha (phase 1) and beta (phase 2) testing were completed, preparation of the next phase included minor adjustments identified during the sprints and debugging. See Table 4 for detailed information on the development and evaluation of phase 0–2.
Intervention
ACTsmart is based on the clinical treatment program developed at Karolinska Institutet and Karolinska University Hospital during the past 18 years. To date, the ACT based treatment program for chronic pain patients has been evaluated in nine clinical trials, including five RCT’s31,34–37 with results illustrating the efficacy of the protocol. Improvements are primarily seen in functioning/disability and psychological flexibility, with mostly large effect sizes. Results are consistent across all studies, supporting the external validity of the findings.
The overarching goal of the ACTsmart treatment is to improve functioning and quality of life by increasing the participants psychological flexibility, defined as the ability to act in accordance with life values in the presence of pain and distress.38 Psychological flexibility is established through promoting greater acceptance of negative inner experiences22,31 as well as increasing ability to observe, rather than being entangled with, thoughts and engage in valued action.38,39
In treatment, participants were encouraged through content, solution design and by their therapist to redirect behaviors and shift focus from avoiding or reducing pain and distress to act in alignment with values in the presence of interfering pain and distress. Thus, patients were encouraged to engage in value-based exposure. Treatment content was divided into different themes that roughly corresponds to the core processes of ACT;38 acceptance, creating distance to thoughts, creating distance to emotional and bodily experiences, noticing and changing behaviors, self-observation, and values. Content was also categorized depending on type; educational, exercise or value-work. See Figs. 8 and 9 for screenshots of the patient interface.
Completion
The pre-defined criterion of treatment completion was the combination of (a) completing at least eight exercises, (b) having defined at least one formulated value, and (c) reporting behavioral actions towards that value. This criterion was chosen as exercises and values-work are based on experiential learning and therefore expected to be the active ingredients in treatment,40 in contrast to educational content that have the purpose of preparing for experiential learning.
Intervention structure
The intervention was arranged in a micro-learning format, which is the combination of micro-content delivery and micro interactions that seeks to enable the user to gain knowledge and skills without risking information overload.41 Micro-content learning through a mobile device can give the user personal control and ownership of the learning process.42,43 All content in the treatment could be either read or listened to, in order to accommodate different preferences and needs.
Data collection
The present study was part of an open-label pilot trial with one intake. The pilot trial was approved by the Regional Ethics Committee in Stockholm, Sweden 3 November 2015 (2015/1638-31/2) and followed the Helsinki declaration. The trial was registered at clinicaltrials.gov at 17 November 2017 with registration number NCT03344926. Participants provided written informed consent prior to enrollment in the study.
Data security
Treatment was delivered on a secure platform and log in required double authentication from both patients and therapists. Data storage differed depending on the type of data collected. Personal data that could be traced back to a specific individual was stored on a secure server and anonymized data was stored in a cloud solution using a cloud storage provider that was certified according to security and auditing standards ISO 27001 and SAS 70/SSAE 16 as well as connected to the Privacy Shield principles of data processing and thus complied with the European legislation (GDPR) requirements for the processing of personal data. Security was also managed through various levels of access, for example therapists only had access to data regarding their specific patients while research admin had access to all patient data.
Alpha testing
Data collected during the alpha testing consisted of qualitative data on user behaviors and experiences. All tests were documented by the test leader and then brought back to the development team to guide further development.
Beta testing
In the beta testing, system generated quantitative user data was extracted during the course of the treatment/testing. Qualitative UX-data was derived from interviews with seven patients (23%) and all four therapists post treatment/testing as well as through a UX-survey that was completed by 16 (52%) of the patient beta testers. Data was compiled, organized into themes, and sorted for immediate re-iteration, future research questions to address or future development.
The main purpose of the beta testing was to examine feasibility. The variables of interest were acceptability, usage and practicality. Acceptability concerns to what extent the intervention program and format of delivery is suitable, satisfying, and attractive to the users, both recipients (patients) and deliverers (therapists).44 Usage data provides information on to what extent the participants accept the treatment, and at what level participants use the solution throughout the course of the treatment. Practicality refers to the possibilities of carrying out the intervention based on existing means, resources, and circumstances and without outside intervention44 as well as technical feasibility.
Acceptability data was collected pre-treatment (during alpha testing) as well as post treatment (as part of the beta testing) in qualitative UX-interviews. The alpha testing investigated user friendliness, comprehensiveness, and usability and led to iterations and continuous retests until they met a satisfactory level for all alpha testers before the intervention was ready for beta testing. Beta testing investigated satisfaction with treatment content, satisfaction with treatment format, satisfaction with values section as well as the treatment’s ability to motivate the user. Qualitative acceptability data was also collected from therapists in a post-treatment focus interview on satisfaction with treatment format, intent to continue use, use of therapist time, user friendliness as well as if the intervention supports patient communication.
Usage data was generated by the solution through completion rate, number of chat messages to therapist as well as post treatment logins.
Practicality data was collected during treatment and included therapist minutes per patient, text message reminders and phone calls from therapist to patient. Technical feasibility, as part of the practicality aspect, concerned how well the system worked in real-life situations through the need for technical support and reason for the need of technical support. Post-treatment quantitative data was collected in a UX-survey investigating device used, and in what working context patients engaged in treatment.
Reporting summary
Further information on research design is available in the Nature Research Reporting Summary linked to this article.
Supplementary information
Acknowledgements
We would like to thank all the participants in all phases of the development process. We also want to thank the team at KIB: Sofia Samuelsson, Mats Ronquist, Mikael Jergefeldt, Åsa Jenslin, David Hansson, Fredrik Persson, and Erik Svallingson. Their dedication, knowledge, and curiosity made ACTsmart possible. In addition, we are grateful to Brjánn Ljótsson for generously allowing us to use his digital platform for baseline data collection. This work was partly supported by a grant from AFA insurance as part of a larger research program on chronic pain. Furthermore, Wicksell was provided funding through the regional agreement on medical training and clinical research (ALF) between Stockholm City Council and Karolinska Institutet. The funders had no influence on design, recruitment, analyses or manuscript. Open access funding provided by Karolinska Institute.
Author contributions
C.G. and J.R. collected all data. C.G., J.R., R.K.W. and V.Z. designed the study. C.G. compiled and processed data. C.G. prepared the manuscript with substantial contributions from L.H., J.R., L.E.S., R.K.W. and V.Z. All authors approved the final version of the manuscript.
Data availability
The dataset generated during the current study is not publicly available due to restrictions in the ethical permit, but may be available from the corresponding author on reasonable request.
Code availability
Given the descriptive nature of the analyses no custom code or mathematical algorithm was used in this study.
Competing interests
The authors declare no competing interests.
Footnotes
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary information is available for this paper at 10.1038/s41746-020-0228-4.
References
- 1.Andersson G, Cuijpers P, Carlbring P, Riper H, Hedman E. Guided Internet-based vs. face-to-face cognitive behavior therapy for psychiatric and somatic disorders: a systematic review and meta-analysis. World Psychiatry. 2014;13:288–295. doi: 10.1002/wps.20151. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Cowan LT, et al. Apps of steel: are exercise apps providing consumers with realistic expectations?: a content analysis of exercise apps for presence of behavior change theory. Health Educ. Behav. 2012;40:133–139. doi: 10.1177/1090198112452126. [DOI] [PubMed] [Google Scholar]
- 3.Riley WT, et al. Health behavior models in the age of mobile interventions: are our theories up to the task? Transl. Behav. Med. 2011;1:53–71. doi: 10.1007/s13142-011-0021-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Neuhauser L, Kreps GL. eHealth communication and behavior change: promise and performance. Soc. Semiotics. 2010;20:9–27. doi: 10.1080/10350330903438386. [DOI] [Google Scholar]
- 5.Hebden L, Cook A, van der Ploeg HP, Allman-Farinelli M. Development of smartphone applications for nutrition and physical activity behavior change. JMIR Res. Protoc. 2012;1:e9–e9. doi: 10.2196/resprot.2205. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Backinger CL, Augustson EM. Where there’s an app, there’s a way? Am. J. Prev. Med. 2011;40:390–391. doi: 10.1016/j.amepre.2010.11.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Breton ER, Abroms LC, Fuemmeler BF. Weight loss—there is an app for that! But does it adhere to evidence-informed practices? Transl. Behav. Med. 2011;1:523–529. doi: 10.1007/s13142-011-0076-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Abroms LC, Padmanabhan N, Thaweethai L, Phillips T. iPhone apps for smoking cessation: a content analysis. Am. J. Prev. Med. 2011;40:279–285. doi: 10.1016/j.amepre.2010.10.032. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Azar KMJ, et al. Mobile applications for weight management: theory-based content analysis. Am. J. Prev. Med. 2013;45:583–589. doi: 10.1016/j.amepre.2013.07.005. [DOI] [PubMed] [Google Scholar]
- 10.Payne HE, Lister C, West JH, Bernhardt JM. Behavioral functionality of mobile apps in health interventions: a systematic review of the literature. JMIR MHealth UHealth. 2015;3:e20–e20. doi: 10.2196/mhealth.3335. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Baker TB, Gustafson DH, Shah D. How can research keep up with eHealth? Ten strategies for increasing the timeliness and usefulness of eHealth research. J. Med. Internet Res. 2014;16:e36–e36. doi: 10.2196/jmir.2925. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Chen J, Cade JE, Allman-Farinelli M. The most popular smartphone apps for weight loss: a quality assessment. JMIR MHealth UHealth. 2015;3:e104–e104. doi: 10.2196/mhealth.4334. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Thomas SL, Hyde J, Karunaratne A, Kausman R, Komesaroff PA. “They all work…when you stick to them”: a qualitative investigation of dieting, weight loss, and physical exercise, in obese individuals. Nutr. J. 2008;7:34. doi: 10.1186/1475-2891-7-34. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Breivik H, Collett B, Ventafridda V, Cohen R, Gallacher D. Survey of chronic pain in Europe: prevalence, impact on daily life, and treatment. Eur. J. Pain. 2006;10:287–333. doi: 10.1016/j.ejpain.2005.06.009. [DOI] [PubMed] [Google Scholar]
- 15.Tunks, E. R., Crook, J. & Weir, R. Epidemiology of Chronic Pain with Psychological Comorbidity: Prevalence, Risk, Course, and Prognosis. Vol. 53, pp. 224–234 (Los Angeles, CA, 2008). [DOI] [PubMed]
- 16.Kerns RD, Sellinger J, Goodin BR. Psychological treatment of chronic pain. Annu. Rev. Clin. Psychol. 2011;7:411–434. doi: 10.1146/annurev-clinpsy-090310-120430. [DOI] [PubMed] [Google Scholar]
- 17.Williams, A. C., Eccleston, C. & Morley, S. Psychological therapies for the management of chronic pain (excluding headache) in adults. Cochrane Pain Palliative Supportive Care Group11, 10.1002/14651858.CD007407.pub3 (2012). [DOI] [PMC free article] [PubMed]
- 18.Jensen MP, Turk DC. Contributions of psychology to the understanding and treatment of people with chronic pain: why it matters to ALL psychologists. Am. Psychologist. 2014;69:105. doi: 10.1037/a0035641. [DOI] [PubMed] [Google Scholar]
- 19.Rosser BA, Eccleston C. Smartphone applications for pain management. J. Telemed. Telecare. 2011;17:308–312. doi: 10.1258/jtt.2011.101102. [DOI] [PubMed] [Google Scholar]
- 20.Lalloo AC, Jibb NL, Rivera NJ, Agarwal NA, Stinson NJ. “There’s a pain app for that”: review of patient-targeted smartphone applications for pain management. Clin. J. Pain. 2015;31:557–563. doi: 10.1097/AJP.0000000000000171. [DOI] [PubMed] [Google Scholar]
- 21.Hayes, S. C., Strosahl, K. D. & Wilson, K. G. Acceptance and Commitment Therapy: An Experiential Approach to Behavior Change. 1st edn (Guilford Press, 1999).
- 22.McCracken LM, Morley S. The psychological flexibility model: a basis for integration and progress in psychological approaches to chronic pain management. J. Pain. 2014;15:221–234. doi: 10.1016/j.jpain.2013.10.014. [DOI] [PubMed] [Google Scholar]
- 23.McCracken LM, Vowles KE. Acceptance and commitment therapy and mindfulness for chronic pain. Am. Psychol. 2014;69:178–187. doi: 10.1037/a0035623. [DOI] [PubMed] [Google Scholar]
- 24.Andersson G. Internet-delivered psychological treatments. Annu. Rev. Clin. Psychol. 2016;12:157–179. doi: 10.1146/annurev-clinpsy-021815-093006. [DOI] [PubMed] [Google Scholar]
- 25.Wilson, K., Bell, C., Wilson, L. & Witteman, H. Agile research to complement agile development: a proposal for an mHealth research lifecycle. npj Digital Med.1, 10.1038/s41746-018-0053-1 (2018). [DOI] [PMC free article] [PubMed]
- 26.Andrews G, et al. Computer therapy for the anxiety and depression disorders is effective, acceptable and practical health care: an updated meta-analysis. J. Anxiety Disord. 2018;55:70–78. doi: 10.1016/j.janxdis.2018.01.001. [DOI] [PubMed] [Google Scholar]
- 27.Andrews G, Davies M, Titov N. Effectiveness randomized controlled trial of face to face versus internet cognitive behaviour therapy for social phobia. Aust. N. Z. J. Psychiatry. 2011;45:337–340. doi: 10.3109/00048674.2010.538840. [DOI] [PubMed] [Google Scholar]
- 28.Andrews G, Williams AD. Up-scaling clinician assisted internet cognitive behavioural therapy (iCBT) for depression: a model for dissemination into primary care. Clin. Psychol. Rev. 2015;41:40–48. doi: 10.1016/j.cpr.2014.05.006. [DOI] [PubMed] [Google Scholar]
- 29.Lindner Philip, Ivanova Ekaterina, Ly Kien, Andersson Gerhard, Carlbring Per. Guided and unguided CBT for social anxiety disorder and/or panic disorder via the Internet and a smartphone application: study protocol for a randomised controlled trial. Trials. 2013;14(1):437. doi: 10.1186/1745-6215-14-437. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Kemani MK, Zetterqvist V, Kanstrup M, Holmström L, Wicksell RK. A validation of the pain interference index in adults with long‐standing pain. Acta Anaesthesiol. Scand. 2016;60:250–258. doi: 10.1111/aas.12599. [DOI] [PubMed] [Google Scholar]
- 31.Wicksell RK, Olsson GL, Hayes SC. Psychological flexibility as a mediator of improvement in Acceptance and Commitment Therapy for patients with chronic pain following whiplash. Eur. J. Pain. 2010;14:1059.e1051–1059.e1011. doi: 10.1016/j.ejpain.2010.05.001. [DOI] [PubMed] [Google Scholar]
- 32.Gothelf, J. Lean UX: Applying Lean Principles To Improve User Experience. (“O’Reilly Media, Inc.”, 2013).
- 33.Schwaber, K. & Beedle, M. Agile Software Development With Scrum. Vol. 1 (Prentice Hall Upper Saddle River, 2002).
- 34.Wicksell KR, Ahlqvist J, Bring A, Melin L, Olsson GL. Can exposure and acceptance strategies improve functioning and life satisfaction in people with chronic pain and Whiplash‐Associated Disorders (WAD)? A randomized controlled trial. Cogn. Behav. Ther. 2008;37:169–182. doi: 10.1080/16506070802078970. [DOI] [PubMed] [Google Scholar]
- 35.Wicksell KR, Melin LL, Lekander LM, Olsson LG. Evaluating the effectiveness of exposure and acceptance strategies to improve functioning and quality of life in longstanding pediatric pain – a randomized controlled trial. Pain. 2009;141:248–257. doi: 10.1016/j.pain.2008.11.006. [DOI] [PubMed] [Google Scholar]
- 36.Wicksell KR, Olsson LG, Hayes CS. Mediators of change in Acceptance and Commitment Therapy for pediatric chronic pain. Pain. 2011;152:2792–2801. doi: 10.1016/j.pain.2011.09.003. [DOI] [PubMed] [Google Scholar]
- 37.Wicksell RK, et al. Acceptance and commitment therapy for fibromyalgia: a randomized controlled trial. Eur. J. Pain. 2013;17:599–611. doi: 10.1002/j.1532-2149.2012.00224.x. [DOI] [PubMed] [Google Scholar]
- 38.Hayes SC, Luoma JB, Bond FW, Masuda A, Lillis J. Acceptance and Commitment Therapy: Model, processes and outcomes. Behav. Res. Ther. 2006;44:1–25. doi: 10.1016/j.brat.2005.06.006. [DOI] [PubMed] [Google Scholar]
- 39.Vowles KE, McCracken LM. Acceptance and values-based action in chronic pain: a study of treatment effectiveness and process. J. Consult. Clin. Psychol. 2008;76:397–407. doi: 10.1037/0022-006X.76.3.397. [DOI] [PubMed] [Google Scholar]
- 40.Morris E. Learning ACT: An Acceptance and Commitment Therapy Skills Training Manual for Therapists Jason B. Luoma, Steven C. Hayes and Robyn D. Walser Oakland, CA: New Harbinger Publications, 2007. pp. 352. £22.68 (pb). Behav. Cogn. Psychother. 2008;36:634–635. doi: 10.1017/S1352465808004670. [DOI] [Google Scholar]
- 41.Bruck, P. A., Motiwalla, L. & Foerster, F. Mobile Learning with Micro-content: A Framework and Evaluation. In BLED 2012 Proceedings 2 (2012).
- 42.Wong LH. A learner‐centric view of mobile seamless learning. Br. J. Educ. Technol. 2012;43:E19–E23. doi: 10.1111/j.1467-8535.2011.01245.x. [DOI] [Google Scholar]
- 43.Ally, M. Mobile Learning: Transforming the Delivery of Education and Training. Vol. 50; 10 Jahre Vorsprung Baden-Württemberg (Athabasca University Press, 2009).
- 44.Bowen DJ, et al. How We Design Feasibility Studies. Am. J. Preventive Med. 2009;36:452–457. doi: 10.1016/j.amepre.2009.02.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The dataset generated during the current study is not publicly available due to restrictions in the ethical permit, but may be available from the corresponding author on reasonable request.
Given the descriptive nature of the analyses no custom code or mathematical algorithm was used in this study.