Skip to main content
Translational Behavioral Medicine logoLink to Translational Behavioral Medicine
. 2022 Feb 22;12(5):693–701. doi: 10.1093/tbm/ibab165

If you personalize it, will they use it?: Self-reported and observed use of a tailored, internet-based pain self-management program

Lillian Reuman 1,2, Chelsey Solar 1, R Ross MacLean 3,4, Allison M Halat 2, Haseena Rajeevan 4, David A Williams 5, Alicia A Heapy 3,4, Matthew J Bair 6,7,8, Sarah L Krein 5,9, Robert D Kerns 3,4, Diana M Higgins 1,2,
PMCID: PMC9154266  PMID: 35192703

Abstract

Little is known about how individuals with chronic pain use tailored internet-based interventions. This study is the first to compare self-reported skill module use to observed module access and to examine each of these in relationship to tailored recommendations to access specific content. Participants (N = 58) enrolled in a 10-week trial of the Pain EASE program, a tailored internet-based intervention that includes 10 pain self-management skill modules. Participants completed a “Self-Assessment,” which was used to provide a “Personalized Plan” that encouraged accessing specific modules. Participants self-reported module use during weekly data collection telephone calls. Program log data were extracted to capture “observed” module use during the trial period. Findings indicated significantly greater self-reported use of the Pain EASE modules compared to observed access with log data. Further, log data revealed that participants accessed less than half of the modules recommended to them via tailoring.

Keywords: Pain, Cognitive behavioral therapy, Self-management, Tailoring


Implications.

Practice: Patients’ level of participation in a tailored, self-guided, self-management internet-based intervention for chronic pain (CP) may be higher than the average number of sessions attended during in-person treatment and is consistent with a “dose” of treatment; further, patients’ use patterns vary.

Policy: Multimethod assessment of internet intervention use is vital, and further inquiry with stakeholders, such as participants with CP, to understand preferences and strategies to improve use and engagement is also warranted.

Research: Future studies should explore person- and program-level factors that affect the outcome, whether tailoring (e.g., content matching) meaningfully affects outcomes, as well as the minimum or “sufficient” module use needed to affect outcomes.

Introduction

Chronic musculoskeletal pain is a leading cause of disability, affecting millions of veterans [1, 2]. In the context of the current opioid epidemic, a Veterans Health Administration (VHA) directive and other policy and clinical practice guidelines emphasize non-pharmacological approaches to treat chronic pain (CP) [3, 4]. Non-pharmacological approaches, including cognitive behavioral therapy for CP (CBT-CP), are often considered as first-line treatments for chronic musculoskeletal pain such as chronic low back pain (cLBP) [3, 5]. Compared to treatment as usual, CBT-CP offers significant, positive effects on pain intensity for chronic musculoskeletal pain [6, 7]. Barriers to receiving non-pharmacological treatments, however, include limited access, cost, and time burden [8]. Strategies to facilitate and improve access to and engagement in multi-visit, non-pharmacological treatments such as CBT-CP are needed.

Internet-based interventions offer a viable, accessible alternative to traditional face-to-face interventions with demonstrated efficacy [9–12]. Internet-based CBT-CP interventions for CP conditions have been shown to reduce pain and related outcomes at short- and long-term follow-up [13, 14]. Although internet delivery can increase access to CBT-CP, individuals who begin internet-based interventions may infrequently use, or be unlikely to complete, these programs [11]. This low completion rate may be due to various factors such as intervention design (e.g., not requiring skill module completion as a prerequisite for beginning the next skill module), less commitment (e.g., fewer demands associated with the treatment), and no face to face interaction with a clinician or health coach [15].

One useful structured framework for exploring the design and development of internet-based interventions is Persuasive System Design (PSD). The PSD model outlines a framework to design and evaluate persuasive software solutions (i.e., systems that influence behavior) [16]. Comprised of four broad categories and multiple elements, the PSD model addresses the primary task, human-computer dialogue, perceived system credibility, and social influence. Applying a particular element of this model (i.e., primary task) to internet-based interventions such as Pain EASE can facilitate examination of the use of the intervention. Design principles within the primary task category aim to support the user’s primary activities; commonly used principles include tailoring and self-monitoring. To enhance the use and, theoretically better the outcomes, of internet-based interventions for CP conditions, “tailored” recommendations may be beneficial [17, 18]. Tailoring refers to using personal health data to create customized offerings and meet individual needs [18]. Within PSD, “tailoring” is a design principle that can help carry out the user’s main task (in the case of the Pain EASE program, participating in the intervention).

Although researchers have proposed multiple mechanisms (e.g., feedback, and content matching) by which tailoring may improve engagement, content matching is thought to be “the essence of tailoring” ([19], p. 462). Content matching involves developing individualized treatment programs with chosen content to apply and use based upon known determinants of the behavior (e.g., evidence-based processes of behavioral changes such as reported coping skill use) [19]. An algorithm or decision tree can be used to select the content that addresses behavioral determinants. For example, individuals with cLBP who indicate that they often use relaxation skills to address increased pain would not be assigned content related to this skill, whereas those who report minimal relaxation skill use would be encouraged to receive this content.

Little is known, however, about participants’ use of tailored internet-based interventions and whether participants’ program use aligns with tailored recommendations. Methodological challenges, including self-report bias, may cloud results from studies of tailored intervention program use [20]. Specifically, participants may have difficulty self-reporting and describing past module content or skills use. Further, social desirability bias (e.g., wanting to appear favorable to healthcare providers) may lead participants to mischaracterize actual use. Accordingly, researchers have questioned the accuracy of self-reported use of internet-based interventions [21–23]. For internet-based interventions, this issue may be addressed via collection and use of log data (i.e., information about individuals’ interactions with a program [such as IP address, date, and time] that are passively and automatically collected and stored on an associated web server). Log data (also referred to as “observed use”) provide a unique opportunity to examine any discordance between participants’ self-reported program use—as is standardly measured—and documented program interaction. Understanding the discordance between self-reported versus log data can help researchers thoughtfully develop study methodology to ultimately understand how individuals use internet-based interventions and whether a particular program leads to meaningful change in outcomes of interest. If researchers only report one type of user data (e.g., self-report or log data), it may be difficult to interpret results from trials and draw comparisons across programs and studies. To the extent that the discordance suggests that one method is more reliable than another, however, researchers could prioritize a given method. Further, data collection methods that do not require additional efforts—such as log data—may be preferred.

Limited information is available regarding whether self-reported use is congruent with observed use obtained from log data, and whether self-reported and observed use align with tailored recommendations developed based on a Self-Assessment. Accordingly, an aim of this paper was to produce insight regarding the program’s design features (i.e., tailored module recommendations) in the context of use of the program. The current study explored and compared self-reported pain self-management skill module access and log data obtained from the Pain EASE program, a tailored, internet-based intervention for veterans with cLBP. Given the aforementioned problems of self-reported data, we hypothesized that self-reported use would not match observed use obtained from log data; however, we did not develop module-specific hypotheses.

METHODS

This is a secondary data analysis study of single-arm feasibility and preliminary efficacy trial of the Pain EASE program [24, 25]. For a detailed CONSORT diagram describing participant recruitment, enrollment, withdrawal, and completion, see the primary outcome trial [21]. Baseline and posttreatment (i.e., 10 weeks post-baseline) data were collected via self-report questionnaires. Program log data and brief weekly data collection telephone calls to patients provided additional information and are described below.

Participants

Participants were veterans recruited from one northeastern VHA facility via flyers in patient care areas and a health education outreach table located in the hospital lobby. Veterans with an International Classification of Diseases (ICD)‐9 diagnosis consistent with cLBP of moderate severity (as indicated by pain intensity numeric rating scale (NRS) scores of ≥ 4 out of 10 for a period of ≥ 3 months) were eligible to participate in the study. Veterans were excluded if any of the following criteria were present: (a) an acute or life-threatening medical condition (e.g., terminal cancer), (b) a psychiatric condition that could hinder participation (e.g., psychosis), or (c) planned surgery for low back pain scheduled to occur during study participation. Additional information regarding eligibility criteria is described in detail elsewhere [24].

The Pain EASE program

The Pain EASE program is a tailored, internet-based cognitive behavioral intervention for cLBP. Pain EASE includes 10 pain self-management skill modules, interactive activities, and additional resources. See Table 1 for a description of the skill modules. Within each module, participants receive a brief description of the topic with a combination of didactic text, audio, and/or graphics. Pain EASE also includes a “Tracking Your Progress” self-monitoring feature that facilitates input of personal numerical data such as pain intensity, sleep quality, and pedometer-recorded number of steps walked, as well as fillable self-monitoring forms to track and practice cognitive behavioral skills such as cognitive restructuring. The Pain EASE program is self-guided, as it does not involve contact with a clinician. Additional program details along with feasibility and preliminary efficacy data were previously reported [24].

Table 1.

Pain EASE skill module names and content

# Module name Module content
1. Pain Education Biopsychosocial model and pain self-management
2. Setting Personal Goals Goal setting tips for behavioral change
3. Planning Meaningful Activities Pleasant activity scheduling
4. Physical Activity Low impact exercise and stretching
5. Relaxation Deep breathing, imagery, and progressive muscle relaxation
6. Developing Healthy Thinking Patterns Identifying and changing unhelpful thoughts
7. Pacing and Problem Solving Pacing activities and problem-solving techniques
8. Improving Sleep Behavioral sleep tips
9. Effective Communication Anger management and communication styles
10. Preparing for the Future Treatment wrap-up

Personalized plan

Upon logging into the Pain EASE program for the first time, participants completed a “Self-Assessment,” which contained eight items from a brief version of the Chronic Pain Coping Inventory (CPCI) [26, 27]. In accordance with individual responses to CPCI items and an automated module mapping algorithm, participants were provided with a Personalized Plan comprised of tailored module recommendations (i.e., incorporating the “content matching” approach to tailoring), described below. In addition, all participants were automatically recommended four core modules that did not have corresponding CPCI items but that were important components of CBT-CP: Pain Education, behavioral goal setting, improving sleep, and planning for the future. Participants received suggestions for varying numbers of skill modules based on their Self-Assessment responses. The total number of modules recommended (range 4–10) was based upon endorsement of coping strategies on the CPCI; thus, participants could receive recommendations to use a range of modules beyond the four core modules. Regardless of which modules were recommended for the “Personalized Plan,” participants were able to access all Pain EASE modules at any time and in any order during the 10-week intervention period. Participants were neither required to complete modules in a particular order, nor required to complete a given number of modules per week or during the course of their 10 weeks of access to the program.

Measures and data sources

Demographics and clinical characteristics

At the in-person baseline visit, demographics and clinical characteristics were collected via self-report questionnaires and from participants’ electronic health record, with their permission. Variables such as race/ethnicity, age, sex, pain duration, pain intensity NRS ratings (where 0 = no pain and 10 = worst pain imaginable), and pain interference (collected via the West Haven-Yale Multidimensional Pain Inventory) [28] were included to describe the sample in the current study.

CPCI

The CPCI is a self-report measure designed to assess the use of coping strategies in response to pain [26, 27]. It demonstrates strong psychometric properties: adequate to excellent internal consistency and satisfactory test-retest stability [26]. In response to the question “During the past week, how many days did you use each of the following at least once in the day to cope with your pain?,” participants rated items (e.g., “focused on relaxing my muscles”) on an 8-item scale ranging from 0 days to 7 days. CPCI items corresponded to the following measure subscales: Guarding, Resting, Asking for Assistance, Relaxation, Task Persistence, Exercise/Stretch, Seeking Social Support, and Coping Self-statements. The single items addressing guarding, resting, and asking for assistance comprise the illness-focused coping subscale, and the single items addressing relaxation, task persistence, exercise/stretch, seeking support, and coping self-statements comprise the wellness-focused coping scales. Scores indicating limited (strategies used on three or fewer days per week) past-week adaptive strategy use on the CPCI items determined which module(s) were recommended to the participant in the Personalized Plan. For example, a participant who reported low use of Relaxation, Exercise/Stretch, and Coping Self-statements would receive a Personalized Plan that included the four core Pain EASE modules and the relaxation, physical activity, and Developing Healthy Thinking Patterns modules. Another participant reporting only low use of Exercise/Stretch would receive a Personalized Plan that include the four core Pain EASE modules and the physical activity module. See Table 2 for CPCI subscales and corresponding Pain EASE modules (i.e., the “module mapping algorithm”).

Table 2.

CPCI subscales and corresponding Pain EASE skill modules

CPCI subscale Pain EASE module
Pain Educationa
Setting Personal Goalsa
Resting Planning Meaningful Activities
Guarding
Resting
Exercise/Stretch
Physical Activity
Relaxation Relaxation
Coping Self-Statements Developing Healthy Thinking Patterns
Asking for Assistance
Task Persistence
Pacing and Problem-Solving
Improving Sleepa
Asking for Assistance
Seeking social support
Effective Communication
Preparing for the Futurea

CPCI Chronic Pain Coping Inventory.

aRecommended to all participants. Did not correspond to a CPCI subscale.

Log data

Pain EASE program log data, collected using linked SQL server databases, were extracted to capture “observed” use during the trial period. Individual participants’ log data was associated with a unique user ID (identification). Log data included individual participant activity (e.g., unique logins to the Pain EASE program, CPCI item responses, unique interactions with each skill module) with accompanying date- and time-stamps for each entry. For the current study, we extracted log data pertaining to CPCI item responses (used to create a tailored Personalized Plan) and module access. Additional user-provided data (e.g., responses to “Test Your Knowledge” module quizzes, pain intensity ratings, self-monitoring forms) was captured in this manner but is not included in the present study. Log data describing participant engagement with the Pain EASE program is described elsewhere [25]. Participants were neither informed about their log data, nor confronted about possible differences between self-reported and log data.

Weekly data collection telephone call

Participants received brief (5–10 min) weekly calls from non-clinician study staff to assess several variables, such as difficulty accessing the program and program use during the previous week. To explore self-reported module use, staff queried, “Which skill or skills did you try this past week on the Pain EASE website?” The staff read each skill module aloud to the participant and recorded a “yes” response in a study database if the participant responded affirmatively. Participants were also given an opportunity to share feedback about the Pain EASE program. At the conclusion of telephone calls, staff encouraged participants to log on in the coming week and try a new Pain EASE skill. Staff did not encourage intervention engagement based on the participants’ “Personalized Plan;” rather, they encouraged participants to access the Pain EASE program. The staff also reminded participants that they would call again the following week.

Procedure

All procedures were approved by the [VA Connecticut Healthcare System] Healthcare System Institutional Review Board. The study was registered at clinicaltrials.gov (NCT01918189).

Following telephone or in-person screening for eligibility criteria, participants attended an in-person visit to provide written informed consent, complete a baseline assessment, and receive instructions and a unique user ID for accessing the Pain EASE program. Upon accessing the Pain EASE program for the first time, participants completed a “Self-Assessment,” which informed their “Personalized Plan,” described above. Participants received instructions that they could choose to access any module in any order at any time. They were informed about which modules were recommended as part of their “Personalized Plan.” Participants were also notified that they would receive weekly telephone calls (described above) from a study staff member to confirm program access and collect additional limited data.

Data analytic plan

Prior to computing descriptive statistics, CPCI response data, weekly telephone call data, and log data were organized to facilitate interpretation and analysis. Using participants’ raw CPCI item responses and a module mapping algorithm, we created 10 binary variables to indicate whether a given module was recommended (0 = no, 1 = yes) to participants in their “Personalized Plan.” For weekly data collection telephone calls, we created a binary variable to indicate whether participants completed 0 calls or at least one call (0 = 0 calls, 1 = 1 or more calls) over the 10-week trial period. A telephone call was considered “completed” if the participant answered the telephone and responded to staff inquiries. Among participants who reported accessing at least one module on at least one weekly telephone call, we then created one binary variable per module to indicate whether participants reported—in any call—having accessed a given module during the past week (0 = no, 1 = yes).

As described above, raw log data extracted from the Pain EASE program indicated individual participant activity (e.g., number of logins), including an entry (with an accompanying date and time) for each instance a participant accessed a given module during the trial period. Accordingly, we merged these data to create one binary variable per module to indicate whether a participant ever accessed a given module (0 = no, 1 = yes) during the trial. See Table 3 for a description of variable sources, variables names, and operationalization.

Table 3.

Variable sources, names, and operationalization

Variable Source Variable Operationalization
Log data Recommended modules for Personalized plan Core modules and modules determined by CPCI algorithm that appear in participant’s plan
Modules accessed Accessed skill module over the course of 10-week trial period
Self-report data
(weekly telephone calls)
Modules used Self-reported using a module on at least one call over the course of 10-week trial period

CPCI Chronic Pain Coping Inventory.

Using the aforementioned variables, we calculated descriptive statistics (i.e., frequencies, percentages, averages, ranges) to characterize module recommendations, completed telephone calls, self-reported module use, and module access from log data in aggregate. We computed a variable to indicate percentage of recommended modules accessed using the following equation: “number of recommended modules ever accessed according to log data” divided by “number of modules recommended in the Personalized Plan” for each participant. We also computed a variable to indicate percentage of recommended modules reportedly accessed using the following equation: “number of recommended modules accessed according to telephone call data” divided by “number of modules recommended in the Personalized Plan” for each participant.

Finally, we computed inferential statistics. Given that the normality assumption for parametric testing was not met, we used a Wilcoxon signed-rank test to compare total (i.e., overall) module access according to log data to total (i.e., overall) self-reported access according to telephone call data. We used a Mann–Whitney U-test to compare total module access via log data in individuals who self-reported module use in at least one weekly telephone call to individuals without telephone call data. To compare log data to self-reported access of recommended modules, we used McNemar tests.

Results

Fifty-eight veterans with moderate to severe cLBP who were enrolled in and completed baseline assessments in the Pain EASE feasibility and initial efficacy trial were included in analyses for the current study [24]. Participants were 93% male (n = 54) and 54.5 years old on average (SD = 11.9 years). Almost two-thirds (60.3%) identified as White, and 32.8% identified as Black. Participants reported an average pain duration of 9.5 years (range: 0.67–47 years). Participants reported moderate to high past-week pain intensity (6.7 on a 0 = no pain to 10 = worst pain imaginable NRS; SD = 1.67, range: 4–10) and moderate pain interference at baseline (3.8 on a 0 = no interference to 6 = extreme interference scale on the 9-item WHYMPI Interference subscale; SD = 1.44; range 0.56–6.00). All 58 participants completed the Self-Assessment, and 55 participants accessed a Pain EASE module on at least one occasion during the trial.

Log data

Log data revealed that participants accessed an average of 3.41 modules (SD = 3.36; range: 0–10; mode = 1) during the trial. The first module, Pain Education, was accessed by 55 veterans (94.8%); however, access of the remaining modules ranged from 17.2% (n = 10; Effective Communication) to 39.7% (n = 23; Setting Personal Goals) of the sample. Table 4 presents log data reflecting the number and percentage of participants that accessed each module throughout the trial period.

Table 4.

Recommendations and use (self-report and log data) by module

Module Recommended
n (%)
N = 58
Access (log data)
n (%)
N = 58
Use
(self-report)
n (%)
n = 42
Recommended compared to accessa
(McNemar p-value)
N = 58
Access compared to Useb
(McNemar p-value)
n = 42
 1. Pain Educationc 58 (100) 55 (94.8) 40 (95.2) .250 1.00
 2. Setting Personal Goalsc 58 (100) 23 (39.7) 36 (85.7) <.001 <.001
 3. Planning Meaningful Activities 35 (60.3) 20 (34.5) 33 (78.6) .009 .003
 4. Physical Activity 51 (87.9) 21 (36.2) 32 (76.2) <.001 .008
 5. Relaxation 23 (39.7) 18 (31.0) 33 (78.6) .441 <.001
 6. Developing Healthy Thinking Patterns 23 (39.7) 13 (22.4) 26(61.9) .100 .001
 7. Pacing and Problem Solving 52 (89.7) 15 (25.9) 30 (71.4) <.001 .002
 8. Improving Sleepc 58 (100) 12 (20.7) 27 (64.2) <.001 <.001
 9. Effective Communication 52 (89.7) 10 (17.2) 23 (54.8) <.001 .007
10. Preparing for the Futurec 58(100) 11 (19.0) 26 (61.9) <.001 .001

aThese data compared number of participants who were recommended a given module according to log data and participants’ access of those same modules during the 10-week trial period.

bThese data compared number of participants who accessed a given module according to log data and participants’ self-reported use of those same modules during the 10-week trial period.

cCore modules recommended to all participants.

Self-reported module use via telephone calls

Forty-two participants (72%) reported using at least one module during at least one weekly telephone call. Self-reported module use data were missing for 16 (28%) participants (e.g., did not take calls, did not report module use on a call). Participants (n = 42, 72%) self-reported completing an average of 7.3 modules (SD = 3.07; range: 1–10) on weekly telephone calls during the 10-week trial period. Forty participants self-reported completing the first module (Pain Education) at some point during the trial period. The percentage of veterans self-reporting use of other modules ranged from 54.8% (n = 23; Developing Healthy Thinking Patterns) to 85.7% (n = 36; Setting Personal Goals).

Comparisons between overall self-reported module use and modules accessed using log data (observed access) reveal that participants self-reported using significantly more modules than they accessed (7.29 vs. 4.05; Z = −4.16, p < .001). With the exception of the Pain Education module, McNemar tests indicated that module access according to log data was significantly lower than self-reported module use for all modules (p < .01). Table 4 presents data regarding the number and percentage of participants who self-reported ever using a given module during the trial period. Log data demonstrate that participants with self-reported module use data (n = 42) collected during completed telephone calls accessed significantly more modules over the course of the trial than participants for whom those telephone call data were missing (n = 16; 4.05 vs. 1.87, U = 209, p = .022).

Recommended modules

On average, participants were recommended 8.1 modules (range: 5–10). Nine participants were recommended all ten modules. The four core modules (Pain Education, Setting Personal Goals, Improving Sleep, and Preparing for the Future) were recommended to all participants (n = 58), regardless of responses on Self-Assessment. The Pacing and Problem Solving (n = 52), Effective Communication (n = 52), and Physical Activity (n = 51) modules were recommended most frequently. Table 4 presents log data regarding the number and percentage of participants that were recommended a given module, as well as p-values from McNemar tests (by module) comparing the proportion of participants who accessed a recommended module.

Tailored module use congruence

On average, participants accessed 34.4% of the modules that were recommended to them (range: 0%–100%). Log data indicated that six participants (10.3%) accessed all modules assigned to them. According to log data, 17 participants (of the 49 participants assigned less than 10 modules) used at least one module that was not included in their Personalized Plan. Based upon data obtained during weekly telephone calls, participants (n = 42) self-reported, on average, that they completed 73.0% of the modules recommended to them (range: 11.1%–100%) during the trial period. Participants’ self-reported use of recommended modules was higher than observed access of recommended modules obtained from log data (for overall module use and for all individual modules). With the exception of three modules (Pain Education, Relaxation, and Developing Healthy Thinking Patterns), McNemar’s tests indicated that there was a statistically significant difference in the proportion of participants who were recommended a given module compared to the proportion of participants that accessed said module, all ps < .01.

Discussion

Tailored internet-based interventions for cLBP present a unique opportunity to both customize patient offerings and track whether and how participants use these recommendations. To our knowledge, this is the first study to compare self-reported module use to observed module access and to examine both observed module access and self-reported module use in relation to tailored recommendations. Overall, the findings indicated significantly greater self-reported use of the Pain EASE modules compared to observed access with log data.

Several factors may have contributed to participants’ inflated reports of use. Participants may have been motivated to intentionally over-report use for social desirability reasons. Participants may have also falsely—but genuinely—believed that they accessed certain modules due to a variety of reasons, such as familiarity with the terms from the Self-Assessment and Personalized Plan and/or from extrapolated content from one module to another. The accuracy of recall is also dependent on many factors including emotional salience of events, novelty, as well as implicit theories and schemas that are constructed over time [29, 30]. Prior research has shown that these factors can contribute to both over and underreporting of healthcare utilization and access [31, 32]. Nevertheless, these findings highlight an apparent discrepancy between self-reported use and observed access. This key finding aligns with a recently published review by Parry et al. 23] regarding discrepancies between self-reported and log data reflecting digital media use. Given these discrepancies (i.e., under- and over-self-reporting), Parry et al. raise concerns about drawing conclusions (e.g., policy recommendations) from self-reported media use. Collectively, our primary finding suggests that sole reliance on participants’ self-reported use of an internet-based intervention likely misses captured data that might otherwise be important and more accurate. Participants’ self-reported use may unintentionally misrepresent actual use of internet-based interventions, which may have implications for analyses that examine use in relation to outcome.

Both log data and self-report data revealed that participants accessed less than half of the modules that were recommended to them. Log data revealed that participants accessed approximately 3.41 modules. This level of participation may differ from the average number of sessions attended during in-person CBT-CP but is consistent with a “dose” of treatment reported for the CBT-CP content in Pain EASE [24, 33, 34]. Despite this rate of use of recommended modules (according to log data; both numerically and as a percentage of overall recommendations), participants benefited from the intervention [24]. Notably, prior research has demonstrated that a large proportion of improvement during 10 sessions of CBT-CP occurs during the first four sessions [35].

Although patterns of module use varied, the Pain Education module was accessed and used most frequently. This may have helped participants to reconceptualize pain as a manageable problem. This further highlights the potential positive impact of providing a rationale and explanation of the development and treatment of CP [36]. We acknowledge that the Pain Education module may have been accessed most frequently due to reasons other than its centrality in pain management (e.g., it was recommended to all participants and presented first). We did not examine the effect of accessing the Pain Education module on outcomes, as the trial was not powered to examine the relationship between specific module use and pain-related outcomes [25].

Further, nearly half of the participants self-reported that they used at least one module that was not recommended to them. The reason for this voluntary access remains unknown, but participants may have explored these additional modules due to perceived relevance or curiosity. The amount of time spent on a given module accessed could not be derived from log data.

Lastly, we acknowledge that the majority of participants in the study identified as male. This is representative of the US Veteran population, from which this study drew participants. The gender breakdown in this sample, however, differs from other CP studies in which women are often over-represented.

The study has some limitations. Importantly, the study does not explore why the observed phenomena may be happening or how the observed phenomena impact outcomes. The current study was underpowered to examine whether self-reported or observed use—in terms of specific modules (i.e., physical activity) or overall number of modules used—mediated outcomes [25]. The findings also do not capture idiographic patterns of use (e.g., frequency of use, duration of use, rate of use) or the depth of engagement (e.g., did a participant review each content “page” of a module?). For example, we did not examine whether reporting style (e.g., comparing those who self-reported more modules accessed than those who accurately reported) affected outcome. The method for capturing self-report data through weekly telephone calls has weaknesses, such as possible added participant burden. Additionally, we did not measure or control for social desirability, which may have played a role in the participants’ self-reported use. The participants who provided self-report data (i.e., module use data from weekly telephone calls) for the current study represent only a portion of the full sample; some data (e.g., self-reported use data for 28% of participants) were missing as previously described. Limitations of the method by which self-reported use was captured may have contributed to this primary finding.

These limitations invite opportunities for future research. First, multimethod assessment in the study of internet-based intervention use is vital. Specifically, the use of both self-report and log data will facilitate accurate comparisons within and across trials of internet-based interventions for CP and other conditions. Second, given that engagement in internet-based interventions is a dynamic process [37], it would be worthwhile to explore participants’ rate of use and the ways in which rate of use may be associated with user characteristics and clinical outcomes. Comparing outcomes for groups of participants who are more versus less accurate (with regard to self-report) may be warranted. Additional predictors of use (e.g., computer literacy, illness burden) may also be worthy of consideration. Future studies with larger samples should examine these putative predictors to determine minimum or “sufficient” module use to affect outcome and better understand person- and program-level factors that affect outcome. Qualitative studies to explore participants’ preferences and which aspects of the program were helpful to them may also be beneficial.

Although strategies such as tailored recommendations (i.e., content matching) may affect use, it remains unknown how use of a tailored internet-based program for cLBP compares to non-tailored internet-based programs. Martorella et al. [18] did not find significant differences between outcomes of tailored versus non-tailored internet-based interventions; however, the studies reviewed did not account for intervention use. Therefore, it remains unknown whether a discrepancy in actual and assumed use could account for this “null” finding between tailored versus non-tailored internet-based interventions. We used one form of tailoring (i.e., content matching); however, there are varied forms of tailoring and many PSD design principles. Additional tailoring approaches (e.g., tailoring the message to participants) may be worthwhile, and tailoring the design features available to a participant may be beneficial, as well. Intervention developers should also plan in advance to examine other aspects of PSD that are important in assessing a digital intervention.

Future studies that both include a comparison control (i.e., non-tailored) condition and use multiple methods to assess use would be best suited to consider whether tailoring is effective and whether tailoring (e.g., content matching) meaningfully affects outcomes. Further inquiry with stakeholders, such as participants with cLBP, to understand preferences and strategies to improve use/engagement is also warranted. For example, participants may recommend or benefit from added clinician contact, support, or updates to the program to include more frequent prompts/checks/reminders for accountability. Collectively, the findings encourage further development of strategies to improve use of self-directed, tailored internet-based interventions.

In conclusion, we found that participants self-reported greater use of the Pain EASE skill modules (compared to that observed from the log data) and accessed less than half of the modules that were recommended to them. Given these findings, researchers should thoughtfully consider how to measure and assess use of internet-based interventions. Without thoughtful consideration, it is difficult to capture if and how internet-based interventions are truly used, to compare interventions, and to understand possible effects of tailoring on intervention use.

Acknowledgments

This research was supported by a Veterans Health Administration Rehabilitation Research and Development Service Investigator Initiated Research (RX000998-03). L. Reuman, PhD was supported by National Institute of Mental Health T32MH019836.

Compliance With Ethical Standards

Conflicts of Interest: All authors declare that they have no conflicts of interest to disclose.

Human Rights: All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. The study was approved by the VA Connecticut Healthcare System Institutional Review Board.

Informed Consent: Informed consent was obtained from all individual participants included in the study.

Welfare of Animals: This article does not contain any studies with animals performed by any of the authors.

Transparency Statements

The study was registered at clinicaltrials.gov (NCT01918189). The analysis plan for this study was not formally pre-registered. De-identified data from this study are not available in a public archive. De-identified data and analytic code from this study may be made available (as allowable according to institutional IRB standards) by emailing the corresponding author. Some materials used to conduct the study, including study protocol and analysis plan, are available at: https://clinicaltrials.gov/ct2/show/NCT01918189?term=internet-based&cond=Low+Back+Pain&cntry=US&draw=2&rank=1.

References

  • 1. Haskell SG, Heapy A, Reid MC, Papas RK, Kerns RD. The prevalence and age-related characteristics of pain in a sample of women veterans receiving primary care. J Womens Health (Larchmt). 2006;15(7):862–869. [DOI] [PubMed] [Google Scholar]
  • 2. Kerns RD, Otis J, Rosenberg R, Reid MC. Veterans’ reports of pain and associations with ratings of health, health-risk behaviors, affective distress, and use of the healthcare system. J Rehabil Res Dev. 2003;40(5):371–379. [DOI] [PubMed] [Google Scholar]
  • 3. Qaseem A, Forciea MA, McLean RM, et al. ; Clinical Guidelines Committee of the American College of Physicians . Treatment of low bone density or osteoporosis to prevent fractures in men and women: a clinical practice guideline update from the American College of Physicians. Ann Intern Med. 2017;166(11):818–839. [DOI] [PubMed] [Google Scholar]
  • 4. Simon LS. Relieving pain in America: a blueprint for transforming prevention, care, education, and research. Journal of Pain & Palliative Care Pharmacotherapy. 2012;26(2):197–198. [Google Scholar]
  • 5. Ehde DM, Dillworth TM, Turner JA. Cognitive-behavioral therapy for individuals with chronic pain: efficacy, innovations, and directions for research. Am Psychol. 2014;69(2):153–166. [DOI] [PubMed] [Google Scholar]
  • 6. Hoffman BM, Papas RK, Chatkoff DK, Kerns RD. Meta-analysis of psychological interventions for chronic low back pain. Health Psychol. 2007;26(1):1–9. [DOI] [PubMed] [Google Scholar]
  • 7. Williams AC, Eccleston C, Morley S. Psychological therapies for the management of chronic pain (excluding headache) in adults. Cochrane Database Syst Rev. 2012;11:CD007407. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Glombiewski JA, Hartwich-Tersek J, Rief W. Attrition in cognitive-behavioral treatment of chronic back pain. Clin J Pain. 2010;26(7):593–601. [DOI] [PubMed] [Google Scholar]
  • 9. Bender JL, Radhakrishnan A, Diorio C, Englesakis M, Jadad AR. Can pain be managed through the Internet? A systematic review of randomized controlled trials. Pain. 2011;152(8):1740–1750. [DOI] [PubMed] [Google Scholar]
  • 10. Barak A, Klein B, Proudfoot JG. Defining internet-supported therapeutic interventions. Ann Behav Med. 2009;38(1):4–17. [DOI] [PubMed] [Google Scholar]
  • 11. Macea DD, Gajos K, Daglia Calil YA, Fregni F. The efficacy of Web-based cognitive behavioral interventions for chronic pain: A systematic review and meta-analysis. J Pain. 2010;11(10):917–929. [DOI] [PubMed] [Google Scholar]
  • 12. El-Metwally A. Internet-based interventions for pain management: a systematic review of randomised controlled trial (RCTs) conducted from 2010 to 2014. J Public Health Epidemiol. 2015;7:170–182. [Google Scholar]
  • 13. Moman RN, Dvorkin J, Pollard EM, et al. A systematic review and meta-analysis of unguided electronic and mobile health technologies for chronic pain-is it time to start prescribing electronic health applications? Pain Med. 2019;20(11):2238–2255. [DOI] [PubMed] [Google Scholar]
  • 14. Heapy AA, Higgins DM, Cervone D, Wandner L, Fenton BT, Kerns RD. A systematic review of technology-assisted self-management interventions for chronic pain: looking across treatment modalities. Clin J Pain. 2015;31(6):470–492. [DOI] [PubMed] [Google Scholar]
  • 15. Dowd H, Hogan MJ, McGuire BE, et al. Comparison of an online mindfulness-based cognitive therapy intervention with online pain management psychoeducation: a randomized controlled study. Clin J Pain. 2015;31(6):517–527. [DOI] [PubMed] [Google Scholar]
  • 16. Oinas-Kukkonen H, & Harjumaa, M. Persuasive systems design: Key issues, process model, and system features. Commun. Assoc. Inf. Syst. 2009;24:28. [Google Scholar]
  • 17. Schubart JR, Stuckey HL, Ganeshamoorthy A, & Sciamanna CN. Chronic health conditions and internet behavioral interventions. Comput Inform Nurs CIN. 2011;29(2):81–92. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Martorella G, Boitor M, Berube M, Fredericks S, Le May S, Gélinas C. Tailored web-based interventions for pain: Systematic review and meta-analysis. J Med Internet Res. 2017;19(11):e385. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Hawkins RP, Kreuter M, Resnicow K, Fishbein M, Dijkstra A. Understanding tailoring in communicating about health. Health Educ Res. 2008;23(3):454–466. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Danaher BG, Seeley JR. Methodological issues in research on web-based behavioral interventions. Ann Behav Med. 2009;38(1):28–39. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. de Reuver M, & Bouwman, H. Dealing with self-report bias in mobile Internet acceptance and usage studies. Inf Manag. 2015;52(3):287–294. [Google Scholar]
  • 22. Scharkow M. The Accuracy of self-reported internet use—A validation study using client log data. Commun Methods Meas. 2016;10(1):13–27 [Google Scholar]
  • 23. Parry DA, Davidson BI, Sewall CJR, Fisher JT, Mieczkowski H, Quintana DS. A systematic review and meta-analysis of discrepancies between logged and self-reported digital media use. Nat Hum Behav. 2021;5(11):1535–1547. [DOI] [PubMed] [Google Scholar]
  • 24. Higgins DM, Buta E, Williams DA, et al. Internet-based pain self-management for veterans: feasibility and preliminary efficacy of the pain EASE program. Pain Pract. 2020;20(4):357–370. [DOI] [PubMed] [Google Scholar]
  • 25. Solar C, Halat AM, MacLean RR, et al. Predictors of engagement in an internet-based cognitive behavioral therapy program for veterans with chronic low back pain. Transl Behav Med. 2021;11(6):1274–1282. [DOI] [PubMed] [Google Scholar]
  • 26. Jensen MP, Turner JA, Romano JM, Strom SE. The chronic pain coping inventory: development and preliminary validation. Pain. 1995;60(2):203–216. [DOI] [PubMed] [Google Scholar]
  • 27. Jensen MP, Nielson WR, Turner JA, Romano JM, Hill ML. Readiness to self-manage pain is associated with coping and with psychological and physical functioning among patients with chronic pain. Pain. 2003;104(3):529–537. [DOI] [PubMed] [Google Scholar]
  • 28. Kerns RD, Turk DC, Rudy TE. The West Haven-Yale multidimensional pain inventory (WHYMPI). Pain. 1985;23(4):345–356. [DOI] [PubMed] [Google Scholar]
  • 29. Bradburn NM, Rosen D. Walter E. Massey: president-elect of AAAS. Science. 1987;238(4834):1657–1658. [DOI] [PubMed] [Google Scholar]
  • 30. Ross M. Relation of implicit theories to the construction of personal histories. Psychol Rev. 1989;96(2):341–357. [Google Scholar]
  • 31. Rhodes AE, Fung K. Self-reported use of mental health services versus administrative records: care to recall? Int J Methods Psychiatr Res. 2004;13(3):165–175. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. Kaur N, Vedel I, El Sherif R, Pluye P. Practical mixed methods strategies used to integrate qualitative and quantitative methods in community-based primary health care research. Fam Pract. 2019;36(5):666–671. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33. Heapy AA, Higgins DM, Goulet JL, et al. Interactive voice response-based self-management for chronic back pain: the COPES noninferiority randomized trial. JAMA Intern Med. 2017;177(6):765–773. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34. Kerns RD, Burns JW, Shulman M, et al. Can we improve cognitive-behavioral therapy for chronic back pain treatment engagement and adherence? A controlled trial of tailored versus standard therapy. Health Psychol. 2014;33(9):938–947. [DOI] [PubMed] [Google Scholar]
  • 35. Burns JW, Nielson WR, Jensen MP, Heapy A, Czlapinski R, Kerns RD. Does change occur for the reasons we think it does? A test of specific therapeutic operations during cognitive-behavioral treatment of chronic pain. Clin J Pain. 2015;31(7):603–611. [DOI] [PubMed] [Google Scholar]
  • 36. Mar CM, Chabal C, Jacobson L, Mariano AJ, Vore A. Interactive computerized chronic pain education program. In: Proceedings of the AMIA Symposium; 2000:1077. [Google Scholar]
  • 37. Yardley L, Spring BJ, Riper H, et al. Understanding and promoting effective engagement with digital behavior change interventions. Am J Prev Med. 2016;51(5):833–842. [DOI] [PubMed] [Google Scholar]

Articles from Translational Behavioral Medicine are provided here courtesy of Oxford University Press

RESOURCES