Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2014 Oct 1.
Published in final edited form as: Health Educ Behav. 2013 Oct;40(1 0):10.1177/1090198113496787. doi: 10.1177/1090198113496787

Exploring Behavioral Markers of Long-term Physical Activity Maintenance: A Case Study of System Identification Modeling within a Behavioral Intervention

Eric B Hekler 1, Matthew P Buman 1, Nikhil Poothakandiyil 1, Daniel E Rivera 1, Joseph M Dzierzewski 2, Adrienne Aiken Morgan 3, Christina S McCrae 2, Beverly L Roberts 2, Michael Marsiske 2, Peter R Giacobbi Jr 4
PMCID: PMC3806212  NIHMSID: NIHMS493611  PMID: 24084400

Abstract

Efficacious interventions to promote long-term maintenance of physical activity are not well understood. Engineers have developed methods to create dynamical system models for modeling idiographic (i.e., within-person) relationships within systems. In behavioral research, dynamical systems modeling may assist in decomposing intervention effects and identifying key behavioral patterns that may foster behavioral maintenance. The Active Adult Mentoring Program (AAMP) was a 16-week randomized controlled trial of a group-based, peer-delivered physical activity intervention targeting older adults. Time intensive (i.e., daily) physical activity reports were collected throughout the intervention. We explored differential patterns of behavior among participants who received the active intervention (N=34; 88% women, 64.1±8.3 years of age) and either maintained 150 minutes/week of moderate to vigorous intensity physical activity (MVPA; n=10) or did not (n=24) at 18 months following the intervention period. We used dynamical systems modeling to explore whether key intervention components (i.e., self-monitoring, access to an exercise facility, behavioral initiation training, behavioral maintenance training) and theoretically plausible behavioral covariates (i.e., indoor vs. outdoor activity) predicted differential patterns of behavior among maintainers and non-maintainers. We found that maintainers took longer to reach a steady-state of MVPA. At week 10 of the intervention, non-maintainers began to drop whereas maintainers increased MVPA. Self-monitoring, behavioral initiation training, % outdoor activity, and behavioral maintenance training, but not access to an exercise facility, were key variables that explained patterns of change among maintainers. Future studies should be conducted to systematically explore these concepts within a priori idiographic (i.e., N-of-1) experimental designs.

Keywords: physical activity, maintenance, dynamical systems, system identification


Physical activity is a key behavior to prevent chronic diseases (Schroeder, 2007). Although advances have been made in best methods for promoting physical activity (A.C. King, Buman, & Hekler, in press; Marcus et al., 2006), there is still much we do not understand. There is a strong need to develop physical activity interventions that promote not just physical activity during an intervention but also maintenance of physical activity after the intervention ceases. Previous research focused on behavioral maintenance has largely focused on group mean-difference analyses (Fjeldsoe, Neuhaus, Winkler, & Eakin, 2011). Specifically, results from a recent systematic review suggested that those trials that achieved maintenance (i.e., statistically significant difference between intervention and control at the end of an intervention that persisted during the follow-up period) tended to focus more on women, had more pretrial screening and included more intensive interventions (e.g., longer duration, face-to-face contact, more intervention components, and follow-up prompts) (Fjeldsoe et al., 2011).

Although the overall average in an intervention group may indicate significant differences relative to control, often wide variability exists that is largely lost when focusing only on mean-level differences (e.g., see: Buman, Hekler, Bliwise, & King, 2011). For example, even among intervention trials that promote 18-month maintenance relative to control based on mean-difference analyses (Buman, Giacobbi Jr., et al., 2011; A. C. King et al., 2007), there is wide variability of success and failure within both study arms. Research methods from other disciplines (e.g., engineering, computer science) are increasingly being explored within behavioral science (Collins, Murphy, & Strecher, 2007). In particular, a wide array of engineering methods, broadly labeled system identification are used to create dynamical system models (Ljung, 1999), which have been proposed to better understand the impact of intervention components rather than intervention packages (Rivera, Pew, & Collins, 2007).

Because dynamical systems modeling has historically not been applied in behavioral settings, some considerations important to behavioral applications, such as the need to model the response of multiple participants under difficult experimental and measurement conditions may seem to limit the usefulness of the method. However, systems modeling is based on the same principles as regression and therefore, lessons from behavioral science (e.g., mixed model analyses for controlling for clustering effects) can be used to solve these issues and thus motivate future research (see limitations section). These current limitations aside, dynamical systems modeling affords new opportunities beyond what traditional statistical approaches allow including: a) a much wider range of mathematically modeled responses using low-order differential equations that are estimated from intensive longitudinal data; this includes responses that are often observed in behavioral interventions but are difficult to model via traditional behavioral science analytic techniques (see: (Glass, Wilson, & Gottman, 1975; Hayes, Barlow, & Nelson-Gray, 1999) and see figures 1 and 2 for examples from system identification); b) an explicit emphasis on creating mathematical models for understanding how a measured system functions, which results in strong internal validity for understanding exact mechanisms of change and dose-response relationships for an individual utilizing a behavioral intervention; and c) the direct utilization of these mathematical models in the creation of “closed-loop control systems,” which are central for systems that tailor an intervention relying on individual response, thus providing increasingly personalized and adaptive interventions (Nandola & Rivera, 2013; Rivera et al., 2007).

Figure 1.

Figure 1

Unit step response (i.e., change in y for a unit change in x) for a classical first-order system

Figure 2.

Figure 2

Unit step response for a first-order plant with integrator

Addressing Classical Methodological Challenges

Behavioral science has classically used both methodological and statistical strategies to improve the overall precision of our behavioral models and intervention approaches. These strategies essentially seek to improve the signal-to-noise ratio (Ljung, 1999) by (a) reducing or re-distributing the “noise” in a model, which are the measurement and extraneous factors occurring within the model, through randomization and statistical control techniques; or (b) increasing the “signal” in the model, which is the ground truth information about a relationship of interest in the model, through more potent interventions or more tightly controlled studies. Classically these models then undergo null hypothesis testing procedures to determine “if” the hypothesis of proposed relationship(s) has met an a priori, sometimes arbitrary, threshold of significance to determine its merit.

System identification is a distinct yet complementary approach to traditional hypothesis testing endeavors as its primary emphasis is on modeling variations and responses within temporal data, classically, within a single “system” (a single individual or a small aggregation of individuals). In comparison to hypothesis testing endeavors, improving the signal-to-noise ratio is achieved by utilizing all of the available time-linked data for an individual and then modeling the variations in these data for an individual system over time. This allows for a stronger signal by focusing on the dynamical system and interrelationships of model components within a single system over time. In contrast to hypothesis testing endeavors that explore if a relationship exists, dynamical systems modeling allows for explication on exactly “how” an individual system is functioning overall, thereby offering an important complement to hypothesis testing.

More explicitly, system identification and dynamical systems modeling, in particular, explores the transient responses of manipulated input variables (i.e., intervention components) and disturbance variables (i.e., time-varying covariates known to influence the outcome such as social norms for being active) on an outcome within a single case or N-of-1 time series context (i.e., idiographic) (Deshpande, Nandola, Rivera, & Younger, 2011; Molenaar & Campbell, 2009; Rivera et al., 2007; Timms, Rivera, Collins, & Piper, 2012; Velicer, 2010). As the techniques are generally based on regression, they share some characteristics with more commonly used methods within behavioral science; however, dynamical systems modeling utilizes a wide variety of differential equation structures to mathematically model responses of intervention components and time-varying covariates. For example, it is plausible that the impact of an intervention component will initially be strong but gradually level off over time (e.g., see Figure 1). This response is called a 1st order system within system identification and is difficult to model within current behavioral science analytic techniques but a very basic response pattern within dynamical systems modeling. Indeed, this is only the starting point of a much wider range of possible responses that can be mathematically modeled within system identification (e.g., see Figure 2 for a 1st order integrator system description, which models a delayed but then continuous increase response, along with models for understanding multiple-component feedback loop systems) (Ljung, 2009, 2011; Ogunnaike & Ray, 1994).

As such, system identification offers an exciting complementary methodology for modeling time-intensive data. Because of the much greater flexibility for modeling, the resulting models have strong internal validity for mathematically describing exactly what occurred within a single system, such as a single person. Further, many of the traditional requirements for statistical power to detect “significant” effects (i.e., large number of subjects) are not as relevant in dynamical systems. Because the analytic techniques were originally devised to model single systems, they are well suited to model individual response patterns to interventions when intensive repeated measures are gathered. Similar to how hypothesis testing individuals seek additional subjects to improve the signal-to-noise ratio, in system identification, increased “power” is achieved by gathering data more often and over a longer period of time within an individual (or via more potent signals such as stronger interventions). As such, the use of dynamical system modeling offers promise as an important analytic technique for supporting N-of-1 style experimental designs; an increasingly popular area of experimental inquiry within behavioral science (Smith, 2012).

Perhaps most exciting, the underlying mathematical models, particularly if generated from strong within-person experimental designs (e.g., designs that are conceptually similar to alternating treatment designs, see: (Barlow & Hayes, 1979)) that includes validation studies to determine external validity of the models, can be used to create closed-loop control systems; something that is not possible within traditional null hypothesis endeavors. Control systems are algorithms that can monitor an outcome of interest (e.g., physical activity), a person’s responsiveness to different intervention components (e.g., self-monitoring, attending intervention sessions focused on behavioral initiation training), and the impact of uncontrolled but influential and measurable factors on physical activity (e.g., weather patterns, or social norms of family and friends related to being active) and, based on all of those factors, provide the “right” intervention component at the “right” time for an individual. As such, dynamical systems modeling represents a potentially important enabler for realizing the dream of personalized behavioral medicine.

Although the norm in system identification is to explicate transient responses within one system (i.e., idiographic data), recent explorations into the use of system identification within a behavioral context have explored ways to aggregate data (e.g., average values between individuals based on the assumption of ergodicity) across individuals to reduce the signal-to-noise ratio even further, assuming there is strong a priori justification for the grouping. As discussed earlier, two key groups that require further exploration are those individuals who maintain physical activity levels compared to those who do not. An increasingly popular approach for identifying variables that distinguish unknown groups within longitudinal data are growth mixture models (Duncan, Duncan, & Strycker, 2006). These analyses examine data to identify underlying or latent factors that explain and distinguish different groups and thus could be one fruitful method for understanding maintainers vs. non-maintainers.

The purpose of this study was to utilize dynamical systems modeling to explore how intervention components (i.e., self-monitoring, access to an exercise facility, behavioral initiation training, behavioral maintenance training) and exogenous variables (i.e., % of activity accomplished outdoors) may function differently among maintainers and non-maintainers. Specifically, our goal was to utilize common data exploration practices from system identification (e.g., data visualization, progressive model building using system identification) (Ljung, 2009, 2011) to generate empirically-based hypotheses (A. C. King, Ahn, Atienza, & Kraemer, 2008) that could inform later a priori N-of-1 experimental testing of key intervention components that may promote improved maintenance of physical activity.

Methods

Study Design

The AAMP (Active Adult Mentoring Program) study was a 16-week randomized controlled trial with an 18-month follow-up that tested the impact of peer volunteers as delivery agents of a social cognitive theory-based, group-mediated, physical activity intervention in sedentary adults aged 50 and older living in a university community in the southeastern United States. Enrollment occurred on a rolling basis throughout 2007 and 2008 to control for seasonality effects. The full methods and results have been reported elsewhere (Buman, Giacobbi Jr., et al., 2011) and are briefly reviewed here. Participants were randomized to one of two 16-week study arms: (1) peer-led advice and support for physical activity initiation and maintenance; or (2) a “standard” community-based physical activity promotion intervention. The main results suggested that, at 16 weeks, both arms had significant increases in physical activity, but with no significant differences between arms. At 18 months, the peer-led intervention arm had superior maintenance of physical activity relative to the standard arm, despite cessation of the intervention. Institutional approval was obtained for all aspects of the study protocol. The original study design was conceived to also test dynamical relationships with other key outcomes in older adults including sleep and cognition within the course of a physical activity promotion intervention; hence, an intensive repeated measures design was adopted. In this manuscript we leverage this intensive repeated measures design to explore additional questions related to the specific intervention components we had included in the study, to assess their impact on long-term maintenance of physical activity behavior. For these secondary analyses, we focus only on those individuals who were randomized to the active intervention group and completed the study (N=34).

Intervention

Project AAMP was a multi-component intervention grounded in social cognitive (Bandura, 1986) and self-determination (Deci & Ryan, 1985) theories. Prior to the intervention (week 0), all participants were given a pedometer and a daily log for physical activity self-monitoring and were encouraged to track their behavior daily. At session one, participants received access to the community exercise facility in which the intervention sessions were held. This facility was available for use from weeks 1–12. Weeks 2–7 focused on behavioral initiation training, where content focused on self-management skills for physical activity initiation including goal setting, problem solving, social support, and mental imagery exercises. Weeks 8–12 focused on maintenance training, where content focused on relapse prevention and developing a plan to transition to a home- or community-based exercise routine.

Measures

The Leisure-Time Exercise Questionnaire (LTEQ) was used to assess self-reported physical activity behavior through the intervention and for an additional week at 18-month follow-up. The LTEQ is a 3-item scale that asks participants to rate how many 20 minute bouts they engaged in mild, moderate, and strenuous leisure-time exercise (Godin & Shephard, 1985). Although typically used as a 7-day recall, in the current study the LTEQ was used as a daily measure to reduce recall bias and to allow for time intensive data. Daily reports of minutes of MVPA were computed from the LTEQ by adding the number of bouts reported in the moderate and strenuous categories and multiplying by 20. In a randomly selected subsample of study participants (n=22, 11 in each arm) we found evidence for adequate concurrent validity for the LTEQ with accelerometer-derived MVPA (r=.48, p < .001) as measured by the RT3 triaxial accelerometer (Stayhealthy, Monrovia, CA). Although pedometer data were available, we did not include these data as a comparison to the LTEQ as the intervention targeted all types of MVPA, and pedometers are known to be less accurate at capturing non-ambulatory activities (Tudor-Locke, Williams, Reis, & Pluto, 2002). Immediately following daily completion of the LTEQ, participants were asked to indicate the number of exercise bouts that occurred indoors and outdoors. These data are expressed as a ratio of total bouts performed outdoors (outdoor bouts/[outdoor + indoor bouts]). Finally, for the intervention components, we dichotomously coded each day of the intervention to represent when each of the intervention components were either present or absent. For example, gym membership became available at day 8 and then taken away at day 84. As such, days 0–7 were labeled 0, days 8–84 were labeled 1, and 85 on were labeled 0 again. This temporally linked coding scheme of intervention components is visually summarized in Figure 3.

Figure 3.

Figure 3

Time Series Plots of MVPA and key inputs

Participants

We have limited our sample to only completers of the 16 week intervention in the peer-led intervention arm (N=41 were randomized but N=34 completed the study) to test how the full set of behavioral and intervention components may have differentially impacted physical activity levels during the intervention between maintainers vs. non-maintainers (see Table 1). Specifically, we defined ‘maintainers’ as those who reported ≥150 minutes of MVPA during the 18-month follow-up week of monitoring and ‘non-maintainers’ as those who reported less MVPA. Finally, because all intervention components were delivered during the first 13 weeks of the intervention period, and response rates to daily surveys were lower during the final two week period, we focused on data only from the first 14 weeks of measurement (1 baseline week and 13 intervention weeks). In total, our sample for these analyses was comprised of 10 maintainers and 24 non-maintainers whom reported their physical activity over 98 consecutive days (see Table 1).

Table 1.

Demographics characteristics of maintainers and non-maintainers in the active intervention (N=34).

Non-Maintainers (n=24)
n (%)
Maintainers (n=10)
n (%)
Women 21 (87.5) 9 (90.0)
Ethnicity
 Hispanic/Latino 2 (8.3) 0 (0)
Race
 White 20 (83.3) 10 (100)
 Black 2 (8.3) 0 (0)
 Other 2 (8.3) 0 (0)
Education
 < College graduate 9 (37.5) 3 (30)
 College graduate 6 (25.0) 2 (20)
 > College graduate 9 (37.5) 5 (50)
Age (yrs), m ± SD 63.1 ± 8.7 66.7 ± 6.9
 50–64 years 10 (41.7) 4 (40)
 65+ years 14 (58.3) 6 (60)
Marital status
 Married 13 (54.1) 4 (40)
 Single (divorced/never married) 7 (29.2) 5 (50)
 Widowed 4 (16.7) 1 (10)
Baseline PA (min/wk), m ± SD 19.1 ± 24.7 24.9 ± 17.2

Analysis Plan

First, we developed time-series visuals of our output variable (i.e., daily MVPA, see Figures 1a and 1b) and input/disturbance variables, using basic features of MatLab. These included metrics for the time during the intervention when self-monitoring (i.e., use of a pedometer and daily recording of physical activity levels) occurred. Self-monitoring occurred throughout the entire study, however we utilized an implicit assumption that self-monitoring was not occurring prior to the intervention and that MVPA values from the first measurement were similar to previous days not measured. This assumption provided us with the ability to model possible transient responses of MVPA to self-monitoring via system identification (Figures 1c–1d). A dichotomous metric was entered for the timeframe when access to the exercise facility was available (Figures 3e–3f). In addition, the separated percent attendance of maintainers and non-maintainers who participated in behavioral initiation and/or maintenance training sessions and the self-reported % of activity accomplished outdoors were visualized (see Figure 3g–3l). These visualizations aided in conceptualizing the most appropriate dynamical model response patterns.

We utilized functions provided by the System Identification Toolbox (SITB) in Matlab to perform model estimation and validation (Ljung, 2011). In particular, we rely on the estimation of continuous-time linear models from sampled data using the SITB’s pem command, which is implemented as part of the Process Models option in the SITB’s graphical user interface. The model estimation employed in software applies a continuous-time identification approach with a sophisticated filtering approach that is well suited for demanding noisy environments, and is able to accomplish multiple input estimation as well (Garnier, 2011; Ljung, 2009).

As dynamical systems modeling is not commonly used in behavioral research, a more thorough description of response patterns explored in this study are in order. A dynamical system model may utilize the classical first-order system structure (see Figure 1). This model is represented in differential equation form as follows:

τdy(t)dt+yt=Kx(t) (1)

The signal x(t) denotes the input signal, which could be manipulated (customarily denoted as u in system identification) or an exogenous variable or disturbance (customarily denoted as d). K is the steady-state gain (in units of y/x) while τ is the time constant, which is in units of time. An effective means for understanding the first-order system can be seen from observing the response of y to a unit step change in x, as seen in Figure 1. The gain parameter K denotes the change in the output y at final time per unit of x. τ, meanwhile, is indicative of the speed of response of the system. The smaller the value of τ, the faster the system is to reach a settling point where the output variable levels off, a larger time constant denotes a slower system. More precisely, one time constant τ is equivalent to 63.2% of the modeled final value for a step input change, while 3τ denotes T95%, or the 95% settling time. At t = 5τ the modeled-system step response is effectively settled, having reached over 99% of its final value. The first-order system displays a response curve that grows exponentially towards a new steady-state; this is known as an overdamped response. For the purposes of this paper, we have focused on only first-order systems based on a priori expectations. That said, dynamical systems modeling includes capacities for capturing a variety of other response patterns such as oscillations, overshoots, or undershoots via second-order or higher derivatives (Ogunnaike & Ray, 1994).

An additional dynamical model structure that is relevant to this work involves a first-order system with integrator. This system is represented by a model which includes a second-order derivative and is expressed as,

τd2y(t)dt2+dy(t)dt=Kx(t) (2)

Figure 2 depicts the response of y for a unit step change in x for a first-order system with integrator. For this model there is an initial “lag”’ in the response whose length is influenced by τ, followed by what becomes an asymptotically ramp response with slope equivalent to the gain K. In effect, following the initial lag period, K represents the slope of a linear relationship per unit of time within this type of model (e.g., an initial lag period of a few days followed by a 5min/day increase of MVPA that continues for the rest of the measurement period). As with the classical first-order system, K is in units of y/x, while τ has units of time.

Model validation in system identification comprises many tasks, but ultimately corresponds to the researcher’s confidence that the estimated model will be suitable for an end use application (which can range anywhere from prediction of future outcomes to the design of a optimized intervention based on control systems engineering) (Deshpande et al., 2011). For purposes of this paper, we utilized a progressive model building strategy whereby we add plausible inputs (i.e., self-monitoring, gym membership, behavioral initiation training, maintenance training, and % of activity accomplished outdoors) into a hierarchical series of models. Within these models we strived for the most parsimonious model that had the best model fit.

The model’s goodness-of-fit index is expressed as a percentage from 0 to 100% and calculated according to:

Fit[%]=1001-yk-yk2yk-y2 (3)

Here y k is the simulated output, y k is the data to which the model is fit, and y is the average of all y values and ||·||2 indicates a vector 2-norm. As this is secondary data analysis, these goodness-of-fit estimates are more appropriately viewed as a numerical indicator of internal validity for the model explaining the available data rather than a determination of the external validity of these results translating into other systems. Finally, similar to exploratory factor analysis (Thompson, 2004), when some model parameters are entered, they may result in a better model fit, but the overall model does not fit well with conventional wisdom (e.g., a strong negative response for self-monitoring) or simply creates implausible outcomes (e.g., an increase of over 1000 minutes/day of MVPA). As such, the models are also judged based on conventional wisdom and plausibility of the model fits. As system identification emphasizes questions of how systems function, it does not have statistical significance testing for determining improved model fit. Therefore no p-values were reported as that is not in-line with this methodology, at present.

Results

Visualization results

As described earlier, Figure 3 reports a time-series visualization of key output, input and disturbance variables. Figures in the left column represent non-maintainers whereas figures on the right represent maintainers. Visual inspection from Figure 3a and 3b suggest that maintainers had a slightly greater increase early in the intervention (days 0–40) and greater decrease later (days 70–80) than non-maintainers. Specifically, non-maintainers showed a rapid increase in MVPA following completion of the baseline phase that then remained relatively steady for the rest of 14 weeks plotted with a possible slight reduction at the end that started around week 11. In contrast, maintainers appeared to have a somewhat slower increase to steady state activity, but this state was actually higher overall than the non-maintainers. In addition, the maintainers appeared to have a slightly further increase in MVPA around approximately week 10. A visual inspection of the inputs suggests plausible links between them and fluctuations in daily MVPA.

System Identification

Table 2 summarizes results from our system identification modeling. For non-maintainers, 1st order modeling suggested that self-monitoring, gym membership, and % outdoor activity resulted in the most parsimonious model fit that explained the most variance, with only a negligible impact from behavioral initiation training and maintenance training. Specifically, the overall model suggests self-monitoring contributed an increase of 8.8min/day of MVPA that is reached by approximately day 33.9 (i.e., 3τ). In addition, % activity outdoors contributed an increase of 5.3min/day of MVPA that is reached almost instantaneously. Finally, gym membership contributed a small but significant increase of 3.5min/day of MVPA that is reached at approximately day 16 of the intervention. Although model fit improved slightly for the non-maintainers as more inputs were added, these other items (i.e., behavioral initiation and maintenance training) did not have an appreciable impact on MVPA. The final model explained 18.39% of the variance in MVPA observed across time for the non-maintainer group (see Figure 4).

Table 2.

System Identification Results

Groups Inputs Transient Resp. Gains (min/day of MVPA) Time constant (days) Fit
Inp. 1 Inp. 2 Inp. 3 Inp. 4 Inp. 1 Inp. 2 Inp. 3 Inp. 4
Non-maintainers
  1. Self-monitoring

12.1 11.1 16.45
1st 1st
Maintainers 15.4 16.2 12.6
1st 1st
Non-maintainers
  1. Self-monitoring

  2. Initiation Training

12.0 0.2 11.2 0.001 16.44
1st 1st 1st 1st
Maintainers >1000 13.6 >1000 2.7 20.29
1st-Int 1st 1st-Int 1st
Non-maintainers
  1. Self-monitoring

  2. Gym Membership

9.1 3.4 11.6 5.6 17.82
1st 1st 1st 1st
Maintainers 59.2 −64.1 49.4 96.5 14.59
1st 1st 1st 1st
Non-maintainers
  1. General Intervention

  2. % Activity Outdoors

11.9 4.6 10.8 0.001 16.9
1st 1st 1st 1st
Maintainers 16.7 42.6 16.1 3.6 17.91
1st 1st 1st 1st
* Non-maintainers
  1. General Intervention

  2. % Activity Outdoors

  3. Gym Membership

8.8 5.3 3.5 11.3 0.0 5.4 18.39
1st 1st 1st 1st 1st 1st
Maintainers 2.4 63.8 16.4 0.0 4.1 16.0 16.34
1st 1st 1st 1st 1st 1st
Non-maintainers
  1. General Intervention

  2. Initiation Training

  3. Maintenance Training

10.8 1.6 1.9 10.3 3.8 0.0 16.83
1st 1st 1st 1st 1st 1st
Maintainers 1.7 22.5 4.2 1.2 12.1 57.1 21.92
1st-Int 1st 1st 1st-Int 1st 1st
Non-maintainers
  1. General Intervention

  2. Initiation Training

  3. % Activity Outdoors

11.8 0.4 4.6 11.1 2.7 0.001 16.92
1st 1st 1st 1st 1st 1st
Maintainers 0.2 10.7 33.1 0.3 2.7 3.2 22.73
1st-Int 1st 1st 1st-Int 1st 1st
Non-maintainers
  1. General Intervention

  2. Initiation Training

  3. Maintenance Training

  4. % Activity Outdoors

11.3 0.8 1.2 4.6 10.5 0.001 0.001 0.001 17.06
1st 1st 1st 1st 1st 1st 1st 1st
Maintainers 2.9 16.7 41.2 14.0 0.001 10.1 195.4 0.5 23.59
1st-Int 1st 1st 1st 1st-Int 1st 1st 1st
* Maintainers 3.0 16.5 1.0 10.7 0.0 9.7 9.1 0.3 23.38
1st 1st 1st-Int 1st 1st 1st 1st-Int 1st

Notes:

  • Inp. : Input (also may be thought of as independent variables)
  • 1st :1st order system & 1st-Int:1st order system with integrator
    The step response plots for 1st order systems and 1st order systems with integrator are depicted in Fig. 2 & Fig. 3 respectively (‘K’ represents the gain and ‘τ’ represents time constant).
  • MVPA(Moderate to Vigorous intensity Physical Activity): This is the key output (also known as dependent variable) measured on daily basis
  • Transient Resp. Gains: For 1st order system, this value represents the value achieved when the influence of an input reaches a flattening steady-state, for 1st order integrator systems, this value represents the slope and is conceptually similar to an unstandardized beta coefficient within regression. Refer Fig. 2 & Fig. 3 wherein the gain is indicated by ‘K’.
  • Time Constant (τ): For 1st order systems, the time constant represents the number of units (in this case days) whereby it takes to reach a steady state, a standard convention within system ID is to utilize the 3×Time Constant (3τ) as a benchmark of when the system has reached 95% of its gain; Time constants for 1st-order integrator systems represent the amount of time it takes before the system settles into a linear relationship;
  • Non-maintainers: The group of individuals in the intervention group who were not engaging in 150 minutes per week of MVPA at 18 months
  • Maintainers: The group of individuals who were engaging in 150 min/wk of MVPA or more at 18 months
  • Fit % : This is an estimate of the % of variance explained within the output variable by overall model.
*

Denotes the best overall model.

Figure 4.

Figure 4

Simulation fits for final models for non-maintainers and maintainers

For maintainers, results suggest inclusion of self-monitoring, behavioral initiation training, maintenance training and % activity outdoors results in the strongest, theoretically plausible model. Although not included in Table 2, inclusion of gym membership in the model resulted in either reduced model fit or implausible results (e.g., increases or decreases in daily MVPA over 1000min/day). For maintainers, the final model suggested the greatest gains occurred based on the behavioral initiation training (i.e., a total gain over the intervention of 16.5 min/day, which settled around day 29.1). The next most impactful variable for the model was % activity outdoors with an almost immediate gain of 10.7 min/day of MVPA if individuals accomplished all of their MVPA outdoors. Although tested initially as a 1st order model, a 1st order integrator model appeared to better explain the impact of maintenance training on MVPA (i.e., rather than exponential growth that later settles, maintenance training appeared to influence MVPA in a delayed linear fashion). Specifically, results suggest the effects of maintenance training on MVPA only start to settle into an effect 9 days following the training. Following this initial 9 days, the maintenance training results in a steady increase of 1 min/day of MVPA. Finally, self-monitoring also appears to be influential with a total gain of 3.0 min/day in MVPA that occurs rapidly (i.e. less than 1 day) and then levels off for the duration of the study. In all, this final model explained 23.4% of the variance in MVPA for maintainers (see Figure 4).

Discussion

Results from these secondary, exploratory analyses suggest the maintainers in our sample exhibited different patterns of MVPA during the trial compared to non-maintainers in our sample. This observation was confirmed both via visual inspection and through the establishment of system identification models quantifying these differential patterns. Results further suggest that, for those individuals who maintained, behavioral initiation training, % of activity outdoors, self-monitoring, and maintenance training each helped to explain observed variance in MVPA during the trial, although the magnitude of these effects varied. For maintainers, behavioral initiation training and % of activity outdoors appeared to be key factors driving MVPA, with a more modest impact of self-monitoring and maintenance training. In contrast, non-maintainers appeared to be most influenced by self-monitoring, % of activity outdoors, and access to an exercise facility.

To our knowledge this is the first study that has explored system identification modeling to understand differential patterns of response to individual components to further elucidate best strategies for promoting maintenance of physical activity. Conceptually these results suggest, at least among the older healthy adult population in our sample, interventions may work better at promoting long-term maintenance at 18 months if they focus on performing activity outdoors rather than indoor activities in an exercise facility. This conclusion is reached based on results suggesting that non-maintainers in our sample were influenced by access to an exercise facility to increase their MVPA during the trial whereas, for maintainers in our sample, the higher percentage of time outdoors mapped to increases in their physical activity at the end of the intervention that appeared to be maintained for up to 18 months. Future research should further explore the potential unintended consequences of providing access to an exercise facility during the behavioral initiation phase of an intervention on long-term maintenance of physical activity behavior.

These results highlight how interventions may function differently to impact MVPA for those who later maintain compared to non-maintainers. Although group mean-difference analyses can provide valuable insights for intervention packages (Fjeldsoe et al., 2011), they do not provide guidance on the influence of intervention components which can lead to more efficient and impactful interventions (Collins et al., 2007). Dynamical system models provide guidance into how an intervention may function, which could later serve as an “open-loop” model in a just in time adaptive, time-varying intervention based on control systems engineering principles (Collins, Murphy, & Bierman, 2004; Rivera et al., 2007). An adaptive intervention designed on the basis of control systems engineering principles (Rivera et al., 2007), directly incorporates dynamical models relating intervention dosages to outcomes to synthesize optimal decision rules that maximize (or minimize) a performance objective reflective of desirable intervention outcomes. Deshpande et al. (2011), for example, showed that anxiety self-reports can be used to adjust naltrexone dosage for treatment of fibromyalgia symptoms. In the context of this paper, weather forecasts could be used in conjunction with the dynamical model relating outdoor activity to MVPA to anticipate periods of time amenable to outdoor activity; this information could then be communicated to intervention participants to enable better physical activity planning. It may be fruitful in future studies to explore variations in physical activity by weather individually by season where seasonal impact would vary experimentally for individuals.

Limitations

This investigation has several limitations. The study focused on secondary data analyses of a previously conducted study. As such, the sample was not explicitly designed for system identification methods nor was a large enough sample present to allow for validation of the models generated thereby limiting any interpretation of the generailizability of these results. Related to design, several intervention components started and/or ceased together, thereby limiting our ability to parse out the differential impact of each intervention component. System identification techniques were explicitly designed for understanding response patterns in physical systems that are guided by fundamental laws of conservation (e.g., energy, matter, momentum). As such, the methods, by design, emphasize strong internal validity, particularly for modeling how a particular system functions. As alluded to above, since system identification procedures are largely based on regression, there is the potential to take advantage of important lessons from statistical analytic techniques utilized in behavioral science to improve the methods, particularly for determining the generalizability of the results. With the current secondary data analyses, particularly without the use of a priori experimental designs that include a subset of data for validation of the models created, results from our models require replication within larger samples before any conclusions about generalizability of the results can be determined. Of particular importance, this study utilized only healthy older adults participating in a peer-led intervention. As such, additional research is required to determine if these results generalize to other intervention packages or populations including older adults with chronic conditions such as arthritis.

In addition, the primary outcome measure we utilized was not a continuous variable, but instead reported daily physical activity levels in 20 minute increments. Although we attempted to reduce the impact of this by averaging across meaningful groups, future research, particularly utilizing truly among N-of-1 “systems,” should incorporate continuous measures of MVPA, such as accelerometry data. While participants did receive a pedometer in this trial, the pedometer was used to provide a tracking and feedback mechanism for participants. The pedometers used were not designed to provide research grade data (e.g., easy to reset the numbers, steps were calculated with only the movements of a ball) and therefore were not appropriate to use in these analyses. In addition, self-report of key variables was poor during the final weeks of the trial and were completely lacking from the end of the intervention trial until the 18-month mark, thereby limiting our understanding of long-term patterns of change. This may have been particularly impactful for the maintenance training intervention component as it was only activated during the final few weeks of the trial, thereby limiting our ability to fully map the response to maintenance training. In addition, self-monitoring was present throughout the measurement period. As such, conclusions about this particular intervention component are particularly tenuous. That said, based on the assumption that the effect of the self-monitoring would start to occur as soon as self-monitoring becomes available, the inclusion of it in the model still does provide some insights for understanding its impact on physical activity.

As discussed previously, while technically possible within system identification, commercial tools such as the System Identification Toolbox in MATLAB (Mathworks, 2013) used in this work are not geared toward establishing statistical significance testing, particularly among clustered observations (e.g., participants who participated in the same group). As such, in these secondary analyses, which were primarily meant to drive hypothesis generation rather than hypothesis testing, we did not conduct formal statistical significance testing. As suggested by King et al. (2008), the type of exploratory analysis conducted in this study is a key first step for generating hypotheses and is conceptually distinct from hypothesis testing endeavors. The results provide guidance for subsequent hypothesis-testing endeavors but should not be considered confirmatory (A. C. King et al., 2008).

To fully capture the strength of system identification within a behavioral context, future research could explore improving the methods by developing techniques for better determining statistical significance of the models, applying analytic lessons from mixed model analyses to a system identification context, and use of more explicit N-of-1 experimental designs that are more in line with a system identification tradition. That said, these secondary data analyses provide an important “proof of concept” and bridge for linking these two traditions. We believe the melding of lessons from behavioral science (i.e., significance testing, mixed model analyses) and engineering (i.e., system identification, N-of-1 study designs) will result in particularly fruitful and robust methods for understanding behavioral interventions.

Future Directions

To fully utilize the strengths of system identification and dynamical systems models, a priori N-of-1 experiments would need to be devised to test the differential impact of intervention components during an intervention trial. These idiographic study designs are being explored more frequently within behavioral research (Deshpande et al., 2011; Molenaar & Campbell, 2009; Rivera et al., 2007; Smith, 2012; Timms et al., 2012; Velicer, 2010). The use of a priori N-of-1 experimental designs whereby different intervention components are systematically activated and then deactivated, coupled with system identification methods, would allow for a more appropriate testing of each individual intervention component, which could then be used for better explicating how each individual component influences (or does not influence) MVPA and developing control systems for behavioral interventions. Future research should explore the key variables identified within this study but utilize stronger within-subject study designs for disaggregating intervention impact on patterns of change. For example, utilizing a mixture of an alternating treatment design with a Latin Square study design to control for ordering effects across participants might be particularly fruitful (Barlow & Hayes, 1979; Grant, 1948). As these methods primarily emphasize how a system functions, additional research using latent growth modeling could be a valuable complement for determining if different intervention components significantly identify differences between maintainers and non-maintainers. Beyond this, to ensure these types of analyses can be replicated, a central tenet of science, interdisciplinary teams that include both behavioral scientists and engineers are required for conducting these types of analyses, at present. This will be particularly important to be mindful of when designing studies that explore questions related to temporally optimizing behavioral interventions to foster adaptive interventions; they will likely require a transdisciplinary team.

Conclusions

Overall, results suggested different patterns of responses to intervention components between those who report ≥150 minutes/week of MVPA at 18 months (i.e., maintainers) compared to those who do not (i.e., non-maintainers). Results further emphasize that, for behavioral maintenance, engaging in physical activity outdoors, and therefore in a place that can always be accessed (as opposed to providing gym membership for only the intervention period), appeared to be important for differentially predicting behavioral maintenance. Future research should explore within-person experimental designs for better explicating the differential impact of each intervention component on physical activity levels. Overall though, the techniques utilized within this paper offer a valuable “proof-of-concept” for the utility of system identification modeling within behavioral science that could open the door to improvements in theory, adaptive interventions, and pattern recognition of meaningful patterns for predicting long-term behavioral maintenance.

Acknowledgments

Drs. Hekler and Buman contributed equally to this work and thus the first two authors are listed at random. This work was supported in part by a Research Opportunity Fund in the College of Health and Human Performance at the University of Florida, an Age Network Multidisciplinary Research Enhancement grant at the University of Florida, a Mentorship Opportunity Grant from the Graduate Student Council at the University of Florida, institutional (T32-AG-020499, J. M. D.) and individual (F31-AG-032802, J. M. D.; 1R36AG029664, A. A. M.) training grants awarded to the University of Florida. Support for this work has also been provided by OBSSR and NIDA through grants R21 DA024266 and K25 DA021173. The content is solely the responsibility of the authors and does not necessarily represent the official views of OBSSR, NIDA, or the National Institutes of Health.

References

  1. Bandura A. Social Foundations of Thought and Action: A Social Cognitive Theory. Englewood Cliffs, NJ: Prentice Hall; 1986. [Google Scholar]
  2. Barlow DH, Hayes S. Alternating Treatment Designs: One Strategy for Comparing the Effects of Two Treatments in a Single Subject. Journal of Applied Behavioral Analysis. 1979;12:199–210. doi: 10.1901/jaba.1979.12-199. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Buman MP, Giacobbi P, Jr, Dzierzewski J, Aiken MA, McCrae C, Roberts B, et al. Peer Volunteers Improve Long-Term Maintenance of Physical Activity with Older Adults: A Randomized Controlled Trial. Journal of Physical Activity & Health. 2011;8:S257–S266. doi: 10.1123/jpah.8.s2.s257. [DOI] [PubMed] [Google Scholar]
  4. Buman MP, Hekler EB, Bliwise D, King AC. Exercise Effects on Night-to-Night Fluctuations in Self-Rated Sleep among Older Adults with Sleep Complaints. Journal of Sleep Research. 2011;20:28–37. doi: 10.1111/j.1365-2869.2010.00866.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Collins LM, Murphy SA, Bierman KA. A Conceptual Framework for Adaptive Preventive Interventions. Prevention Science. 2004;5:185–196. doi: 10.1023/b:prev.0000037641.26017.00. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Collins LM, Murphy SA, Strecher V. The Multiphase Optimization Strategy (Most) and the Sequential Multiple Assignment Randomized Trial (Smart): New Methods for More Potent Ehealth Interventions. American Journal of Preventive Medicine. 2007;32:S112–S118. doi: 10.1016/j.amepre.2007.01.022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Deci EL, Ryan RM. Intrinsic Motivation and Self-Determination in Human Behavior. New York: Plenum; 1985. [Google Scholar]
  8. Deshpande S, Nandola NN, Rivera DE, Younger J. A Control Engineering Approach for Designing an Optimized Treatment Plan for Fibromyalgia. Paper presented at the 2011 American Control Conference; San Francisco, CA USA. 2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Duncan TE, Duncan SC, Strycker LA. An Introduction to Latent Variable Growth Curve Modeling: Concepts, Issues, and Applications. Mahway, NJ: Lawrence Erlbaum Associates; 2006. [Google Scholar]
  10. Fjeldsoe B, Neuhaus M, Winkler E, Eakin E. Systematic Review of Maintenance of Behavior Change Following Physical Activity and Dietary Interventions. Health Psychology. 2011;30:99–109. doi: 10.1037/a0021974. [DOI] [PubMed] [Google Scholar]
  11. Garnier H. Teaching Data-Based Continuous-Time Model Identification with the Contsid Toolbox for Matlab. Paper presented at the 18th IFAC World Congress; Milano (Italy). 2011. [Google Scholar]
  12. Glass G, Wilson V, Gottman J. Design and Analysis of Time-Series Experiments. Boulder, CO: Colorado Associated University Press; 1975. [Google Scholar]
  13. Godin G, Shephard RJ. A Simple Method to Assess Exercise Behavior in the Community. Canadian Journal of Applied Sports Science. 1985;10:141–146. [PubMed] [Google Scholar]
  14. Grant D. The Latin Square Principle in the Design and Analysis of Psychological Experiments. Psychological Bulletin. 1948;45:427–442. doi: 10.1037/h0053912. [DOI] [PubMed] [Google Scholar]
  15. Hayes S, Barlow DH, Nelson-Gray R. The Scientific Practitioner: Research and Accountability in the Age of Managed Care. Needham Heights, MA: A Viacom Company; 1999. [Google Scholar]
  16. King AC, Ahn DF, Atienza AA, Kraemer HC. Exploring Refinements in Targeted Behavioral Medicine Intervention to Advance Public Health. Annals of Behavioral Medicine. 2008;35:251–260. doi: 10.1007/s12160-008-9032-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. King AC, Buman MP, Hekler EB. Physical Activity Strategies in Populations. In: Green L, Kahan S, Gielen A, Fagan P, editors. Health Behavior Change in Populations: The State of the Evidence and Roles for Key Stakeholders. in press. [Google Scholar]
  18. King AC, Friedman R, Marcus B, Castro C, Napolitano M, Alm D, Baker L. Ongoing Physical Activity Advice by Humans Versus Computers: The Community Health Advice by Telephone (Chat) Trial. Health Psychology. 2007;26:718–727. doi: 10.1037/0278-6133.26.6.718. [DOI] [PubMed] [Google Scholar]
  19. Ljung L. System Identification: Theory for the Use. 2. Upper Saddle River, NJ: Prentice Hall; 1999. [Google Scholar]
  20. Ljung L. Experiments with Identification of Continuous Time Models. Paper presented at the 15th IFAC Symposium on System Identification; Saint-Malo (France). 2009. [Google Scholar]
  21. Ljung L. System Identification Toolbox for Use with Matlab: User’s Guide. The MathWorks Inc; 2011. [Google Scholar]
  22. Marcus BH, Williams DM, Dubbert PM, Sallis JF, King AC, Yancey AK, Claytor RP. Physical Activity Intervention Studies: What We Know and What We Need to Know: A Scientific Statement from the American Heart Association Council on Nutrition, Physical Activity, and Metabolism (Subcommittee on Physical Activity); Council on Cardiovascular Disease in the Young; and the Interdisciplinary Working Group on Quality of Care and Outcomes Research. Circulation. 2006;114:2739–2752. doi: 10.1161/CIRCULATIONAHA.106.179683. [DOI] [PubMed] [Google Scholar]
  23. Molenaar P, Campbell C. The New Person-Specific Paradigm in Psychology. Current Directions in Psychological Science. 2009;18:112–117. [Google Scholar]
  24. Nandola NN, Rivera DE. An Improved Formulation of Hybrid Model Predictive Control with Application to Production-Inventory Systems. IEEE Transactions on Control Systems Technology. 2013;1:121–135. doi: 10.1109/TCST.2011.2177525. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Ogunnaike B, Ray W. Process Dynamics, Modeling, and Control. Oxford: Oxford University Press; 1994. [Google Scholar]
  26. Rivera DE, Pew MD, Collins LM. Using Engineering Control Principles to Inform the Design of Adaptive Interventions: A Conceptual Introduction. Drug and Alcohol Dependence. 2007;88:S31–S40. doi: 10.1016/j.drugalcdep.2006.10.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Schroeder SA. We Can Do Better - Improving the Health of the American People. New England Journal of Medicine. 2007;357:1221–1228. doi: 10.1056/NEJMsa073350. [DOI] [PubMed] [Google Scholar]
  28. Smith JD. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards. Psychological Methods. 2012;17:510–550. doi: 10.1037/a0029312. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Thompson B. Exploratory and Confirmatory Factor Analysis: Understanding Concepts and Applications. Washington, DC USA: American Psychological Association; 2004. [Google Scholar]
  30. Timms KP, Rivera DE, Collins LM, Piper ME. System Identification Modeling of a Smoking Cessation Intervention. Paper presented at the 16th IFAC Symposium on System Identification (SYSID 2012); Brussels, Belgium. 2012. [Google Scholar]
  31. Tudor-Locke C, Williams JE, Reis JP, Pluto D. Utility of Pedometers for Assessing Physical Activity: Convergent Validity. Sports Medicine. 2002;32:795–808. doi: 10.2165/00007256-200232120-00004. [DOI] [PubMed] [Google Scholar]
  32. Velicer W. Applying Idiographic Research Methods: Two Examples. Paper presented at the 8th International Conference on Teaching Statistics; Ljubljana, Slovenia. 2010. [Google Scholar]

RESOURCES