Abstract
Objectives
To determine whether auditory and visual computer games yield transfer effects that (a) are modality-specific to verbal memory (auditory stimulus presentation) and visual-processing tests, (b) affect working memory and processing speed, (c) are synergistic for combined game-type play, and (d) are durable.
Method
A Pilot Study (N = 44) assessed visual transfer effects in a two-group pre–post design. The Main Study (N = 151) employed a 2 (visual games: yes, no) × 2 (auditory games: yes, no) × 3 (test session: pretest, post-test, follow-up) design, allowing different training groups to act as active controls for each other. Neuropsychological test scores were aggregated into verbal-memory (auditory presentation), visual-processing, working-memory, and processing-speed indexes.
Results
Visual-processing and working-memory pre–post-training change scores were differentially modulated across the four gameplay groups in the main sample, demonstrating transfer effects differing across both active- and passive-control groups. Visual training yielded modality-specific transfer effects in both samples, transfer to working memory in the main sample, and transfer to processing speed in the pilot sample. There were no comparable transfer effects for auditory training. Combined-visual-and-auditory training failed to yield synergistic effects or any significant transfer effects. Visual-processing transfer effects remained significant at follow-up.
Discussion
Visual and auditory games differentially modulated transfer effects. Domain-specific visual transfer effects were found at post-test and were durable at follow-up. Visual gameplay holds potential to ameliorate age-related cognitive decline in visual cognition.
Keywords: Brain training, Cognitive training, Video-game training
Age-related cognitive decline impacts well-being (Wilson et al., 2013), the ability to complete everyday activities, and a person’s independence (Burdick et al., 2005). Thus, there has been increasing interest in developing interventions that slow, or reverse, these declines. One seminal example is the ACTIVE study (Ball et al., 2002) that administered memory and reasoning-strategy training and computerized speed-of-processing training to over 2,800 older adults and found targeted cognitive-training effects in memory, reasoning, and processing speed, respectively. Such a pattern of effects suggests transfer on a domain-specific basis.
Importantly, cognitive aging is not monolithic. Age-related stability occurs in vocabulary, semantic memory, and autobiographical memory (Park et al., 2002), whereas age-related decline occurs in processing speed (Salthouse, 1996), working memory and cognitive control (Braver & West, 2008), and visuospatial processing and verbal memory (Park et al., 2002). Indeed, Salthouse and Davis (2006; see also Salthouse 2009) found that much of the breadth of age-related changes in cognition can be captured by of a set of interrelated broad cognitive abilities common across the life span, with five of these (verbal memory, spatial memory, processing speed, working memory, and fluid intelligence) demonstrating unique age-related variation in older adults.
The present studies explore cognitive-training effects in healthy older adults for two computer-game suites emphasizing visual and auditory challenges, and potential synergistic effects of playing both suites. Our focus is on modality-specific transfer (defined as greater change in training vs comparison groups) from computer gameplay to paper-and-pencil neuropsychological tests that have been found to load on two broad age-related cognitive domains, namely, verbal memory (auditory presentation of items) and visual processing (Duff et al., 2006, 2009). We also expected modality-nonspecific transfer to working-memory and processing-speed tasks based on widespread observation of transfer effects for these cognitive domains (Lampit, Hallock, & Vallenzuela, 2014; Toril, Reales, & Ballesteros, 2014).
Computerized Cognitive Training and Age
Computerized cognitive training (CCT) has rapidly taken a prominent place in the literature on cognitive interventions in healthy older populations. Three types of CCT have been studied: cognitive-strategy training targeting a specific cognitive domain or process; neuropsychological-based training targeting broad cognitive domains; and traditional video games, often designed for entertainment (Kueider, Parisi, Gross, & Rebok, 2012). Our interest is in a hybrid of the latter two categories, often referred to as brain games. The fact that these games are typically offered as commercial products with strong marketing claims has been met with concern that such claims, and the theories used to motivate the games’ design, bear greater scrutiny (Simons et al., 2016).
Two recent meta-analyses have documented the effectiveness of CCT. Lampit et al. (2014) included 52 randomized-control CCT studies encompassing a wide range of training methods, including cognitive strategies, neuropsychological, and video-game approaches. They reported significant small-to-modest transfer effects (pre–post change statistically greater for training than for control group, mean d = .22) for verbal memory, visuospatial processing, working memory, and processing speed. Toril et al. (2014) included 20 randomized-control computer-game studies with an overall mean transfer effect (d = .37) that varied significantly across cognitive domains of memory, processing speed, attention, and global cognition. They failed to find significant modulation of transfer effects for game type (traditional video vs brain games) or number of training games (1–6 vs 7–12), but did report that shorter training periods and older participants yielded greater transfer effects (1–6 vs 7–12 weeks, 60–70 vs 71–80 years).
Computer games have advantages of self-administration, potential for increased engagement and compliance, ease of dissemination, and low cost (Kueider et al., 2012; Toril et al., 2014). Furthermore, game suites allow for variation in game experience and differential targeting of skills. The complexity of individual games (Oei & Patterson, 2014), however, poses challenges to understanding the nature of gameplay transfer effects. One solution is to experimentally compare transfer effects for different types of games.
To date, only a few direct comparisons of training games have appeared in the cognitive aging literature. Most have compared brain games to traditional video games such as Tetris (e.g., Belchior et al., 2013; Peretz et al., 2011; Wolinsky et al., 2013). An exception is Anguera et al. (2013) who compared two versions of a driving game designed on neuropsychological principles. There is a clear need for additional work comparing brain games’ effect on targeted cognitive domains, assessed by multiple tests covering broad domains of age-related cognitive decline.
Visual- and Auditory-Challenge Games
Previous work on auditory games has been encouraging. Zelinski and colleagues (Smith et al., 2009; Zelinski et al., 2011; Zelinski, Peters, Hindin, Petway, & Kennison, 2014; IMPACT study, N = 487) reported that Brain Fitness games (Table 1; Posit Science Corporation, San Francisco, CA), which involve auditory challenges (Mahncke, Bronstone, & Merzenich, 2006) improved verbal-memory (auditory item presentation) and processing-speed scores above and beyond the control group. At 3-month follow-up (Zelinski et al., 2011), the transfer effects were somewhat durable but clearly in decline.
Table 1.
Description | |
---|---|
Auditory Games | |
Sound Sweeps | Discriminate rising/lowering tones |
Fine Tuning | Phoneme discrimination |
To Do List | Remember order of actions in verbal list |
Memory Grid | Match sounds associated with locations on a grid |
Syllable Stacks | Recall syllable order |
In The Know | Speech comprehension |
Visual Games | |
Target Tracker (Jewel Diver) | Track multiple moving targets |
Double Decision (Road Tour) | Attend to central and peripheral locations simultaneously |
Peripheral Challenge (Bird Safari) | Detect targets from center to periphery |
Visual Sweeps (Sweep Seeker) | Identify direction of fast motion |
Eye For Detail (Master Gardener) | Remember spatial locations of objects |
Note: Auditory games (Brain Fitness, Posit Science, San Francisco, CA), visual games (InSight, Posit Science). Visual games were modified by Posit Science between the Pilot Study and the Main Study, but the basic task demands for each game remained constant.
A related suite of brain games, InSight (Table 1; Posit Science Corporation), which involve visual challenges (Table 1), has been evaluated as an intervention for cognitive declines. Edwards et al. (2013) reported near-transfer to Useful Field of View (UFOV, a visuospatial-attention test validated as predictive for driving, Ball et al., 2002). Wolinsky et al. (2013) found similar results using one game, Road Tour, from the Insight suite. Barnes et al. (2013) conducted the most comprehensive test of the InSight games thus far. Their older adults played the visual games (InSight) for 6 weeks and then the auditory games (Brain Fitness) for 6 weeks. Barnes et al. reported significantly greater improvement on the UFOV test for the training group than the control group, but no transfer effects for processing speed or verbal memory. The present study builds on this work with a focus on direct comparison of visual and auditory games.
Brain Fitness and InSight (auditory and visual challenges, respectively, see Table 1) were originally designed based on sensory neuroscience results in animals indicating a decline in sensory processing can lead to negative neuroplastic brain changes that have the potential to manifest differentially in tasks that rely on sensory input (Delahunt et al., 2008; Mahncke et al., 2006). Such references to neuroplasticity have been criticized for being overly vague and incomplete (Simons et al., 2016), but the fact that these auditory and visual games have yielded transfer effects to tasks that involve verbal memory of auditory items, and visuospatial attention, respectively, suggests that it may be important to directly compare the patterns of transfer and their potential for combined training. Moreover, there is a potential for such a comparison to provide important information regarding the game elements most likely to lead to transfer to broad age-related cognitive domains.
Transfer of Computerized-Game Training
Practice of a task generally improves performance on that task, or tasks with similar task elements, suggesting a distinction between nearer and farther transfer based on the overlap of trained- and untrained-task elements (Gathercole, Dunning, Holmes, & Norris, 2019). Simons et al. (2016) reviewed the literature on brain-game training and argued the evidence was stronger for near transfer over far transfer to unrelated tasks. They noted, however, that not enough is known about the constraints that lead to transfer.
Knowledge of constraints on transfer is critical for theoretically driven predictions of transfer (Taatgen, 2013). Recently, Gathercole et al. (2019) performed a meta-analysis of near-transfer between working-memory tasks to document training-task constraints on transfer, which they then used to develop a theoretical framework predictive of transfer in an additional set of studies not included in the original analysis. Modality was a trained-task constraint considered for better understanding transfer between working-memory tasks. We have a similar goal to examine modality of game challenge as a potential game constraint influencing transfer.
Understanding computer-gameplay transfer is challenging given (a) the complexity and dynamic nature of the games and (b) the overlap between game elements and outcome measures may not be immediately apparent based on surface relationships between tasks (Simons et al., 2016; Zelinski et al., 2014). Two reviews of traditional action games came to differing conclusions, for example. Bavelier, Green, Pouget, and Schrater (2012) argued that traditional action games result in far-transfer to an ability to learn to learn. Oei and Patterson (2014), by contrast, argued that transfer is due primarily to multiple trained-game characteristics that each correspond with a subset of untrained-task elements in the battery used to assess a cognitive domain (broad-spectrum near-transfer effects). Given the possibility of multiple near-transfer effects that can appear as far-transfer, and our focus on broad domains of cognitive aging, we leave issues of transfer to specific cognitive abilities to future work.
Study Design and Hypotheses
The present study assessed modality-specific transfer following training with auditory and visual games (Brain Fitness and InSight game suites, Table 1, Posit Science). Using both suites allowed us to directly compare their transfer effects, and also, to examine possible synergistic effects by including a group that played both suites. By including three game-training groups (auditory games only, visual games only, and both game types) and a no-game-play control group, we address issues surrounding active- versus passive-control groups that often limit interpretation of training studies (Simons et al., 2016).
Our primary outcome measures were visual processing and verbal memory indexes computed from the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS, Randolph, 1998) as suggested by Duff et al. (2006, 2009) who found that a two-factor solution captured age-related variation for 9 of 12 subtests. We chose to use these measures because they assess broad cognitive domains, making them appropriate for assessment of interventions for cognitive aging, and because of the natural modality split between verbal-memory subtests based on auditory items and visual-processing subtests based on visual items. We recognize that these domains are broad and are comprised of more specific cognitive abilities (e.g., verbal working memory, spatial working memory). Moreover, many cognitive abilities can be conceptualized as having modality-specific and modality-general aspects (e.g., verbal and spatial working memory can be viewed as a combination of modality-specific components that share a combined modality-general component associated with a higher-order cognitive ability of working memory, Gathercole et al., 2019). Given the Salthouse (2009) work, however, it is reasonable to focus on broad verbal (auditory presentation) and visual domains. We expected a double-dissociation pattern of transfer from auditory and visual games to the verbal-memory and visual-processing indexes, and synergistic effects of multimodal training across both game suites. We also predicted modality-general transfer to working-memory and processing-speed tasks for both game suites.
Pilot Study
Method
Design
We used a 2 (game training: visual games, waitlist control) × 2 (test session: pretest, post-test) mixed design; test session was the only within-participants variable. A follow-up testing session was planned but dropped due to resource limitations; all waitlist-control participants were provided an opportunity for game-play following their post-test.
Participants
Participants were recruited from a list of community volunteers. Inclusionary and exclusionary criteria were inspired by prior studies (e.g., Smith et al., 2009). Inclusionary requirements were: community dwelling, age ≥60 years, fluent English, willingness to make a time commitment, Mini-Mental Examination score ≥ 24 (Folstein, Folstein, & Mchugh, 1975). Exclusionary criteria were: self-report of serious neurological or psychiatric disorder or medications for such a disorder, failure to accumulate at least 900-min training in a 10-week period (for training group). Compensation was access to the games which, at the time, were not publicly available. Of the 52 participants evaluated for inclusion, 8 either decided not to participate, were excluded, or did not meet inclusionary criteria. The 44 participants who completed the pretest were randomly assigned to the visual-game training (n = 25) or the waitlist-control (n = 19) groups; all 44 participants completed the post-test. The top of Table 2 presents the groups’ demographic data.
Table 2.
Visual games | Auditory games | Both games | Control | |
---|---|---|---|---|
Pilot Sample (N = 44) | ||||
n | 25 | n/a | n/a | 19 |
Male, n (%) | 9 (36) | n/a | n/a | 8 (42) |
Age, M ± SD, range | 78.0 ± 6.6 | n/a | n/a | 79.8 ± 5.4 |
Education (years), M ± SD | 17.2 ± 2.9 | n/a | n/a | 17.1 ± 2.7 |
MMSE (range 0–30), M ± SD | 28.8 ± 1.0 | n/a | n/a | 28.5 ± 1.3 |
Main Sample (N = 151) | ||||
n | 39 | 38 | 37 | 37 |
Male, n (%) | 17 (42) | 15 (40) | 17 (45) | 16 (43) |
Age, M ± SD, range | 68.5 ± 6.1 | 70.2 ± 5.2 | 71.1 ± 5.7 | 69.8 ± 6.3 |
Education (years), M ± SD | 16.1 ± 3.2 | 16.6 ± 2.7 | 16.1 ± 2.5 | 16.4 ± 2.6 |
MMSE (range 0–30), M ± SD | 28.0 ± 1.6 | 27.8 ± 1.5 | 28.4 ± 1.6 | 27.9 ± 1.5 |
GDS (range 0–15), M ± SD | 0.5 ± 0.9 | 0.9 ± 1.5 | 0.7 ± 1.1 | 0.9 ± 1.5 |
Note: M = Mean; GDS = Geriatric Depression Scale; MMSE = Mini-Mental Status Examination; SD = Standard deviation.
Procedure
This research was conducted under an IRB-approved protocol; all participants provided written consent. After completing the pretest, participants either played visual games (InSight, Posit Science Corporation) for the next 8–10 weeks (training group) or played them after the second testing session (post-test) in the following 8–10 weeks (waitlist-control group). We asked participants to complete 30–40 training sessions of 30–40 min each in 8–10 weeks. All training participants completed at least 900-min of training (M = 1519, SD = 144). All training was completed at the Davidson College, Cognitive Aging Lab. Outcome measures were scored by two scorers; all conflicts were adjudicated by K. S. Multhaup.
Outcome measures
We selected outcome measures to cover the major domains associated with age-related cognitive decline. The RBANS (Randolph, 1998) is a neuropsychological battery validated for diagnosis of dementia in a community-dwelling population. Duff and colleagues (2006, 2009) found that age-related declines measured by the 12 RBANS subtests can be mostly captured by combined scores from nine subtests that load on two factors, visual processing (figure copy, figure recall, line orientation, coding) and verbal memory (list learning, list recognition, list recall, story recall, story memory). To provide coverage of processing speed and working memory, we chose WAIS-III (Wechsler, 1997) Coding and Letter-Number-sequencing subtests, respectively. To add converging measurement of working memory, we combined the RBANS Digit Span with WAIS-III Letter-Number. We measured global cognitive function with the RBANS Total Scale Score, according to the procedures in the manual (Randolph, 1998). We included a secondary verbal-memory index from the HVLT-R (Brandt & Benedict, 2001).
With the exception of the RBANS Total Scale Score, the primary and secondary indexes were created by T-scaling (i.e., M = 100 and SD = 10) all subtest scores using the overall mean and standard deviation from the pretest administration of each subtest, and then summing the resultant subtest T-scores and dividing by the number of subtests combined in that particular index. This procedure results in equal weighting of each included subtest’s variation in an index. Processing speed was assessed with one subtest so was left in its raw score form.
Results
Table 3 reports the pretest means (and SDs) and pre–post-training mean change scores for both groups on the outcome measures. As expected, the control group showed no significant changes pre–post (repeated-measures F-tests, see Table 3 Control Change column). The visual-training group, by contrast, showed significant positive pre–post change for RBANS Total, visual processing, working memory, and processing speed.
Table 3.
Visual games (n = 25) | Control (n = 19) | Group × Test | ||||
---|---|---|---|---|---|---|
Measure | Pretest M ± SD |
Change M |
Pretest M ± SD |
Change M |
F valuea | Effect Size db |
Primary | ||||||
RBANS Total Scale Score | 102.8 ± 17.7 | 4.7* | 112.9 ± 17.4 | −1.3 | 4.90* | 0.33 |
Verbal Memory Indexc | 100.0 ± 8.0 | 1.1 | 100.9 ± 7.7 | −0.3 | 1.53 | 0.18 |
Visual Processing Indexd | 98.6 ± 7.1 | 2.9** | 102.5 ± 6.1 | −0.6 | 5.87* | 0.51 |
Secondary | ||||||
Working Memory Indexe | 99.0 ± 6.4 | 3.7** | 101.7 ± 10.3 | 0.3 | 3.61† | 0.41 |
Verbal Memory 2 Indexf | 100.6 ± 8.7 | 0.8 | 100.5 ± 9.5 | 0.1 | 0.14 | 0.08 |
Processing Speedg | 51.8 ± 13.2 | 3.2** | 57.3 ± 14.3 | −0.1 | 4.29* | 0.24 |
Note: HVLT = Hopkins Verbal Learning Test Revised; M = Mean; RBANS = Repeatable Battery for the Assessment of Neuropsychological Status; SD = Standard deviation; WAIS = Wechsler Adult Intelligence Scale.
aDegrees of freedom (1, 42). bEffect size based on Morris (2008): d = mean ([PostTrain – PreTrain] – [PostControl – PreControl])/SDPooled Pre. cRBANS List Learning, RBANS List Recognition, RBANS List Recall, RBANS Story Recall, RBANS Story Memory. dRBANS Figure Copy, RBANS Figure Recall, RBANS Line Orientation, RBANS Coding. eWAIS-III Letter-Number Sequencing, RBANS Digits Span. fHVLT-R Total, HVLT-R Delayed. gWAIS-III Coding.
† p < .10. *p < .05. **p < .01..
The rightmost columns of Table 3 report the Group × Test interaction and transfer effect sizes (Morris, 2008) based on the differential change score for the training versus the control group, scaled by the pooled pretest standard deviation. There was a significant transfer effect (i.e., the associated Group × Test interaction was statistically significant) for the visual processing index, d = .51, suggesting a domain-specific transfer effect. There was also a significant transfer effect for RBANS Total and processing speed with smaller effect sizes (d = .33 and .24, respectively).
Discussion
Modality-specific transfer was observed for the visual-processing index, with further transfer to processing speed. By contrast, Smith et al. (2009; IMPACT study) reported modality-specific transfer to verbal memory following auditory gameplay (Brain Fitness), as well as transfer to working memory and processing speed. These results suggest that direct comparison of these two game suites should produce a double-dissociation pattern of domain-specific transfer (i.e., verbal memory, visual processing) unique to auditory and visual games, respectively, and modality-general working-memory and processing-speed transfer for both suites.
The Main Study included a factorial manipulation of game type, resulting in visual-only, auditory-only, and both-game-play groups, and a waitlist-control group. This factorial crossing allows for tests for synergistic gameplay effects, as well as inclusion of both active- and passive-control groups.
Main Study
Method
Design
We used a 2 (visual training: yes, no) × 2 (auditory training: yes, no) × 3 (test session: pretest, post-test, follow-up) mixed design with test session as the only within-participants variable. The resulting four game-play groups were none, visual-only, auditory-only, and both-visual-and-auditory.
Participants
Recruitment strategies included newsletters, faith-community bulletins, community presentations, fliers, websites, and snowball sampling. Reimbursement offered was $50 per test session (maximum $150) and two meetings with a geriatrician. Inclusion and exclusion criteria repeated the Pilot Study’s with two modifications: Exclusion for failure to complete training was <20 sessions and additional exclusion criteria were current substance abuse and having played brain games beyond roughly 1 week. In addition to self-report on inclusion/exclusion criteria, participants were screened by a geriatrician, M. S. Ong. See Supplementary Figure 1 for flow of participants. There were no significant differences across groups in any of the demographic measures (see bottom of Table 2). Waitlist-control participants were given an opportunity to play the games after the follow-up test. A power analysis based on the Pilot Study and the results of Smith et al. (2009) indicated that a sample of N = 160 total, N = 40 per training group, was sufficient for an 85% chance of detecting expected 2 × 2 group interactions of mean change scores on outcome measures.
We asked training-group participants to complete 30–40 sessions (~40 min each) in 8–10 weeks and participants largely complied: 87.7% completed 30–40 sessions, with 3.5% completing less than 30, and 8.8% completing over 40. The visual-, auditory-, and both-game-training groups did not differ in mean sessions completed (M = 38.0, 37.9, 38.0, respectively).
Procedure
This research was conducted under an IRB-approved protocol; all participants provided written consent. Participants were randomly assigned to one of four conditions (visual, verbal, both, or no training). Couples were assigned to the same condition to minimize awareness of the experimental manipulation. Participants used their own computers or computers in their apartment building. Training occurred through an online portal (Brain HQ; Posit Science Corporation). The visual games were updated versions of the computer-disk games used in the Pilot Study (InSight; Table 1). The auditory games were also updated versions of computer-disk games (Brain Fitness; Table 1). Participants who played both visual and auditory games alternated by session which suite they played.
The coordinator who made the condition assignment handled all participant contact. A team of neuropsychological testers, approved by a clinical neuropsychologist (G. J. Demakis), was naive to experimental design and participant condition. Participants met a new tester at each testing session, with two exceptions when scheduling constraints required a repetition in testers. A separate team of test scorers was naive to experimental design, participant condition, and which test session they were scoring. All outcome measures were scored by at least two scorers and conflicts were adjudicated by M. E. Faust and K. S. Multhaup.
Outcome measures
All primary and secondary measures were the same as for the Pilot Study (Table 4), except for replacing WAIS-III Letter-Number Sequencing and Coding with their WAIS-IV versions (Wechsler, 2008), and the addition of the WAIS-IV Cancellation to create a processing-speed index based on two tests.
Table 4.
Outcome measure | Visual games | Auditory games | ||||
---|---|---|---|---|---|---|
Change M | Group × Test F | Effect sizea | Change M | Group × Test F | Effect sizea | |
Pre–post change | ||||||
Primary | ||||||
RBANS Total Score | 5.9** | 6.34* | 0.42 | 3.0* | 1.10 | 0.18 |
Verbal Memory Indexb | 4.2** | 0.62 | 0.16 | 1.4 | 0.20 | −0.09 |
Visual Processing Indexc | 4.2*** | 5.58* | 0.35 | 1.9† | 0.70 | 0.12 |
Secondary | ||||||
Working Memory Indexd | 4.7** | 5.82* | 0.45 | 3.1† | 2.97† | 0.32 |
Verbal Memory Index 2e | −2.6† | 0.71 | −0.14 | 2.1 | 0.09 | 0.17 |
Processing Speed Indexf | 2.5 | 0.74 | −0.15 | 3.7* | 0.81 | −0.05 |
Pre-follow-up change | ||||||
Primary | ||||||
RBANS Total Score | 4.5** | 0.94 | 0.17 | 4.4* | 0.92 | 0.17 |
Verbal Memory Indexb | 3.4* | 0.04 | −0.04 | 2.2 | 0.62 | −0.14 |
Visual Processing Indexc | 3.6** | 4.21* | 0.35 | 3.3** | 3.57† | 0.32 |
Secondary | ||||||
Working Memory Indexd | 2.9* | 2.87† | 0.31 | 2.7 | 2.70 | 0.30 |
Verbal Memory Index 2e | 1.5 | 0.12 | −0.07 | 1.8 | 0.05 | −0.04 |
Processing Speed Indexf | 5.4** | 0.06 | 0.05 | 6.3*** | 0.45 | 0.12 |
Note: HVLT = Hopkins Verbal Learning Test; RBANS = Repeatable Battery for the Assessment of Neuropsychological Status; WAIS = Wechsler Adult Intelligence Scale. Group × Test Interaction F-tests for pre–post comparison based on (1, 147) degrees of freedom, and for pre-follow-up comparison based on (1, 141) degrees of freedom.
aEffect size based on Morris (2008): mean change difference across groups scaled by the pooled standard deviation of pretest scores. bRBANS List Learning, List Recognition, List Recall, Story Recall, & Story Memory. cRBANS Figure Copy, Figure Recall, Line Orientation, and Coding. dWAIS-IV Letter-Number Sequencing and RBANS Digits Span. eHVLT-R Total and Delayed. fWAIS-IV Coding and Cancellation. gEffect size based on Morris (2008): d = mean ([PostTrain – PreTrain] – [PostControl – PreControl])/SDPooled Pre.
† p < .10. *p < .05. **p < .01.
Results
Pre–post comparisons
Outcome measures were each submitted to a 2 (visual games: yes, no) × 2 (auditory games: yes, no) × 2 (test: pre, post) mixed-model analysis of variance. All outcome measures, except the secondary verbal-memory index, yielded a main effect of test (all F’s > 7.4, all p’s < .008). Planned pre–post repeated-measures F-tests were performed for each group, for each outcome measure (Table 4 and Supplementary Table 1). The both-gameplay and control groups failed to yield any significant pre–post outcome-measure changes (all p’s > .10), except for the control group’s significant improvement in processing speed (p < .05). The visual-gameplay group yielded significant pre–post change for all outcome measures except verbal index 2 and processing speed; by contrast, the auditory-gameplay group yielded significant pre–post change for the global-cognition measure and processing speed (Table 4).
There was a significant Visual Games × Auditory Games × Test interaction for the RBANS Total, F(1,147) = 6.162, p = .014, η 2= .040; the visual-processing index, F(1,147) = 5.980, p = .016, η 2= .039; and the working-memory index, F(1,147) = 4.578, p = .034, η 2= .030. None of the other three-way interactions were significant (all p’s> .724). Figure 1 shows the significant interactions with the test factor collapsed and expressed as a difference score. The depicted two-way interactions are equivalent to the three-way interactions in the full three-factor design and were explicitly tested as such; mean change scores are shown for clarity. Counter to our predictions, all three-way interactions depicted in Figure 1, marked by the large bracket including all four groups, are under-additive rather than over-additive, and as such, reflect a lack of evidence for any synergistic combination of visual and auditory games.
Table 4 presents the Group × Test interaction results for the visual- and auditory- gameplay groups, each compared to the control group; no significant interactions involving the both-gameplay group were found (all p’s > .10). Table 4 also reports the transfer effect size (Morris, 2008; d = mean [{PostTrain – PreTrain} – {PostControl – PreControl}]/SDPooled Pre) associated with each interaction test. The three significant transfer effects were all found for the visual-training group: RBANS total (d = .42), visual processing (d = .35), and working memory (d = .45); there were no significant transfer effects for the auditory-gameplay group. As with the Pilot Study, there is evidence for domain-specific transfer effects in the visual-training group. Unlike the Pilot Study, there was no transfer effect for processing speed. Working-memory and the total index scores also showed transfer effects that may reflect a more general training effect.
Pre-follow-up comparisons
There were no significant Auditory Game × Visual Game × Pre-Follow-up interactions (all p’s > .10) such as those reported for the pre–post comparisons. As indicated in Table 4, the only significant transfer effect was for the visual-gameplay group for the visual index (d = .35, p < .05). However, the visual-gameplay group yielded a marginal transfer effect for the working-memory index (d = .31, p < .10), and the auditory-gameplay group yielded a marginal transfer effect for the visual index (d = .32, p < .10).
Discussion
The Main Study replicated and extended the Pilot Study’s finding of modality-specific transfer effects. Figure 1 shows that transfer effects were significantly modulated across the four game-training groups for the global-cognition, visual-processing, and working-memory indexes. Games with visual challenges yielded pre–post improvements in visual-processing, working-memory, and global-cognition outcome measures above and beyond any test-learning effect in the control group. Only the targeted visual-processing transfer effect remained at the 3-month follow-up, with a stable effect size (d =.35), demonstrating durability. The observation of transfer to the global-cognition outcome measure (includes all 12 subtests of the RBANS) for the visual-gameplay group pre–post comparison, but not the pre-follow-up comparison, suggests that visual-processing and working-memory transfer effects are the primary driver of the pre–post transfer to the global-cognition measure.
What the present data did not yield, however, is a double-dissociation where auditory and visual gameplay resulted in modality-specific transfer to their respective targeted cognitive domains. The auditory-gameplay group did not produce any significant (control-group-adjusted) transfer effects. By contrast, Smith et al. (2009) reported auditory-gameplay transfer to verbal memory and working memory in older adults, with control-group-adjusted effect sizes in the d = .20–.30 range that were statistically significant with their large sample size (N > 240 for training and control groups). Several of these transfer effects were significant at 3-month follow-up (Zelinski et al., 2011). The cross-study discrepancy may be due to the present study using updated Brain Fitness games. Moreover, the present study did provide some marginal evidence of transfer to working memory in our auditory-gameplay group (d = .32 and .30, at post-test and follow-up, respectively, .05 < p’s < .11), consistent with Smith et al. (2009).
One strength of the Main Study’s design was the inclusion of a both-games group that allowed assessment of possible synergy from training in both visual and auditory domains. The present lack of synergistic effects is surprising. Even though there were no significant transfer effects for the auditory-gameplay group, we still would expect the visual gameplay of the both-game group to yield transfer to visual processing and working memory. One explanation for a lack of transfer in the both-game group is that there may be a threshold of modality-specific training that must be reached before transfer is possible. We chose to hold the total number of training sessions constant across groups which means that our both-domain group had roughly half as many visual-domain sessions as the visual-training group and roughly half as many auditory-domain sessions as the auditory-training group. If we had chosen to hold constant the number of domain-specific sessions, the both-game group would have had twice as much training as the single-domain groups. In short, the observed under-additive interactions (Figure 1) could have been due to amount of training. Another possibility is that the alternation of visual and auditory games for the both-gameplay group produced learning interference across sessions.
Future work could explore the influence of multiple gameplay types by controlling amount of training and the mix of games across groups. Synergistic effects may be detected if domains are trained within a session rather than across them.
General Discussion
To better understand the modality-specific and durable visual-gameplay transfer effects observed in both samples, we conducted a combined post-hoc analysis. Principle components analysis (N = 195) of the nine subtests of the RBANS revealed a two-component solution (54% of variance accounted for, loadings of .66–.85 for five verbal-memory subtests and .52–.76 for four visual-processing subtests), confirming the results of Duff et al. (2006, 2009). We also computed control-group-adjusted pre–post effect sizes and Group × Test interactions (N = 56 and 64 for control and visual groups, respectively) for the four visual-processing subtests. Effect sizes were d =.4, .2, .2, and .2 for the Figure Copy, Figure Recall, Coding, and Line Orientation RBANS subtests, respectively, indicating that Figure Copy contributed more to the significant transfer to the modality-specific visual-processing index than the other subtests (confirmed by Figure Copy yielding the only significant Group × Test interaction, p = .022). However, the distribution of effect sizes for the other subtests suggests a sizeable and balanced contribution to transfer. Whether the modality-specific transfer from visual gameplay is dominated by a spectrum of specific near-transfer effects (e.g., Oei & Patterson, 2014), consistent with the dominant contribution of transfer to the Figure Copy task, or by a task-general process, such as learning to learn (Bavelier et al., 2012), consistent with broad contributions from all of the subtests together, remains to be determined.
Evidence of modality-general transfer was observed for working memory (Main Study) following visual gameplay, likely due to near-transfer from the working-memory game included in Insight (Table 1). This is consistent with the literature on transfer following working-memory training, which is primarily confined to near-transfer between working-memory tasks (Gathercole et al., 2019). Moreover, the transfer to working memory in the Main Study is not modality-specific as InSight included a spatial-working-memory game, but the working-memory index was a combination of the RBANS digit span and WAIS-IV letter-number subtests, both of which involved auditory presentation of, and serial retrieval of, verbal items (Table 4).
Evidence of modality-general transfer was observed for processing speed in the Pilot Study, but not the Main Study, following visual gameplay. The Insight games (Table 1) were inspired, in part, by the speed-of-processing training used in the ACTIVE study (Ball et al., 2002), which reported durable transfer to measures of processing speed (Edwards et al., 2013). Comparison of Tables 3 and 4 reveals that the Main Study added WAIS Cancellation as a second test of processing speed. Further examination revealed that the WAIS Cancellation test had strong test effects in the control group. We conducted a post-hoc comparison of the combined visual-gameplay and control groups and found a marginal group-by-session interaction (p < .08), consistent with the Pilot Study.
Conclusion
The gameplay that provided visual challenges yielded durable modality-specific transfer to visual-processing tasks and weaker evidence of modality-general near-transfer to working-memory and processing-speed tasks. Transfer from auditory games did not reach statistical significance. No synergistic effects were observed for multimodal (visual and auditory) training. Future investigation of potential synergistic effects of multimodal training would benefit from exploring the effect of visual and auditory-training games mixed within session, as opposed to across sessions as in the present research.
Supplementary Material
Supplementary data is available at The Journals of Gerontology, Series B: Psychological Sciences and Social Sciences online.
Supplemental Figure 1. Flow of participants in main study sample of adaptive computer games study.
Funding
This work was supported by the National Institute on Aging (1R15AG038879-01A1). The pilot sample study was supported by an internal award to K. S. Multhaup from the Department of Psychology, Davidson College.
Conflict of Interest
None of the authors received any monetary compensation from Posit Science Corporation, San Francisco, CA, nor did Posit Science fund any portion of the research.
Acknowledgments
The authors would like to thank our research coordinators, Sarah Daniels, Savannah R. Erwin, Meg DiDonato, and our testing and scoring teams. The authors would also like to thank the scientific support staff at Posit Science Corporation, San Francisco, CA, for consulting regarding use of their group gaming portals for scientific studies of the online versions of their games in the main study sample. The authors thank the Department of Psychology, Davidson College, for contributing funds to purchase computer game disks, and neuropsychology testing forms for this project.
References
- Anguera J. A., Boccanfuso J., Rintoul J. L., Al-Hashimi O., Faraji F., Janowich J.,…Gazzaley A (2013). Video game training enhances cognitive control in older adults. Nature, 501, 97–101. doi:10.1038/nature12486 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ball K., Berch D. B., Helmers K. F., Jobe J. B., Leveck M. D., Marsiske M.,…Willis S. L.; Advanced Cognitive Training for Independent and Vital Elderly Study Group (2002). Effects of cognitive training interventions with older adults: A randomized controlled trial. JAMA, 288, 2271–2281. doi:10.1001/jama.288.18.2271 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Barnes D. E., Santos-Modesitt W., Poelke G., Kramer A. F., Castro C., Middleton L. E., & Yaffe K (2013). The Mental Activity and eXercise (MAX) trial: A randomized controlled trial to enhance cognitive function in older adults. JAMA Internal Medicine, 173, 797–804. doi:10.1001/jamainternmed.2013.189 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bavelier D., Green C. S., Pouget A., & Schrater P (2012). Brain plasticity through the life span: Learning to learn and action video games. Annual Review of Neuroscience, 35, 391–416. doi:10.1146/annurev-neuro-060909-152832 [DOI] [PubMed] [Google Scholar]
- Belchior P., Marsiske M., Sisco S. M., Yam A., Bavelier D., Ball K., & Mann W. C (2013). Video game training to improve selective visual attention in older adults. Computers in Human Behavior, 29, 1318–1324. doi:10.1016/j.chb.2013.01.034 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brandt J., & Benedict R. H. B (2001). Hopkins verbal learning test-Revised. Lutz, FL: PAR. [DOI] [PubMed] [Google Scholar]
- Braver T. S., & West R (2008). Working memory, executive control, and aging. In Craik F. I. M. and Salthouse T. A. (Eds.), The handbook of aging and cognition (3rd ed.), pp. 311–372. New York, NY: Psychology Press. doi:10.4324/9780203837665.ch7 [Google Scholar]
- Burdick D. J., Rosenblatt A., Samus Q. M., Steele C., Baker A., Harper M.,…Lyketsos C. G (2005). Predictors of functional impairment in residents of assisted-living facilities: The Maryland Assisted Living study. The Journals of Gerontology, Series A: Biological Sciences and Medical Sciences, 60, 258–264. doi:10.1093/gerona/60.2.258 [DOI] [PubMed] [Google Scholar]
- Delahunt P. B., Hardy J. L., Brenner D. F., Chan S. C., Dewey J. A., Mahncke H. W., … Merzenich M. M (2008). InSight: Scientific principles of a brain-plasticity-based visual training program. San Francisco, CA: Posit Science Corporation. [Google Scholar]
- Duff K., Langbehn D. R., Schoenberg M. R., Moser D. J., Baade L. E., Mold J.,…Adams R. L (2006). Examining the repeatable battery for the assessment of neuropsychological status: Factor analytic studies in an elderly sample. The American Journal of Geriatric Psychiatry, 14, 976–979. doi:10.1097/01.JGP.0000229690.70011.cd [DOI] [PubMed] [Google Scholar]
- Duff K., Langbehn D. R., Schoenberg M. R., Moser D. J., Baade L. E., Mold J. W.,…Adams R. L (2009). Normative data on and psychometric properties of Verbal and Visual Indexes of the RBANS in older adults. The Clinical Neuropsychologist, 23, 39–50. doi:10.1080/13854040701861391 [DOI] [PubMed] [Google Scholar]
- Edwards J. D., Valdés E. G., Peronto C., Castora-Binkley M., Alwerdt J., Andel R., & Lister J. J (2013). The efficacy of insight cognitive training to improve useful field of view performance: A brief report. The Journals of Gerontology, Series B: Psychological Sciences and Social Sciences, 70, 417–422. doi:10.1093/geronb/gbt113 [DOI] [PubMed] [Google Scholar]
- Folstein M., Folstein S., & Mchugh P.(1975). “Mini-mental state”: A practical method for grading the cognitive state of patients for the clinician. Journal of Psychiatric Research, 12, 189–198. doi:10.1016/0022-3956(75)90026-6 [DOI] [PubMed] [Google Scholar]
- Gathercole S., Dunning D., Holmes J., & Norris D (2019). Working memory training involves learning new skills. Journal of Memory and Language, 105, 19–42. doi.org/10.1016/j.jml.2018.10.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kueider A. M., Parisi J. M., Gross A. L., & Rebok G. W (2012). Computerized cognitive training with older adults: A systematic review. PLoS One, 7, e40588. doi:10.1371/journal.pone.0040588 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lampit A., Hallock H., & Valenzuela M (2014). Computerized cognitive training in cognitively healthy older adults: A systematic review and meta-analysis of effect modifiers. PLoS Medicine, 11, e1001756. doi:10.1371/journal.pmed.1001756 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mahncke H. W., Bronstone A., & Merzenich M. M (2006). Brain plasticity and functional losses in the aged: Scientific bases for a novel intervention. Progress in Brain Research, 157, 81–109. doi:10.1016/S0079-6123(06)57006-2 [DOI] [PubMed] [Google Scholar]
- Morris S. (2008). Estimating effect sizes from pretest-posttest-control group designs. Organizational Research Methods, 11, 364–386. doi:10.1177/1094428106291059 [Google Scholar]
- Oei A. C., & Patterson M. D (2014). Are videogame training gains specific or general? Frontiers in Systems Neuroscience, 8, 54. doi:10.3389/fnsys.2014.00054 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Park D. C., Lautenschlager G., Hedden T., Davidson N. S., Smith A. D., & Smith P. K (2002). Models of visuospatial and verbal memory across the adult life span. Psychology and Aging, 17, 299–320. doi:10.1080/02724980443000179 [PubMed] [Google Scholar]
- Peretz C., Korczyn A. D., Shatil E., Aharonson V., Birnboim S., & Giladi N (2011). Computer-based, personalized cognitive training versus classical computer games: A randomized double-blind prospective trial of cognitive stimulation. Neuroepidemiology, 36, 91–99. doi:10.1159/000323950 [DOI] [PubMed] [Google Scholar]
- Randolph C. (1998). Repeatable battery for the assessment of Neuropsychological Status. San Antonio, TX: Psychological Corporation. [Google Scholar]
- Salthouse T. A. (1996). The processing-speed theory of adult age differences in cognition. Psychological Review, 103, 403–428. doi:10.1037/0033-295X.103.3.403 [DOI] [PubMed] [Google Scholar]
- Salthouse T. A. (2009). Decomposing age correlations on neuropsychological and cognitive variables. Journal of the International Neuropsychological Society, 15, 650–661. doi:10.1017/S1355617709990385 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Salthouse T. A., & Davis H. P (2006). Organization of cognitive abilities and neuropsychological variables across the lifespan. Developmental Review, 26, 31–54. doi:10.1016/j.dr.2005.09.001 [Google Scholar]
- Simons D. J., Boot W. R., Charness N., Gathercole S. E., Chabris C. F., Hambrick D. Z., & Stine-Morrow E. A (2016). Do “Brain-Training” programs work? Psychological Science in the Public Interest, 17, 103–186. doi:10.1177/1529100616661983 [DOI] [PubMed] [Google Scholar]
- Smith G. E., Housen P., Yaffe K., Ruff R., Kennison R. F., Mahncke H. W., & Zelinski E. M (2009). A cognitive training program based on principles of brain plasticity: Results from the Improvement in Memory with Plasticity-based Adaptive Cognitive Training (IMPACT) study. Journal of the American Geriatrics Society, 57, 594–603. doi:10.1111/j.1532-5415.2008.02167.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Taatgen N. A. (2013). The nature and transfer of cognitive skills. Psychological Review, 120, 439–471. doi:10.1037/a0033138 [DOI] [PubMed] [Google Scholar]
- Toril P., Reales J. M., & Ballesteros S (2014). Video game training enhances cognition of older adults: A meta-analytic study. Psychology and Aging, 29, 706–716. doi:10.1037/a0037507 [DOI] [PubMed] [Google Scholar]
- Wechsler D. (1997). Wechsler adult intelligence scale (3rd ed.). San Antonio, TX: Pearson. [Google Scholar]
- Wechsler D. (2008). Wechsler adult intelligence scale (4th ed.). San Antonio, TX: Pearson. [Google Scholar]
- Wilson R. S., Boyle P. A., Segawa E., Yu L., Begeny C. T., Anagnos S. E., & Bennett D. A (2013). The influence of cognitive decline on well-being in old age. Psychology and Aging, 28, 304–313. doi:10.1037/a0031196 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wolinsky F. D., Vander Weg M. W., Howren M. B., Jones M. P., & Dotson M. M (2013). A randomized controlled trial of cognitive training using a visual speed of processing intervention in middle aged and older adults. PLoS ONE, 8, e61624. doi:10.1371/journal.pone.0061624 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zelinski E. M., Peters K. D., Hindin S., Petway K. T. 2nd, & Kennison R. F (2014). Evaluating the relationship between change in performance on training tasks and on untrained outcomes. Frontiers in Human Neuroscience, 8, 617. doi:10.3389/fnhum.2014.00617 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zelinski E. M., Spina L. M., Yaffe K., Ruff R., Kennison R. F., Mahncke H. W., & Smith G. E (2011). Improvement in memory with plasticity-based adaptive cognitive training: Results of the 3-month follow-up. Journal of the American Geriatrics Society, 59, 258–265. doi:10.1111/j.1532-5415.2010.03277.x [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.