Skip to main content
The Journals of Gerontology Series B: Psychological Sciences and Social Sciences logoLink to The Journals of Gerontology Series B: Psychological Sciences and Social Sciences
. 2025 Oct 4;81(1):gbaf189. doi: 10.1093/geronb/gbaf189

Cognitive dedifferentiation in later life: longitudinal findings from the Lothian Birth Cohort 1936

Joanna E Moodie 1,, Janie Corley 2, Ian J Deary 3, Simon R Cox 4
Editor: Gali Weissberger
PMCID: PMC12779353  PMID: 41043000

Abstract

Objectives

In the cognitive aging literature, the dedifferentiation hypothesis refers to cognitive skills becoming more interrelated in older adulthood. Here, we report evidence for cognitive dedifferentiation in the Lothian Birth Cohort 1936 (LBC1936).

Methods

The LBC1936 is a narrow-age cohort assessed at 5 waves between ages 70 and 82. We analyzed data from 418 participants (49% male) who provided cognitive data at all 5 waves.

Results

In single-order structural equation models, the percentage of variance that general cognitive functioning (g) accounted for across 13 cognitive tests increases by wave; w1 to w5: 25%, 27%, 29%, 31%, 36%, and the group-level rate of dedifferentiation closely tracked the group-level rate of cognitive decline (r  =  −.991, p = .001). A hierarchical model, which included 4 cognitive domains as mid-level factors, provides evidence of cognitive dedifferentiation at the cognitive domain level: fluid cognitive domains (Visuospatial Skills, Processing Speed, and Verbal Memory) converged, and Crystallised Ability became less influential on the structure of g over time. We also show that this group-level measure of dedifferentiation reflects the individual-level measure of dispersion (people tend to score more similarly across different cognitive tests with advancing age), r  =  −.989, p = .001.

Discussion

The current results have implications for longitudinal g modeling choices: it cannot be assumed that g’s composition is the same over time. Future longitudinal research will be important in clarifying the incremental validity, determinants, mechanisms, and implications of cognitive differentiation and dedifferentiation across the lifespan.

Keywords: Cognitive aging, Dispersion, General cognitive functioning, Longitudinal cohort


Cognitive test scores tend to be positively correlated, such that performance level on one cognitive test predicts similar performance on all other tests across a broad range of cognitive domains. When scores from multiple cognitive tests spanning various cognitive domains are considered, a robust component or latent factor of general cognitive functioning (g) can be derived, which typically explains a large minority (∼30%–40%) of the variance (Johnson et al., 2004, 2008; Salthouse, 2005). This construct is one of the most replicated phenomena in psychological science (Deary, 2012; Panizzon et al., 2014), and its individual differences correlate with important life outcomes, including everyday functioning, health, illness, aging, dementia, and mortality (Deary et al., 2009; Jonas et al., 2022).

A key debate remains: to what extent does the interrelation between cognitive test scores change over the lifespan? Changes in the covariation among cognitive test scores over time are referred to as cognitive differentiation (where cognitive skills become more distinct from one another) and cognitive dedifferentiation (where cognitive skills become more interrelated). The cognitive differentiation-dedifferentiation hypothesis suggests that differentiation occurs in early life (e.g., Burt, 1954; Li et al., 2004), and dedifferentiation tends to occur in later life (e.g., Babcock et al., 1997; Baltes et al., 1980; Blum & Holling, 2017; Deary et al., 2004; de Frias et al., 2007; Ghisletta & de Ribaupierre, 2005; Ghisletta & Lindenberger, 2003, 2004; Hülür et al., 2015). However, some studies find little or no evidence of such effects, and others find more complex patterns (Anstey et al., 2003; Batterham et al., 2011; Breit et al., 2025; de Mooij et al., 2018; Hartung et al., 2018; Sims et al., 2009; Tucker-Drob, 2009; Tucker-Drob & Salthouse, 2008; Whitley et al., 2016).

One possible explanation for some of any cognitive dedifferentiation in older age is the general decline in cognitive functioning. Previous findings provide evidence that, cross-sectionally, there is increased dedifferentiation with lower cognitive performance (Deary & Pagliari, 1991; Deary et al., 1996; Detterman & Daniel, 1989; Spearman, 1927). On a biological level, there might be age-related declines in neuronal specificity, which could then affect cognitive performance (Cox et al., 2016; Li et al., 2001; Park & Reuter-Lorenz, 2009; Raz, 2000).

Several conceptual models are relevant to the phenomenon of cognitive dedifferentiation. Tucker-Drob et al. (2019) define the concepts of static and dynamic dedifferentiation. Static dedifferentiation refers to the cross-sectional observation that cognitive abilities become more highly correlated in older adults, suggesting convergence across cognitive domains in later life. Dynamic dedifferentiation focuses on intraindividual change over time, suggesting that, as individuals age, domain-based patterns of cognitive change become more synchronized across domains, which, over time, increases the proportion of variance accounted for by g. Tucker-Drob et al. (2019) showed that the shared variance in individual differences in cognitive change increased from about 45% at age 35 to about 70% by age 85. Accumulated effects of cognitive changes, across many years, could help to explain static dedifferentiation effects in later life. Another concept closely connected to dedifferentiation is that of dispersion (Lindenberger & Baltes, 1997), which refers to the intraindividual variability in performance across different cognitive tasks (in contrast to dedifferentiation, which is, by definition, a group-level construct). Changes in dispersion in older age may reflect dedifferentiation and have similarly been interpreted as a marker of neurological decline or reduced efficiency in neural processing (Buczylowska & Petermann, 2018; Lindenberger & Baltes, 1997; Mella et al., 2016; Rapp et al., 2005).

Understanding cognitive differentiation and dedifferentiation is meaningful because these phenomena help to characterize cognitive trajectories across the life course and could aid the identification of early markers of cognitive decline and distinguish healthy aging from pathological aging. For example, a study found that cognitive dedifferentiation was more pronounced in two samples with marked cognitive decline compared to controls, when controlling for age, sex, and education (Wallert et al., 2021), suggesting that measures of dedifferentiation could provide information of clinical relevance. However, the incremental predictive value of cognitive dedifferentiation in addition to cognitive decline is yet to be clearly demonstrated.

To date, much of the research on cognitive differentiation and dedifferentiation has relied on cross-sectional designs, comparing different individuals at different life stages. While often informative, cross-sectional designs are limited by potential confounding factors, for example, factors that might differ across age groups and cohort effects. Longitudinal designs, in contrast, allow for the measurement of changes in cognitive interrelations over time within the same individuals, enabling direct analysis (e.g., Hülür et al., 2015). Inconsistent results in this field might be due to tests covering different domains, samples with wide age ranges that include periods of differentiation, structural stability, and dedifferentiation (thus masking effects), small and unrepresentative samples, or unaccounted for pathology. Thus, investigations in well-characterized cohorts of exclusively older adults across a wide range of cognitive tests and domains can substantially inform this line of enquiry.

In the current paper, we test for evidence of cognitive dedifferentiation using data from the Lothian Birth Cohort 1936 (LBC1936), a narrow-age longitudinal cohort with data from five waves collected approximately every 3 years between the ages of 70 and 82 years. By analyzing this informative data set, we aim to provide new insights into the dynamics of cognitive dedifferentiation in older adulthood.

Method

Participants

The LBC1936 is a longitudinal study of a sample of community-dwelling older adults who were born in 1936, most of whom took part in the Scottish Mental Survey of 1947 when they were ∼11 years old, and who volunteered to participate in this cohort study at ∼70 years old (Deary et al., 2007; ­Taylor et al., 2018; https://lothian-birth-cohorts.ed.ac.uk/). The inclusion criteria for the LBC1936 cohort were having been at school in Scotland for the Moray House Test at age 11, and to have been living in Edinburgh for the start of the cohort testing in 2004. The current sample were all White Scottish, and the data included here were collected in Edinburgh between 2004 and 2019. w1N = 418 participants completed at least one cognitive test at all five waves (waves 1 to 5, w1 to w5). These 418 participants constitute the present analytic sample (49% male). Data were collected approximately every 3 years. The mean age (years) per wave was as follows: w1 = 69.46 (SD = .85), w2 = 72.44 (SD = .72), w3 = 76.20 (SD = .68), w4 = 79.28 (SD = .63), and w5 = 82.00 (SD = .48). The minimum number of participants for any one test at any wave was N = 371 (89% of the sample), for both inspection time total in w4 and verbal paired associates in w5 (a grid showing the completeness of data by test is shown in Supplementary Figure 1 [see online supplementary material]).

There was an association between w1 cognitive test scores and participants having any missing waves from w2 to w5 (0 = complete data, 1 = some missing data on the relevant ­cognitive test w2 to w5), such that people with missing data had slightly lower cognitive scores at w1 (across tests, mean β = −.132, SD = .106, mean p = .0002, SD = .0007, the full results are in Supplementary Table 1 [see online supplementary material]). Therefore, the subset of participants with complete data (N = 418) that we include in this analysis tended to have slightly higher cognitive scores than the full sample. Further differences between completers and non-completers in the LBC1936 cohort were analyzed in a previous study (Corley et al., 2023).

To assess whether results reflected normative aging processes, we conducted sensitivity analyses excluding those with a subsequent dementia diagnosis. No participants had dementia at w1. Dementia diagnoses for the LBC1936 are ascertained through a three-step process: Electronic Health Records and death certificate data, a home visit to some participants, and a consensus review board meeting with experienced dementia experts (Mullin et al., 2023). The LBC1936 is an ongoing longitudinal study, and the dementia diagnoses included here were correct to the best of our knowledge as of the 6th of March 2025. Out of the N = 418 participants included in the current study, N = 58 have had subsequent dementia diagnoses. Therefore, N = 360 were included in our sensitivity analyses.

The LBC1936 study was given ethical approval by the Multi-Centre Research Ethics Committee for Scotland (MREC/01/0/56), the Lothian Research Ethics Committee (LREC/2003/2/29), and the Scotland A Research Ethics Committee (07/MRE00/58). All participants gave written informed consent.

Cognitive tests

The 13 cognitive tests are described in detail elsewhere (Deary et al., 2007; Ritchie et al., 2016; Tucker-Drob et al., 2014). They cover four cognitive domains:

See Supplementary Tables 2 and 3 (see online supplementary material) for details of individual cognitive tests. Two cognitive tests had skewed distributions and so were transformed: inspection time total was squared, and choice reaction time was multiplied by the power of minus two. All variables were scaled before being entered into the structural equation analysis (SEM) models. Correlations between the 13 cognitive tests for each wave are shown in Supplementary Figure 2 (see online supplementary material).

Latent g models

We used SEM with the lavaan package (v 0.6.17, Rosseel, 2012) in R (v4. 2.0, R Team, 2022) to model a latent g factor at each wave. We ran two main types of models to assess and characterize any dedifferentiation effects: single-order models and a hierarchical model.

For the single-order models, the 13 cognitive tests from each wave directly loaded onto the relevant wave’s g factor (the 13 cognitive tests at w1 were loaded onto g_w1, those from w2 onto g_w2, etc.). Figure 1 shows a diagram of the single-order models. We ran four types of single-order models:

Figure 1.

A central latent factor labeled g points to 13 observed cognitive test variables, each represented in a box. Arrows extend from g to each test.

Diagram of single-order latent g model structure. RT = Reaction Time; NART = National Test of Adult Reading; WTAR = Wechsler Test of Adult Reading.

  • Model A: g intercept estimated, cognitive test intercepts fixed to 0:

    • Model A (all waves; this is the main single-order model of interest)

    • Model A1, A2, A3, A4, A5 (separate waves, 5 models)

  • Model B: g intercept fixed to 0, cognitive test intercepts estimated:

    • Model B (all waves)

    • Model B1, B2, B3, B4, B5 (separate waves, 5 models)

Supplementary Figure 5 (see online supplementary material) shows an illustration showing the different model specifications in Model A and Model B. Model A (all waves) was the main model of interest, because in order to get relative g scores across waves (which enabled the testing of correlations between group-level dedifferentiation effects and group-level cognitive change), it was necessary to estimate the latent intercept and fix cognitive test intercepts to zero, and in order to run the relevant tests (e.g., tests of measurement invariance), it was also necessary to include all waves in the same model. Usually, to estimate latent scores within SEM, the latent intercept is fixed to zero, and cognitive test intercepts are estimated (as in Model B, and we include Model B to validate our results with this more conventional latent variable modeling approach). Supplementary Figure 6 (see online supplementary material) illustrates that the means of the g estimates from Model A change over time (reflecting that, generally, older adults do not score as highly as their younger selves), whereas in Model B, the group-level g estimates at each wave have a mean ≈ 0. For all-waves models, we set covariances between gs at different waves to zero because our aim was to calculate wave-specific latent g scores. We also ran g measurement models for each of the waves separately (models numbered 1 to 5, denoting each of the five waves) for validation and comparison purposes, and the lavaan code and results for all models are presented in the Supplementary material.

In these single-order models, we included a covariance between NART and WTAR, which have particularly high within-wave correlations in all waves (at each wave, r > .794), as these are highly similar cognitive tests involving word pronunciation.

An additional hierarchical analysis was conducted where, for each wave, the 13 cognitive tests load onto four cognitive domains: Crystallised Ability, Visuospatial Skills, Verbal Memory, and Processing Speed, and the four domains load onto g (see Figure 2), as modeled previously in this cohort (Ritchie et al., 2016; Tucker-Drob et al., 2014). We used a typical structural equation modeling framework and fixed covariances between g factors from different waves to 0. This modeling framework is comparable to Model B listed above, as the one used in Model A (fixing cognitive test loadings to 0) can distort mid-level factor loadings in hierarchical SEM models, which are the loadings of interest in this analysis. We did not specify any test-wise covariances in the hierarchical model, as our aim was for the domain factors to capture the maximum shared variance among their respective indicators.

Figure 2.

Hierarchical latent g model diagram: general factor g points to four domain-specific factors, each linking to three or four observed cognitive tests, 13 total.

Diagram of the hierarchical latent g model structure. Proc. Speed = Processing Speed; RT = Reaction Time; NART = National Adult Reading Test; WTAR = Wechsler Test of Adult Reading.

Data availability

To access the Lothian Birth Cohort data, see https://lothian-birth-cohorts.ed.ac.uk/data-access-collaboration. The code for all structural equation models is included in the Supplementary material. R version 4.2.0 (R Team, 2022) and lavaan 0.6.17 (Rosseel, 2012) were used for analyses.

Results

Descriptive statistics of all cognitive tests for all waves are in Supplementary Table 3 (see online supplementary material). Their distributions are shown in Supplementary Figure 3 (see online supplementary material), and their mean changes across waves in Supplementary Figure 4 (see online supplementary ­material). The standardized loadings and related results from the single-order latent variable models are shown in Supplementary Tables 5–8 (see online supplementary material), and those from the hierarchical latent variable model are in Supplementary Table 9 (see online supplementary material).

Single-order models (without domains)

In this section, unless stated otherwise, we report the results from Model A (all waves), but results from all models are in the Supplementary material.

The model fits for models A, A1–A5, B, and B1–B5 are ­presented in Supplementary Table 4 (see online supplementary ­material). Common quality of model fit criteria of CFI > .95, TLI > .95, RMSEA < .06, and SRMR < .08 (Hu & Bentler, 1999) are of less relevance than usual, due to our modeling choices. As expected, the model fit of the “all waves” models was poorer than for those that model each wave separately. This is because the longitudinal covariance paths that we would expect to be high across g scores at different waves were intentionally not included in the “all waves” models, because our aim was to calculate wave-specific latent g scores. However, their strong correspondence to our separate per-wave measurement models (discussed in the next section) indicates no model misspecification. Additionally, the fits of the separate-wave models are somewhat poorer than classic criteria for model fit. This is because, for reasons described above in the methods section, within-domain covariances were not included in these single-order models.

Correlations between g estimates across waves

There are strong correlations between g estimates at each wave (Model A, all waves: r range = .79 to .93). These correlations decrease in magnitude as the time between waves increases. For example, w1 g scores correlate at r = .89 with w2, r = .87 with w3, r = .86 with w4, and r = .79 with w5. In other words, people score similarly across waves (e.g., people who score highly compared to the rest of the group at one wave are likely to score highly at other waves), although this relationship weakens slightly with increased time between testing sessions. A similar trend was found for Model B (all waves, see Supplementary Figure 6 [see online supplementary material]), consolidating this finding.

Outcome reliability between models

To test the impact of different model types on outputs, we tested the consistency of g scores between the model types, and the results suggest high reliability. First, we tested the correlations of extracted g scores between “all waves” models versus the separate waves models. All correlations for extracted g scores matched by waves were, for Model A (between all waves and separate models), r > .995, and all factor congruence for loadings > .98; and for all correlations for Model B (between all waves and separate models), r = 1, and all factor congruence for loadings > .97. Then, we tested the correlations between the extracted g scores for Model A (all waves) and Model B (all waves). The extracted g estimates from Model A and Model B (all waves) were highly correlated, at r > .983 for all five waves (all measures of factor congruence for loadings > .97). Therefore, whereas the model syntax and model fits are different between the model types, the outcomes in terms of g estimates and loadings have high reliability.

Lack of weak measurement invariance: loadings of tests on g are not stable across waves

Having established the validity and reliability of our measurement models, we began to examine potential dedifferentiation effects. We first conducted a formal test of measurement invariance to test whether the 13 cognitive tests have similar loadings on the g factor across the waves (testing weak measurement invariance). We compared the standard model to one where loadings were fixed for each test across waves. There was no evidence of weak measurement invariance. The results were significant for both Model A (all waves), p < 2.2e−16, and Model B (all waves), p = .0007 (see Table 1 for details), providing evidence that there are differences in how the cognitive tests load onto g between waves. However, across the waves, the factor congruence = 1 for Model A (all waves) and > .97 for Model B (all waves), suggesting that the rank-order of test loadings on g is well-matched between waves.

Table 1.

Results from the chi-squared difference test, testing weak measurement invariance (standard model vs model with fixed loadings for each test across waves).

Model type Model specification df AIC BIC χ2 χ2 diff. RMSEA df diff. p
Model A (all waves) Standard 2,070 −7,754 −7,189 21,224
Fixed loadings 2,218 −7,331 −6,960 21,742 518 .153 48 <2.2e−16
Model B (all waves) Standard 2,010 −8,810 −8,002 20,048
Fixed loadings 2,058 −8,820 −8,207 20,133 85 .043 48 .0007

Note. These tests were significant, suggesting that there is no weak measurement invariance across waves. df = degrees of freedom; AIC = Akaike information criterion; BIC = Bayesian Information Criterion; diff = difference; RMSEA = Root Mean Square Error of Approximation.

The % of variance accounted for by g increases with age

Consistent with the dedifferentiation hypothesis of cognitive aging, the variance accounted for by g and the g loadings increased by wave in all models (see Supplementary Figures 6 and 7 [see online supplementary material]). In Model A (all waves), the variance accounted for was 24.64% at w1, 27.11% at w2, 29.30% at w3, 31.01% at w4, and 35.92% at w5 (correlation with mean age per wave: r = .971, p = .006, see Figure 3A). The results for the other models are highly similar and are shown in Supplementary Figure 8 (see online supplementary material). This provides direct evidence for dedifferentiation: the amount of variance that the 13 cognitive tests have in common increased over time.

Figure 3.

Results from the single-order latent <italic>g</italic> model: line graphs show that the % variance explained by <italic>g</italic> increases across waves, <italic>g</italic> estimates decline, and loadings on <italic>g</italic> strengthen. A heatmap illustrates the increasing loadings over time.

Results from the single-order model of cognitive dedifferentiation in LBC1936. (A) The % variance accounted for by latent g increases by wave. (B) Strong negative correlation between % of variance explained and g estimates across waves—as % variance increases across waves, g estimates decline. (C) Loadings on g tend to increase by wave. The black line represents the mean loading for each wave, plotted with the “lm” function. Equivalent figures, for the four different model types presented together, are available in Supplementary Figures 6 and 7 (see online supplementary material). (D) Heatmap of % of max loadings per cognitive test per wave, based on the same data as part (C), shows again that, across tests, loadings tend to increase between w1 and w5. RT = Reaction Time; NART = National Adult Reading Test; WTAR = Wechsler Test of Adult Reading.

Accordingly, the mean loadings increased over waves (Model A, all waves, w1 to w5 = .488, .511, .533, .550, and .593, see Figure 3C and D). The comparable results for all models are in Supplementary Figure 7 and Supplementary Tables 5–8 (see online supplementary material).

Correlations between % variance explained and mean g estimates

The % of the variance explained is strongly negatively associated with the group-level mean of individuals’ g score estimates, r = −.991, p = .0010 (see Figure 3B). The group level means (SDs) for g score estimates were 1.06 (.11), 1.04 (.11), .99 (.11), .96 (.12), and .88 (.13) for w1 to w5, respectively. Therefore, the rate of group-level dedifferentiation appears to closely track the group-level rate of cognitive decline. Note that the dedifferentiation effect does not directly numerically depend on a decrease in g within the model, as it also occurred in Model B, where the mean of g at each wave 0 (see above and Supplementary Figures 6 and 7 [see online supplementary material]).

Sensitivity analyses: excluding participants with subsequent dementia

We ran an additional analysis to test whether the dedifferentiation effect held when excluding people who received a subsequent dementia diagnosis from the analysis (no participants had confirmed dementia at w1). We ran Model A (all waves) with the N = 360 participants without a current dementia diagnosis (see results in Supplementary Figure 9 [see online supplementary material]).

As was the case for the full sample, the variance explained increased across the five waves: 24.47%, 26.35%, 27.58%, 27.84%, and 31.58%, although the percentage values were qualitatively lower than for the full sample (full sample w1 to w5: 24.64%, 27.11%, 29.30%, 31.01%, and 35.92%, ­correlation with mean age per wave: r = .939, p = .018). The correlation between loadings and the mean age per wave was accordingly lower than for the full sample: r = .221, p = .078 (compared to r = .358, p = .003).

To test the statistical significance of these reduced effects, we conducted a linear regression model, which showed that there was no significant interaction between sample type (full sample = 0, no dementia sample = 1) and wave (mean age per wave) when predicting loadings (standardized β = −.141, p = .406), and also no significant main effect of sample on loadings (standardized β  =  .002, p = .988). These results suggest that the sample composition (including people with dementia or not including them) did not significantly affect the results. Future research with a larger sample of participants who develop dementia is required to determine whether the (nonsignificant) reduction in the effect signals an acceleration of dedifferentiation in people who go on to develop dementia.

Dispersion and its relation to cognitive dedifferentiation

We also include an analysis of intraindividual dispersion to explore whether dedifferentiation (a group-level statistical phenomenon) is accompanied by changes in dispersion (a measure of the uniformity of cognitive test scores at the individual level). We first assessed whether levels of intraindividual dispersion changed across waves (that is, whether individuals became more or less variable in their performance across cognitive tests over time). To measure group-level dispersion at each wave, we first scaled each variable and then calculated the group-level mean of the individual-level SDs across the 13 cognitive tests for each wave. Group-level dispersion decreased over time, showing that individuals’ cognitive profiles became more homogenous with age (Ms (SDs) from w1 to w5: .819 (.204), .806 (.187), .798 (.184), .788 (.192), .750 (.186). The correlation between group-level dispersion and the proportion of variance explained by g from Model A (the dedifferentiation effect) was negative and very strong, r = −.989, p = .001. Analyses restricted to participants without missing data (N = 288) gave identical results to 3 decimal places (r = −.989, p = .001). Additionally, the correlation between group-level mean dispersion and group-level mean g scores for each wave was strong and positive, r = .955, p = .011, showing that cognitive profiles became flatter as g scores declined. These results indicate that here, the observed group-level dedifferentiation phenomenon strongly reflects the tendency for individuals’ test score performance across cognitive tests to become more similar with advancing age in later life.

Hierarchical model (including domains)

We conducted a hierarchical analysis to help clarify whether dedifferentiation effects from single-order models reflect age-related increases in the reliability of individual cognitive tests (as might be indicated by increased domain-to-test ­loadings) and/or whether they are better explained by convergence across cognitive domains (as indicated by increased g-to-domain loadings). The latter would strengthen the interpretation of a valid cognitive dedifferentiation effect. The model fit for the hierarchical model is in Supplementary Table 4 (see online supplementary material), and it fits similarly to Model B above (as in Model B, the model fit is poor by design, because covariances between g factors at different waves are set to 0).

Domain-to-test results

The proportion of variances accounted for from domains-to-tests generally increased over time, 45%, 47%, 48%, 49%, 54% from w1 to w5 (correlation with mean age per wave: r = .904, p = .035; see change in proportion of variance for each domain in Supplementary Figure 10 and Supplementary Table 10 [see online supplementary material]). However, despite the general increase in proportion of variance explained, we did not find evidence that individual test loadings change significantly with wave (no FDR Q values were < .05, see Supplementary Figure 10, panel D [see online supplementary material]) and, across all 13 cognitive tests, there was not a clear pattern of change in loadings over time (correlation with mean age per wave: r = .147, p = .242, see Supplementary Figure 10, panels B and C [see online supplementary material]). We further established that there was not a significant change in test-based residual variances across waves for individual cognitive tests (no FDR Q values < .05 for each of the 13 tests, or across all 13 cognitive tests, r = −.127, p = .312, see Supplementary Figure 11 [see online supplementary material]). Therefore, we concluded that increased test reliability cannot fully account for the dedifferentiation effect observed in the single-order models.

g-to-domain results

We then tested whether cognitive dedifferentiation is occurring at the g-to-domain level. The results show a general increase in the proportion of variances accounted for over time, 60%, 59%, 63%, 65%, 66% (correlation with mean age per wave: r = .969, p = .007, see Figure 4A). Further investigations into the cross-wave g-to-domain loadings provide insights into cognitive dedifferentiation. When combined across all four domains, there was not a significant general increase in loadings across waves: r = .199, p = .401. However, at the domain-specific level, there were significant increases in g-to-domain loadings for the Visuospatial domain (r = 0.908, p = .033, FDR Q = .044) and the Speed domain (r = .993, p = .0006, FDR Q = .002), which is evidence of cognitive dedifferentiation effects in these fluid domains (see Figure 4B–D).

Figure 4.

(A) A line graph shows % variance explained from <italic>g</italic>-to-domains increases across waves, (B) and (C) show loadings across waves, (D) a bar chart indicates that there are significant changes for the domains apart from Verbal Memory.

g-to-domain results. (A) shows the proportion of variance explained from g-to-domains across waves, (B) shows the g-to-domain loadings across the five waves, (C) is a heat map showing the same data as in (B) for ease of comparison, and (D) is a bar chart showing that loading magnitudes at the g-to-domain level changed significantly (FDR Q < .05) across waves for Visuospatial, Processing Speed, and Crystallised Ability domains, and did not change for Verbal Memory. Proc. Speed = Processing Speed; FDR Q = False Discovery Rate Q.

In contrast, Crystallised Ability showed decreasing loadings on g over time (r = –.948, p = .014, FDR Q = .028). This domain remained equally well-defined across waves, as indicated by its relatively stable domain-to-test loadings (p = .389; see Supplementary Figure 10, part A), suggesting that its observed decrease in g-loading across waves is not attributable to a decline in construct or test quality. Rather, it appears that Crystallised Ability retains domain specificity and that the composition of g shifts away from crystallised skills and toward fluid skills in later life. A comparable decrease in loadings over time for Crystallised Ability tests can be seen in the single-order model equivalent, Model B, as NART and WTAR loadings both decrease over time (correlation with mean age per wave: r = −.924, p = .025, r = −.939, p = .018, respectively). The verbal fluency test, which loads more weakly onto Crystallised Ability than the other two indicators, does not show a significant increase over time in Model B (r = .311, p = .610). Therefore, domain-level shifts in g’s composition, away from Crystallised Ability, can also be detected in the single-order models.

For Verbal Memory, there was not a significant change in its g-to-domain loading (r = .216, p = .727, FDR Q = .727, see Figure 4B–D). This could be due to its already strong loading onto g at baseline (loading = .883; compared to .644 for Speed, 0.760 for Crystallised Ability, and .781 for Visuospatial Skills, all at w1). The strong association between g and Verbal Memory could reflect the fact that the tests in this domain are linked to both crystallised and fluid cognitive processes (see Figure 5 and Supplementary Table 12 for interdomain correlations [see online supplementary material]). Perhaps because of a somewhat mixed composition, Verbal Memory already shares a high degree of variance with g, and there is less scope for an observable dedifferentiation effect at the level of g-to-domain factor loadings.

Figure 5.

Four line graphs showing interdomain correlations across the four domains (one graph per domain). Each graph has four colored lines, relating to each domain.

Interdomain correlations for (A) Visuospatial Ability, (B) Crystallised Ability, (C), Verbal Memory, and (D) Processing Speed. The interdomain correlation results relating to this figure are in Supplementary Tables 11 and 12 (see online supplementary material). Proc. Speed = Processing Speed.

Interdomain correlation results

However, dedifferentiation effects can also be examined at the interdomain level (which could be conceptualized as horizontal effects, compared to g-to-domain effects, which are hierarchical). We calculated correlations between extracted domain scores per wave and tested how these interdomain correlations change over time (see Supplementary Table 11 [see online supplementary material]). Results indicate that Verbal Memory becomes increasingly correlated with fluid domains (Processing Speed and Visuospatial Ability) across waves (r = .951, p = .013, FDR Q = .026 and r = .778, p = .122, FDR Q = .0147, respectively; and together, r = .905, p = .035; see Figure 5C and Supplementary Table 11 [see online supplementary material]), showing that dedifferentiation is occurring between fluid domains. In contrast, Verbal Memory shows a strong and negative correlation with Crystallised Ability across waves (r = −.974, p = .005, FDR Q = .015), reflecting a shift in its alignment away from its crystallised elements and toward its fluid components, as g shifts toward fluid domains (see Figure 5C). The interdomain correlations show that, despite Verbal Memory’s loadings onto g remaining fairly stable, its broader integration with fluid domains increases in later life, and its correlation with Crystallised Ability decreases, consistent with the cognitive dedifferentiation effects observed at the g-to-domain level.

Overall, the results of the hierarchical analysis suggest that dedifferentiation effects affect the composition of g, as fluid cognitive domains converge and Crystallised Ability becomes less central to the structure of g in later life.

Discussion

This longitudinal study provides evidence of aging-related group-level cognitive dedifferentiation from ages 70 to 82 years old. Single-level models showed increasing proportions of variance explained and loadings across a multidomain battery of 13 cognitive tests measured on five occasions. Evidence for dedifferentiation was supported by a lack of weak measurement invariance in the measurement of g across waves. We also found that the group-level rate of dedifferentiation (measured by % of variance explained and mean loadings per wave) was strongly negatively correlated with the group-level mean of individuals’ g estimates. In other words, the group-level rate of dedifferentiation appeared to closely track the group-level rate of cognitive decline. This finding is in line with previous work showing evidence of increased dedifferentiation with lower (cross-sectional) mean levels of cognitive performance (Deary et al., 1996; Detterman & Daniel, 1989; Deary & Pagliari, 1991). However, it is difficult to disentangle here whether dedifferentiation effects are better explained by lower cognitive performance or by age itself (Deary et al., 2004; Tucker-Drob et al., 2019).

The dedifferentiation effect was slightly reduced (although this was not statistically significant) when analyses were conducted on a subset of the sample who have not gone on to receive a dementia diagnosis, pointing to the possibility that while dedifferentiation is a hallmark of normative cognitive aging (as also concluded by Tucker-Drob et al., 2019), the rate of dedifferentiation may accelerate with pathology such as dementia, as found in some other studies (e.g., Wallert et al., 2021). Again, the question of whether such effects are wholly attributable to cognitive decline or involve additional mechanisms requires careful testing and will inform whether cognitive dedifferentiation can serve as an early marker of pathological cognitive decline, beyond traditional assessments of cognitive performance (i.e., individual test scores).

We also observed a decrease in intraindividual dispersion across 13 cognitive tests over time, indicating that individuals’ cognitive profiles became more homogeneous over time in later life. This finding is consistent with some previous findings (Buczylowska & Petermann, 2018; Lindenberger & Baltes, 1997; Mella et al., 2016). This decrease in dispersion occurred alongside the increased proportion of variance explained by g and a decrease in cognitive scores. In other words, the cognitive dedifferentiation and cognitive decline effects we found here are mirrored in the intraindividual cognitive scores becoming increasingly interdependent with age. As people differ in the amount of dispersion they exhibit, dispersion offers an attractive tool to better understand the phenomenon of cognitive dedifferentiation (e.g., its determinants, neural correlates, and functional outcomes).

Crucial to our interpretation of the dedifferentiation effects, a hierarchical model showed that the proportion of variances tended to increase across waves at both the domain-to-test and g-to-domain levels, but the domain-to-test level loadings did not generally increase. At the g-to-domain level, loadings increased for Visuospatial Skills and Processing Speed, decreased for Crystallised Ability, and remained stable for Verbal Memory. Interdomain correlations showed that Verbal Memory exhibits convergence with the other fluid domains (Visuospatial Skills and Processing Speed), suggesting that although there were no dedifferentiation effects evident at the g-to-domain level (perhaps due to high baseline correlations with g), it still became more strongly correlated with other fluid domains over time. Concurrently, Crystallised Ability may maintain or even increase its independence as other domains converge into a more fluid-driven general factor. These findings have implications for longitudinal g modeling decisions—it cannot be assumed that the g factor’s composition is the same over time.

The current results contrast with the findings of Tucker-Drob et al. (2019), who found no evidence of static dedifferentiation in their meta-analysis across 22 longitudinal data sets, with mean baseline ages ranging from 35 to 85 years old and between two and 17 cognitive outcomes per data set. Whereas their analyses aggregated across multiple heterogeneous cohorts, our findings are derived from a single narrow-age cohort, potentially allowing for more consistent aging-related patterns to emerge. The results we find might be explained by dynamic dedifferentiation, a term coined by Tucker-Drob et al. (2019). Whilst the current paper did not conduct a direct test of dynamic dedifferentiation, that is, it did not test whether correlations in cognitive change increased with age (given the narrow age range of the LBC1936, and only five available waves of data, which limits the ability to have at least three instances for >1 slope parameter), we have previously reported that longitudinal cognitive changes in the LBC1936 show ­correlated patterns across domains (Corley et al., 2023; Tucker-Drob et al., 2014). It is possible that the dedifferentiation effects observed in our data (“static dedifferentiation”) reflect, in part, the cumulative effects of aging-related declines becoming more interrelated over time (“dynamic dedifferentiation”). The fact that cognitive domains are declining in tandem and are increasingly sharing variance points to the potential presence of common underlying causes as an Occam’s razor explanation (see, e.g., Baltes & Lindenberger, 1997).

The LBC1936 cohort is well-placed to characterize dedifferentiation between 70 and 82 years old, with a rich 13-test cognitive test battery and five waves of data spanning ∼12 years. This data set allowed us to observe within-individual changes over time, a meaningful advantage over cross-sectional studies, which are limited by potential confounding factors such as cohort effects. Cross-sectional data can provide some insights into aging, but, unlike longitudinal data, it cannot capture the complexity and dynamic within-person phenomenon of cognitive aging (Fjell & Walhovd, 2012; Raz & Lindenberger, 2011; Salthouse, 2011). Future longitudinal research covering different stages of the human lifespan will improve understanding of the developmental trajectory of differentiation and dedifferentiation effects, and research directly involving neurobiological properties could identify key mechanisms behind these phenomena.

It should be noted that a prevalent limitation of longitudinal studies is participant drop-out, particularly when systematic. There was some evidence of systematic drop-out here, as people who had slightly lower cognitive test scores on w1 were more likely to have missing data between w2 and w5 than those with cognitive data at all five waves, and we have previously characterized other ways in which this cohort—like many others—suffers from healthy selection bias (Corley et al., 2023). We only included data from people who had cognitive data at all five waves in the current sample, so our characterization of dedifferentiation is based on a subsample of people with slightly higher cognitive test scores than the full sample, which could have unknown effects on, for example, the reported rate of dedifferentiation. Future research is required to establish links between baseline cognitive performance levels and the rate of dedifferentiation in later life.

Due to the inclusion criteria of the LBC1936 cohort, all participants in this study are White Scottish. Future research incorporating more racially and ethnically diverse samples would be valuable in assessing the generalizability of dedifferentiation effects in older age.

Future research aimed at examining and understanding the day-to-day effects of cognitive dedifferentiation, in addition to cognitive decline, could prove beneficial for developing targeted interventions and strategies to support cognitive functioning and daily life in aging populations.

Conclusion

Overall, this study provides longitudinal evidence of cognitive dedifferentiation between ages 70 and 82, supporting the hypothesis that cognitive skills become increasingly interrelated in later life. We show that the group-level rate of this effect is closely linked to the observed rate of group-level cognitive decline. Over time, the composition of the general factor of cognitive functioning (g) shifts: fluid cognitive abilities become increasingly central to g, while crystallised abilities appear to contribute less. These findings challenge the assumption that the structure of g remains stable with age, and therefore, they also have important implications for how g is modeled across the lifespan. Future longitudinal research will be key in clarifying the incremental validity, determinants, mechanisms, and implications of cognitive differentiation and dedifferentiation across the lifespan.

Supplementary Material

gbaf189_Supplementary_Data

Acknowledgments

We thank the participants of the Lothian Birth Cohort 1936 (LBC1936) and the research team for their work in collecting, processing, and providing the data for these analyses. We also thank an anonymous reviewer whose suggestions significantly extended the scope of this study. Previous modeling of g slopes in the LBC1936 cohort has primarily focused on identifying common variance across longitudinal changes in cognitive tests using factor-of-curves models. However, as part of a consortium analysis (led by Dr James Roe and Professor Kristine Walhovd at the University of Oslo), we were required to structure the modeling approach in a manner that allowed for the extraction of a relative g score at each wave. In running and extending these analyses, we identified evidence of cognitive dedifferentiation, a result that is relevant to the ongoing debate in this field. Recognizing its potential implications, we considered it important to validate and disseminate these findings.

Contributor Information

Joanna E Moodie, Lothian Birth Cohorts, Department of Psychology, The University of Edinburgh, Edinburgh, United Kingdom.

Janie Corley, Lothian Birth Cohorts, Department of Psychology, The University of Edinburgh, Edinburgh, United Kingdom.

Ian J Deary, Lothian Birth Cohorts, Department of Psychology, The University of Edinburgh, Edinburgh, United Kingdom.

Simon R Cox, Lothian Birth Cohorts, Department of Psychology, The University of Edinburgh, Edinburgh, United Kingdom.

Supplementary material

Supplementary data are available at The Journals of Gerontology, Series B: Psychological Sciences and Social Sciences online.

Funding

The authors gratefully acknowledge funding from the BBSRC & ESRC (BB/W008793/1), Age UK (Disconnected Mind ­Project), the Medical Research Council (MR/M01311/1; MR/K026992/1), the US National Institutes of Health (R01AG054628; U01AG083829), the Milton Damerel Trust, and the University of Edinburgh. S.R.C. and J.E.M. are supported by a Sir Henry Dale Fellowship, jointly funded by the Wellcome Trust and the Royal Society (221890/Z/20/Z).

Conflict of interest

None declared.

Data Availability

To access the Lothian Birth Cohort data, see https://lothian-birth-cohorts.ed.ac.uk/data-access-collaboration. The R code for all structural equation models is included in the Supplementary Material. The study was not preregistered.

References

  1. Anstey K. J., Hofer S. M., Luszcz M. A. (2003). Cross-sectional and longitudinal patterns of dedifferentiation in late-life cognitive and sensory function: The effects of age, ability, attrition, and occasion of measurement. Journal of Experimental Psychology. General, 132, 470–487. 10.1037/0096-3445.132.3.470 [DOI] [PubMed] [Google Scholar]
  2. Babcock R. L., Laguna K. D., Roesch S. C. (1997). A comparison of the factor structure of processing speed for younger and older adults: Testing the assumption of measurement equivalence across age groups. Psychology and Aging, 12, 268–276. 10.1037//0882-7974.12.2.268 [DOI] [PubMed] [Google Scholar]
  3. Baltes P. B., Cornelius S. W., Spiro A., Nesselroade J. R., Willis S. L. (1980). Integration versus differentiation in fluid/crystallized intelligence in old age. Developmental Psychology, 16, 625–635. 10.1037/0012-1649.16.6.625 [DOI] [Google Scholar]
  4. Baltes P. B., Lindenberger U. (1997). Emergence of a powerful connection between sensory and cognitive functions across the adult life span: A new window to the study of cognitive aging?  Psychology and Aging, 12, 12–21. 10.1037//0882-7974.12.1.12 [DOI] [PubMed] [Google Scholar]
  5. Batterham P. J., Christensen H., Mackinnon A. J. (2011). Comparison of age and time-to-death in the dedifferentiation of late-life cognitive abilities. Psychology and Aging, 26, 844–851. 10.1037/a0023300 [DOI] [PubMed] [Google Scholar]
  6. Blum D., Holling H. (2017). Spearman’s law of diminishing returns. A meta-analysis. Intelligence, 65, 60–66. 10.1016/j.intell.2017.07.004 [DOI] [Google Scholar]
  7. Breit M., Brunner M., Preuß J., Daseking M., Pauls F., Walter F., Preckel F. (2025). The contribution of general intelligence to cognitive performance across the lifespan: A differentiation analysis of the Wechsler tests. Psychology and Aging, 40, 237–254. 10.1037/pag0000875 [DOI] [PubMed] [Google Scholar]
  8. Buczylowska D., Petermann F. (2018). Intraindividual variability in executive function performance in healthy adults: Cross-sectional analysis of the NAB executive functions module. Frontiers in Psychology, 9, 329. 10.3389/fpsyg.2018.00329 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Burt C. (1954). The differentiation of intellectual ability. British Journal of Educational Psychology, 24, 76–90. 10.1111/j.2044-8279.1954.tb02882.x [DOI] [Google Scholar]
  10. Corley J., Conte F., Harris S. E., Taylor A. M., Redmond P., Russ T. C., Deary I. J., Cox S. R. (2023). Predictors of longitudinal cognitive ageing from age 70 to 82 including APOE e4 status, early-life and lifestyle factors: The Lothian Birth Cohort 1936. Molecular Psychiatry, 28, 1256–1271. 10.1038/s41380-022-01900-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Cox S. R., Ritchie S. J., Tucker-Drob E. M., Liewald D. C., Hagenaars S. P., Davies G., Wardlaw J. M., Gale C. R., Bastin M. E., Deary I. J. (2016). Ageing and brain white matter structure in 3,513 UK Biobank participants. Nature Communications, 7, 13629. 10.1038/ncomms13629 [DOI] [Google Scholar]
  12. de Frias C. M., Lövdén M., Lindenberger U., Nilsson L.-G. (2007). Revisiting the dedifferentiation hypothesis with longitudinal multi-cohort data. Intelligence, 35, 381–392. 10.1016/j.intell.2006.07.011 [DOI] [Google Scholar]
  13. de Mooij S. M. M., Henson R. N. A., Waldorp L. J., Kievit R. A. (2018). Age differentiation within gray matter, white matter, and between memory and white matter in an Adult Life Span Cohort. The Journal of Neuroscience, 38, 5826–5836. 10.1523/JNEUROSCI.1627-17.2018 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Deary I. J. (2012). Intelligence. Annual Review of Psychology, 63, 453–482. 10.1146/annurev-psych-120710-100353 [DOI] [Google Scholar]
  15. Deary I. J., Corley J., Gow A. J., Harris S. E., Houlihan L. M., Marioni R. E., Penke L., Rafnsson S. B., Starr J. M. (2009). Age-associated cognitive decline. British Medical Bulletin, 92, 135–152. 10.1093/bmb/ldp033 [DOI] [PubMed] [Google Scholar]
  16. Deary I. J., Der G., Ford G. (2001). Reaction times and intelligence differences: A population-based cohort study. Intelligence, 29, 389–399. 10.1016/S0160-2896(01)00062-9 [DOI] [Google Scholar]
  17. Deary I. J., Egan V., Gibson G. J., Austin E. J., Brand C. R., Kellaghan T. (1996). Intelligence and the differentiation hypothesis. Intelligence, 23, 105–132. 10.1016/S0160-2896(96)90008-2 [DOI] [Google Scholar]
  18. Deary I. J., Gow A. J., Taylor M. D., Corley J., Brett C., Wilson V., Campbell H., Whalley L. J., Visscher P. M., Porteous D. J., Starr J. M. (2007). The Lothian Birth Cohort 1936: A study to examine influences on cognitive ageing from age 11 to age 70 and beyond. BMC Geriatrics, 7, 28. 10.1186/1471-2318-7-28 [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Deary I. J., Pagliari C. (1991). The strength of g at different levels of ability: Have Detterman and Daniel rediscovered Spearman’s “law of diminishing returns” ? Intelligence, 15, 247–250. 10.1016/0160-2896(91)90033-A [DOI] [Google Scholar]
  20. Deary I. J., Simonotto E., Meyer M., Marshall A., Marshall I., Goddard N., Wardlaw J. M. (2004). The functional anatomy of inspection time: An event-related fMRI study. NeuroImage, 22, 1466–1479. 10.1016/j.neuroimage.2004.03.047. [DOI] [PubMed] [Google Scholar]
  21. Deary I. J., Whiteman M. C., Starr J. M., Whalley L. J., Fox H. C. (2004). The impact of childhood intelligence on later life: Following up the Scottish Mental Surveys of 1932 and 1947. Journal of Personality and Social Psychology, 86, 130–147. 10.1037/0022-3514.86.1.130 [DOI] [PubMed] [Google Scholar]
  22. Detterman D. K., Daniel M. H. (1989). Correlations of mental tests with each other and with cognitive variables are highest for low IQ groups. Intelligence, 13, 349–359. 10.1016/S0160-2896(89)80007-8 [DOI] [Google Scholar]
  23. Fjell A. M., Walhovd K. B. (2012). Neuroimaging results impose new views on Alzheimer’s disease-the role of amyloid revised. Molecular Neurobiology, 45, 153–172. 10.1007/s12035-011-8228-7 [DOI] [PubMed] [Google Scholar]
  24. Ghisletta P., de Ribaupierre A. (2005). A dynamic investigation of cognitive dedifferentiation with control for retest: Evidence from the Swiss interdisciplinary longitudinal study on the oldest old. Psychology and Aging, 20, 671–682. 10.1037/0882-7974.20.4.671 [DOI] [PubMed] [Google Scholar]
  25. Ghisletta P., Lindenberger U. (2003). Age-based structural dynamics between perceptual speed and knowledge in the Berlin Aging Study: Direct evidence for ability dedifferentiation in old age. Psychology and Aging, 18, 696–713. 10.1037/0882-7974.18.4.696 [DOI] [PubMed] [Google Scholar]
  26. Ghisletta P., Lindenberger U. (2004). Static and dynamic longitudinal structural analyses of cognitive changes in old age. Gerontology, 50, 12–16. 10.1159/000074383 [DOI] [PubMed] [Google Scholar]
  27. Hartung J., Doebler P., Schroeders U., Wilhelm O. (2018). Dedifferentiation and differentiation of intelligence in adults across age and years of education. Intelligence, 69, 37–49. 10.1016/j.intell.2018.04.003 [DOI] [Google Scholar]
  28. Hu L., Bentler P. M. (1999). Cut-off criteria for fit indexes in covariance structure analysis. Conventional criteria verses new alternatives. Structural Equation Modelling, 6, 1–55. 10.1080/10705519909540118 [DOI] [Google Scholar]
  29. Hülür G., Ram N., Willis S. L., Schaie K. W., Gerstorf D. (2015). Cognitive dedifferentiation with increasing age and proximity of death: Within-person evidence from the Seattle Longitudinal Study. Psychology and Aging, 30, 311–323. 10.1037/a0039260 [DOI] [PubMed] [Google Scholar]
  30. Johnson W., Bouchard T. J. Jr., Krueger R. F., McGue M., Gottesman I. I. (2004). Just one g: Consistent results from three test ­batteries. Intelligence, 32, 95–107. 10.1016/S0160-2896(03)00062-X [DOI] [Google Scholar]
  31. Johnson W., Nijenhuis J. T., Bouchard T. J. (2008). Still just 1 g: Consistent results from five test batteries. Intelligence, 36, 81–95. 10.1016/j.intell.2007.06.001. [DOI] [Google Scholar]
  32. Jonas K., Lian W., Callahan J., Ruggero C. J., Clouston S., Reichenberg A., Carlson G. A., Bromet E. J., Kotov R. (2022). The course of general cognitive function in individuals with psychotic disorders. JAMA Psychiatry, 79, 659–666. 10.1001/jamapsychiatry.2022.1142 [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Lezak M. D., Howieson D. B., Loring D. W., Hannay H. J., Fischer J. S. (2004). Neuropsychological assessment (4th ed.). Oxford University Press. [Google Scholar]
  34. Li S. C., Lindenberger U., Hommel B., Aschersleben G., Prinz W., Baltes P. B. (2004). Transformations in the couplings among intellectual abilities and constituent cognitive processes across the life span. Psychological Science, 15, 155–163. 10.1111/j.0956-7976.2004.01503003.x [DOI] [PubMed] [Google Scholar]
  35. Li S.-C., Lindenberger U., Sikström S. (2001). Aging cognition: from neuromodulation to representation. Trends in Cognitive Sciences, 5, 479–486. 10.1016/S1364-6613(00)01769-1 [DOI] [PubMed] [Google Scholar]
  36. Lindenberger U., Baltes P. B. (1997). Intellectual functioning in old and very old age: Cross-sectional results from the Berlin Aging Study. Psychology and Aging, 12, 410–432. 10.1037/0882-7974.12.3.410 [DOI] [PubMed] [Google Scholar]
  37. Mella N., Fagot D., de Ribaupierre A. (2016). Dispersion in cognitive functioning: Age differences over the lifespan. Journal of Clinical and Experimental Neuropsychology, 38, 111–126. 10.1080/13803395.2015.1089979 [DOI] [PubMed] [Google Scholar]
  38. Mullin D. S., Stirland L. E., Buchanan E., Convery C. A., Cox S. R., Deary I. J., Giuntoli C., Greer H., Page D., Robertson E., Shenkin S. D., Szalek A., Taylor A., Weatherdon G., Wilkinson T., Russ T. C. (2023). Identifying dementia using medical data linkage in a longitudinal cohort study: Lothian Birth Cohort 1936. BMC Psychiatry, 23, 303. 10.1186/s12888-023-04797-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Nelson H. E., Wilson J. (1991). National Adult Reading Test (NART). NFER-Nelson. [Google Scholar]
  40. Panizzon M. S., Vuoksimaa E., Spoon K. M., Jacobson K. C., Lyons M. J., Franz C. E., Xian H., Vasilopoulos T., Kremen W. S. (2014). Genetic and environmental influences of general cognitive ability: Is g a valid latent construct?  Intelligence, 43, 65–76. 10.1016/j.intell.2014.01.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Park D. C., Reuter-Lorenz P. (2009). The adaptive brain: aging and neurocognitive scaffolding. Annual Review of Psychology, 60, 173–196. 10.1146/annurev.psych.59.103006.093656 [DOI] [Google Scholar]
  42. Rapp M. A., Schnaider-Beeri M., Sano M., Silverman J. M., Haroutunian V. (2005). Cross-domain variability of cognitive performance in very old nursing home residents and community dwellers: Relationship to functional status. Gerontology, 51, 206–212. 10.1159/000083995 [DOI] [PubMed] [Google Scholar]
  43. R Core Team (2022). R: A language and environment for statistical computing. R Foundation for Statistical Computing. https://www.R-project.org/
  44. Raz N. (2000). Aging of the brain and its impact on cognitive performance: Integration of structural and functional findings. In F. I. M. Craik  Salthouse T. A. (Eds.), Handbook of aging and cognition (pp. 1–90). Erlbaum. [Google Scholar]
  45. Raz N., Lindenberger U. (2011). Only time will tell: Cross-sectional studies offer no solution to the age-brain-cognition triangle—comment on Salthouse. Psychological Bulletin, 137, 790–795. 10.1037/a0024503. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Ritchie S. J., Tucker-Drob E. M., Cox S. R., Corley J., Dykiert D., Redmond P., Pattie A., Taylor A. M., Sibbett R., Starr J. M., Deary I. J. (2016). Predictors of ageing related decline across multiple cognitive functions. Intelligence, 59, 115–126. 10.1016/j.intell.2016.08.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Rosseel Y. (2012). lavaan: An R Package for Structural Equation Modeling. Journal of Statistical Software, 48, 1–36. 10.18637/jss.v048.i02. [DOI] [Google Scholar]
  48. Salthouse T. A. (2005). Relations between cognitive abilities and measures of executive function. Neuropsychology, 19, 532–545. 10.1037/0894-4105.19.4.532 [DOI] [PubMed] [Google Scholar]
  49. Salthouse T. A. (2011). Neuroanatomical substrates of age-related cognitive decline. Psychological Bulletin, 137, 753–784. 10.1037/a0023262. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Sims R. C., Allaire J. C., Gamaldo A. A., Edwards C. L., Whitfield K. E. (2009). An examination of dedifferentiation in cognition among African-American older adults. Journal of Cross-Cultural Gerontology, 24, 193–208. 10.1007/s10823-008-9080-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Spearman C. (1927). The abilities of man. MacMillan. [Google Scholar]
  52. Taylor A. M., Pattie A., Deary I. J. (2018). Cohort profile update: The Lothian Birth Cohorts of 1921 and 1936. International Journal of Epidemiology, 47, 1042–1042r. 10.1093/ije/dyy022 [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Tucker-Drob E. M. (2009). Differentiation of cognitive abilities across the life span. Developmental Psychology, 45, 1097–1118. 10.1037/a0015864 [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Tucker-Drob E. M., Brandmaier A. M., Lindenberger U. (2019). Coupled cognitive changes in adulthood: A meta-analysis. Psychological Bulletin, 145, 273–301. 10.1037/bul0000179 [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Tucker-Drob E. M., Briley D. A., Starr J. M., Deary I. J. (2014). Structure and correlates of cognitive aging in a narrow age cohort. Psychology and Aging, 29, 236–249. 10.1037/a0036187 [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Tucker-Drob E. M., Salthouse T. A. (2008). Adult age trends in the relations among cognitive abilities. Psychology and Aging, 23, 453–460. 10.1037/0882-7974.23.2.453 [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Wallert J., Rennie A., Ferreira D., Muehlboeck J. S., Wahlund L. O., Westman E., Ekman U.; ADNI consortium, and MemClin Steering Committee. (2021). Cognitive dedifferentiation as a function of cognitive impairment in the ADNI and MemClin cohorts. Aging, 13, 13430–13442. 10.18632/aging.203108 [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Wechsler D. (1997. a). Wechsler Adult Intelligence Scale (3rd ed.). The Psychological Corporation. [Google Scholar]
  59. Wechsler D. (1997. b). Wechsler Memory Scale (3rd ed.). The Psychological Corporation. [Google Scholar]
  60. Wechsler D. (2001). Wechsler Test of Adult Reading. The Psychological Corporation. [Google Scholar]
  61. Whitley E., Deary I. J., Ritchie S. J., Batty G. D., Kumari M., Benzeval M. (2016). Variations in cognitive abilities across the life course: Cross-sectional evidence from Understanding Society: The UK Household Longitudinal Study. Intelligence, 59, 39–50. 10.1016/j.intell.2016.07.001 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

gbaf189_Supplementary_Data

Data Availability Statement

To access the Lothian Birth Cohort data, see https://lothian-birth-cohorts.ed.ac.uk/data-access-collaboration. The code for all structural equation models is included in the Supplementary material. R version 4.2.0 (R Team, 2022) and lavaan 0.6.17 (Rosseel, 2012) were used for analyses.

To access the Lothian Birth Cohort data, see https://lothian-birth-cohorts.ed.ac.uk/data-access-collaboration. The R code for all structural equation models is included in the Supplementary Material. The study was not preregistered.


Articles from The Journals of Gerontology Series B: Psychological Sciences and Social Sciences are provided here courtesy of Oxford University Press

RESOURCES