Abstract
Objectives
This paper examined the magnitude of differences in performance across domains of cognitive functioning between participants who attrited from studies and those who did not, using data from longitudinal ageing studies where multiple cognitive tests were administered.
Design
Individual participant data meta-analysis.
Participants
Data are from 10 epidemiological longitudinal studies on ageing (total n=209 518) from several Western countries (UK, USA, Mexico, etc). Each study had multiple waves of data (range of 2–17 waves), with multiple cognitive tests administered at each wave (range of 4–17 tests). Only waves with cognitive tests and information on participant dropout at the immediate next wave for adults aged 50 years or older were used in the meta-analysis.
Measures
For each pair of consecutive study waves, we compared the difference in cognitive scores (Cohen’s d) between participants who dropped out at the next study wave and those who remained. Note that our operationalisation of dropout was inclusive of all causes (eg, mortality). The proportion of participant dropout at each wave was also computed.
Results
The average proportion of dropouts between consecutive study waves was 0.26 (0.18 to 0.34). People who attrited were found to have significantly lower levels of cognitive functioning in all domains (at the wave 2–3 years before attrition) compared with those who did not attrit, with small-to-medium effect sizes (overall d=0.37 (0.30 to 0.43)).
Conclusions
Older adults who attrited from longitudinal ageing studies had lower cognitive functioning (assessed at the timepoint before attrition) across all domains as compared with individuals who remained. Cognitive functioning differences may contribute to selection bias in longitudinal ageing studies, impeding accurate conclusions in developmental research. In addition, examining the functional capabilities of attriters may be valuable for determining whether attriters experience functional limitations requiring healthcare attention.
Keywords: Dementia, EPIDEMIOLOGY, Aging
Strengths and limitations of this study.
Individual participant data meta-analysis was conducted over 10 epidemiological longitudinal studies on ageing in several different countries (total n=209 518) to examine the magnitude of cognitive differences between participants who remained in these longitudinal studies and those who dropped out.
We reported the magnitude of cognitive functioning differences between attriters and those who remained across eight domains of cognitive functioning, a more comprehensive coverage of cognitive domains than in prior studies.
We are unable to distinguish the deceased from other attriters, an issue particularly relevant in studies with older respondents, which means that the cognitive functioning differences with respect to those who remain are a weighted average of the differences within these two groups; the computed average proportion of dropouts between consecutive study waves is a similar weighted average.
Introduction
Epidemiological studies on ageing are used to investigate a variety of questions of relevance to interventions including factors increasing a person’s likelihood of developing dementia,1 the consequences of conditions such as dementia (eg, fall risk)2 and long-term outcomes after medical procedures.3 Given the importance of the research questions addressed with such studies, prior research has often investigated factors that could bias study findings and hence limit generalisability of results. Correlates of study attrition have frequently been examined4–6 due to concerns that attrition could lead to selection bias resulting in samples with greater health and functioning compared with the population of interest.
Several studies have found that, when comparing participants who remained in longitudinal ageing studies with those who dropped out after the initial wave including both groups, individuals who dropped out had lower cognitive functioning.4 7 8 Inconsistent findings exist, however, regarding the specific areas of cognitive ability that differ between those who stayed in the study and those who attrited.4 7 Furthermore, no studies have yet attempted to integrate findings on this question from multiple epidemiological studies of ageing covering different countries and across multiple cognitive domains.4 5 7
The cognitive health of participants who drop out of longitudinal ageing studies versus those who remain may have implications not only for sample selection bias but also for future work needed to understand the functional capabilities of attriters. Completion of a survey is a task that can be as complex as engaging in instrumental activities of daily living (IADLs) such as medication management, shopping, and everyday technology use. Responding to a survey item involves cognitive processes such as comprehension, information retrieval, and formation of judgements9 that are also required for performing IADLs. Among older adults in particular, one possible cause of ceasing to participate in a study could be mild cognitive impairment, which has been associated with decreased capacity to perform IADLs.10 Of course, study attrition may also be attributable to a variety of other causes, including old age (and associated mortality), relocation, and loss of interest in participation.5 11 However, if respondents who attrit from studies are consistently found to have lower cognitive functioning, this may indicate that older individuals who attrit from longitudinal ageing studies may have greater healthcare needs.
We build on prior research on differences in cognitive health between participants who remained in longitudinal ageing studies and those who dropped out4 7 8 with an individual participant data meta-analysis to evaluate how closely cognitive problems and dropout are connected and to examine whether specific cognitive problems/domains are relevant for attrition. Use of multiple studies from different countries allowed for the examination of the degree to which differences in cognition (at the previous wave before dropout) between participants who remained in ageing studies and those who attrited were country- and sample-specific. Consideration of a wide array of cognitive domains allowed for a more comprehensive understanding of areas of cognitive ability that differ between those who remain in longitudinal studies and those who attrit.
Methods
Study design
To examine the potential adverse cognitive outcomes associated with attrition, individual participant data (IPD) meta-analysis12 was used to synthesise findings from multiple longitudinal epidemiological studies on ageing, each of which had several waves (range of 2–17 waves) with multiple cognitive tests administered at each wave (range of 4–17 tests, 3–8 cognitive domains covered). Ten studies were included with their characteristics shown in table 1: the Australian Longitudinal Study of Ageing (ALSA),13 Canberra Longitudinal Study,14 Einstein Aging Study,15 English Longitudinal Study of Ageing,16 Health and Retirement Study,17 Long Beach Longitudinal Study,18 Mexican Health and Aging Study,19 Swedish Adoption Twin Study of Aging (SATSA),20 Survey of Health, Ageing and Retirement in Europe,21 and the Irish Longitudinal Study on Ageing.22 For SATSA, when twins both participated in a specific wave, we randomly eliminated one of the twins from the analysis sample (to account for non-independencies in data from twin dyads). All participants included in the analyses were 50 years of age or older.
Table 1.
Study characteristics, proportion dropped out and Cohen’s d difference in cognition between people who remained in the study and those who dropped out, by study
| Study | Country | Age range | # waves used | Wave sample size range | Years between waves | Average age in waves used (95% CI) | Average proportion of all-inclusive dropout in each wave* (95% CI) | Cohen’s d† (95% CI) |
| Australian Longitudinal Study of Ageing | Australia | 56+ | 2 | 87–436 | 2 | 87.9 (85 to 90.8) | 0.41 (0.3 to 0.53) | 0.4 (0.16 to 0.63) |
| Canberra Longitudinal Study | Australia | 70+ | 3 | 209–631 | 4 | 82.8 (80.3 to 85.3) | 0.41 (0.32 to 0.51) | 0.51 (0.29 to 0.72) |
| Einstein Aging Study | USA | 70+ | 17 | 223–568 | 1 | 80.8 (79 to 82.6) | 0.31 (0.24 to 0.37) | 0.29 (0.11 to 0.47) |
| English Longitudinal Study of Ageing | UK | 50+ | 8 | 7898–10310 | 2 | 66.7 (64.7 to 68.7) | 0.17 (0.12 to 0.23) | 0.38 (0.19 to 0.56) |
| Health and Retirement Study | USA | 50+ | 13 | 14 702–19682 | 2 | 68 (66.1 to 69.8) | 0.14 (0.09 to 0.19) | 0.46 (0.28 to 0.64) |
| Long Beach Longitudinal Study | USA | 50+ | 4 | 396–1038 | 3 | 73.5 (71.1 to 75.8) | 0.47 (0.38 to 0.56) | 0.33 (0.13 to 0.53) |
| Mexican Health and Aging Study | Mexico | 50+ | 4 | 11 653–14584 | 3 | 65 (62.6 to 67.3) | 0.18 (0.11 to 0.26) | 0.22 (0.02 to 0.42) |
| Swedish Adoption/Twin Study on Aging | Sweden | 50+ | 5 | 247–328 | 3 | 70.2 (68 to 72.4) | 0.15 (0.1 to 0.22) | 0.45 (0.25 to 0.65) |
| Survey of Health, Ageing and Retirement in Europe | European countries and Israel | 50+ | 4 | 35 899–74183 | 2 | 66.9 (64.6 to 69.3) | 0.29 (0.22 to 0.38) | 0.22 (0.01 to 0.42) |
| The Irish Longitudinal Study on Ageing | Ireland | 50+ | 3 | 5715–7207 | 2 | 66.3 (63.8 to 68.9) | 0.14 (0.08 to 0.21) | 0.48 (0.27 to 0.7) |
| Overall average | – | – | 6.3 | – | 2.4 | 72.8 (67.6 to 77.9) | 0.26 (0.18 to 0.34) | 0.37 (0.30 to 0.43) |
*Includes dropout from all causes (death, refusal, etc) to provide a representation of the relative sizes of the dropout and non-dropout groups that were compared to examine differences in cognition.
†Difference in cognition tests performed in the prior wave between people who remained in the study and those who dropped out in the current wave.
This study is part of a larger scientific effort focused on examining the relationship between survey response behaviours and cognitive functioning.23 Only waves with an immediately preceding wave at which cognitive tests were administered were used in the meta-analysis.
Measures
Only standardised objective measures of cognitive performance were included in the analyses. Different cognitive tests were administered across studies, but they could be categorised into the domains of episodic memory, working memory, processing speed, executive functioning, spatial reasoning, verbal abilities, clinical neuropsychological functioning, and global cognitive status (eg, scores on the Mini-Mental State Examination).24 Tests within cognitive domains are shown in online supplemental table A.
bmjopen-2023-079241supp001.pdf (66.8KB, pdf)
For a given wave in a study, Cohen’s d—a measure of the standardised group mean difference—was computed for each cognitive test, comparing participants who remained in the study at the next wave with participants who dropped out at the next wave. A larger (ie, more positive) Cohen’s d indicates a better performance by participants who remained in the study as compared with those who attrited. Cohen’s ds of 0.20, 0.50 and 0.80 were interpreted as small, medium and large differences, respectively.25
For descriptive purposes, dropout rates were also calculated separately for each pair of consecutive waves as the proportion lost (ie, missing) at a given wave out of all participants observed at the previous wave. This includes dropout due to all causes and intermediate dropouts (ie, individuals who returned to the study at future waves). It is worth noting that this inclusive definition likely yields larger dropout rates compared with definitions that exclude particular types of attrition (eg, attrition due to death, intermediate dropouts).
Statistical analyses
Various descriptives were computed including the range of sample sizes for waves in a study, the average participant age for each study, and the average proportion of dropouts. Note that these values were specific to study waves included in analyses, and not representative of all the waves in each study (ie, waves without cognitive tests at the immediate prior wave were excluded).
For the IPD meta-analysis, effect sizes (ie, dropout rates and Cohen’s d values) were aggregated across waves and studies with random effects meta-analysis models.26 Meta-analytic modelling provided the benefit of yielding mean values for effect sizes that considered non-independencies in the data (eg, observations from the same study) and differential precision of effect size estimates (eg, giving studies and waves with larger samples a greater weight in the overall mean). Because each study contributed multiple correlated effect sizes, multilevel meta-analysis models were estimated that took the ‘nested’ data structure into account (ie, Cohen’s d values for different cognitive tests were nested in waves, which were nested in studies; similarly, dropout rates for different waves were nested in studies). The meta-analysis models were conducted using the metafor package27 in the statistical software R.28
We also tested for potential heterogeneity in effect size estimates. The Q statistic was used to test for the presence of effect size heterogeneity, and the I2 statistic was used to quantify the magnitude of effect size heterogeneity.29 For multilevel meta-analyses, I2 captures the proportion of variance in the effect size explained by each of the levels (eg, between studies or between waves in a study) in the model.30 The I2 for each level is computed as the level-specific effect size variance divided by total variance (ie, variance from each of the levels and error). In the event of significant heterogeneity, follow-up moderator analyses were conducted to investigate factors that may account for differences in effect sizes. Moderator analyses were performed with meta-regression models. For the IPD meta-analysis of study dropout rates, the following variables were considered as potential effect size moderators: country of study, years between study waves, study-specific average age, and deviations from study-specific average age within the waves of a study (ie, if a wave in a study was older or younger than typical for the study). For the IPD meta-analysis of cognitive functioning differences, the following variables were considered as potential effect size moderators: country of study, years between study waves, cognitive domain tested, wave-specific dropout rate, study-specific average age, and deviations from study-specific average age within the waves of a study. Computation of heterogeneity statistics and testing for effect size moderators were performed with the metafor27 package.
Patient and public involvement
None.
Results
Table 1 lists the descriptive information for the studies included in the analysis. Information for specific study waves (eg, wave-specific mean age, mean cognitive test Cohen’s d) is shown in online supplemental table B. The total number of unique individuals included was 209 518. Across all studies, from one wave to the next, the average proportion of dropouts was 0.26 (95% CI 0.18 to 0.34). Heterogeneity in the dropout rates was significant (Cochran’s Q=22 2160.53, df 60, p<0.001). The estimated variance components were τ2 Level 2=0.004, 95% CI (0.003 to 0.007) (wave level) and τ2 Level 3=0.019, 95% CI (0.008 to 0.060) (study level). Heterogeneity statistics (I2) suggested that 81.69% of the variance in dropout proportions was attributable to differences at the study level, 18.20% to differences between study waves, and 0.12% to sampling error. Of the potential moderator variables tested (listed under ‘statistical analyses’), study-specific average age (β=0.015, 95% CI (0.006 to 0.025)) and deviations from study-specific average age (β=−0.014, 95% CI (−0.024 to –0.003)) were statistically significant. Figures 1 and 2 depict bubble plots illustrating the effects of moderation by study-specific average age and deviations from study-specific average age, respectively. There is a bubble for each study wave included in the meta-analysis of dropout, and the size of the bubble is proportional to the amount of weight given to the wave in meta-analysis.
Figure 1.

Bubble plot for proportion of dropout as a function of study-specific average age. There is one bubble for each study wave, with the size of the bubble proportional to the magnitude of its weight in the meta-analysis for proportion of dropout.
Figure 2.

Bubble plot for proportion of dropout as a function of deviation from study-specific average age (for a wave).
The pooled average effect size for the difference in cognitive performance between participants retained in the next wave and those who attrited, across all cognitive tests, was d=0.37, 95% CI (0.30 to 0.43). This is a small-to-medium effect size.25 As shown in table 1, all 10 studies had positive and significant mean Cohen’s d values. Heterogeneity in the effect sizes was significant (Cochran’s Q=123 090.96, df 596, p<0.001). The estimated variance components were τ2 Level 2=0.012, 95% CI (0.010 to 0.014) (cognitive test level), τ2 Level 3=0.012, 95% CI (0.007 to 0.019) (wave level), and τ2 Level 4=0.008, 95% CI (0.002 to 0.030) (study level). I2 statistics indicated that 24.29% of the effect size variance was attributable to differences between studies, 35.23% to differences between study waves, 36.93% to differences between tests within study waves, and 3.56% to sampling error. Of the potential moderator variables tested (listed under ‘statistical analyses’), the cognitive domain categories (F 7,589=20.67, p<0.001) and deviations from study-specific average age (β=0.023, 95% CI (0.006 to 0.041)) were statistically significant. Figure 3 graphically depicts the relationship between deviations from study-specific average age and cognitive differences.
Figure 3.

Bubble plot for cognitive functioning differences (between individuals who attrit and those who remain in longitudinal studies, with larger differences indicated by a greater Cohen’s d) as a function of deviation from study-specific average age (for a wave).
Cohen’s d values for specific cognitive domains are shown in table 2. Cohen’s ds differed significantly across cognitive functioning domains (F 7,589=20.67, p<0.001). The effect sizes ranged from d=0.27 (0.2 to 0.34) for working memory, to d=0.51 (0.43 to 0.58) for global cognitive status. These findings were based on a comprehensive assessment of cognitive functioning in all 63 study waves included in analyses (online supplemental table B), with an average of 5.8 cognitive domains assessed (SD 1.49) and 9.86 cognitive tests administered (SD 4.14) at each wave.
Table 2.
Cohen’s d difference in cognition between dropouts and non-dropouts, by cognitive domain
| Overall (n=608 Cohen’s d values) |
Episodic memory (n=178) | Working memory (n=69) | Processing speed (n=60) | Executive functioning (n=91) | Spatial reasoning (n=39) | Verbal abilities (n=69) | Clinical neuro.* measures (n=71) | Global cognitive status (n=44) |
| 0.37 (0.30 to 0.43) | 0.41 (0.34 to 0.47) | 0.27 (0.2 to 0.34) | 0.36 (0.28 to 0.43) | 0.38 (0.31 to 0.45) | 0.41 (0.33 to 0.49) | 0.33 (0.26 to 0.41) | 0.31 (0.24 to 0.38) | 0.51 (0.43 to 0.58) |
*Neuro.: neuropsychological.
Discussion
Participants who dropped out of longitudinal epidemiological studies at the next wave had worse cognitive performance than those who remained in the study at the next wave. This effect was present in all datasets analysed here. Notably, significant small-to-moderate differences in Cohen’s d values were found between people who attrited and those who did not across all eight cognitive domains tested. Since differences in cognitive functioning were tested two to 2–3 years prior to study attrition in most studies, there was a chance that the differences were even greater by the actual time of dropout, though this remains speculative until investigated. I2 statistics indicated that study, wave to wave, and within-wave variance for cognitive differences were all sizeable. In terms of attrition, the average proportion of people who dropped out from one wave to the next was slightly more than one quarter, which is considerable. Note, however, that this figure includes attrition due to death and intermediate dropouts. Differences in the proportion of dropouts were largely driven by study-level differences, as indicated by the large study level I2.
The magnitude of cognitive functioning differences between individuals who remained in longitudinal ageing studies and those who attrited varied by cognitive domain (as indicated by effect size moderator analyses) and study (as indicated by significant study-level effect size variance). Cognitive functioning differences may have varied by cognitive domain because of construct differences (eg, global cognitive status is a screen for cognitive impairment, while working memory tests capture the capacity for remembering small amounts of information while using it in cognitive tasks) and measurement scale differences including level of ability assessed (eg, sensitivity to cognitive impairment vs sensitivity to gradients of normal cognitive performance). Global cognitive status may have had the largest effect (d=0.51) because it represents a broad collection of cognitive functions (working memory, verbal abilities, etc) all of which are important for completion of surveys, whereas the other domains are more specific. Study-level variation in cognitive functioning differences may have stemmed from the varying combinations of tests that were administered in each study.
The varying levels of age (ie, study average vs within study deviations from average) had different relationships with the proportion of dropout as well as with cognitive functioning differences between participants who attrited and those who remained. Older individuals have often been found to have greater study dropout (secondary to sickness, mortality, etc),11 which was consistent with the finding here that a greater average study-specific age was associated with greater dropout. However, within a study, we found that waves with participants who were older than the study average had lower dropout rates. This finding may stem from selection effects within a study across time whereby with more waves the number of long-time participants in the study increases, meaning the sample becomes steadily older while dropout decreases. In terms of cognitive functioning, as the age of a sample increases, one would expect to observe a greater representation of individuals with poorer cognitive health31 potentially leading to greater cognitive functioning differences between participants who attrit and those who remain. No relationship was found, however, between average study-specific age and cognitive functioning differences. This may be because studies that enrolled older participants recruited a subset of them with higher cognitive functioning (ie, selection effects), meaning that across studies with different average sample ages there may not have been a large difference in the representation of individuals with lower cognitive functioning. Conversely, within a study, waves with an average participant age that was higher than the study average had greater cognitive functioning differences. A possible explanation is that, within a study, as the long-term participants became older, a greater proportion of them demonstrated poorer cognitive health that in part may have manifested as study attrition.
Our finding of lower cognitive functioning across a wide array of domains in individuals who attrit from longitudinal studies (as compared with those who are retained) is consistent with the argument that a possible cause of study dropout is dementia/cognitive impairment. The diagnostic criteria for dementia/cognitive impairment include impairment in a broad range of cognitive domains including attention, executive functioning, language, and visuospatial functioning.32 Note however that we cannot conclude that older adult participants drop out of studies because of dementia or cognitive impairment, though we may at least posit them as possible causes. Further research is needed to examine the extent to which cognitive impairment and associated functional limitations drive study attrition.
Future directions
Consistent with prior research, we found that older adults who drop out of longitudinal ageing studies have lower cognitive health than those who remain.4 7 8 This may result in selection bias that may need to be accounted for in analyses of ageing studies. For instance, multiple imputation (MI) or full-information maximum likelihood (FIML) missing data treatments that include information on factors associated with missingness (eg, dropout) have been recommended to enhance the plausibility of the assumption that missing data are missing at random.4 Results of this study suggest that measurements from all domains of cognitive functioning may be relevant to missing data (ie, dropout), especially global cognitive status. Cognitive measures taken immediately prior to attrition may be useful in MI or FIML approaches to reduce the effects of selection bias on study findings.
Given the cognitive health differences observed across all samples and cognitive domains, our results suggest that future studies with a patient and public involvement and engagement (PPIE) focus may be useful to examine whether attriters from longitudinal ageing studies experience meaningful functional deficits requiring attention. PPIE has many purported benefits, including helping researchers gain fresh insights into issues such as community health needs.33 Potential downsides of PPIE however include the time commitment involved and monetary costs.33 Our finding of consistent differences in cognition between individuals who remain and those who attrit suggests that the potential costs of PPIE to examine the functional state of people who drop out from ageing studies may be justified. There seems to be smoke (ie, consistent cognitive differences), so it may be prudent to check for a fire (ie, difficulties with IADLs and daily functioning).
As an example of a potential future PPIE-based mixed methods study, researchers may first conduct qualitative research (interviews, focus groups, advisory panels, etc) on people who drop out of ageing studies to gain perspectives on questions such as the degree to which cognitive health drove study attrition, the participant’s ability to engage in everyday activities, and whether targeted information about resources to address cognitive issues would be appreciated. The insights can then be used to inform the generation of survey questions that can be administered to a larger group of people. Results of such a survey could help indicate the degree to which views communicated in the qualitative interviews generalise to a larger sample of older adults. Research on attriters would need to recruit (some of) them, an effort that might be particularly difficult and require additional incentives and more labour-intensive recruitment approaches (eg, calling the social contacts of the participant who attrited). Recruitment efforts could be rewarded with a better understanding of people who drop out from studies, including whether they are vulnerable to cognitive functioning deficits that require action by researchers.
Limitations
One limitation to note is that differences in cognitive functioning and dropout rates may have been inflated due to inability to partition out participants who attrited due to mortality, a likely concern in studies with older respondents (eg, in ALSA). It is possible that observed differences in cognitive health and dropout were largely driven by age/frailty, and not differences in cognitive health independent of these factors. Older and frailer individuals are expected to have lower cognitive functioning31 34 and greater dropout (secondary to sickness, mortality, etc).11 While we were not able to test for effects of frailty on effect sizes, wave-specific average age and deviations from it for each wave were examined as potential moderators of the effect sizes for cognitive functioning differences and dropout rates. Figures 1–3 illustrate how effect sizes changed as a function of age.
We did not conduct a systematic search of all potential panel studies that could have provided cognitive assessments in longitudinal research, which left some populations unrepresented. For instance, the datasets used originated primarily from Western countries. A more comprehensive and exhaustive investigation of differences in cognitive performance between people who remain and individuals who attrit from ageing studies would require future research using a more systematic data search strategy.
We cannot claim that a causal relationship exists between cognitive functioning deficits and study dropout. Future research is needed to examine this question.
Conclusion
Overall, our results suggest that older adults who attrited had lower cognitive functioning across all domains (at the timepoint before attrition) than individuals who remained in longitudinal ageing studies. Future investigations may be useful to examine whether selection bias can be reduced by using missing data treatments that draw information from cognitive functioning measures prior to study dropout. Investment of resources to use PPIE strategies to examine questions such as whether people who attrit from studies experience meaningful cognitive performance deficits, and if so how to address them, could also be valuable.
Supplementary Material
Footnotes
Contributors: SS, RH, EZ and HJ conceptualised the study. P-JL, SS and RH performed the data preparation and analysis, though the bulk of it was carried out by PJ-L. HJ and EZ provided the conceptual background. AS, EM, HG, DM and DJ contributed to the data interpretation and discussion. RH and HJ drafted the manuscript after consideration of the conceptual background and viewpoints presented in group discussions. All authors provided feedback on various drafts of the manuscript. RH acts as a guarantor for this paper.
Funding: This work was supported by the National Institute on Aging at the NIH (grant number NIH/NIA R01AG068190).
Competing interests: AS is a senior scientist with the Gallup Organization and a consultant for HRA Pharma. The other authors have no conflicts of interest to declare.
Patient and public involvement: Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.
Provenance and peer review: Not commissioned; externally peer reviewed.
Supplemental material: This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.
Data availability statement
Data may be obtained from a third party and are not publicly available. The data that support the findings of this study came from multiple longitudinal epidemiological studies on ageing, each with its own procedures for accessing their data. We are not able to share these data directly.
Ethics statements
Patient consent for publication
Not applicable.
Ethics approval
The Institutional Review Board of the University of Southern California exempted this study (UP-22-00147) as it involved secondary data analysis of deidentified data.
References
- 1. Jayakody O, Blumen HM, Breslin M, et al. Longitudinal associations between falls and future risk of cognitive decline, the motoric cognitive risk syndrome and dementia: the Einstein ageing study. Age Ageing 2022;51:afac058. 10.1093/ageing/afac058 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Zhang L, Wang J, Dove A, et al. Injurious falls before, during and after dementia diagnosis: a population-based study. Age Ageing 2022;51:afac299. 10.1093/ageing/afac299 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Tommiska P, Korja M, Siironen J, et al. Mortality of older patients with dementia after surgery for chronic Subdural Hematoma: a nationwide study. Age Ageing 2021;50:815–21. 10.1093/ageing/afaa193 [DOI] [PubMed] [Google Scholar]
- 4. Kennison RF, Zelinski EM. Estimating age change in list recall in asset and health dynamics of the oldest-old: the effects of attrition bias and missing data treatment. Psychol Aging 2005;20:460–75. 10.1037/0882-7974.20.3.460 [DOI] [PubMed] [Google Scholar]
- 5. Rydén L, Wetterberg H, Ahlner F, et al. Attrition in the Gothenburg H70 birth cohort studies, an 18-year follow-up of the 1930 cohort. Front Epidemiol 2023;3. 10.3389/fepid.2023.1151519 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Salthouse TA. Attrition in longitudinal data is primarily selective with respect to level rather than rate of change. J Int Neuropsychol Soc 2019;25:618–23. 10.1017/S135561771900016X [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Salthouse TA. Selectivity of attrition in longitudinal studies of cognitive functioning. The Journals of Gerontology Series B: Psychological Sciences and Social Sciences 2014;69:567–74. 10.1093/geronb/gbt046 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Lo RY, Jagust WJ, Aisen P. Predicting missing biomarker data in a longitudinal study of Alzheimer disease. Neurology 2012;78:1376–82. 10.1212/WNL.0b013e318253d5b3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Tourangeau R. The survey response process from a cognitive viewpoint. QAE 2018;26:169–81. 10.1108/QAE-06-2017-0034 [DOI] [Google Scholar]
- 10. Mioshi E, Kipps CM, Dawson K, et al. Activities of daily living in frontotemporal dementia and Alzheimer disease. Neurology 2007;68:2077–84. 10.1212/01.wnl.0000264897.13722.53 [DOI] [PubMed] [Google Scholar]
- 11. Chatfield MD, Brayne CE, Matthews FE. A systematic literature review of attrition between waves in longitudinal studies in the elderly shows a consistent pattern of dropout between differing studies. J Clin Epidemiol 2005;58:13–9. 10.1016/j.jclinepi.2004.05.006 [DOI] [PubMed] [Google Scholar]
- 12. Hofer SM, Piccinin AM. Integrative data analysis through coordination of measurement and analysis protocol across independent longitudinal studies. Psychol Methods 2009;14:150–64. 10.1037/a0015566 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Luszcz MA, Giles LC, Anstey KJ, et al. Cohort profile: the Australian longitudinal study of ageing (ALSA). Int J Epidemiol 2016;45:1054–63. 10.1093/ije/dyu196 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Christensen H, Mackinnon A, Jorm AF, et al. The Canberra longitudinal study: design, aims, methodology, outcomes and recent empirical investigations. Aging, Neuropsychology, and Cognition 2004;11:169–95. 10.1080/13825580490511053 [DOI] [Google Scholar]
- 15. Katz MJ, Lipton RB, Hall CB, et al. Age-specific and sex-specific prevalence and incidence of mild cognitive impairment, dementia, and Alzheimer dementia in blacks and whites: a report from the Einstein aging study. Alzheimer Disease & Associated Disorders 2012;26:335–43. 10.1097/WAD.0b013e31823dbcfc [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Steptoe A, Breeze E, Banks J, et al. Cohort profile: the English longitudinal study of ageing. Int J Epidemiol 2013;42:1640–8. 10.1093/ije/dys168 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Juster FT, Suzman R. An overview of the health and retirement study. The Journal of Human Resources 1995;30:S7. 10.2307/146277 [DOI] [Google Scholar]
- 18. Zelinski EM, Kennison RF, Long Beach Longitudinal Study . The long beach longitudinal study: evaluation of longitudinal effects of aging on memory and cognition. Home Health Care Serv Q 2001;19:45–55. 10.1300/J027v19n03_04 [DOI] [PubMed] [Google Scholar]
- 19. Wong R, Michaels-Obregon A, Palloni A. Cohort profile: the Mexican health and aging study (MHAS). Int J Epidemiol 2017;46:e2. 10.1093/ije/dyu263 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Pedersen NL, McClearn GE, Plomin R. The Swedish adoption twin study of aging: an update. Acta Genet Med Gemellol: Twin Res 1991:7–20. 10.1017/S0001566000006681 [DOI] [PubMed] [Google Scholar]
- 21. Börsch-Supan A, Brandt M, Hunkler C, et al. Data resource profile: the survey of health, ageing and retirement in Europe (SHARE). Int J Epidemiol 2013;42:992–1001. 10.1093/ije/dyt088 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Whelan BJ, Savva GM. Design and methodology of the Irish longitudinal study on ageing. J Am Geriatr Soc 2013;61 Suppl 2:S265–8. 10.1111/jgs.12199 [DOI] [PubMed] [Google Scholar]
- 23. Jin H, Junghaenel DU, Orriens B, et al. Developing early markers of cognitive decline and dementia derived from survey response behaviors: protocol for analyses of preexisting large-scale longitudinal data. JMIR Res Protoc 2023;12:e44627. 10.2196/44627 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24. Arevalo-Rodriguez I, Smailagic N, Roqué I Figuls M, et al. Mini-mental state examination (MMSE) for the detection of Alzheimer’s disease and other dementias in people with mild cognitive impairment (MCI). Cochrane Database Syst Rev 2015;2015:CD010783. 10.1002/14651858.CD010783.pub2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25. Cohen J. Statistical power analysis for the behavioral sciences. 2013. 10.4324/9780203771587 [DOI] [Google Scholar]
- 26. Borenstein M, Hedges LV, Higgins JPT, et al. Introduction to Meta-Analysis. John Wiley & Sons, 2021. 10.1002/9781119558378 [DOI] [Google Scholar]
- 27. Viechtbauer W. Conducting meta-analyses in R with the Metafor package. J Stat Soft 2010. 10.18637/jss.v036.i03 Available: http://www.jstatsoft.org/v36/i03/ [DOI] [Google Scholar]
- 28. R Core Team . R: A language and environment for statistical computing. R foundation for statistical computing [Internet]. 2020. Available: https://www.r-project.org/ [Accessed 14 Jan 2021].
- 29. Huedo-Medina TB, Sánchez-Meca J, Marín-Martínez F, et al. Assessing heterogeneity in meta-analysis: Q Statistic or I2 index. Psychol Methods 2006;11:193–206. 10.1037/1082-989X.11.2.193 [DOI] [PubMed] [Google Scholar]
- 30. Pastor DA, Lazowski RA. On the multilevel nature of meta-analysis: a tutorial, comparison of software programs, and discussion of analytic choices. Multivariate Behav Res 2018;53:74–89. 10.1080/00273171.2017.1365684 [DOI] [PubMed] [Google Scholar]
- 31. Deary IJ, Corley J, Gow AJ, et al. Age-associated cognitive decline. Br Med Bull 2009;92:135–52. 10.1093/bmb/ldp033 [DOI] [PubMed] [Google Scholar]
- 32. Hugo J, Ganguli M. Dementia and cognitive impairment: epidemiology, diagnosis, and treatment. Clin Geriatr Med 2014;30:421–42. 10.1016/j.cger.2014.04.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33. Brett J, Staniszewska S, Mockford C, et al. A systematic review of the impact of patient and public involvement on service users, researchers and communities. Patient 2014;7:387–95. 10.1007/s40271-014-0065-0 [DOI] [PubMed] [Google Scholar]
- 34. Robertson DA, Savva GM, Kenny RA. Frailty and cognitive impairment—A review of the evidence and causal mechanisms. Ageing Res Rev 2013;12:840–51. 10.1016/j.arr.2013.06.004 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
bmjopen-2023-079241supp001.pdf (66.8KB, pdf)
Data Availability Statement
Data may be obtained from a third party and are not publicly available. The data that support the findings of this study came from multiple longitudinal epidemiological studies on ageing, each with its own procedures for accessing their data. We are not able to share these data directly.
