Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Oct 1.
Published in final edited form as: Psychooncology. 2014 Apr 3;23(10):1133–1141. doi: 10.1002/pon.3522

Self-reported Cognitive Concerns and Abilities: Two sides of one coin?

Jin-Shei Lai 1, Lynne I Wagner 2, Paul B Jacobsen 3, David Cella 4
PMCID: PMC4185008  NIHMSID: NIHMS623884  PMID: 24700645

Abstract

OBJECTIVE

Patient-reported cognitive function can be measured using negatively-worded items (Concerns) and positively-worded (Abilities) items. It is possible reporting Abilities is less subject to the influence of emotional states. This study evaluated the relationship between cognitive concerns and cognitive abilities.

METHODS

Cancer patients (N = 509; mean age=61 years; 50% males; 86% White) completed Concerns and Abilities items developed by the NIH Patient Reported Outcomes Information System (PROMIS). Confirmatory factor analysis (CFA) was used to evaluate the extent to which items loaded on one single factor (unidimensionality). Multidimensionality was evaluated using bi-factor analysis (local factors: Concerns and Abilities). Slope parameters from multidimensional Item Response Theory (MIRT) and unidimensional IRT (UIRT) were compared to evaluate which factor solution fit best.

RESULTS

Acceptable fit indices were found in both one-factor CFA (CFI=0.96, RMSEA=0.062) and bi-factor analysis (CFI=0.98; RMSEA=0.043). Thus, Abilities and Concerns could be considered as a single dimension. Yet high loadings on the local factor in bi-factor analysis and slope discrepancies between UIRT and MIRT indicate that Abilities should be considered as a separate factor from Concerns.

CONCLUSIONS

Concerns and Abilities could be measured using one unidimensional item bank. Results also support measuring each construct separately. We recommend a conservative approach by measuring and reporting Concerns and Abilities separately. We therefore recommend two separate but co-calibrated item banks in the PROMIS network: Cognitive Function Item Bank - Concerns, and Cognitive Function Item Bank - Abilities. Both item banks showed good psychometric properties and are available for research and clinical purposes.

Keywords: Patient-reported cognitive function, Cognitive Concerns, Cognitive Abilities, Cancer, PROMIS, Dimensionality

INTRODUCTION

Advances in the treatment of cancer have led to a steady rise in the number of long-term cancer survivors.[1] Cancer survivors have reported concerns about changes in cognitive function following treatment, which compromise quality of life.[2] The contribution of cancer and its treatment to cognitive impairment has increasingly become a focus for research over the past decade.[3; 4] Disease- and treatment-related impairments have been documented using multiple approaches, including neuroimaging,[5; 6] neuropsychological tests,[3; 7] and patient report.[8]

Given the importance of cognitive dysfunction to survivor’s quality of life[2] and ability to function including return to work,[9; 10] obtaining the patient’s perspective is critical to fully understand this symptom.[11; 12] Most studies that have examined patient-reported cognitive function have been conducted with breast cancer survivors. A recently published systematic review identified 27 studies with breast cancer survivors.[13] Of these studies, five utilized a longitudinal design that included pre-treatment measures of cognitive function.[14-18] These studies demonstrated strong evidence for increased cognitive problems immediately following systemic treatment compared to baseline, though results over time after treatment were less consistent. Compared to population-based controls, breast cancer survivors more than 20 years post-adjuvant chemotherapy reported more memory complaints, but fewer symptoms of depression.[19] Higher levels of the pro-inflammatory cytokine tumor necrosis factor-alpha (TNF-α) was associated with memory complaints among breast cancer survivors and a decline in TNF-α post-chemotherapy over 12 months was associated with fewer memory complaints.[20]

A systematic review of studies conducted with survivors of various types of cancer that compared performance on neuropsychological tests to patient-reported cognitive dysfunction similarly found that a significant proportion of patients reported cognitive impairments following chemotherapy.[8] Though many studies included in this review did not support correlations between patient-reported cognitive dysfunction and neuropsychological test scores, Lai et al[21] found the correlation was as high as 0.66 for children with a brain tumor one-year post diagnosis. Patient-reported cognitive problems have also been documented among survivors following stem cell transplant,[22] patients with CNS lymphoma in remission,[23] testicular cancer[24; 25] and in a cross-sectional sample of patients with various malignancies.[26]

Conceptual Differences between Concerns and Abilities

In many studies, patient-reported cognitive dysfunction has demonstrated a significant association with emotional distress.[13] An evaluation of the determinants of patient-reported cognitive dysfunction among breast cancer survivors identified chemotherapy regimen, a negative affectivity personality trait, and current depression as related to higher dysfunction.[27] In non-cancer populations, negative affectivity has been found to introduce a bias with regard to the reporting of physical symptoms.[28] Proneness to negative affect is associated with reporting greater symptom severity, particularly for vague symptoms.[29] Based on this, it is possible that negatively-worded problem-focused items (e.g., “I have had trouble concentrating”) are more likely to elicit negative affect than positively-worded ability items (e.g., “I am able to concentrate”). In our prior research, we have documented that negatively-worded items assessing cognitive dysfunction represent a factor distinct from positively-worded items that assess cognitive abilities.[30] This distinction between negatively- and positively-worded items has also been observed with regard to the emotional impact of cancer.[31] Given the association between patient-reported cognitive dysfunction and clinically significant emotional states including depression and anxiety,[13] we hypothesized assessing cognitive abilities might contribute novel information to our understanding of the effects of cancer and treatment on cognitive function through reducing the bias introduced by negative emotional states.

Research to date has been limited by the use of self-reported cognition measures, further prohibiting comparison of results. Additionally, our review found that the Functional Assessment of Cancer Therapy – Cognitive Function (FACT-Cog)[32] is the only patient-reported outcomes measure to include items that assess cognitive abilities. Use of a common measure to assess patient-reported cognitive function would help to advance this line of research and the inclusion of a scale to assess cognitive abilities may help to better understand cognitive function during and following cancer treatment from the patient’s perspective. The Patient Reported Outcomes Measurement and Information System (PROMIS, www.nihpromis.org) is a National Institutes of Health (NIH) Roadmap initiative to develop item banks to measure patient-reported symptoms and other aspects of health-related quality of life across various condition and disease populations. An item bank is comprised of items calibrated by the item response theory (IRT) models.[33-36] These items are concrete manifestations of positions along that continuum that represent differing levels of that trait. A psychometrically-sound item bank can provide a basis for designing the best set of questions for any particular application such as computerized adaptive testing (CAT) and static fixed length short-forms.[37; 38] Both CAT and short-forms produces scores that are comparable regardless of the specific questions asked of a given individual or group of respondents, and thus provide brief-yet-precise measures that meet the needs of busy clinics.[34; 35; 39] This paper reports the development of the PROMIS cognitive function item banks, with focus on the dimensionality of such banks by evaluating the relationship between self-reported cognitive concerns and abilities.

METHODS

Sample

Participants (N=509) were recruited from 1) the Duke Cancer Care Research Program in Durham, NC (n=72); 2) the Duke Tumor Registry (n=283); and 3) NexCura, a nation-wide online registry of more than 500,000 cancer patients (n=154). Participants were eligible if they were 18 years or older, had a diagnosis of cancer and were fluent in English. Our sampling strategy aimed for representation with regard to gender, race, tumor site, and treatment status (i.e., receiving treatment vs. in post-treatment follow-up). The Institutional Review Board approved the study, and all participants provided informed consent.

Development of the Cognitive Function Items

The item bank development processes are shown in Figure 1. In brief, we started with developing a conceptual model via literature review on cognitive domains and HRQOL and feedback from experts. Semi-structured individual interviews with oncology providers and cancer patients and focus groups conducted with cancer patients were used to generate items appropriate for cancer populations. These items were reviewed by experts and patients and revised by the study team, which resulted in 42 items all of which were negatively worded (i.e., concern items). Psychometric properties of these 42 items were evaluated on general oncology outpatients and a ceiling effect was identified. Consequently, 10 positively-worded items (i.e., ability items) were added with an attempt to minimize the ceiling effect, which resulted in the FACT-Cog.[32] In 2009, we expanded the FACT-Cog as part of the PROMIS effort by conducting another series of cognitive interviews with patients, expert panel review, and translatability review. According to interview results, existing items were removed or revised, and new items were written to capture concepts that were missing in the original item pool. The final item pool used in the field testing consisted of 36 ability items and 42 concern items, covering cognitive domains including mental acuity, concentration, memory, verbal fluency, interference with quality of life, comments from others, change from previous functioning, and multi-tasking. Based on qualitative input from patients on item readability and comprehension, a 5-point intensity rating scale (e.g., “I have been able to concentrate”; 1=not at all; 5=very much) was used for the Abilities subset, whereas a 5-point frequency rating scale (e.g., “I have had trouble forming thoughts”; 1=very often; 5=never) was used for the Concerns subset.

Figure 1.

Figure 1

Development of the PROMIS Cognitive Function Item Banks

Analysis

In this study, we evaluated the dimensionality of cognitive function items using two approaches. One was a traditional approach, confirmatory factor analysis (CFA), in which a single dimension was specified (i.e., Concerns, Abilities, and Concerns and Abilities together); that is, all items loaded on a single factor. The results were evaluated using common criteria[40; 41] including: a) loadings of all of the items are sufficiently large (criterion: R-square>0.3), b) root mean squared error of approximation (RMSEA) <0.1; c) comparative fit index (CFI) > 0.90, and d) modification index (MI) < 10.[42; 43] We calculated residual correlations between items and used a criterion of r< 0.2 to indicate local independence (i.e., no secondary factors among items).

Cognition can be multidimensional in nature yet dominated by a predominant factor and small factors beyond the major factor. On the other hand, it is also possible that data are multidimensional with strong factors beyond the first factor. In the latter case, using unidimensional IRT based calibrations might result in reporting errors. We therefore evaluated potential multidimensionality among items using bi-factor analysis[43; 44] and multidimensional IRT (MIRT)[45; 46] to evaluate whether the degree of multidimensionality distorted the item parameters estimated by unidimensional IRT. Both bi-factor analysis and MIRT include two classes of factors: a general factor (i.e., Cognitive Function) and two local factors (or subdomains, i.e., Concerns and Abilities). In a bi-factor analysis, a general factor is defined by loadings from all of the items in the pool, and local factors are defined by loadings from pre-specified groups of items related to that local factor.[43; 44] The bi-factor model emphasizes factor loadings to determine whether items are more reasonable as unidimensional or multidimensional by comparing factors loadings of the same item between local and general factors. Standardized loadings that are salient (i.e., >0.3) for all the items on the general factor would indicate that the general factor is well defined even in the presence of the sub-domain factors.[44] Similarly, if the loadings of all the items on a sub-domain factor are salient, this would indicate that the sub-domain is well defined even in the presence of the general factor. CFA and bi-factor analyses were conducted using MPlus version 6 (Muthen & Muthen, Los Angeles: CA) with the implementation of the polychoric correlation matrix and weighted least squares with adjustments for mean and variance (WLSMV) estimation, which is appropriate for the evaluation of ordered categorical data.

In IRT analysis, Samejima’s Graded Response Model (GRM)[47; 48], as implemented in the IRTPRO version 2.1 (Scientific Software International, Inc., http://www.ssicentral.com) was used for parameter estimation. For each item the GRM estimates a slope or discrimination parameter (a), which indicates the degree of association between the item responses and the underlying construct and four thresholds (bk) (for five category items) that reflect the degree of cognition (either Concerns, Abilities, or the general factor Cognitive Function) where the most probable response occurs in a given category or higher. We then conducted multidimensional IRT analyses to evaluate whether forcing a potentially multidimensional item set into a unidimensional IRT model distorted the parameter estimation. Distorted item parameter estimations adversely affect scoring and computerized adaptive testing (CAT) administration.[49; 50] To evaluate this, we compared the slope parameters obtained from unidimensional and multidimensional IRT.[51] The statistical procedures used in MIRT are so-called “item” factor analysis that is an application of multiple factor analysis directly to the item responses rather than to test scores.[50] This model was originated from Gibbons and Hedeker[52] who derived an item-response model for binary response data exhibiting the bi-factor structure, as mentioned in the previous paragraph, and developed a practical method of item parameter estimation. Gibbons et al[50] extended this model to polytomous items, which is the approach used in the current study. Technical details of applying GRM in parameter estimations for multidimensional items can be found in Gibbons et al[53] and are not repeated here.

Finally, we evaluated the convergent validity of the resulting item banks by estimating the correlations with the legacy measures, including two FACT-Cog sub-scales[32] (Interference with Quality of Life and Comments from Others), PROMIS global health - physical and mental, and cognition items included in the EORTC QOL-C30 (European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire).[54] Analysis of variance (ANOVA) was used to evaluate whether Concerns and Abilities differentiated patients with different clinical (e.g., treatment and performance rating) and demographic (e.g., gender and education) categories. Results from these analyses will assist us in making a final decision regarding whether to treat cognitive function as one single item bank or as two separate banks that assess cognitive concerns and cognitive abilities.

RESULTS

Sample

Data from 509 cancer patients were analyzed. The average age of participants was 60.6 years (SD=11.8), 49.8% were male, 85.5% White, 74.9% were married, and 85% had at least some college. Cancer diagnoses for this sample include 27.9% breast, 18.2% colon or rectum, 15.7% prostate, and 10.4% lung. Sixty-six percent of patients reported lymph node involvement. Average time since diagnosis was 56.9 months (range: 1–373). The majority of participants reported normal activities without (58.9%) or with (31.2%) symptoms. In terms of treatment during the past month, 59.8% of the respondents had not received any treatment for their cancer in the past month, 20.1% received chemotherapy alone (15.6%) or combined with other treatment (4.52%), and 9.7% received hormonal therapy (e.g., Tamoxifen, Arimidex, Aromasin, Lupron). The majority of the sample reported no problems in walking about (82.1%), with self-care (97.5%), or with performing usual activities (71.2%). Approximately half reported no pain or discomfort (54.7%) and most denied feeling anxious or depressed (69.0%).

Evaluation of Unidimensionality

A series of factor analyses were conducted with Concerns and Abilities being analyzed separately. Items with low R-square (<0.3) and/or high residual correlations (>0.2) were reviewed and discussed by the study team to determine their inclusion/exclusion based on item content. As a result, the Concerns and Abilities item banks consisted of 34 and 33 items, respectively, with borderline acceptable fit indices: CFI=0.92 and 0.94, RMSEA=0.084 and 0.113 for Concerns and Abilities, respectively. These items were calibrated using GRM model and formed the PROMIS Cognitive Function item banks version 1 (formerly, Applied Cognition). Only these 67 items retained in version 1 were analyzed in this study.

We first evaluated dimensionality of these 67 items as a whole. Four items (two concerns and two abilities) with residual correlations > 0.2 were removed with a concern of local dependency. Among the remaining 63 items, acceptable fit indices were found (CFI=0.983, RMSEA=0.041) using a two-factor CFA approach. A high correlation (r=0.872) was found between Concerns and Abilities factors. When using a one-factor CFA, all fit indices met the pre-set criteria: CFI=0.961, RMSEA=0.062, R-squares ranged from 0.37 to 0.86, and MI < 10. These results support the unidimensionality of these 63 items.

Evaluation of Multidimensionality

The same conceptual model was used for both bi-factor analysis and MIRT; in which, the general model was the Cognitive Function (i.e., all 63 items) and two local factors were Concerns (i.e., 32 negatively worded items) and Abilities (i.e., 31 positively-worded items). Satisfactory fit indices were found in bi-factor analysis, where CFI=0.982, RMSEA=0.043, MIs < 10, and all items had higher loadings on the general factor than their own local factors. As shown in Figure 2, Abilities items generally had loadings>0.3 on the local factor, indicating the potential to justify scaling a separate factor from Concerns. When we evaluated the variance explained, we found that the general factor explained 86.6% of variance while local factors together only explained 14.4%. These results support a general cognitive function factor.

Figure 2.

Figure 2

Comparisons of the factor loadings on the general factor against the local factors (i.e., Abilities or Concerns). All items had loadings greater than 0.5 on the general factor. Abilities items also showed salient loadings (>0.3) on the local factor (upper half in the Figure) yet such observations were not found in the most Concerns items (lower half in the Figure).

In IRT related analyses, slopes obtained from MIRT were compared to those obtained from unidimensional IRT analysis. Though they were highly correlated (r=0.976), more variability was found in the middle to the higher ends of the continuum. For Concerns, the average discrepancy of the slopes between uni- and multidimensional IRT was 0.04 (SD=0.26), ranging from 0.04 to 0.52. For Abilities, the average discrepancy was 0.1 (SD=0.3), ranging from 0.01 to 0.56, indicating greater variability in Abilities than in Concerns; yet this variability was cancelled out during the reliability estimation. These slope discrepancies suggest the multidimensional nature between Concerns and Abilities. Forcing the Concerns and Abilities items to be calibrated together might result in distorted estimations, especially in Abilities, when computerized adaptive testing is administered. The correlation between scaled scores produced uni- and MIRT was 0.943 (shown on Figure 3). As expected, more variability was found in the middle of the continuum. Cronbach’s Alpha was 0.97, 0.98 and 0.99 for Concerns, Abilities and when Concerns and Abilities were combined together, respectively.

Figure 3.

Figure 3

Comparisons of IRT Scaled score (theta) between uni- and multidimensionality IRT models.

Based on the results of unidimensionality analyses, Abilities and Concerns could be considered as a single dimension. Yet given the finding of high loadings on the local factor in bi-factor analysis and slope discrepancies between uni- and multidimensional IRT, especially in Abilities, we concluded that Abilities demonstrated the potential to be an independent factor from Concerns.

Relationship with Demographic and Clinical Factors

The correlation between Concerns and Abilities was 0.81. Both Concerns and Abilities were significantly (p<0.001) correlated with PROMIS Global - Physical Health, PROMIS Global - Mental Health, the FACT-Cognition “Comments from others” subscale, the FACT-Cognition “interference with quality of life” subscale, and EORTC cognition items, with Spearman rho ranging from 0.41 to 0.72 (see Table 1). Abilities had stronger associations with general quality of life scores (PROMIS physical and mental health scores) while Concerns tended to have stronger associations with other cognition specific items. Length since diagnosis was not significantly correlated with Concerns or Abilities.

Table 1.

Relationships (Spearman’s rho) with legacy items

Concerns Abilities
PROMIS Global Health
PROMIS Global Physical Health 0.44 0.46
PROMIS Global Mental Health 0.56 0.60
FACT-COG “Comments from Others” subscale
Other people have noticed that I had problems remembering information −0.58 −0.51
Other people have noticed that I had problems speaking clearly −0.44 −0.41
Other people have noticed that I had problems thinking clearly −0.53 −0.49
FACT-COG “Interference with quality of life” subscale
I have been upset about these problems −0.68 −0.64
These problems have interfered with my ability to work −0.66 −0.63
These problems have interfered with my ability to do things I enjoy −0.61 −0.58
These problems have interfered with the quality of my life −0.63 −0.60
EORTC Cognition subscale
Have you had difficulty in concentrating on things, like reading a newspaper or watching television? −0.60 −0.58
Have you had difficulty remembering things? −0.72 −0.66

As shown in Table 2, Concerns reported by patients significantly differentiated people with different treatment types (F(3,504)=7.6, p<.001), performance ratings (F(2,506)=35.84, p<.001), education levels (F(2, 346)=6.8, p=.001) and gender (t=2.93, p=.004). All above four variables remained as significant predictors of Concerns, F(8, 310)=11.69, p<.001. Similar results were found with Abilities, in which Abilities reported by patients significantly differentiated people with different treatment types (F(3,504)=8.48, p<.001), performance ratings (F(2,506)=44.13, p<.001), and education levels (F(2, 346)=13.8, p<.001). Gender-based differences in Abilities were also observed (t=2.68, p=.008). These four variables remained as significant predictors to “abilities” in a multiple regression analysis, F(8, 310)=19.84, p<.001. Post-hoc Tukey’s test showed the same significant comparison groups for both Concerns and Abilities with one exception based on treatment in the past month. Patients who did not receive any treatment in the past month had better Abilities scores than those who received “other types of treatment”. This result was not observed with Concerns.

Table 2.

Convergent validity of Concerns and Abilities on selected variables.

Concerns NOTE Abilities NOTE
Treatment No treatment in the past month (n=304)
Hormonal therapy (n=49)
Chemotherapy (n=99)
Other types of treatment (n=55)
F(3,504)=7.6; p<.00011 F(3, 504)=8.48, p<.00012
ECOG3 Normal activities, no symptoms (n=300)
Normal activities, some symptoms (n=159)
Required bed rest (n=50)
F(2,506)=35.84, p<.0001 F(2, 506)=44.13, p<.0001
Education4 High school or lower (n=74)
Some college or college graduate (n=310)
Advance degree (n=125)
F(2, 346)=6.8, p=.0013 F(2,346)=13.8, p<.0001
Gender5 Male (n=232)
Female (n=234)
t=2.93, p=0.0035 t=2.68, p=0.0076

NOTE: higher scores=better function and less problems

1

“No treatment” had significantly better Concerns scores than “chemotherapy”.

2

“No treatment” groups had significantly better Abilities scores than “chemotherapy” and “other types of treatment”.

3

Significant Concerns and Abilities scores were found on all comparisons.

4

Significant Concerns and Abilities scores were found on all comparisons except between “some college or college graduate” and “advance degree”.

5

Males had better Concerns and Abilities scores than females

DISCUSSION

The contribution of cancer and its treatment to cognitive impairment has become a focus over the past decade.[3; 20; 55-58] “Cancer dyscognition” is a relatively new research area. As such, underlying mechanisms are not yet well-understood.[59; 60] We report our efforts in developing comprehensive self-reported cognition measures via PROMIS: Cognitive Function-Concerns and Cognitive Function-Abilities item banks. These measures were developed via rigorous qualitative and quantitative approaches. Several waves of focus group, individual interviews and cognitive interviews were conducted to fully capture patients’ experiences in their own words. After completing this qualitative work, we evaluated dimensionality of self-reported cognition to ensure that we were accurately measuring and scoring self-reported cognitive function. These analyses extended beyond the current PROMIS methodology,[41] to further evaluate the degree of multidimensionality within these items and how the multidimensional nature of items impacted the estimation of item parameters and consequently final scores. Analysis results indicated Concerns and Abilities could be reported as one single score (i.e., Cognitive Function) or two separate scores, and both reporting methods are psychometrically sound. We recommend reporting Concerns and Abilities scores separately and our rationale for this decision is described below.

Both the one-factor CFA and the bi-factor analysis supported the idea that Concerns and Abilities are similar enough to each other to be treated as a single dimension. For this reason, we then proceeded to calibrate both sets of items on a single continuum which we label Cognitive Function. However, as illustrated by Figure 2, there appears to be sufficient separation of the clusters of factor loadings between Concerns and Abilities to support separate reporting of Concerns and Abilities, even though both sets of items were calibrated together. Comparisons between uni- and multidimensional IRT indicated more variability in Abilities among patients than Concerns as indicated by the slope discrepancies: averaged slope differences=0.04 and 0.1 for Concerns and Abilities, respectively. Our cognitive interview data (not reported here) also indicated that patients utilized different thinking processes when responding to Abilities and Concerns items. Patients reported it was relatively straightforward to estimate the number of times in the past 7 days they experienced a problem with cognitive function. However, they reported that answering the Abilities items required an extra step because they were required to consider which aspect of their cognitive experience they would have to weigh when considering severity over a 7 day period. Despite this extra step, a subgroup of participants did report that they preferred the Abilities items because the items focused on positive aspects of cognitive function instead of problems. Both qualitative and quantitative data suggest Abilities and Concerns might tap different concepts fundamentally, however, not psychometrically.

Self-reported cognitive abilities is a relatively new area. To our knowledge, the measure developed from this project is the first measure tapping this concept. Our previous study[30] suggested Concerns and Abilities were two independent constructs, however a limitation of this study was the ratio of items between cognitive concerns (32 items) and cognitive abilities (10 items) potentially introducing method effects. In this project, we ensured the balance of Concerns and Abilities in terms of content coverage and numbers of items. As recommended by Gibbons and his colleague,[50] “. . .if a given set of items is found to be multidimensional, development of the test instrument can proceed in different ways according to the objective of the investigator. If the purpose is to obtain a profile of measurements describing the respondent’s position on each dimension, the factor analysis will identify the subsets of items that best represent the corresponding dimensions. The subsets can then be presented to future respondents as separate measures, possibly with separate instructions and separately timed. Alternatively, all of the items can be presented as a single test and the respondents positions on the dimensions estimated directly in the form of factor scores.”(p.28) We decided to take a more conservative approach by reporting Abilities and Concerns separately. We therefore recommended two separate but co-calibrated item banks in the PROMIS network: Cognitive Function Item Bank - Concerns, and Cognitive Function Item Bank - Abilities. Future research should examine how these two concepts relate to clinical features of patients. For example, the extent to which Abilities may be less vulnerable to the influence of emotional states than Concerns among patients with depression and anxiety should be examined in future research.

In clinical practice, self-reported cognitive impairments may signify many possible clinically significant events and self- or proxy-report of cognitive problems is one of the most common reasons for conducting neuropsychological testing. The potential of self-report as a method for monitoring cognitive functioning among adults with cancer is supported by recent studies which have found significant correlations between patient self-reported cognitive concerns and neuroimaging findings.[57; 61; 62] Despite the lack of association with neuropsychological testing, self-reported cognition reflects subjective concern or distress in the patients, which, we suggest, is an important component of health-related quality of life. The item banks developed in this study are valid and psychometric sound, which can facilitate research in understanding the role of self-reported cognitive function in patients’ daily living.

Limitations of the current study are noted. Patients recruited in the current study might not be representative of the national cancer population. Future studies should be done to evaluate the impact of patients characteristics on Concerns and Abilities. Examples of such characteristics are, but not limited to, education levels, financial status, and types of treatment. It would be also important to evaluate self-reported cognition prospectively and longitudinally, including pre-treatment through long-term follow up to better understand the trajectory of cognitive function.

In conclusion, two PROMIS Cognitive Function item banks were developed in this study: Concerns and Abilities. These two item banks showed good psychometric properties, were significantly correlated with PROMIS Global - Physical Health, PROMIS Global - Mental Health, FACT-Cognition, and EORTC cognition items. Both significantly differentiated people with different treatment types, performance status, education levels and gender. These two item banks are ready for use and through providing a psychometrically robust, brief and precise approach for the assessment of cognitive function will advance research on this salient and distressing symptom.

Acknowledgments

This work was supported by NIH grants U01 AR052177 (PI: David Cella) and R01 CA60068 (PI: David Cella). Data were collected by Duke University Medical Center via U01AR052186 (PI: Kevin Weinfurt).

Footnotes

Author disclosures: The authors report no financial or personal relationships that might bias this work.

Contributor Information

Jin-Shei Lai, Department of Medical Social Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL.

Lynne I. Wagner, Department of Medical Social Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL

Paul B. Jacobsen, Department of Health Outcomes and Behavior, Moffitt Cancer Center, Tampa, FL

David Cella, Department of Medical Social Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL

References

  • 1.American Cancer Society. Cancer Facts & Figures 2012. ONCOLOGY NUTRITION CONNECTION. 2012;20(4):18–19. [Google Scholar]
  • 2.Baker F, Denniston M, Smith T, West MM. Adult cancer survivors: How are they faring? Cancer. 2005;104(S11):2565–2576. doi: 10.1002/cncr.21488. [DOI] [PubMed] [Google Scholar]
  • 3.Ahles TA, Root JC, Ryan EL. Cancer- and Cancer Treatment–Associated Cognitive Change: An Update on the State of the Science. Journal of Clinical Oncology. 2012;30(30):3675–3686. doi: 10.1200/JCO.2012.43.0116. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Ganz PA. Doctor, Will the Treatment You Are Recommending Cause Chemobrain? Journal of Clinical Oncology. 2012;30(3):229–231. doi: 10.1200/JCO.2011.39.4288. [DOI] [PubMed] [Google Scholar]
  • 5.Deprez S, Amant F, Smeets A, Peeters R, Leemans A, Van Hecke W, Verhoeven JS, Christiaens MR, Vandenberghe J, Vandenbulcke M, Sunaert S. Longitudinal Assessment of Chemotherapy-Induced Structural Changes in Cerebral White Matter and Its Correlation With Impaired Cognitive Functioning. Journal of Clinical Oncology. 2012;30(3):274–281. doi: 10.1200/JCO.2011.36.8571. [DOI] [PubMed] [Google Scholar]
  • 6.Silverman D, Dy C, Castellon S, Lai J, Pio B, Abraham L, Waddell K, Petersen L, Phelps M, Ganz P. Altered frontocortical, cerebellar, and basal ganglia activity in adjuvant-treated breast cancer survivors 5–10 years after chemotherapy. Breast Cancer Research and Treatment. 2007;103(3):303–311. doi: 10.1007/s10549-006-9380-z. [DOI] [PubMed] [Google Scholar]
  • 7.Jim HSL, Phillips KM, Chait S, Faul LA, Popa MA, Lee YH, Hussin MG, Jacobsen PB, Small BJ. Meta-Analysis of Cognitive Functioning in Breast Cancer Survivors Previously Treated With Standard-Dose Chemotherapy. Journal of Clinical Oncology. 2012;30(29):3578–3587. doi: 10.1200/JCO.2011.39.5640. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Hutchinson AD, Hosking JR, Kichenadasse G, Mattiske JK, Wilson C. Objective and subjective cognitive impairment following chemotherapy for cancer: A systematic review. Cancer Treatment Reviews. 2012;38(7):926–934. doi: 10.1016/j.ctrv.2012.05.002. [DOI] [PubMed] [Google Scholar]
  • 9.Reid-Arndt S, Cox C. Stress, Coping and Cognitive Deficits in Women After Surgery for Breast Cancer. Journal of Clinical Psychology in Medical Settings. 2012:1–11. doi: 10.1007/s10880-011-9274-z. [DOI] [PubMed] [Google Scholar]
  • 10.Boykoff N, Moieni M, Subramanian SK. Confronting chemobrain: an in-depth look at survivors’ reports of impact on work, social networks, and health care response. Journal of cancer survivorship : research and practice. 2009;3(4):223–232. doi: 10.1007/s11764-009-0098-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Tannock IF, Ahles TA, Ganz PA, van Dam F. Cognitive Impairment Associated with Chemotherapy for Cancer: Report of a Workshop. Journal of Clinical Oncology. 2004;22(11) doi: 10.1200/JCO.2004.08.094. [DOI] [PubMed] [Google Scholar]
  • 12.Castellon SA, Ganz PA, Bower JE, Petersen L, Abraham L, Greendale GA. Neurocognitive performance in breast cancer survivors exposed to adjuvant chemotherapy and tamoxifen. Journal of Clinical and Experimental Neuropsychology. 2004;26(7):955–969. doi: 10.1080/13803390490510905. [DOI] [PubMed] [Google Scholar]
  • 13.Pullens MJJ, De Vries J, Roukema JA. Subjective cognitive dysfunction in breast cancer patients: a systematic review. Psycho-Oncology. 2010;19(11):1127–1138. doi: 10.1002/pon.1673. [DOI] [PubMed] [Google Scholar]
  • 14.Jansen CE, Dodd MJ, Miaskowski CA, Dowling GA, Kramer J. Preliminary results of a longitudinal study of changes in cognitive function in breast cancer patients undergoing chemotherapy with doxorubicin and cyclophosphamide. Psycho-Oncology. 2008;17(12):1189–1195. doi: 10.1002/pon.1342. [DOI] [PubMed] [Google Scholar]
  • 15.Jenkins V, Shilling V, Deutsch G, Bloomfield D, Morris R, Allan S, Bishop H, Hodson N, Mitra S, Sadler G, Shah E, Stein R, Whitehead S, Winstanley J. A 3-year prospective study of the effects of adjuvant treatments on cognition in women with early stage breast cancer. British Journal of Cancer. 2006;94(6):828–834. doi: 10.1038/sj.bjc.6603029. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Hermelink K, Untch M, Lux MP, Kreienberg R, Beck T, Bauerfeind I, Münzel K. Cognitive function during neoadjuvant chemotherapy for breast cancer: results of a prospective, multicenter, longitudinal study. Cancer. 2007;109(9):1905–1913. doi: 10.1002/cncr.22610. [DOI] [PubMed] [Google Scholar]
  • 17.Hurria A, Goldfarb S, Rosen C, Holland J, Zuckerman E, Lachs M, Witmer M, Gorp W, Fornier M, D’Andrea G, Moasser M, Dang C, Poznak C, Robson M, Currie V, Theodoulou M, Norton L, Hudis C. Effect of adjuvant breast cancer chemotherapy on cognitive function from the older patient’s perspective. Breast Cancer Research and Treatment. 2006;98(3):343–348. doi: 10.1007/s10549-006-9171-6. [DOI] [PubMed] [Google Scholar]
  • 18.Quesnel C, Savard J, Ivers H. Cognitive impairments associated with breast cancer treatments: results from a longitudinal study. Breast Cancer Research and Treatment. 2009;116(1):129–130. doi: 10.1007/s10549-008-0114-2. [DOI] [PubMed] [Google Scholar]
  • 19.Koppelmans V, Breteler MMB, Boogerd W, Seynaeve C, Gundy C, Schagen SB. Neuropsychological Performance in Survivors of Breast Cancer More Than 20 Years After Adjuvant Chemotherapy. Journal of Clinical Oncology. 2012;30(10):1080–1086. doi: 10.1200/JCO.2011.37.0189. [DOI] [PubMed] [Google Scholar]
  • 20.Ganz PA, Bower JE, Kwan L, Castellon SA, Silverman DHS, Geist C, Breen EC, Irwin MR, Cole SW. Does tumor necrosis factor-alpha (TNF-α) play a role in post-chemotherapy cerebral dysfunction? Brain, Behavior, and Immunity. 2012 doi: 10.1016/j.bbi.2012.07.015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Lai J-S, Zelko F, Krull K, Cella D, Nowinski C, Manley P, Goldman S. Parent-reported cognition of children with cancer and its potential clinical usefulness. Quality of Life Research. 2013:1–10. doi: 10.1007/s11136-013-0548-9. Epub ahead of print. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Jacobs SR, Jacobsen PB, Booth-Jones M, Wagner LI, Anasetti C. Evaluation of the functional assessment of cancer therapy cognitive scale with hematopoietic stem cell transplant patients. Journal of Pain and Symptom Management. 2007;33(1):13–23. doi: 10.1016/j.jpainsymman.2006.06.011. [DOI] [PubMed] [Google Scholar]
  • 23.Fliessbach K, Helmstaedter C, Urbach H, Althaus A, Pels H, Linnebank M, Juergens A, Glasmacher A, Schmidt-Wolf IG, Klockgether T, Schlegel U. Neuropsychological outcome after chemotherapy for primary CNS lymphoma: a prospective study. Neurology. 2005;64(7):1184–1188. doi: 10.1212/01.WNL.0000156350.49336.E2. [DOI] [PubMed] [Google Scholar]
  • 24.Skaali T, Fosså SD, Andersson S, Cvancarova M, Langberg CW, Lehne G, Dahl AA. Self-reported cognitive problems in testicular cancer patients: Relation to neuropsychological performance, fatigue, and psychological distress. PSR Journal of Psychosomatic Research. 2011;70(5):403–410. doi: 10.1016/j.jpsychores.2010.12.004. [DOI] [PubMed] [Google Scholar]
  • 25.Schagen SB, Boogerd W, Muller MJ, Huinink WT, Moonen L, Meinhardt W, Van Dam FS. Cognitive complaints and cognitive impairment following BEP chemotherapy in patients with testicular cancer. Acta oncologica (Stockholm, Sweden) 2008;47(1):63–70. doi: 10.1080/02841860701518058. [DOI] [PubMed] [Google Scholar]
  • 26.Poppelreuter M, Weis J, Külz AK, Tucha O, Lange KW, Bartsch HH. Cognitive dysfunction and subjective complaints of cancer patients:a cross-sectional study in a cancer rehabilitation centre. EJC European Journal of Cancer. 2004;40(1):43–49. doi: 10.1016/j.ejca.2003.08.001. [DOI] [PubMed] [Google Scholar]
  • 27.Hermelink K, Küchenhoff H, Untch M, Bauerfeind I, Lux MP, Bühner M, Manitz J, Fensterer V, Münzel K. Two different sides of ‘chemobrain’: determinants and nondeterminants of self-perceived cognitive dysfunction in a prospective, randomized, multicenter study. Psycho-Oncology. 2010;19(12):1321–1328. doi: 10.1002/pon.1695. [DOI] [PubMed] [Google Scholar]
  • 28.Costa PT, Jr, McCrae RR. Neuroticism, somatic complaints, and disease: is the bark worse than the bite? Journal of Personality. 1987;55(2):299–316. doi: 10.1111/j.1467-6494.1987.tb00438.x. [DOI] [PubMed] [Google Scholar]
  • 29.Mora PA, Halm E, Leventhal H, Ceric F. Elucidating the relationship between negative affectivity and symptoms: the role of illness-specific affective responses. Annals of behavioral medicine : a publication of the Society of Behavioral Medicine. 2007;34(1):77–86. doi: 10.1007/BF02879923. [DOI] [PubMed] [Google Scholar]
  • 30.Lai JS, Butt Z, Wagner L, Sweet JJ, Beaumont JL, Vardy J, Jacobsen PB, Jacobs SR, Shapiro PJ, Cella D. Evaluating the dimensionality of perceived cognitive function. Journal of Pain and Symptom Management. 2009;37(6):982–995. doi: 10.1016/j.jpainsymman.2008.07.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Lai JS, Garcia SF, Salsman JM, Rosenbloom S, Cella D. The psychosocial impact of cancer: evidence in support of independent general positive and negative components. Quality of Life Research. 2012;21(2):195–207. doi: 10.1007/s11136-011-9935-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Wagner LI, Sweet J, Butt Z, Lai JS, Cella D. Measuring Patient Self-Reported Cognitive Function: Development of the Functional Assessment of Cancer Therapy–Cognitive Function Instrument. Journal of Supportive Oncology. 2009;7:W32–W39. [Google Scholar]
  • 33.Lai JS, Butt Z, Zelko F, Cella D, Krull K, Kieran M, Goldman S. Development of a Parent-Report Cognitive Function Item Bank Using Item Response Theory and Exploration of its Clinical Utility in Computerized Adaptive Testing. Journal of Pediatric Psychology. 2011;36(7):766–779. doi: 10.1093/jpepsy/jsr005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Lai JS, Cella D, Choi SW, Junghaenel DU, Christodolou C, Gershon R, Stone A. How Item Banks and Their Application Can Influence Measurement Practice in Rehabilitation Medicine: A PROMIS Fatigue Item Bank Example. Archives of Physical Medicine and Rehabilitation. 2011;92(10 Supplement):S20–S27. doi: 10.1016/j.apmr.2010.08.033. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Cella D, Gershon R, Lai JS, Choi S. The future of outcomes measurement: item banking, tailored short-forms, and computerized adaptive assessment. Quality of Life Research. 2007;16(Suppl 1):133–141. doi: 10.1007/s11136-007-9204-6. [DOI] [PubMed] [Google Scholar]
  • 36.Hays RD, Liu H, Spritzer K, Cella D. Item Response Theory Analyses of Physical Functioning Items in the Medical Outcomes Study. Medical Care. 2007;45(5 Suppl 1):S32–S38. doi: 10.1097/01.mlr.0000246649.43232.82. [DOI] [PubMed] [Google Scholar]
  • 37.Lai JS, Cella D, Dineen K, Von Roenn J, Gershon R. An item bank was created to improve the measurement of cancer-related fatigue. Journal of Clinical Epidemiology. 2005;58(2):190–197. doi: 10.1016/j.jclinepi.2003.07.016. [DOI] [PubMed] [Google Scholar]
  • 38.Lai JS, Cella D, Chang CH, Bode RK, Heinemann AW. Item banking to improve, shorten and computerize self-reported fatigue: an illustration of steps to create a core item bank from the FACIT-Fatigue Scale. Quality of Life Research. 2003;12(5):485–501. doi: 10.1023/a:1025014509626. [DOI] [PubMed] [Google Scholar]
  • 39.Gershon R, Cella D, Dineen K, Rosenbloom S, Peterman A, Lai JS. Item response theory and health-related quality of life in cancer. Expert Review of Pharmacoeconomics & Outcomes Research. 2003;3(6):783–791. doi: 10.1586/14737167.3.6.783. [DOI] [PubMed] [Google Scholar]
  • 40.Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling. 1999;6(1):1–55. [Google Scholar]
  • 41.Reeve BB, Hays RD, Bjorner JB, Cook KF, Crane PK, Teresi JA, Thissen D, Revicki DA, Weiss DJ, Hambleton RK, Liu H, Gershon R, Reise SP, Lai JS, Cella D. Psychometric evaluation and calibration of health-related quality of life item banks: plans for the Patient-Reported Outcomes Measurement Information System (PROMIS) Med Care. 2007;45(5 Suppl 1):S22–31. doi: 10.1097/01.mlr.0000250483.85507.04. [DOI] [PubMed] [Google Scholar]
  • 42.Muthen LK, Muthen BO. Mplus User’s Guide. Los Angeles, CA: Muthen & Muthen; 2006. [Google Scholar]
  • 43.Lai JS, Crane PK, Cella D. Factor analysis techniques for assessing sufficient unidimensionality of cancer related fatigue. Quality of Life Research. 2006;15(7):1179–1190. doi: 10.1007/s11136-006-0060-6. [DOI] [PubMed] [Google Scholar]
  • 44.McDonald RP. Test Theory: A unified treatment. Mahwah, NJ: Lawrence Earlbaum Associates, Inc; 1999. [Google Scholar]
  • 45.Reckase MD. The past and future of multidimensional item response theory. Applied Psychological Measurement. 1997;21:25–36. [Google Scholar]
  • 46.Reise SP, Morizot J, Hays RD. The role of the bifactor model in resolving dimensionality issues in health outcomes measures. Quality of Life Research. 2007;16(Suppl 1):19–31. doi: 10.1007/s11136-007-9183-7. [DOI] [PubMed] [Google Scholar]
  • 47.Samejima F. Estimation of latent ability using a response pattern of graded scores. Psychometrika Monograph. 1969;(17) [Google Scholar]
  • 48.Samejima F. Graded Response Model. In: van der Linden WJ, Hambleton RK, editors. Handbook of Modern Item Response Theory. New York: Spinger; 1997. pp. 85–100. [Google Scholar]
  • 49.Folk VG, Green BF. Adaptive Estimation When the Unidimensionality Assumption of IRT Is Violated. Applied Psychological Measurement. 1989;13(4):373–389. [Google Scholar]
  • 50.Gibbons RD, Immekus JC, Bock RD. The Added Value of Multidimensional IRT Models: Didactic Workbook. Chicago, IL: Center for Health Statistics, University of Illinois at Chicago; 2007. [Google Scholar]
  • 51.Reise SP, Cook KF, Moore TM. A Direct Modeling Approach for Evaluating the Impact of Multidimensionality on Unidimensional Item Response Theory Model Parameters. In: Reise SP, Revicki D, editors. Handbook of Applied Item Response Theory in Typical Performance Assessment. New York: Taylor & Francis; (In press) [Google Scholar]
  • 52.Gibbons R, Hedeker D. Full-information item bi-factor analysis. Psychometrika. 1992;57(3):423–436. [Google Scholar]
  • 53.Gibbons RD, Bock RD, Hedeker D, Weiss DJ, Segawa E, Bhaumik DK, Kupfer DJ, Frank E, Grochocinski VJ, Stover A. Full-Information Item Bifactor Analysis of Graded Response Data. Applied Psychological Measurement. 2007;31(1):4–19. [Google Scholar]
  • 54.Sprangers MA, Cull A, Groenvold M, Bjordal K, Blazeby J, Aaronson NK. The European Organization for Research and Treatment of Cancer approach to developing questionnaire modules: An update and overview. EORTC Quality of Life Study Group. Quality of Life Research. 1998;7(4):291–300. doi: 10.1023/a:1024977728719. [DOI] [PubMed] [Google Scholar]
  • 55.Rosler A, Gonnenwein C, Muller N, Sterzer P, Kleinschmidt A, Frolich L. The fuzzy frontier between subjective memory complaints and early dementia: A survey of Patient Management in German Memory Clinics. Dementia and Geriatric Cognitive Disorders. 2004:17. doi: 10.1159/000076360. [DOI] [PubMed] [Google Scholar]
  • 56.Saykin AJ, Wishart HA, Rabin LA, Santulli RB, Flashman LA, West JD, McHugh TL, Mamourian AC. Older adults with cognitive complaints show brain atrophy similar to that of amnestic MCI. Neurology. 2006;67(5):834–842. doi: 10.1212/01.wnl.0000234032.77541.a2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Ferguson RJ, McDonald BC, Saykin AJ, Ahles TA. Brain structure and function differences in monozygotic twins: Possible effects of breast cancer chemotherapy. Journal of Clinical Oncology. 2007;25(25):3866–3870. doi: 10.1200/JCO.2007.10.8639. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Joly F, Rigal O, Noal S, Giffard B. Cognitive dysfunction and cancer: which consequences in terms of disease management? Psycho-Oncology. 2011;20(12):1251–1258. doi: 10.1002/pon.1903. [DOI] [PubMed] [Google Scholar]
  • 59.Hess L, Insel K. Chemotherapy-Related Change in Cognitive Function: A Conceptual Model. Oncology Nursing Forum. 2007;34(5):981–994. doi: 10.1188/07.ONF.981-994. [DOI] [PubMed] [Google Scholar]
  • 60.Raffa RB. Is a picture worth a thousand (forgotten) words?: neuroimaging evidence for the cognitive deficits in ‘chemo-fog’/’chemo-brain’. Journal of Clinical Pharmacy and Therapeutics. 2010;35(1):1–9. doi: 10.1111/j.1365-2710.2009.01044.x. [DOI] [PubMed] [Google Scholar]
  • 61.de Groot JC, de Leeuw FE, Oudkerk M, Hofman A, Jolles J, Breteler MM. Cerebral white matter lesions and subjective cognitive dysfunction: The Rotterdam Scan Study. Neurology. 2001;56(11):1539–1545. doi: 10.1212/wnl.56.11.1539. [DOI] [PubMed] [Google Scholar]
  • 62.Mahone EM, Martin R, Kates WR, Hay T, Horska A. Neuroimaging correlates of parent ratings of working memory in typically developing children. Journal of the International Neuropsychological Society. 2009;15(1):31–41. doi: 10.1017/S1355617708090164. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES