Abstract
Accurate retrospective reporting of activities and symptoms has been shown to be problematic for older adults, yet standard clinical care relies on self-reports to aid in assessment and management. Our aim was to examine the relationship between self-report and sensor-based measures of activity. We administered an online activity survey to participants in our ongoing longitudinal study of in-home ubiquitous monitoring. We found a wide range of accuracy when comparing self-report with time-stamped sensor data. Of the 95 participants who completed the two-hour activity log, nearly one quarter did not complete the task in a way that could potentially be compared with sensor data. Where comparisons were possible, agreement between self-reported and sensor-based activity was achieved by a minority of participants. The findings suggest that capture of real time events with unobtrusive activity monitoring may be a more reliable approach to describing behaviors patterns and meaningful changes in older adults.
Keywords: in-home monitoring, technology, self-report assessments
Self-report assessment is a cost-effective means of gathering large amounts of data, and can be adapted for different populations and research applications. Thus, retrospective self-reports of health status, daily activity, clinical symptomatology and other important health related variables have been widely used for decades in health outcomes research. However, in many cases poor or inconsistent reliability of self-report questionnaires and inventories has been well recognized. Discrepancies between self-report and more objective measures have been documented in a wide variety of important behaviors, including medication adherence (Wild & Cotrell, 2003), health service utilization (Wallihan, Stump, & Callahan, 1999), dietary intake (O’Loughlin et al., 2013), and driving (Cotrell, Wild, & Bader, 2006). This is an increasingly important issue as more survey research moves on-line while at the same time potentially more accurate and objective methods of data collection are tested and developed.
For example, the importance of physical activity in maintaining health and independence in older adults has been well established (Concannon, Grierson, & Harrast, 2012; Sun, Norrman, & While, 2013). At the same time, self-reporting of types, intensities, and amounts of physical activity has been shown to be unreliable. Saelens and Sallis (2000), in a review of self-report physical activity measures, concluded that while such measures have been developed for use with older adults, impaired recall, misinterpretation of survey items, and social desirability biases can limit the accuracy of data collected by those methods. Similarly, Tudor-Locke and Myers (2001) describe limitations of physical activity questionnaires when administered to more sedentary adults. Most instruments fail to address lower levels of activity both in terms of time spent and level of intensity, and emphasize more vigorous pursuits. Given the documented limited reliability of such measures in assessing activities more typical of older adults, the authors recommend approaches that combine self-report with more objective measures derived from motion sensing devices.
With the development of applicable technologies, research has recently focused on achieving greater levels of accuracy in detecting and describing daily physical activity. Banda et al. (2010) found poor agreement between questionnaire responses and accelerometer data in middle aged and older adults, with significantly higher levels of activity on self-report. Others have reported similar results with discrepancies between self-report and accelerometry (Dyrstad, Hansen, Holme, & Anderssen, 2014; Grimm, Swarz, Hart, Miller, & Strath, 2012; Hekler et al., 2012), pedometers (Harris et al., 2009), and doubly labeled water as a measure of total energy expenditure (Neilson, Robson, Friedenreich, & Csizmadi, 2008). In a review of 21 studies comparing self-report and direct measures of physical activity, Kowalski et al. (2012) found “weak to moderate” correlations between these data collection methods. Further analysis revealed higher strength of association between direct measures than between indirect, or self-report, measures of activity. The authors suggest that the measurement of physical activity in older adults presents unique challenges, including intra-individual variability due to changes in health, and failure of indirect measures to appropriately assess intensity, type, and duration of activity relevant to this population.
In response to the demonstrated unreliability of self-reports of daily events, more direct reporting techniques have been developed to minimize the effects of memory lapses and recall biases. Ecological Momentary Assessment (Kahnemann, Krueger, Schkade, Schwarz, & Stone, 2004) is based on real-time monitoring or sampling of behaviors of interest as they occur in one’s natural environment (Shiffman, Stone, & Hufford, 2008). In a review of EMA research with older adults, Cain et al. (2009) described the typical study as having collected data for a period of two weeks, related to physical and daily living activities. They found compliance to be acceptable across studies and recommended the use of such methods in combination with other continuous monitoring techniques and domains.
The Day Reconstruction Method (Kahnemann et al., 2004) is a strategy for real-time behavior sampling in which a structured diary format elicits recall of a continuous sequence of events, by means of a detailed description of duration and location of activities and the related social interactions and affect states. Similarly, Milligan et al. (2005) used “solicited diaries” to collect data on the health and daily activities of older adults. They found that this method offered unique insights that were less influenced by memory lapses or biases by capturing events close in time to their actual occurrence. However limitations of this technique included differences in response styles, and possible burden on respondents with increasing duration of participation. Jacelon and Imperio (2005) also reported that solicited diaries were a productive means of tracking daily activities of older adults. However they found the richness of the data tended to decline over time, and recommended a two-week maximum for such intense participation by older adults. Others have reported mixed results with the diary method. A large sample of older adults completed a 7-day daily diary symptom report (Aroian & Vander Wal, 2007), but respondents were found to endorse fewer symptoms each day. The authors hypothesized that frequently occurring symptoms become “taken for granted” and thus are less likely to be reported on a daily basis. They recommended the use of this method primarily for quantifying the appearance of new symptoms or change in symptoms over time. In an effort to reduce burden on older adults, we used an abbreviated Day Reconstruction Method approach to record activities for a 24-hour period. Consistent with other studies, we found wide variability both across participants and within diaries in terms of level of detail and consistency of response (Wild, Maxwell, Campbell, Hayes, & Kaye, 2011).
In-home monitoring of daily activity and behavior has benefited from dramatic advances in related technologies. Brownsell et al. (2011), in a review of “lifestyle monitoring technologies,” concluded that such approaches are useful in tracking and detecting changes in activity, but the important next step of applying those data to identification of health care needs and interventions has not yet been reported. Similarly, Reeder et al. (2013) reviewed promising studies of smart home technologies and concluded that while prediction of functional decline based on changes in patterns of activity is readily attainable, future research should focus on the interpretation of those findings for improving individualized health care. Kaye et al. (2011) described a system of in-home continuous activity monitoring that has been deployed in the homes of over 250 older adults for an average of nearly three years. In this longitudinal study, real-time ubiquitous and unobtrusive monitoring of activities and behaviors is captured by strategically placed in-home motion sensors. At the same time, participants respond to regularly scheduled questionnaires via desktop computer. Continuously assessed metrics include walking speed, total activity, computer use, and time out of home, among others. With continued advances in monitoring capabilities and algorithms to interpret the ensuing data, assessment by self-report of symptomatology, behaviors and activities may become a useful adjunct rather than the sole means of data collection.
Given this increasing capability to integrate “of the moment” self-report data with objectively sensed data, we were interested in determining the degree of fidelity in self-reports of activity relative to sensor-detected activity in older adults. Further, we hypothesized that there might be limitations to the reliability of recalled events and behaviors, particularly when implemented with older adults. To reduce burden and maximize accuracy, we developed an on-line survey to obtain activity information. In the present study, web-based self-reports of activity were compared with time-stamped sensor based activity data.
Method
Participants
All participants had provided written informed consent and were already enrolled in one of two ongoing studies of in-home monitoring: the Oregon Center for Aging and Technology (ORCATECH) Living Laboratory study and the Intelligent Systems for Assessing Aging Change (ISAAC) study. The protocol was approved by the OHSU Institutional Review Board (IRB #2353). Both studies use the same in-home sensor technology and computers to detect early behavioral and cognitive changes that occur with aging. Inclusion criteria were: 60 years and older for the Living Laboratory study and 80 and older for the ISAAC study, living independently, cognitively healthy (Clinical Dementia Rating score < 1; Mini-Mental State Exam score > 24), and of average health for age with no or well-controlled chronic health conditions. The details of the sensor system and its deployment in homes have been described previously (Kaye et al., 2011). All participants from these two cohorts who routinely completed a weekly electronic health form were asked to participate in this study. At the time of this study, that included 136 currently active participants who were regular computer users.
Procedures
At the time of completion of their on-line health form, participants were asked to complete an additional survey at one time point (see Figure 1). In the survey, participants were asked to report their activity, location in the home, and time of activity for the two hours immediately preceding completion of the survey. Text boxes in the survey were designed to expand as needed for each entry. Examples were given to illustrate appropriate responses. Sensor based activity data for the two hours prior to the date and time stamp of the activity survey were examined for each participant for comparison with self-reported activity. Standard clinical and cognitive measures are administered on a yearly basis to all ISAAC participants and were available for this cohort.
Figure 1.
Online Activity Survey
Recognizing that a level of precision in documenting time and location of activity similar to that of motion sensors was beyond the capability of most participants, activity reports from sensor data were translated into sequences of locations rather than based on time-stamped accuracy. For example, sensor firings in the living room followed by firings in the kitchen and bedroom could be assessed for agreement with self-reported sequences of activity for the same time frame. Determinations of concordance between self-report and sensor data were made by direct comparison of room locations identified for the two hour time frame by each method. Those comparisons where locations identified in self-report were in the same sequence as determined by sensor data were termed a match; those where at least one location was consistently identified were considered to be in partial agreement; and finally those instances where there was no overlap between self-report and sensor data were termed “no match.”
Statistical Analysis
Demographics, clinical characteristics and neuropsychological test scores were compared between participants who were and were not able to complete the real-time activity reporting task. Student’s t-test or Wilcoxon rank sum test were used as appropriate for continuous variables and Pearson’s chi-square test or Fisher’s exact test were used as appropriate for categorical variables. All analyses were performed using SAS 9.3 software (SAS Institute, Inc., Cary, NC).
Results
Of the 136 ISAAC participants to complete a weekly health form, 95 completed the activity survey as requested. Of those 95 older adults, nearly one-quarter (n = 22) did not complete the survey adequately, as defined by failure to include at least one activity, location, and time that could potentially be compared with time-stamped sensor-based data. Group differences between those who were and were not able to complete the real-time activity reporting task are presented in Table 1.
Table 1.
Characteristics of older adults who were and were not able to complete the real-time activity reporting task
Variable (Range) | Able (n=73) | Not able (n=22) | P value |
---|---|---|---|
Age, yrs (64–99) | 83.7 (7.1) | 85.6 (7.2) | 0.25 |
Gender, ,% Women | 62 (85%) | 15 (68%) | 0.12 |
Race, % White | 60 (82%) | 16 (73%) | 0.56 |
Education, yrs (10–20) | 15.0 (2.6) | 14.9 (2.4) | 0.82 |
MMSE (23–30) | 29.1 (1.0) | 28.1 (2.0) | 0.04 |
GDS (0–12) | 1.0 (2.1) | 0.8 (1.0) | 0.80 |
FAQ (0–7) | 0.5 (1.3) | 0.7 (1.5) | 0.56 |
CIRS (15–28) | 20.8 (2.6) | 20.7 (2.1) | 0.96 |
Logical Memory Delayed (1–23) | 13.3 (3.9) | 12.3 (4.9) | 0.35 |
Logical Memory Immed. (2–24) | 13.8 (4.0) | 14.0 (4.9) | 0.92 |
Word List Delayed (2–10) | 7.3 (1.7) | 6.1 (2.2) | 0.03 |
Word List Acquisition (11–29) | 21.4 (4.0) | 19.5 (3.8) | 0.06 |
Trail Making Test Part B* (58–300) | 114.6 (51.0) | 125.3 (64.2) | 0.41 |
Animal Fluency (6–39) | 18.3 (5.4) | 16.5 (5.3) | 0.38 |
Represents time to completion in seconds; higher score reflects worse performance.
MMSE = Mini Mental State Examination; GDS = Geriatric Depression Scale; FAQ = Functional Assessment Questionnaire, CIRS = Cumulative Illness Rating Scale
Between-group comparisons showed significantly lower mental status scores in those who did not submit usable activity reports than those who did. Verbal memory as measured by delayed recall on a word list learning task was also significantly lower for the participants who were unable to successfully complete the survey, while differences in word list learning were significant at the 0.06 level. Two participants who had most recent Clinical Dementia Rating scores of 0.5 indicative of mild cognitive impairment were both in the “unusable” survey group.
Of the 73 respondents with usable surveys, 49 were residing alone and had usable sensor data (see Figure 2). Of the 49 instances of comparable data sets, roughly one-quarter (12) demonstrated a match between self-report and data based on sensor firings in a particular sequence of locations. Thirteen residents had no agreement at all between self-reported activity and sensor firings for the same two-hour period (see Figure 3). For example, one respondent reported attending an exercise class, while sensors were firing for much of the reported episode. Finally, about half the reports had some overlap with sensor based activity readings. There were no differences between these three groups (match, partial agreement, no match) on any demographic or cognitive variables.
Figure 2.
Participants in Survey Study
Figure 3.
Example of matching self-report (blue arrows) and sensor data.
Discussion
We have demonstrated the difficulties inherent in obtaining accurate assessments of daily activity as obtained by older adults’ self-reports. In a relatively structured daily activity reconstruction format, nearly one-quarter of participants in an online survey failed to record a usable activity entry. Even among the generally cognitively intact older adults of the present study, those who had difficulty completing an online survey had lower scores on objective measures of mental status and verbal memory. While the clinical significance of these small test score differences may be slight, that such differences were statistically significant in a cohort of healthy adults without neurological diagnoses is of note. These individuals were not demented, lived independently and had responded regularly to online surveys for the previous four years. Continued follow-up of these participants will serve to detect clinically meaningful changes in cognition as they occur. Finally, we found that agreement between self-reported and sensor-based activity was achieved by a minority of participants (24%). One-half of usable survey responses had partial overlap with data based on time stamped sensor firings. Fully one-quarter of respondents reported activity that had no correspondence to sensor data for the same time interval.
While self-report instruments are a cost-effective means of gathering large amounts of data, and can be adapted for different populations and research applications, limitations of this type of assessment have been recognized and described in multiple research settings. Van Uffelen et al. (2011) reported “poor to fair agreement” between self-reported and objectively measured sedentary activity in older adults. They cited respondents’ difficulty recalling actual activities as opposed to typical or average time spent sitting, and failure to restrict their responses to the time span in question. Further, they noted a tendency to report activities that were given as questionnaire examples, and to omit activities that were not a regular part of their schedule. Similarly, in a comparison of self-reported and objective measures of driving experience, Blanchard (2010) found that older adults made significant errors in both under- and over-estimating miles driven per week, and recommended use of multiple measures from different sources for maximizing accuracy of data.
Bolstered by previous research documenting the limitations of self-report as a sole source of behavioral data, we suggest the present path of investigation as a whole highlights the need to seek more reliable methods of data collection whenever possible especially in older populations. We are not the first to call for more cost-effective, scalable, and reproducible measures to describe daily activity and to detect subtle changes in those activities. “Smart home” technologies aimed at prolonging safety and independence in older adults, and identifying the earliest signs of physical or cognitive decline, have been extensively described and reviewed elsewhere (Alwan, 2009; Brownsell et al., 2011; Mahoney et al., 2007; Reeder et al., 2013). However few have been deployed in long-term, large-scale studies. Reeder et al. (2103) reported an attempt to define the relationship between self-report and sensor-based measures of activity in a small group of older adults living in an independent retirement community. While they gathered important data on the acceptability of in-home sensors, difficulties with technical issues precluded the intended comparisons.
Physical activity assessed by remote sensing in the home is just one objective metric suitable for use as a marker of incipient decline in older adults. Recently, Kaye et al. (2014) demonstrated that daily, continuous monitoring of computer use can detect change in function over time despite relative stability on more traditional functional status measures. Similarly, subtle changes in medication use (Hayes, Larimer, Adami, & Kaye, 2009), walking speed (Dodge, Mattek, Austin, Hayes, & Kaye, 2012; Silbert, 2012), sleep patterns (Dodge et al., 2012), and overall activity level (Kaye et al., 2011) have been identified by in-home monitoring technologies as potential early signs of incipient cognitive decline. In a related finding, Thielke et al. report associations between monitored in-home behaviors and self-reported low mood (Thielke et al., 2014).
There are current limitations to the approach reported here. The current state of sensor derived data analysis has yet to successfully address the issue of multiple person dwellings; thus our comparison of self-reports with sensed data was limited to participants living alone. Future research will benefit from the development of analytic strategies that can reconstruct the activities and movements of multiple persons within the same space. Additionally, the participants recruited for these kinds of studies are generally “early adopters” of technology and may not be representative of the broader older adult population. As aging cohorts have more experience with computers and in-home technology, it can be expected that those findings will be more generalizable to the older adult population as a whole. In particular the inclusion of a more diverse group of older adults will be essential in moving this work forward.
We propose that the traditional methods of assessment are not ideally suited to detecting subtle but potentially meaningful changes in behavior. Continuous sensor-based assessment acquires data in a way that minimizes memory deficits and recall biases, by taking place in real time in the home environment rather than being subject to retrospective, episodic clinic-based reporting. While the difficulties inherent in self-report retrospective data collection are not limited to older adults, the prevalence of memory impairment in this cohort further weakens the reliability of such methods. We believe that due to the inherent inaccuracies of self-report data, behavioral research can be substantially advanced by using sensor-based pervasive computing assessment methods that have been shown capable of detecting subtle changes that might signal early cognitive decline. Finally, as applications of in-home technology move from research to the clinical practice setting, reliance on more sensitive and accurate behavioral profiles can serve to improve patient care.
Acknowledgments
We are grateful to all the participants and research staff involved in this project.
Funding
This study was funded by grants from the National Institutes of Health [P30AG024978, R01AG024059, P30AG008017].
References
- Alwan M. Passive in-home health and wellness monitoring: Overview, value and examples. Proceeding of the Annual International Conference of the IEEE EMBS; 2009. pp. 4307–4310. [DOI] [PubMed] [Google Scholar]
- Aroian K, Vander Wal J. Measuring elders’ symptoms with daily diaries and retrospective reports. Western Journal of Nursing Research. 2007;29(3):322–337. doi: 10.1177/0193945906293814. [DOI] [PubMed] [Google Scholar]
- Banda J, Hutto B, Feeney A, Pfeiffer K, McIver K, Lamonte M, et al. Comparing physical activity measures in a diverse group of midlife and older adults. Medicine and Science in Sports and Exercise. 2010;42(12):2251–2257. doi: 10.1249/MSS.0b013e3181e32e9a. [DOI] [PubMed] [Google Scholar]
- Blanchard R, Myers A, Porter M. Correspondence between self-reported and objective measures of driving exposure and patterns in older drivers. Accident Analysis and Prevention. 2010;42:523–529. doi: 10.1016/j.aap.2009.09.018. [DOI] [PubMed] [Google Scholar]
- Brownsell S, Bradley D, Blackburn S, Cardnaux F, Hawley M. A systematic review of lifestyle monitoring technologies. Journal of Telemedicine and Telecare. 2011;17:185–189. doi: 10.1258/jtt.2010.100803. [DOI] [PubMed] [Google Scholar]
- Cain A, Depp C, Jeste D. Ecological momentary assessment in aging research: A critical review. Journal of Psychiatric Research. 2009;43(11):987–996. doi: 10.1016/j.jpsychires.2009.01.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Concannon LG, Grierson JM, Harrast MA. Exercise in the older adult: from the sedentary elderly to the masters athlete. Physical Medicine and Rehabilitation. 2012;4(11):833–839. doi: 10.1016/j.pmrj.2012.08.007. [DOI] [PubMed] [Google Scholar]
- Cotrell V, Wild K, Bader T. Medication management and adherence among cognitively impaired older adults. Journal of Gerontological Social Work. 2006;47(3/4):31–46. doi: 10.1300/J083v47n03_03. [DOI] [PubMed] [Google Scholar]
- Dodge H, Mattek N, Austin D, Hayes T, Kaye J. In-home walking speeds and variability trajectories associated with mild cognitive impairment. Neurology. 2012;78(24):1946–1952. doi: 10.1212/WNL.0b013e318259e1de. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dyrstad S, Hansen B, Holme I, Anderssen S. Comparison of self-reported versus accelerometer-measured physical activity. Medicine and Science in Sports and Exercise. 2014;46(1):99–106. doi: 10.1249/MSS.0b013e3182a0595f. [DOI] [PubMed] [Google Scholar]
- Grimm E, Swarz A, Hart T, Miller N, Strath S. Comparison of the IPAQ-Short Form and accelerometry predictions of physical activity in older adults. Journal of Aging and Physical Activity. 2012;20:64–79. doi: 10.1123/japa.20.1.64. [DOI] [PubMed] [Google Scholar]
- Harris T, Owen C, Vistor C, Adams R, Ekelund U, Cook D. A comparison of questionnaire, accelerometer, and pedometer: Measures in older people. Medicine and Science in Sports and Exercise. 2009;41(7):1392–1402. doi: 10.1249/MSS.0b013e31819b3533. [DOI] [PubMed] [Google Scholar]
- Hayes T, Larimer N, Adami A, Kaye J. Medication adherence in healthy elders: small cognitive changes make a big difference. Journal of Aging and Health. 2009;21(4):567–580. doi: 10.1177/0898264309332836. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hekler E, Buman M, Haskell W, Conway T, Cain K, Sallis J, et al. Reliability and validity of CHAMPS self-reported sedentary-to-vigorous intensity physical activity in older adults. Journal of Physical Activity and Health. 2012;9:229–236. doi: 10.1123/jpah.9.2.225. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jacelon C, Imperio K. Participant diaries as a source of data in research with older adults. Qualitative Health Research. 2005;15(7):991–997. doi: 10.1177/1049732305278603. [DOI] [PubMed] [Google Scholar]
- Kahnemann D, Krueger A, Schkade D, Schwarz N, Stone A. A survey method for characterizing daily life experience: The Day Reconstruction Method. Science. 2004;306:1776–1780. doi: 10.1126/science.1103572. [DOI] [PubMed] [Google Scholar]
- Kaye J, Mattek N, Dodge H, Campbell I, Hayes T, Austin D, et al. Unobtrusive measurement of daily computer use to detect mild cognitive impairment. Alzheimer’s and Dementia. 2014;10(1):10–17. doi: 10.1016/j.jalz.2013.01.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kaye J, Maxwell S, Mattek N, Hayes T, Dodge H, Pavel M, et al. Intelligent systems for assessing aging changes: Home-based, unobtrusive, and continuous assessment of aging. The Journals of Gerontology, Series B: Psychological and Social Sciences. 2011;66B(S1):180–190. doi: 10.1093/geronb/gbq095. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kowalski K, Rhodes R, Naylor P, Tuokko H, MacDonald S. Direct and indirect measurement of physical activity in older adults: a systematic review. International Journal of Behavioral Nutrition and Physical Activity. 2012;9:148. doi: 10.1186/1479-5868-9-148. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mahoney D, Purtilo RB, Webbe FM, Alwan M, Bharucha AJ, Adlam TD, et al. In-home monitoring of persons with dementia: Ethical guidelines for technology research and development. Alzheimer’s and Dementia. 2007;3:217–226. doi: 10.1016/j.jalz.2007.04.388. [DOI] [PubMed] [Google Scholar]
- Milligan C, Bingley A, Gatrell A. Digging deep: Using diary techniques to explore the place of health and well-being amongst older people. Social Science & Medicine. 2005;61:1882–1892. doi: 10.1016/j.socscimed.2005.04.002. [DOI] [PubMed] [Google Scholar]
- Neilson H, Robson P, Friedenreich C, Csizmadi I. Estimating activity expenditure: how valid are physical activity questionnaires? American Journal of Clinical Nutrition. 2008;87:279–291. doi: 10.1093/ajcn/87.2.279. [DOI] [PubMed] [Google Scholar]
- O’Loughlin G, Cullen S, McGoldrick A, O’Connor S, Blain R, O’Malley S, et al. Using a wearable camera to increase the accuracy of dietary analysis. American Journal of Preventive Medicine. 2013;44(3):297–301. doi: 10.1016/j.amepre.2012.11.007. [DOI] [PubMed] [Google Scholar]
- Reeder B, Chung J, Lazar A, Joe J, Demiris G, Thompson H. Testing a theory-based mobility monitoring protocol using in-home sensors. Research in Gerontological Nursing. 2103;6(4):253–263. doi: 10.3928/19404921-20130729-02. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reeder B, Meyer E, Lazar A, Chaudhuri S, Thompson H, Demiris G. Framing the evidence for health smart homes and home-based comsumer health technologies as a public health intervention for independent aging: A systematic review. International Journal of Medical Informatics. 2013;82:565–579. doi: 10.1016/j.ijmedinf.2013.03.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Saelens B, Sallis J. Assessment of physical activity by self-report: Status, limitations, and future directions. Research Quarterly for Exercise and Sports. 2000;71(2 Suppl):S1–14. [PubMed] [Google Scholar]
- Shiffman S, Stone A, Hufford M. Ecological Momentary Assessment. Annual Review of Clinical Psychology. 2008;4:1–32. doi: 10.1146/annurev.clinpsy.3.022806.091415. [DOI] [PubMed] [Google Scholar]
- Silbert L. In-home continuous monitoring of gait speed: a sensitive method for detecting motor slowing associated with smaller brain volumes and dementia risk. Paper presented at the ICAD.2012. [Google Scholar]
- Sun F, Norrman IJ, While AE. Physical activity in older people: a systematic review. BMC Public Health. 2013;13:1–17. doi: 10.1186/1471-2458-13-449. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thielke S, Mattek N, Hayes T, Dodge H, Quinones A, Austin D, et al. Associations between observed in-home behaviors and self-reported low mood in community-dwelling older adults. Journal of the American Geriatrics Society. 2014;62(4):685–689. doi: 10.1111/jgs.12744. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tudor-Locke C, Myers A. Challenges and opportunities for measuring physical activity in sedentary adults. Sports Medicine. 2001;3(2):91–99. doi: 10.2165/00007256-200131020-00002. [DOI] [PubMed] [Google Scholar]
- van Uffelen J, Heesch K, Hill R, Brown W. A qualitative study of older adults’ responses to sitting-time questions: do we get the information we want? BMC Public Health. 2011;11:458. doi: 10.1186/1471-2458-11-458. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wallihan D, Stump T, Callahan C. Accuracy of self-reported health services use and patterns of care among urban older adults. Medical Care. 1999;37(7):662–670. doi: 10.1097/00005650-199907000-00006. [DOI] [PubMed] [Google Scholar]
- Wild K, Cotrell V. Identifying driving impairment in Alzheimer disease: A comparison of self and observer report versus driving evaluation. Alzheimer Disease and Associated Disorders. 2003;17:27–34. doi: 10.1097/00002093-200301000-00004. [DOI] [PubMed] [Google Scholar]
- Wild K, Maxwell S, Campbell I, Hayes T, Kaye J. Validation of self-reported activity data: Application of Day Reconstruction methods to a cohort of monitored elders. Paper presented at the Internation Neuropsychological Society Annual Meeting.2011. [Google Scholar]