Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Jun 10.
Published in final edited form as: J Alzheimers Dis. 2017;60(4):1611–1620. doi: 10.3233/JAD-170444

Age and Graphomotor Decision Making Assessed with the Digital Clock Drawing Test: The Framingham Heart Study

Ryan J Piers a,b, Kathryn N Devlin c, Boting Ning d, Yulin Liu b, Ben Wasserman b, Joseph M Massaro b,d, Melissa Lamar e, Catherine C Price f, Rod Swenson g, Randall Davis h, Dana L Penney i, Rhoda Au b,j,1,*, David J Libon k,1
PMCID: PMC7286350  NIHMSID: NIHMS1587537  PMID: 29036819

Abstract

Background

Digital Clock Drawing Test (dCDT) technology enables the examination of detailed neurocognitive behavior as behavior unfolds in real time; a capability that cannot be obtained using a traditional pen and paper testing format.

Objective

Parameters obtained from the dCDT were used to investigate neurocognitive constructs related to higher-order neurocognitive decision making and information processing speed. The current research sought to determine the effect of age as related to combined motor and non-motor components of drawing, and higher-order decision making latencies.

Methods

A large group of stroke- and dementia- free Framingham Heart Study participants were administered the dCDT to command and copy with hands set for “10 after 11”. Six age groups (age range 28–98) were constructed.

Results

Differences between age groups were found for total time to completion, total pen stroke count, and higher-order decision making latencies in both command and copy test conditions.

Conclusion

Longer age-related decision making latencies may reflect a greater need for working memory and increased self-monitoring in older subjects. These latency measures have potential to serve as neurocognitive biomarkers of Alzheimer’s disease and other insidious neurodegenerative disorders.

Keywords: Boston Process Approach, cognition, digital clock drawing test, graphomotor decision making, normal aging

INTRODUCTION

There is a long history regarding the Clock Drawing Test (CDT) and its use for research and clinical assessment [1]. The utility of the CDT derives, in part, from the fact that clock drawing behavior has been shown to assess a host of cognitive abilities including, but not limited to, executive and motor functioning, access to semantic memory, and visuoconstruction and visuospatial skills [26]. The CDT is inexpensive, easy to administer, requires little time to complete [7], and is often used in conjunction with other neurocognitive screening tests such as the Mini-Mental State Examination (MMSE) because of its ability to assess complimentary underlying neurocognitive deficits [8]. Because of the wide number of neurocognitive constructs that are required for successful test performance, different patterns of performance have been associated with various neurodegenerative disorders such as mild cognitive impairment (MCI) [9], Alzheimer’s disease (AD) [10], vascular dementia [2, 5, 11], and Parkinson’s disease [1215].

A major innovation has been integrating the traditional pen and paper administration of the CDT with digital technology and artificial intelligence, by using a commercially available digital pen (from Anoto Inc.) that works as an ordinary ballpoint pen while measuring its position on the page 75 times/second with a spatial resolution of ± 0.002 inch and software created at Massachusetts Institute of Technology and the Lahey Hospital and Medical Center that analyzes the digital data stream. The software classifies each pen stroke as an element of the clock (e.g., clock face, each number, hands, etc.), with any misclassifications easily corrected by the user. The spatial resolution of the pen also enables the drawing to be enlarged by up to 100x making visually apparent phenomena fractions of a millimeter in size that are not visible on the paper with the naked eye, with these phenomena detected and measured automatically by the software. Data obtained using the digital Clock Drawing Test (dCDT) [16] is time-stamped, enabling the program to capture and analyze behavior, i.e., the process by which the drawing was made, in addition to the final product.

A major innovation offered by this technology is the examination of detailed temporal cognitive processes obtained in real time that cannot be captured using the traditional pen and paper testing format. It has been well established that increasing age in adulthood is associated with decreased motor and slower information processing speed [17]. However, preliminary data using the dCDT suggests that clock drawing total time to completion may not be a single or monolithic construct. Rather, clock drawing total time to completion can be decomposed into at least three components: 1) gross drawing behavior, 2) gross non-drawing behavior, and 3) higher-order decision latency measures. Gross drawing behavior is assessed by measuring total strokes used to complete clock drawing, and the percent time participants devote to drawing (i.e., ‘ink time’). Gross non-drawing behavior is assessed by activity such as percent time when the pen is not moving on the paper (e.g., no drawing occurs), and presumably devoted to thinking (i.e., ‘think time’) [18, 19]. Finally, higher-order decision latencies measure discrete time intervals or inflection points as participants transition from one portion of their drawing to the next [20, 22]. In this sense coupling a traditional and popular neuropsychological test with digital technology and dCDT software [16] holds the promise of developing a variety of neurocognitive biomarkers able to detect subtle differences in cognitive performance previously unobtainable using traditional pen and paper assessment techniques.

In prior research Libon et al. [21] used the dCDT to assess graphomotor decision making in relapsing-remitting multiple sclerosis (MS). When compared to normal controls, patients with MS exhibited slower higher-order decision making latencies, longer latencies when transiting from one component of their drawing to the next. Moreover, these longer transitional latencies were related to greater impairment on neuropsychological tests that assessed both executive control and processing speed. On the other hand, patients with MS did not differ from healthy controls in their total proportion of time spent thinking (approximately 60%) versus drawing (approximately 40%). By comparison, in adults with depression, Cohen et al. [19] found that the dCDT was able to differentiate aspects of gross motor/nonmotor slowing, observing that younger depressed participants (age 40.98 ± 5.27) spent a larger proportion of time thinking (68%) relative to drawing compared to the older depressed participants (age 65.41 ± 4.92; 64% drawing) and to younger euthymic participants (64% drawing). A greater percent of time spent thinking was related to greater impairment on neuropsychological tests that assessed processing speed, but not those that assessed executive function. Thus, preliminary data suggests latencies of gross speed-related behavior and higher-order decision making on the dCDT reflect dissociable neurocognitive constructs.

Our current work is focused on integrating digital technology with traditional cognitive assessment, but without changing the testing experience on the part of the participants such that the traditional pen and paper experience is retained. In this paper, clock drawing behavior was captured by using a digital pen rather than a common ink pen. The purpose of current research is to use an a priori, knowledge-driven approach (1) to operationally define constructs that assess both motor and non-motor speed/related behavior and higher-order decision making latencies; and (2) to examine the differential effects of age as related to these measures.

MATERIALS AND METHODS

Participants

Established in 1948, the Framingham Heart Study (FHS) recruited 5,209 participants (the Original cohort) for a longitudinal study designed to identify common characteristics contributing to cardiovascular disease. In 1971, the biological children of the Original cohort and their spouses (the Offspring cohort) were recruited for participation [23]. Most recently, in 2001, a third generation of participants (Gen 3), the grandchildren of Gen 1 and children of Gen 2, was recruited for studies of genetic heritability of cardiovascular and cerebrovascular diseases [24].

In 1994, the FHS began recruitment of men and women between the ages of 40 and 74, consisting of Hispanic, non-Hispanic black, Asian, and Native-American residents of Framingham, Massachusetts and 24 surrounding towns. These participants are known to make up the Omni Cohort 1. In order to expand from the Omni Cohort 1 and to represent an ethnically diverse group at least 10% of the size of the Gen 3 cohort, enrollment of a second cohort of Omni participants started in 2003 and ended in July 2005. Omni Cohort 2 included many individuals related to the participants of Omni Cohort 1 and also individuals unrelated to Omni Cohort 1 members.

Between October 2011 and July 2014, 1,906 participants from the FHS were invited for neuropsychological testing as part of a study on brain aging in which digital clock drawing data were also collected. Using standard test administration instruction, participants were asked to draw the clock to command and copy with hands set for “10 after 11”. Additional neuropsychological tests were administered on these days but were not analyzed for the purposes of this study.

Of the 1,906 participants who agreed to participate in the dCDT, 1791 non-demented participants were included in the present analyses (age = 62.01 ± 13.82; MMSE = 28.94 ± 1.36; 51.62% college degree; 46.68% women; 92.29% Caucasian). Participants were excluded based on the presence of probable dementia (n = 9), clinical stroke (n = 59), or other neurologic conditions (n = 12). An additional 35 protocols (1.95% of the total subject pool) were excluded because of digital pen malfunction.

The Institutional Review Board (IRB) at Boston University Medical Center (BUMC) approved the study protocol. Informed consent was obtained from all participants.

dCDT parameters

Summary parameter

  • Total time to completion. Total time needed to complete the clock drawing in both the command and copy test conditions.

Gross drawing parameters

  • Total strokes. Total strokes or graphomotor pen marks used to complete clock drawing.

  • Percentink time’. The total time the pen is in contact with the paper for each test condition, i.e., the command and copy clock. Percent ‘ink time’ was computed as (total ‘ink time’/total time to completion) × 100, indicating the percent of total time to completion devoted to drawing. This variable provides a means to operationally define motor execution, i.e., drawing, while taking into account individual differences in total drawing time.

Gross non-drawing parameter

  • Percentthink time’. The total time the pen is not in contact with the paper, measured from the completion of the first pen stroke to the beginning of the last pen stroke. By definition, percent think time = 1 – percent ink time. Percent ‘think time’ provides a means to operationally define non-drawing cognitive activity.

Higher-order dCDT decision making latencies

Three intra-component, higher-order decision making latencies were analyzed as defined by Libon et al. [21]. The latencies were identified because they are clearly associated with major transition points in executing the drawing, e.g., transition time from drawing the clock face to what is next (which is most often the numbers in the clock) and drawing each of the hands within the clock.

  • Post-clock face circle latency. Time between drawing of clock face and whatever is drawn next.

  • Pre-1st hand latency. Time between the end of the pen stroke for whatever was drawn before the first clock hand and the beginning of the pen stroke starting to draw that clock hand.

  • Pre-2nd hand latency. Time between the end of the last stroke for whatever was drawn before the second clock hand and the beginning of the pen stroke used to draw that clock hand.

Pre-clock hand center dot and post-clock hand center dot latencies were not included for several reasons. First, many participants do not draw center dots as part of their clock drawings. Therefore, missing data could pose a statistical problem. Moreover, the presence or absence of a center does is not intregral to the production of what otherwise might be judged to be an intact drawing. Second, center dot latencies were not analyzed in the current research because of a desire to control for the number of statistical tests.

Statistical analysis

To limit the number of analyses and to guard against statistical errors, dCDT variables were first analyzed in a multivariate analysis of variance (MANOVA) with subsequent univariate analysis of variance (ANOVA) if the MANOVA yielded a significant result. Six age groups were constructed: 20 s & 30 s, 40 s, 50 s, 60 s, 70 s, and 80s+ years of age. For all MANOVA analyses, the independent variable was age group. Post hoc pariwise comparisons of independent variables with >2 levels was conducted using the Bonferroni correction following significant results in the univariate analysis. Analyses were performed using SPSS (v. 23, Chicago, IL) statistical packages. Significance was set at p < 0.05.

RESULTS

Demographic variables

All demographic information can be found in Table 1. Age ranged between 28 and 98 years. The 80s+ group included 6 participants aged 90 and older. Analysis of variance (ANOVA) found group differences for education as a function of age group (F [5, 1788] = 11.31, p < 0.001). Education was covaried in all subsequent analyses.

Table 1.

Demographics of sample

Group n = 1791
 20 & 30s 111 (6.20%)
 40s 294 (16.42%)
 50s 336 (18.76%)
 60s 483 (26.97%)
 70s 405 (22.61%)
 80s 162 (9.05%)
Age (mean±SD) 62.01±13.82
MMSE (mean±SD) 28.94±1.36
Education
 No high school degree 38 (2.13%)
 High school degree only 280 (15.66%)
 Some college 547 (30.59%)
 College degree 923 (51.62%)
Gender
 Male 955 (53.32%)
 Female 836 (46.68%)
Race
 Native American 1 (0.06%)
 Asian 65 (3.63%)
 Black/African American 42 (2.35%)
 White/Caucasian 1653 (92.29%)
 Unknown 30 (1.68%)

Summary, gross drawing, and gross non-drawing parameters

Total time to completion and total strokes

Command and copy total time to completion and total strokes were assessed with a single multivariate analysis of variance (MANOVA; Table 2). The multivariate effect for age group was significant (F [20, 3899.8] = 8.13; p < 0.001; partial eta squared = 0.040). Univariate ANOVAs found significant effects for age for both command and copy total time to completion (p < 0.001; both analyses), and command and copy total strokes (p < 0.024; both analyses).

Table 2.

Total time to completion and total strokes (means and standard deviation)

command total time to completion copy total time time to completion command total strokes copy total strokes

20 & 30s 38.49 (16.73) 28.33 (10.97) 25.46 (6.46) 23.93 (2.23)
40s 34.75 (10.74) 27.56 (9.23) 24.65 (4.64) 23.63 (2.61)
50s 38.05 (15.34) 28.22 (8.15) 25.38 (5.59) 24.16 (2.76)
60s 40.03 (23.89) 30.00 (12.54) 26.00 (8.51) 24.55 (3.11)
70s 43.19 (21.80) 31.80 (11.99) 25.92 (6.23) 24.79 (3.50)
80s+ 46.28 (19.93) 37.71 (17.96) 26.67 (5.37) 25.80 (4.37)
statistics F[5, 1776] = 10.03, F[5, 1779] = 19.23, F[5, 1776] = 2.60, F[5, 1779] = 11.46,
p < 0.001 p < 0.001 p < 0.024 p < 0.001

Post hoc Bonferroni analyses for command total time to completion revealed longer total time to completion for the 80s+ as compared to the 60 s, 50 s, 40 s, and 20 & 30 s (p < 0.019, all analyses); the 70 s compared to the 50 s and 40 s (p < 0.006), and the 60 s compared to the 40 s (p < 0.004). In the copy condition, total time to completion was longer for the 80s+ compared to all other groups (p < 0.001), and for the 70 s compared to the 50 s and 40 s (p < 0.001). For the command test condition, the 80s+ used more pen strokes compared to 40 s (p < 0.031). In the copy condition, the 80s+ used more pen strokes compared to all groups (p < 0.009); the 70 s used more pen strokes compared to the 40 s (p < 0.001); and the 60 s used more strokes than the 40 s (p < 0.002).

Percent ‘ink’ and percent ‘think’ time

Motor and non-motor behavior was also assessed for differences both between- and within- group by comparing ‘ink’ versus ‘think’ time (Table 3). These analyses showed that for all age groups ‘think time’ accounted for approximately 55–60 percent of total time to completion where ‘ink time’ accounted for 40–45 percent of total time to completion. When analyzed between groups, there were no differences regarding ‘think’ and ‘ink’ time. However, consistent with prior research [21], within-age group comparisons revealed greater ‘think’ time compared to ‘ink’ time (p < 0.001, all analyses).

Table 3.

Percent drawing (‘ink’) and percent non-drawing (‘think’) time (means and standard deviations)

command percent drawing (‘ink’) time command percent non-drawing (‘think’) time copy percent drawing (‘ink’) time copy percent non-drawing (‘think’) time

20 & 30s 0.38 (0.087) 0.62 (0.087) 0.45 (0.066) 0.55 (0.066)
40s 0.40 (0.087) 0.60 (0.087) 0.45 (0.070) 0.55 (0.070)
50s 0.38 (0.090) 0.62 (0.090) 0.45 (0.069) 0.55 (0.069)
60s 0.40 (0.084) 0.60 (0.084) 0.46 (0.076) 0.54 (0.076)
70s 0.38 (0.091) 0.62 (0.091) 0.46 (0.070) 0.54 (0.070)
80s+ 0.40 (0.090) 0.60 (0.090) 0.45 (0.083) 0.55 (0.083)
Total mean (sd) 0.40 (0.088) 0.60 (0.088) 0.46 (0.073) 0.54 (0.073)

Higher-order decision making latency

Intra-component latencies

In the command test condition the multivariate effect for age for all three intra-component latencies was significant (Hotelling’s Trace; F [15.00, 3292.8] = 3.49; p < 0.001; partial eta squared = 0.016). Univariate analyses ANOVAs were significant for pre-1st hand latency (p < 0.019) and pre-2nd hand latency (p < 0.001; Table 4). The ANOVA for post-clock face latency was not significant. Post hoc Bonferroni analyses for pre-1st hand latency found no significant differences between age groups. Post hoc Bonferroni analyses for pre-2nd hand latency found longer latency for the 80s+ compared to 60 s, 50 s, and 40 s (p < 0.029, all analyses), and 70 s compared to 50 s and 40 s (p < 0.039, both analyses).

Table 4.

Command and copy intra-component latency (means and standard deviations)

command post-clock face latency command pre-1st hand latency command pre-2nd hand latency Sum command intra-component latency

20 & 30s 1.66 (1.22) 5.09 (5.72) 1.77 (1.18) 8.50 (6.10)
40s 1.71 (1.92) 3.86 (3.08) 1.62 (1.14) 7.22 (3.86)
50s 1.63 (1.85) 4.09 (4.98) 1.70 (1.36) 7.45 (5.71)
60s 1.48 (1.71) 4.03 (3.28) 1.87 (2.40) 7.36 (4.64)
70s 1.76 (1.85) 4.38 (4.05) 2.38 (4.86) 8.47 (7.02)
80s+ 1.69 (1.22) 4.93 (4.96) 2.75 (4.48) 9.42 (8.04)
statistics ns F[5, 1765] = 2.71; F[5, 1760] = 4.72; F[5, 1745] = 4.94;
p < 0.019 p < 0.001 p < 0.001

copy post-clock face latency copy pre-1st hand latency copy pre-2nd hand latency Sum copy intra-component latency

20 & 30s 1.40 (0.83) 1.25 (0.72) 1.22 (1.00) 3.88 (1.57)
40s 1.34 (0.73) 1.25 (0.80) 1.00 (0.57) 3.61 (1.39)
50s 1.43 (0.77) 1.32 (0.84) 1.12 (0.73) 3.89 (1.57)
60s 1.44 (1.07) 1.38 (1.07) 1.09 (0.81) 3.93 (2.15)
70s 1.55 (1.01) 1.44 (1.26) 1.15 (0.83) 4.15 (2.14)
80s+ 1.87 (1.37) 2.24 (6.25) 1.61 (3.34) 5.71 (8.18)
statistics F[5, 1770] = 6.86; F[5, 1774] = 5.27; F[5, 1773] = 5.34; F[5, 1768] = 10.76,
p < 0.001 p < 0.001 p < 0.001 p < 0.001

In the copy test condition the multivariate effect for age for the three copy intra-components was significant (Hotelling’s Trace; F [15.00, 3336.2] = 5.21; p < 0.001; partial eta squared = 0.023). In subsequent univariate analyses, ANOVAs were significant for post-clock face latency (p < 0.001), pre-1st hand latency (p < 0.001), and pre-2nd hand latency (p < 0.001). Post hoc Bonferroni analyses for post-clock face latency found longer latency for the 80s+ compared to all other groups (p < 0.007, all analyses). Post hoc Bonferroni analyses for pre-1st hand latency found longer latency for the 80s+ compared to all other groups (p < 0.003, all analyses). Post hoc Bonferroni analyses for pre-2nd hand latency found longer latency for the 80s+ compared to 70 s, 60 s, 50 s, and 40 s (p < 0.001, all analyses).

Total intra-component latency

The three intra-component latencies were summed to create single command and copy intra-component measures. Both variables were analyzed with MANOVA where the overall effect for age was significant (Hotelling’s Trace; F [10.00, 2608.8] = 8.15; p < 0.001; partial eta squared = 0.03). Univariate ANOVA found a significant effect for age in the command (F [5, 1745] = 4.94; p < 0.001; partial eta squared = 0.014) and copy condition (F [5, 1768] = 10.76, p < 0.001, partial eta squared = 0.029). Post hoc analyses in the command test condition found longer total intra-component latency for 80s+ compared to 60 s, 50 s, and 40 s (p < 0.008; all analyses). Post hoc analyses in the copy test condition found longer total intra-component latency for 80s+ compared to all groups (p < 0.001, all analyses).

DISCUSSION

In the current research, we found significant effects of age for dCDT total time to completion, pen strokes, as well as higher-order decision making latencies. The effect of age regarding total time to completion and pen strokes is, perhaps, not unexpected. However, the ability to measure in, real time, higher-order, decision making latencies constitutes one of the advantages of combining digital technology with traditional neuropsychological assessment. Prior clinical research found differences regarding higher-order, decision making latencies [18, 21]. The current research extends these findings showing that this behavior is also present within the realm of normal aging.

We believe that these intra-component decision latencies likely reflect the recruitment of specific neurocognitive abilities to bring the task to fruition. That is, when transitioning from one portion of their drawing to the next (e.g., post-clock face latency), a participant may be utilizing more and/or different neurocognitive resources beyond simple processing speed. Thus, the current research, combined with our previous reports [1821], continues to demonstrate that digital technology can collect subtle and potentially clinically significant information beyond what can be obtained using traditional assessment methods. Issues to be addressed in future research are how and whether intra-component latencies do, indeed, provide greater information versus total time to completion. The added value of obtaining intra-component latencies over and above total time to completion might be supported to the extent these variables are associated with different underlying MRI neurocognitive networks and/or differenct patterns of performance as related to fluid biomarkers for MCI and dementria.

Consistent with oft observed age-related decline in information processing speed, older participates required more total time to complete their drawings in both the command and copy test conditions. These results also reinforced previous results from clinical samples [18, 21] in which the the majority of drawing time is spent not drawing (and presumably ‘thinking’) rather than actually drawing or ‘inking’. Libon et al. [21] and Cohen et al. [19] described this behavior in patients with MS and normal controls; and in younger and older patients with depression. The reasons underlying this behavior require further research. The semantic knowledge and attributes associated with many components of the clock in this heathy community dwelling sample are likely overlearned and automatized; therefore, the high proportion of non-drawing time likely reflects the higher-order mental planning necessary to bring the test to fruition. However, not all the non-drawing times are equally distributed. Thus, while certain individual components required to draw a clock may, indeed tap into overlearned knowledge, the act of combining these component parts into the requested gestalt may not be as overlearned or automatized as presumed. Further, the higher-order, decision making metrics described above may be able to distinguish components of the clock drawing test that tap into specific neuropsychological abilities related, perhaps, to attention and/or executive control. The construct of constructional apraxia as described by Kleist [25] suggests that executive control underlie the ability to successfully combine components into a required or requested whole [25]. Additional research assessing how digital clock drawing behavior is related to other neuropsychological abilities may provide new insight regarding visuoconstruction in normal aging as well as patients presenting with clinical disabilities.

The observation that the majority of the time necessary to produce a clock is spent not drawing is interesting as we turn to the age-related differences regarding intra-component latencies. Indeed, in clinical practice it is often the case that patients might struggle with time setting and/or how best to draw the numbers inside the clock face. However, the current research, obtained from a group of non-demented community dwelling participants, suggests that there are, in fact, several decision making inflection points where participants may pause before proceeding with their response; and that these inflection points, more so than planning time as a whole, are particularly susceptible to age-related changes. Thus, the process by which participants complete their drawings [26, 27] suggests that clock drawing is best viewed as a test composed of a variety of component parts.

Interestingly, greater age-related differences regarding longer intra-component latencies were found primarily in the copy rather than the command test condition. The reason(s) for this observation is not entirely clear at the present time. The relative paucity of age-related differences regarding the intra-component in the command test condition suggest that participants have access to the necessary neurocognitive strategies to combine clock components into the requested gestalt. However, age-related intra-component latency differences in the copy test condition suggests that older participants may be spending more time scanning from their drawing to the model in order to self-monitor their behavior. If this is true then total intra-component latency might be related to working memory skills. Libon et al. [21] described a negative association between longer intra-component latency and poorer performance on executive tests in patients with MS. In any case, the demonstration of greater age-related effects regarding intra-component latency in the copy, rather than the command test condition in this sample underscores the fact that the two clock drawing conditions measure complimentary, but different neuropsychological skills [4]. This has consistently been demonstrated in the analysis of clock drawings produced by patients with mild cognitive impairment and dementia [2, 5, 9, 11, 14].

There is now considerable interest in the CDT as a measure that can screen for the early emergence of neurodegenerative illness. However, the results of this research have been variable, likely because manual scoring systems are imprecise. For example, Ehreke et al. [28] studied a large sample of MCI and non-MCI patients where clock drawing was scored using a variety of scoring systems. Although between-group differences were obtained both receiver operator curves and inter-rater reliability tended to be modest. By contrast, Kato et al. [29] undertook a cross-sectional examination of patients with AD, MCI, and healthy controls. Comparisons between AD patients versus healthy controls found that a combination of reduced performance on the CDT and the MMSE were best in differentiating both MCI and AD patients from controls. In a retrospective, longitudinal analysis of patients with MCI who converted to dementia, Nesst et al. [30] examined performance on the MMSE, CDT, and Neurobehavioral Cognitive Status Examination (Cognistat). These researchers found that in using these three tests, the combination of impaired CDT and Cognistat test performance was best in predicting conversion to dementia.

Given that issues regarding inter-rater reliability and precision of measurement are obviated using the digital capture of behavior and analysis via the dCDT software, it is possible that the latency measures used in the current research could be used as objective neurocognitive biomarkers with the potential to identify emerging changes in neurocognitive functioning before the appearance of the more pronounced changes that traditional paper and pencil tests can detect. One example of this is a cross-sectional study by Souillard-Mandar et al. [31] that showed that the dCDT using a machine-learning based classifier significantly outperformed existing clock scoring algorythms that had been operationalized (eliminating interrater reliability issues) and maximized for performance when used as a cognitive screener for diverse neurological conditions. Further cross-sectional as well as longitudinal research examining dCDT performance in patients with progressive cognitive disorders may help address this question.

There has been considerable interest in using fluid biomarkers to identify the emergence MCI and/or dementia subtype. Henriksen et al. [32] has pointed out that treatment of AD has suffered because fluid biomarker are not easily obtained. Moreover, until recently there has been a decided preference for biomarkers obtained from cerebrospinal fluid or PET imaging, which is not easily obtained in general clinical practice. Palmqvist et al. [33] found that in predicting eventual conversion from MCI to dementia the combination of clock drawing and MMSE test performance was as effective as CSF biomarkers. In sum, the findings described above suggest that coupling neuropsychological assessment with digital technology and sophisticated software could identify neurocognitive indices as surrogates to expensive and clinically impractical CSF and PET imaging biomarkers.

The current research has a number of significant strengths, including a large, well-characterized community-based sample. An additional strength of the current research is the granularity, precision, and objectivity with which data was collected. However, several limitations must be acknowledged including modest effect sizes and the demographic homogeneity of the sample. Future studies with more ethnically and educationally diverse populations can test a priori hypotheses related to specific latency measures. Also, the dCDT parameters described in the current research need to be obtained from a wide range of clinical samples to assess criterion validity and to better understand the how these data may relate to emerging and/or well established neurodegenerative illness. Furthermore, while our sample included non-demented participants, given the age and the size of the sample, we cannot be sure that participants with mild cognitive impairment were not included. Finally, longitudinal studies are needed to determine predictive efficacy of these potential digital cognitive markers. Despite these limitations, parameters obtained in the current research may increase significantly the sensitivity of cognitive tools to identify persons at risk for developing neurocognitive decline well before reaching the clinical symptom threshold.

ACKNOWLEDGMENTS

This work was supported by the Framingham Heart Study’s National Heart, Lung, and Blood Institute contract (N01-HC-25195), by grants (R01-AG016495, R01-AG008122, R01-AG033040) from the National Institute on Aging, by grant (R01-NS017950) from the National Institute of Neurological Disorders and Stroke, by grant IIS-1404494 from the National Science Foundation, and by the REW Research and Education Institution. The authors thank the extraordinary participants and families of the Framingham Heart Study who made this work possible. We also acknowledge the great work of all the research assistants and study staff.

All data was collected from participants who provided written consent, that was approved by the Boston University Institution Review Board.

Footnotes

Authors’ disclosures available online (http://j-alz.com/manuscript-disclosures/17–0444r1).

DEDICATION

This research is dedicated to the memory of Edith Kaplan, PhD, who more than any other neuroscientist, understood the power of the Clock Drawing Test to elucidate brain-behavior relations in neuropsychiatric illness.

REFERENCES

  • [1].Libon DJ, Kaplan E, Swenson R, Penney DL (2010) Clock drawing In The Encyclopedia of Clinical Neuropsychology, Kreutzer J, DeLuca J, Caplan B, eds. Springer, New York. [Google Scholar]
  • [2].Cosentino S, Jefferson AJ, Chute DL, Kaplan E, Libon DL (2004) Clock drawing errors in dementia: Neuropsychological and neuroanatomic considerations. Cogn Behav Neurol 17, 74–83. [DOI] [PubMed] [Google Scholar]
  • [3].Shulman KI (2000) Clock-drawing: Is it the ideal cognitive screening test? Int J Geriatr Psychiatry 15, 548–561. [DOI] [PubMed] [Google Scholar]
  • [4].Freedman M, Leach L, Kaplan E, Winocur G, Shulman K I, Delis D (1994) Clock drawing: A neuropsychological analysis. Oxford University Press, New York. [Google Scholar]
  • [5].Libon DJ, Malamut BL, Swenson R, Sands LP, Cloud BS (1996) Further analysis of clock drawings among demented and nondemented older subjects. Arch Clin Neuropsychol 11, 193–205. [PubMed] [Google Scholar]
  • [6].Royall DR, Cordes JA, Polk M (1998) CLOX: An executive clock drawing task. J Neurol Neurosurg Psychiatry 64, 588–594. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [7].Shulman K, Shedletsky R, Silver I (1986) The challenge of time: Clock-drawing and cognitive function in the elderly. Int J Geriatr Psychiatry 1, 135–140. [Google Scholar]
  • [8].Brodaty H, Moore CM (1997) The Clock Drawing Test for dementia of the Alzheimer’s type: A comparison of three scoring methods in a memory disorders clinic. Int J Geriatr Psychiatry 12, 619–627. [PubMed] [Google Scholar]
  • [9].Ahmed S, Brennan L, Eppig J, Price CC, Lamar M, Delano-Wood L, Bangen KJ, Edmonds EC, Clark L, Nation DA, Jak A, Au R, Swenson R, Bondi MW, Libon DJ (2016) Visuoconstructional impairment in subtypes of mild cognitive impairment. Appl Neuropsychol Adult 23, 43–52. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [10].Jorgensen K, Kristensen MK, Waldemar G, Vogel A (2015) The six-item Clock Drawing Test–reliability and validity in mild Alzheimer’s disease. Neuropsychol Dev Cogn B Aging Neuropsychol Cogn 22, 301–311. [DOI] [PubMed] [Google Scholar]
  • [11].Libon DJ, Swenson R, Barnoski E, Sands LP (1993) Clock drawing as an assessment tool for dementia. Arch Clin Neuropsychol 8, 405–416. [PubMed] [Google Scholar]
  • [12].Seichepine DR, Neargarder S, Davidsdottir S, Reynolds GO, Cronin-Golomb A (2015) Side and type of initial motor symptom influences visuospatial functioning in Parkinson’s disease. J Parkinsons Dis 5, 75–83. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [13].Sallam K, Amr M (2013) The use of the Mini-Mental State Examination and the Clock-Drawing Test for dementia in a tertiary hospital. J Clin Diagn Res 7, 484–488. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [14].Price CC, Cunningham H, Coronado N, Freedland A, Cosentino S, Penney DL, Penisi A, Bowers D, Okun MS, Libon DJ (2011) Clock drawing in the Montreal Cognitive Assessment: Recommendations for dementia assessment. Dement Geriatr Cogn Disord 31, 179–187. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [15].De Pandis MF, Galli M, Vimercati S, Cimolin V, De Angelis MV, Albertini G (2010) A new approach for the quantitative evaluation of the Clock Drawing Test: Preliminary results on subjects with Parkinson’s disease. Neurol Res Int 2010, 283890. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [16].Davis R, Penney DL (2014) U.S. Patent No. 8,740,819 U.S. Patent And Trademark Office, Washington, DC.
  • [17].Salthouse TA (1996) The processing-speed theory of adult age differences in cognition. Psychol Rev 103, 403–428. [DOI] [PubMed] [Google Scholar]
  • [18].Davis R, Libon D, Au R, Pitman D, Penney DL (2014) THink: Inferring Cognitive Status from Subtle Behaviors. Proceedings of the Conference on Innovative Applications of Artificial Intelligence, Québec City, Québec, Canada; 2014, pp. 2898–2905. [PMC free article] [PubMed] [Google Scholar]
  • [19].Cohen J, Penney DL, Davis R, Libon DJ, Swenson RA, Ajilore O, Kumar A, Lamar M (2014) Digital Clock Drawing: Differentiating “thinking” versus “doing” in younger and older adults with depression. J Int Neuropsychol Soc 20, 920–928. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [20].Libon DJ, Penney DL, Lamar M, Price CC, Swenson R, Eppig J, Nieves C, Garrett KD, Davis R (2011) Assessing constructional decision making and response latencies to differentiate patients with dementia and mild cognitive impairment (MCI) using new digital technology. In Neuropsychology and Vascular Cognitive Impairment: Combining Neuropsychological Assessment, Digital Technology, and Brain Imaging to Differentiate Between Dementia, Depression, and Mild Cognitive Impairment, Libon DJ, ed. Symposium Presented at the 5th Biannual Meeting of The International Society of Vascular Behavioural and Cognitive Disorders (VAS-COG), Lille, France. [Google Scholar]
  • [21].Libon DJ, Penney DL, Davis R, Tabby DS, Eppig J, Nieves C, Bloch A, Donohue JB, Brennan L, Rife KL, Wicas G, Lamar M, Price CC, Au R, Swenson R, Garrett KD (2014) on behalf of the Clock Sketch Consortium. Deficits in processing speed and decision making in relapsing-remitting multiple sclerosis: The Digit Clock Drawing Test (dCDT). J Mult Scler 1, 1–8. [Google Scholar]
  • [22].Penney DL, Libon DJ, Au R, Lamar M, Price CC, Swenson R, Macaulay C, Garrett KD, Devine S, Delano-Wood L, Scala S, Flanagan A, Davis R (2014) Working harder but producing less: The Digital Clock Drawing Test (dCDT) differentiates amnestic mild cognitive impairment and Alzheimer’s disease. Abstract presented at the 42nd annual meeting of the International Neuropsychological Society, Seattle, Washington. [Google Scholar]
  • [23].Kannel WB, McGee DL (1979) Diabetes and cardiovascular risk factors: The Framingham Study. Circulation 59, 8–13. [DOI] [PubMed] [Google Scholar]
  • [24].Splansky GL, Corey D, Yang Q, Atwood LD, Cupples LA, Benjamin EJ, D’Agostino RB Sr, Fox CS, Larson MG, Murabito JM, O’Donnell CJ, Vasan RS, Wolf PA, Levy D (2007) The third generation cohort of the National Heart, Lung, and Blood Institute’s Framingham Heart Study: Design, recruitment, and initial examination. Am J Epidemiol 165, 1328–1335. [DOI] [PubMed] [Google Scholar]
  • [25].Benton A, Tranel D (1993) Visuoperceptual, visuospatial, and visuoconstructional disorders In Clinical neuropsychology (3rd edition), Heilman KM, Valenstien E, eds. Oxford University Press, New York. [Google Scholar]
  • [26].Kaplan E (1988) A process approach to neuropsychological assessment In Clinical neuropsychology and brain function: Research, measurement, and practice, Boll T, Bryant BK, eds. American Psychological Association, Washington, DC. [Google Scholar]
  • [27].Kaplan E (1990) The process approach to neuropsychological assessment of psychiatric patients. J Neuropsychiatry Clin Neurosci 2, 72–87. [DOI] [PubMed] [Google Scholar]
  • [28].Ehreke L, Luck T, Luppa M, König HH, Villringer A, Riedel-Heller SG (2011) Clock drawing test - screening utility for mild cognitive impairment according to different scoring systems: Results of the Leipzig Longitudinal Study of the Aged (LEILA 75+). Int Psychogeriatr 23, 1592–1601. [DOI] [PubMed] [Google Scholar]
  • [29].Kato Y, Narumoto J, Matsuoka T, Okamura A, Koumi H, Kishikawa Y, Terashima S, Fukui K (2013) Diagnostic performance of a combination of Mini-Mental State Examination and Clock Drawing Test in detecting Alzheimer’s disease. Neuropsychiatr Dis Treat 9, 581–586. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [30].Nesset M, Kersten H, Ulstein ID (2014) Brief Tests such as the Clock Drawing Test or Cognistat can be useful predictors of conversion from MCI to dementia in the clinical assessment of outpatients. Dement Geriatr Cogn Dis Extra 4, 263–270. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [31].Souillard-Mandar W, Davis R, Rudin C, Au R, Libon DJ, Swenson R, Price CC, Lamar M, Penney DL (2016) Learning classification models of cognitive conditions from subtle behaviors in the Digital Clock Drawing Test. Mach Learn 102, 393–441. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [32].Henriksen K, O’Bryant SE, Hampel H, Trojanowski JQ, Montine TJ, Jeromin A, Blennow K, Lonneborg A, Wyss-Coray T, Soares H, Bazenet C, Sjogren M, HU W, Lovestone S, Karsdal MA, Weiner MW; Blood-Based Biomarker Interest Group (2014) The future of blood-based biomarkers for Alzheimer’s disease. Alzheimers Dement 10, 115–131. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [33].Palmqvist S, Hertze J, Minthon L, Wattmo C, Zetterberg H, Blennow K, Londos E, Hansson O (2012) Comparison of brief cognitive tests and CSF biomarkers in predicting Alzheimer’s disease in mild cognitive impairment: Six-year follow-up study. PLoS One 7, e38639. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES