Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Mar 27.
Published in final edited form as: MobileHCI. 2016 Sep;2016:465–477. doi: 10.1145/2935334.2935383

Mobile Manifestations of Alertness: Connecting Biological Rhythms with Patterns of Smartphone App Use

Elizabeth L Murnane 1, Saeed Abdullah 1, Mark Matthews 1, Matthew Kay 2, Julie A Kientz 3, Tanzeem Choudhury 1, Geri Gay 1, Dan Cosley 1
PMCID: PMC6436843  NIHMSID: NIHMS989846  PMID: 30931436

Abstract

Our body clock causes considerable variations in our behavioral, mental, and physical processes, including alertness, throughout the day. While much research has studied technology usage patterns, the potential impact of underlying biological processes on these patterns is under-explored. Using data from 20 participants over 40 days, this paper presents the first study to connect patterns of mobile application usage with these contributing biological factors. Among other results, we find that usage patterns vary for individuals with different body clock types, that usage correlates with rhythms of alertness, that app use features such as duration and switching can distinguish periods of low and high alertness, and that app use reflects sleep interruptions as well as sleep duration. We conclude by discussing how our findings inform the design of biologically-friendly technology that can better support personal rhythms of performance.

Keywords: J.3 Life and Medical Sciences: Health, Circadian Rhythms, Alertness, Sleep, Mobile App Use

INTRODUCTION

In today’s increasingly technological, constantly-connected, hustle-bustle societies, mounting pressures to boost alertness and work output are leading to a commonly encountered mindset that human performance can be optimized — even “hacked” [60] — in order to sustain maximum levels of lasting productivity. Personal productivity tools further reinforce an ideal of persistent “busyness” [49]. Additionally promoted is an “early bird gets the worm” attitude, which argues that the most productive people rise the earliest and implies that adopting such behavior is a reliable path to reaching high levels of achievement and success [2].

Such attempts at optimizing performance rarely take into account both inter- and intra-individual variability in biological characteristics [84] — namely the “internal timing” of the body clock [73], which produces individually-variable fluctuations in cognitive and physical performance. Similarly, technologies aimed at supporting productivity are typically designed on assumptions that our capabilities over the course of a day are steady (or could be made steady).

In reality, biological clocks influence our performance levels, which naturally rise and fall throughout the day [14]. In addition, the circadian rhythms these body clocks generate vary between individuals. A person’s “chronotype” represents his or her unique circadian profile [16], which manifests in biological and behavioral differences such as the timing of hormone secretions [73] and the predisposition to sleep and wake earlier or later [30] (i.e., “early birds” and “night owls”).

Circadian misalignment results when any behavior — including sleeping, waking, or doing cognitive tasks — occurs at the wrong phase with respect to one’s underlying circadian rhythms [79]. Problems rooted in circadian misalignment affect daily life for millions of people [58]. Chronic circadian misalignment is linked to increased mortality in animal studies; and while it is often confounded with sleep deprivation for humans, research associates detrimental cognitive, behavioral, and physiological consequences with even brief circadian misalignment [44]. Working out of sync with our individual circadian rhythms of performance can thus not only be frustrating and fruitless but actually harmful, with serious long-term consequences for our health and well-being [30].

We see an opportunity for “circadian-aware” technologies that account for individual, innate variations in performance and reduce the risks of circadian misalignment. Calendars, for instance, typically treat hours and tasks as commodities instead of helping people schedule in accordance with their own historical patterns of alertness. Similarly, notifications arrive at any time of day or night on the sender’s schedule, not the receiver’s — and though there has been much research around interruption management (e.g., [7]), it tends to focus on minimizing disruption rather than whether the person currently has the cognitive capability to respond to a particular kind of notice. A greater awareness of our innate biological rhythms could positively impact the way we design such technology, which could in turn support improved productivity and overall well-being on a broadly deployable scale.

Given that smartphones now mediate a wide range of daily behaviors from work to entertainment activities, mobile usage can provide informative cues for understanding individual performance. This is especially true for the young individuals comprising our population of interest in this study, for whom mobile ownership and usage — as well as the risks of circadian misalignment — are particularly high.

As the first study to investigate the relationship between circadian rhythms and mobile app use, this paper thus builds on chronobiological foundations about daily performance rhythms and uses logged mobile behaviors to make the following contributions:

  • Using data from 20 participants over 40 days, we illustrate daily and weekly rhythmic patterns in application use, particularly when it comes to productivity and entertainment app use, the volume and timing of which show associations with biological aspects including chronotype traits, alertness levels, and sleep behaviors. We incorporate qualitative interviews to further contextualize and interpret these statistical findings.

  • Through these analyses, we show how mobile-mediated behavioral traces can be leveraged as part of circadian-aware passive sensing of latent biological characteristics.

  • Finally, we provide future steps and discuss how our findings can inform the design of novel technologies for mobile and beyond. We consider both tools that increase awareness of personal rhythms and current alertness levels as well as adaptive systems that can model circadian aspects of daily functioning, personalize experiences accordingly, and overall more flexibly support productivity and well-being.

RELATED WORK

Understanding and Leveraging Technology Use

It is now well-established that ownership and use of smartphones are at all-time highs and climbing. In the United States, over 90% of people own cellphones and 72% own smartphones [66], and it is estimated 80% of adults globally will have a smartphone by 2020 [26]. The most recent statistics report that ownership has already reached 86% for our study population, U.S. “millennials” in the 18-34 year old age range, who are also the heaviest and most habituated users: 52% claim they could not last more than 24 hours without their phones; 90% sleep with or next to their phones; 54% report checking their phones “almost constantly”; and 90% check at least once an hour even during social situations such as meals, meetings, and conversations [11].

This increasing ubiquity of mobile phones has resulted in a growing interest to understand smartphone use. Many studies have focused on depicting common quantitative patterns in usage data. For instance, researchers have developed automated and manual usage sampling frameworks in order to report on prevalent types of applications, typical chains of application use, and when and where apps get used [10, 12, 27, 28]. Such contextual information has also been used to model app launching, navigation, organization, and revisitation behaviors [34, 40] as well as to make personalized app launching predictions [77, 87, 88]. Smaller scale qualitative studies have provided insights into how factors such as users’ socio-economic status [67], emotions [48], and social context [21] can all influence individuals’ smartphone use habits.

A related strand of research has focused on how usage data can reflect characteristics of users at both individual and aggregate levels. Particularly relevant to our work are studies that leverage usage patterns to investigate and model aspects of cognition or sleep.

For example, attention has been linked to the use of particular types of computer and mobile apps (e.g., email, messaging, notification trays) as well as certain usage behaviors (e.g., window switching and lapses in device use) [52, 64], while inattention has been associated with short bursts of smartphone use [62]. Such usage behaviors along with demographic information and contextual data (e.g., time, location, light levels) have also been used to model boredom [52, 65] and proneness to boredom based on the types and amount of smartphone app use [54]. Finally, phone use data (e.g., logs of app use, outgoing and incoming communication, and screen unlocking) have been used to predict sleep stages [55], duration [39], and quality [6].

Such studies have identified consistent patterns of technology use and have associated emotional or psychological states such as boredom and inattention with particular usage behaviors. However, research has yet to provide satisfying trait-based explanations of these patterns. In particular, we argue for considering individuals’ unique circadian rhythms. In this study, we investigate such biologically-rooted factors to offer novel insights into idiosyncratic use behaviors.

Circadian Rhythms of Alertness

For humans, nearly all physiological and neurobehavioral processes follow roughly 24 hour cycles, referred to as circadian rhythms [30]. Many aspects of cognitive performance including alertness, attention, reaction time, response inhibition, short-term and working memory, and higher executive skills also follow rhythmic patterns [9].

Numerous empirical studies have examined these variations in human performance, which are modulated by a two-process model of sleep regulation based on synchronous and opposite mechanisms. The homeostatic need for sleep follows an hourglass process that accumulates during wake time and abates during sleep, while the wake-promoting drive is a circadian process following an approximately 24 hour cycle [30].

As fatigue and the need for sleep accumulate while awake, an accompanying decrease occurs in cognitive ability and alertness [9]. These effects can become severe. For shift workers, the increased chance of accidents and injury due to fatigue is well established [68]. More generally, the impairment effects of fatigue coupled with the endogenous decrements in cognitive functioning over the day have been equated to alcohol intoxication [46]. Fatigue also hinders meta-cognition and one’s ability to self-assess and recognize performance reductions [23], which may lead people to rationalize the sacrifice of sleep and disregard the well-studied negative impacts of sleep loss on performance, further compounding fatigue-based performance losses [83].

In this study, we focus on alertness, a cornerstone of cognitive performance [76], since it correlates with a number of cognitive functions [4], displays substantial variation over the course of a day [14], and deteriorates considerably after lost and interrupted sleep [61].

Circadian-Aware Technology

Research in chronobiology typically studies facets of circadian rhythms and alertness through controlled laboratory experiments in artificial settings. Responding to the need for more in-situ, longitudinal, inexpensive, and broadly deployable strategies [70], the area of “circadian computing” has recently emerged within HCI with the aim of leveraging technology usage traces to passively sense biological rhythms and related behaviors. Preliminary work has focused on modeling sleep events (e.g., onset and duration) and sleep-related circadian misalignments (e.g., social jet lag and sleep inertia) based on screen on-off patterns [2] and social media and communication data [59].

These early successes in using technology-mediated behaviors for circadian assessment suggest that consideration of circadian rhythms will be powerful for both explaining patterns of technology use and for improving personalization of those technologies.

In this study, we go beyond sleep modeling to explore and interpret a number of relationships among full-day functioning, mobile use, and latent biological traits. In doing so, we also help contribute to a circadian-aware sensing framework by identifying ways to capture and analyze usage patterns in order to passively detect idiosyncratic biological rhythms.

METHOD

In this section, we describe characteristics of our sample and the data we collected in order to capture their biological traits, cognitive functioning, and mobile application use.

Participants and Procedure

In this study, we focus on university-aged individuals, who are at particular risk of circadian misalignment [73]. Additional negative impacts of circadian misalignment that face this population include learning deficits and impairments to cognitive performance [17], problems with attention and procrastination [20], and increased stress and risk of drug and alcohol consumption [80]. At the same time, these individuals also face a higher risk for developing anxiety, depression, and other mental and emotional health problems due to mounting academic demands and pressures to succeed [42]. Further, a mobile-usage focused methodology is particularly appropriate for this group given they are the largest, fastest growing, and most habituated users of mobile technologies, as described earlier [11, 78].

Our data collection framework, described below, was developed specifically for Android phones, so we required participants to be regular Android users. We recruited using public mailing lists, recruitment portals, and snowball sampling. Our final sample consisted of 20 participants (7 males and 13 females) who were all 18-29 years old, Android users, and willing to participate for the full duration of the study.

To onboard participants, we invited them to our lab, where we conducted an entry interview and installed, tested, and demonstrated our data collection tools on their phones. We also explained procedures, which were administered over the next 40 days. After that point, participants returned to the lab so that we could conduct an exit interview and download collected data. Compensation ranged up to $262 based on interviews, logged application use data, and the number of completed sleep diaries and momentary assessments. All collected data were anonymized and encrypted, and Cornell’s Institutional Review Board approved all study procedures.

Data

We now describe those data, which included a daily sleep diary, a 4-times-per-day alertness assessment, phone application use logs, and in-person interviews.

Sleep and Chronotype

Sleep (including lost and misaligned sleep) has well-known effects on cognitive performance and alertness [32]. Participants in our study completed once-daily sleep diaries based on diaries from prior work [2, 59]. Diary entries included questions about the prior night’s bedtime, minutes to fall asleep, number of wakeups during the night, wake time, total sleep duration, sleep disruptions experienced, perceived feelings upon waking, presence and duration of groggy feelings after waking, and overall alertness and sleepiness [19]. Participants received a mobile notification each day at 10:30am reminding them to complete the sleep diary, which had 73% compliance. Prior research validates the reliability of this self-report journaling for in-situ measurement of per-night sleep [63], and it is considered less intrusive than body- or environment-based sensors such as actigraphy.

As previously described, an individual’s chronotype reflects his or her unique circadian rhythms, which underlie numerous biological processes, including sleep. To measure chronotype, we administered the Munich ChronoType Questionnaire (MCTQ) [75] during recruitment. The MCTQ has been clinically validated against sleep logs, the Morningness- Eveningness Questionnaire [37], biochemical rhythms (e.g., melatonin and cortisol), and actimetry data [73, 74]. The MCTQ includes questions about sleep and activity timings on both work and work-free days in order to estimate chronotype based on the midpoint between sleep onset and waking on free days (MSF) — i.e., days without an externally-imposed work or school schedule (typically weekends). MSF is corrected (MSFSC) to account for longer sleep durations taken on free days (SDF) to compensate for sleep debt accumulated on work days (SDW) [75]. Thus, chronotype is a continuous variable quantified as:

MSFSC=MSF0.5(SDF(5SDW+2SDF)7)

Figure 1 illustrates the distribution of chronotypes for our participants along an established early-late spectrum for a general population [73]. For such a general population, average MSFSC falls closer to the 4:00-6:00 range. Our participants trend later (average MSFSC = 5:56) as expected given their ages. Considering the typically narrow chronotype range associated with this age group, our sample actually provides a relatively wide variability of chronotypes [73, 75]. Our sample’s distribution is similar to a larger sample of university students (N=281, average MSFSC = 5:46) from other work [59] as well as a larger sample of students to whom we deployed the MCTQ for further comparison (N=206, average MSFSC = 6:16), altogether helping to verify our participants are representative of our population of interest.

Figure 1.

Figure 1.

Distribution of participant chronotypes.

Since how “early” or “late” a person is considered depends on attributes of their population such as age as well as other factors like timezone, chronobiologists use the chronotype distribution of the particular population of interest in order to determine a fair early vs. late threshold for that group [72]. Thus we follow prior work [69] and treat MSFSC <= 5:00 as early and MSFSC > 5:00 as late given our sample’s young age range and corresponding later-skewed chronotype distribution [35]. This split allows us to obtain groups acceptably balanced in size and produces an allocation of early and late types similar to that in prior research focusing on the same age group [2]. It also provides the highest level of agreement between MCTQ-measured chronotype, self-perceived lateness reported during interviews, and earlyness/lateness assessed via the Morningness-Eveningness Questionnaire [37], which we also administered to participants during recruitment as an additional check on their early/late classifications.

Momentary Assessments

To objectively assess alertness throughout the day, participants completed a brief ecological momentary assessment (EMA) delivered through a smartphone application we developed. Specifically, the EMA included a 3-minute version of PVT-Touch [45], a validated smartphone-based psychomotor vigilance task (PVT). The PVT, which is sensitive to changes in alertness [8] and is immune to practice or learning effects [47], measures alertness by displaying a visual stimulus and recording the elapsed milliseconds before a tactile response. Alertness performance for a given session is computed according to its percent deviation from that individual’s baseline, where individual baseline is computed as the mean reaction time across all test sessions, after removing false starts and outliers 2.5 standard deviations above or below the mean [84].

We delivered the EMA four times daily at the start of time windows defined by prior work [1] for morning, afternoon, evening, and late night to increase the breadth of coverage across the day. Participants could complete the assessment anytime within the time window, providing further variation in the collection times. For the morning, afternoon, and evening windows, the average compliance rate was over 75%; the late night window overlaps with sleep [41] and thus had an (expected) lower coverage of 14%.

Application Usage Logs

Participants also installed the AWARE framework, which senses mobile application usage [29]. We captured the timestamped log of any app coming to the foreground (e.g., app launches or switches), which we refer to as “usage events”, along with the duration of use. Most of our analyses focus on the instances of app foregrounding since prior work shows this is an informative portrayal of usage behaviors [10, 12, 34, 40, 77], facilitates comparison with related research using the same metric, and is less prone to measurement errors than metrics like duration [21]. As they are not indicative of user behaviors [10], we disregard background apps with which the user does not interact as well as system-generated activity such as automated notifications.

To categorize participants’ logged applications, we followed prior research [10] and used the app’s developer-specified category in the official Android application market, Google Play, where each app is associated with a single category.

We filtered out the Tools category since its apps relate to launcher processes, system activities, and settings, which either are not user-originated actions or do not provide the sorts of insight we desire into individuals’ app usage behavior [10], with the exception of clock and weather apps, which we relabeled into a new Time & Weather category. We also filtered out Health & Fitness apps since they comprised less than 0.1% of all usage events and by only 3 participants — a proportion of our sample similar to other findings that only 16% of 18-29 year olds use mobile health apps [31]. Future work could do well to recruit Health & Fitness app users in order to explore the relation between app usage and physical performance, which also exhibits well-known circadian fluctuations [24].

Again following prior work [10], we separated web browsers and email apps from Communication into more fine-grained Browser and Email categories. Then, to facilitate analyses and since we are particularly interested in how the use of productivity versus entertainment-oriented applications might reflect circadian rhythms of performance, we consolidated a number of related apps into higher level categories: Entertainment contains apps originally categorized as Entertainment, Games, Media & Video, Music & Audio, Photography, or Shopping; and Productivity contains apps originally categorized as Productivity, Business, Education, or Finance.

Manually inspecting all apps within each category, two authors independently verified, discussed, and came to full agreement that similar kinds of apps were folded together and that each resultant parent category fairly represented its contained applications. These categories are shown in Table 1, along with information about the unique number of apps participants used from each category and the total number of usage events.

Table 1.

Categories of applications used by participants along with examples and amounts of applications and usage events.

Category Example Apps # of Apps # of Usage Events
Browser Chrome, Firefox 10 17683
Communication Facebook Messenger, GroupMe, Phone, SMS 33 32906
Email Gmail, Inbox 3 5142
Entertainment Clash of Clans, Ebay, Netflix, YouTube 60 9863
Productivity Evernote, OfficeSuite, To Do Reminder, Piazza 47 3146
Social Media Facebook, Twitter, Yik Yak 14 27693
Time & Weather Clock, Timely, Weather Channel 12 1702

Interviews

To help interpret patterns of alertness, technology usage, and potential links between the two, we conducted in-person interviews with each participant at the start and end of the study. Interviews included questions about sleep-wake behaviors and perceived connections among one’s alertness, fatigue, sleep, and time of day. We also asked about technology usage habits, including thoughts about technology’s impact on alertness, fatigue, or sleep. Finally, we asked about experiences with productivity software and reactions to the idea of circadian-aware tools. We contextualize quantitative results with representative quotes throughout the paper.

RESULTS

Application Usage

We begin by exploring the types of applications our participants use along with temporal trends in those use patterns, across individuals and both within and over days. Wherever possible, we compare our findings to those from prior work, both to help assess whether our sample is representative of larger populations and to highlight and interpret new findings.

Daily Rhythms in Application Usage

Aggregating participants’ usage events, we find the trends illustrated in Figure 2, with app use overall at its lowest in early morning, steadily rising and remaining relatively high from approximately noon until late evening, and then dropping off. These trends are similar to those observed in prior studies on daily mobile, computer, and Internet usage (e.g., [10, 53]).

Figure 2.

Figure 2.

Hourly app usage by category.

The most heavily used types of applications across all hours of the day are communication and social media apps. Communication app use is highest between late morning and midnight, with peaks mid-afternoon and evening, similar to trends other research has found for phoning and texting [59]. Usage of social media apps — which, compared to those in the communication category, are used more for consuming social content rather than communicating — have maximum usage levels between 7pm and midnight, similar to prior findings that social media is most heavily used in the late evening [10, 53, 59].

Browser use is relatively stable from morning until late night, except for dips around 3pm and 10pm, as is email use, which gradually declines from late morning onward. The use of time & weather apps spikes around 8–9am, which makes sense since participants’ sleep diaries indicate nearly 60% of wake times are within an hour of 8:30am and more than 75% of participants use the phone as their alarm (similar to findings that over 80% of individuals use a workday alarm [71]).

Finally, entertainment apps are used more during the same morning period as well as in mid-afternoon and late night, while productivity apps show usage peaks at points later in the morning, afternoon, and evening with a dip mid-day and dropping off past late evening. These patterns are similar to those found in prior work [10, 52], though shifted an hour or so later, likely because our sample is younger and therefore trends later in terms of activity timing [75].

Weekly Usage Trends

We also find a distinction in the use of entertainment and productivity apps across days of the week. Figure 3 presents the percentage of use for each category on each day, showing a reversed “scissor” pattern also found in other work [69].

Figure 3.

Figure 3.

Use across the week of productivity and entertainment apps shown with standard error.

For our participants, “work days” (i.e., days on which alarms are used [71]) are Monday—Friday and “free days” correspond to the weekend (Saturday—Sunday). At the beginning of the work week on Monday and Tuesday, we see over 40% of productivity-based usage events occurring, while Friday and weekend days see the least use of productivity apps — except for Wednesday, when only 8% of use events are productivity-related. Inversely, Wednesday is the day when entertainment apps are used the most, followed by Friday and weekends. This mid-week dip resembles a common mid-week sentiment dip found in other work [3], and our participants express experiencing a high degree of fatigue on Wednesdays related to their class schedules — though further study is required to see if this mid-week effect is consistent inside and outside of college populations.

Circadian Rhythms Reflected in Application Usage

Overall, these patterns replicate and expand past findings and provide descriptive insight into types and temporal patterns of mobile application usage. Yet, while this helps increase our understanding of what individuals are doing with their phones over the course of hours and days, more work is necessary to understand why. In this section, we thus look to chronobiology to add explanatory bite.

Usage Relative to Internal Time and Alertness

We first explore how usage patterns vary for different chronotypes. As mentioned, chronotype modulates nearly all biological functions [84], including alertness performance [16]. Simply put, earlier chronotypes are more alert earlier in the day, and later chronotypes function at their peak alertness later [38].

Comparing the amount of usage events between early and late types across parts of the day suggests this distinction might be reflected in differing usage patterns, particularly of productivity and entertainment apps. Figure 4 shows these statistically significant differences in app use (p < .05 using Wilcoxon sign-rank tests) between early and late types, broken down by application category and time of day. Bars above (or below) the y axis indicate early types use that type of app at that time of day the indicated percentage more (or less) than late types.

Figure 4.

Figure 4.

Percentage increase (positive y value) or decrease (negative y value) in amount of usage by early types compared to usage by late types of productivity and entertainment apps across the day.

That is, in the earlier half of the day, we see early types use approximately 25% more productivity apps than late types and 19–29% fewer entertainment apps; while the opposite effect is observed for evening and night usage, when early types use 15–50% fewer productivity apps and 22–68% more entertainment apps than late types.

As mentioned, alertness exhibits well-known fluctuations over the course of a day [4]. The pattern of peaks and dips in alertness is roughly the same for everyone, but there are individual differences in the phase of these rhythms that are reflected by one’s chronotype. To align the phases of circadian rhythms for different chronotypes, we can shift temporal analyses of usage patterns to a measure of time that is adjusted to take chronotype into account.

“External time” (ExT, also known as “clock time” or “local time”) is the number of hours that have elapsed since midnight (the midpoint of night-time) [18]. “Internal time” (InT, also known as “body clock time” or “biological time”) is the number of elapsed hours since an individual’s sleep midpoint, MSFSC (the midpoint of a person’s biological night) [73, 84]. Internal time is therefore a corrected measure of time that reflects individual chronotype, calculated as:

InT=ExTMSFSC

Considering mobile app use in terms of internal time rather than external time, we see associations with our participants’ innate biological rhythms of alertness (based on PVT performance). Productivity and entertainment apps specifically show the strongest associations with performance of all app types in our sample. Specifically, we find a strong positive correlation between performance and productivity app usage (r = 0.52, p < .001) — that is, higher alertness performance is associated with more usage of productivity apps. We do also find an inverse relationship between performance and entertainment apps (r = −0.31, p < .05), indicating that lower alertness relates to increased entertainment app use, though the correlation is more moderate. Important to note is that these strong to moderately strong statistically significant associations between alertness and app use do not hold true when computed using external time.

Figure 5 illustrates alertness together with usage of productivity and entertainment apps over the course of the “biological day”. Inspecting these trends beginning with the midpoint of biological night (hour 0 of Internal Time), we see alertness levels gradually rising from the end of sleep through the wakeup phase. During this same period, usage of entertainment apps is over 2.4 times higher compared to productivity apps. These findings resonate with the concept of “sleep inertia”, which can last for hours, reflects the transition period from sleep to full wakefulness, and is characterized by diminished alertness and vigilance in attention [75]. Nearly three quarters of interviews support this idea of an association between groggy wakeups and morning entertainment app use (e.g., “I’ll stay on the phone longer, browsing YouTube, etc, if I’m more tired.”)

Figure 5.

Figure 5.

Temporal trends in application use (Usage) and alertness performance (Performance) across internal body-clock time (InT). Usage axis is proportion (normalized to [0,1] scale) of all an app category’s usage events that occurred in a given hour. Performance axis is percent deviation from individual baseline of alertness measured in a given hour. Internal time axis is number of hours since biological midnight, and accompanying spectrum indicates periods of the biological day.

Following this wakeup period, alertness performance eventually peaks approximately 7 hours after sleep midpoint, which agrees with trends found in prior research [14, 83, 84]. At the same time, the use of productivity applications also ramps up and reaches its own daily maximum, while the use of entertainment apps falls to one of its minimum levels. Both the well-studied mid-day alertness dip (during which productivity is known to drop [14, 56, 57]) and evening rebound [22, 50] are also observed in our participants’ alertness patterns and align with a productivity app use dip and peak, respectively.

Finally, as biological night approaches, alertness is known to fade [84]. Our data show this same trend in diminished alertness. (The outlier spikes at InT=1:00 and InT=23:00 result from sparse data since this period overlaps with sleep). In parallel, productivity app use also falls off while entertainment app use stays more elevated. In interviews, participants commonly mentioned nightly habits related to watching videos or playing games (e.g., “Every time before I go to bed, I play a card game until I feel sleepy.”)

Gauging Alertness Level from App Use Features

We next explore how alertness may be reflected through additional usage features beyond the time of day an app is used. Using technology for a longer amount of time has been associated with procrastination, inattention, and lack of devoted concentration [59]. In addition, switching among different tasks and computer windows has shown relations with capacity for sustained attention, distractibility, and boredom [52, 53]. Such prior work suggests that the metrics of app use duration, diversity, and switching (defined as follows) may therefore be particularly relevant to alertness.

  • Duration: Mean # of seconds per usage session during T

  • Diversity: Total # of distinct apps used during T

  • Switching: Total # of app switches during T

We calculate these features based on usage in a given hour window (T) surrounding an alertness measurement and use Mann-Whitney-Wilcoxon tests to compare these features during low and high alertness states. Guided by prior work, we set thresholds for low and high alertness according to whether a PVT measurement is above or below that participant’s individual baseline — i.e., is a positive value in the range (0, 1] or is a negative value in the range [−1, 0), respectively [84].

To clarify, a usage “session” represents a period of interaction marked by unlocking the phone and is comprised of any number of app foreground events. Our participants’ overall durations of usage show good agreement with durations found in prior work [10, 28, 88]. During periods of low alertness, we find duration of use is over 20% higher as seen in Table 2. Interviews agree usage becomes more “bottomless”, “stuck”, and “harder to get off”. Participants also switch apps 33% more when alertness is low, though they do not necessarily switch among a larger set of distinct apps, as app diversity shows no significant difference between alertness states.

Table 2.

Median values of use features during low vs. high alertness. Significant differences in medians marked on variable name.

Low Alertness High Alertness
Duration* 103.4 seconds 85.8 seconds
Diversity 2.87 apps 2.82 apps
Switching* 32 switches 24 switches
*

p < .05

Connecting App Use and Alertness with Sleep

Lastly, we study app use, alertness, and sleep. Variation in performance is made most evident by sleep loss [32], with the largest effects on alertness, working memory, and cognitive throughput [51]. In addition to impairing performance directly, inadequate sleep is also associated with subsequent feelings of fatigue [46]. Conversely, extending sleep enhances learning and problem solving [86]; with adequate sleep duration improving energy, alertness, and reaction time [43, 75].

Comparing sleep duration according to participants’ sleep journals with their app use the following day, we find that less sleep is correlated with less productivity-oriented usage (r =0.43, p < .05) and more use of entertainment apps (r = −0.19, p < .05), for both weekdays and weekends.

A more coarse grained measure of sleep duration adequacy also shows statistically and practically significant differences. Specifically, if we consider sleep lasting 7–9 hours as “adequate” following established guidelines and prior research [15, 59], then we find participants use productivity apps an average of 61% more after nights of adequate as opposed to inadequate sleep (Cohen’s d = 0.48, p < .05) while entertainment apps are used 33% more on average after an inadequate amount of sleep (Cohen’s d = 0.24, p < .05).

Interviews provide qualitative detail about sleep loss and subsequent fatigue manifesting through increased usage of entertainment-based apps, which participants described as enabling “mindless”, “passive” interactions; while on the other hand, they associated feeling rested, energized, and alert with more “intentional”, “directed”, and “productive” usage.

Finally, recent studies continue to suggest links between nightly technology usage and sleep problems — particularly when it comes to usage delaying sleep onset and cutting into sleep time [13, 59, 81, 85]. We find a mild relationship between the number of experienced sleep interruptions according to daily diaries and the number of app usage events sensed between sleep onset and waking (rs = 0.46, p < 0.05). This indicates app use data might help to assess sleep disruptions — though, the phone itself might be a culprit of disruption in the first place. In interviews, the majority of our participants described turning their phone to silent overnight to avoid such sleep interruptions. However, even phantom notifications can sometimes awaken them (e.g., “Sometimes I wake up as if I’m expecting something, like an email or a text, and will check my phone. I imagine that if I didn’t have technology, I’d have a sound sleep.”)

These results demonstrate how app usage can provide informative signals when assessing sleep duration, interruption, and associated feelings of alertness and fatigue. However, they also reveal a disruptive potential of mobile devices and the corresponding adherence-overuse tension, which deserves careful consideration from system designers as well as researchers who leverage digital traces from these systems.

DISCUSSION

Our goal in this research was to bring a biological perspective to the interpretation of how and why individuals use their phones in particular ways. In this study, we focused on using mobile application logs to explore how patterns of mobile use relate to and can reflect chronotype, alertness, and sleep. Going beyond prior works’ descriptions of diurnal variations in app use, we offer biological factors behind these variations, which both hold explanatory power and provide opportunities for modeling and intervention.

Connecting Usage Patterns and Alertness Rhythms

A central contribution of our work is using biological rhythms to provide a foundation for explaining rhythms of alertness and how they may manifest in technology-mediated behaviors. Higher level theoretical constructs such as cognitive engagement and 2-axis models of attentional states have been used in prior work studying digital activity [52], which has employed ‘Boredom’ as an explanatory variable. However, boredom is a higher order construct that is underpinned by lower level processes like sustained attention and vigilance in alertness [25], and chronobiology offers an explanation of individual daily rhythms in these aspects of performance.

We expect that throughout the day, people fluctuate to an extent across these attentional state boundaries depending on their tasks, interactions, and other contextual factors — but that circadian rhythm factors present a consistent limit to our cognitive performance and hence are most helpful in understanding individual performance (and in turn, could be practically useful in the planning of cognitively demanding activities).

The Key Role of Chronotype

Aggregating temporal trends in mobile app use, we observed patterns similar to those found in prior work, with overall usage elevated in late morning and late evening. Focusing more closely on the inverse patterns of productivity and entertainment apps, we found productivity apps reach usage peaks in late morning, late afternoon, and evening, as well as on most workdays; while entertainment apps are most used in early morning, mid-day, and late night, as well as on weekends and during the mid-week dip on Wednesday. Further, given the known impact of sleep on performance, we examined sleep in relation to app use and found that less sleep correlates with less productivity app use and more entertainment app use.

Interested in why individuals might be using mobile applications in these ways, we looked to the chronobiology behind cognitive functioning and first compared the app use of early and late chronotypes. We found that during earlier parts of the day, early types use productivity apps a statistically significant amount more and entertainment apps a significant amount less than late types, who exhibited the opposite patterns.

We further took the internal time of the body clock into consideration by analyzing usage trends with an adjusted timescale based on biological time rather than external clock time. Findings affirmed that both early and late types use productivity apps during their optimal performance times. We then compared alertness states using additional usage features, finding that when alertness is lower, individuals use their phones for more time and switch back and forth more between apps.

Our finding that accounting for internal time makes temporal relationships between app use and alertness clearer has broad implications for studies that model human behavior across time. As an example, consider research that aggregates sentiment using external clock time to study daily rhythms in mood expressed in Twitter (e.g., [33]). The same analyses, but corrected to internal time, would be intriguing; and given the extent mood correlates with alertness and circadian rhythms [43], we suspect the results would be even more striking.

Alertness, Social Media, and Communication Apps

Though many of our analyses focused on the app types that consistently exhibited the strongest patterns (productivity and entertainment apps), we would like to point out that social media and communication apps, which together accounted for over half of all usage events, can also send informative signals. Specifically, we found these apps’ usage to be elevated during biological morning, and nearly all participants described using social media apps as a way to “ease” themselves into the day (e.g., “To wake myself up, I’ll have to look at things on the phone like Facebook or Tumblr.”)

Also notable is the fact that participants’ morning classes often fall within their phase of sleep-inertia. The majority of participants described using their phones in lectures when tired, bored, or unable to concentrate, for instance to help keep themselves awake — which they explained was particularly necessary for morning classes (e.g., “In morning classes, I have less attention and am very tired so I’ll browse the phone. Using tactics like social media, I focus on the screen to try to keep my eyes open.”) Such findings suggest that the learning impairments associated with lectures being scheduled at biologically unsuitable times may be further compounded by this compensating phone usage, given the negative impacts on learning associated with this type of distracting technology usage in the classroom [36].

Throughout the day, participants appear to continue turning to social media and communication apps when experiencing low alertness (e.g., “I go for apps that don’t require much mental energy when fatigued. Facebook, YikYak.”), including during the mid-day dip when these apps are used more than any other category.

Reaching the day’s end, we found social media and communication apps also interplayed significantly with behaviors before and during sleep, with over 50% of sensed sleep interruptions corresponding to social media app use alone. All participants but two reported in interviews that they use their phone within 30 minutes of sleep, and half of participants reported using it “immediately” before (e.g., “I use my phone directly before bedMessenger, email, Facebook. Any notification.”) This suggests that social media use is a major driver of our finding that the number of nightly usage events moderately correlates with experienced sleep interruptions.

Altogether, these findings align with usage patterns found in other research in this area, with circadian rhythms providing explanatory power. Our work builds on such research by providing evidence that biological rhythms exert a strong influence on patterns of alertness and by demonstrating how these patterns could potentially be detected automatically through smartphone app use.

Biologically-Friendly Productivity Technology

This ability to leverage these mobile traces to model alertness and other circadian rhythms can support circadian-aware systems: technologies that could account for individual, biologically-driven levels of alertness and that would likely be practically useful for improving productivity. Particularly promising classes of technology include scheduling tools, circadian-adaptive systems, and those that raise awareness of individual rhythms. Below, we consider each in turn.

First, technology for improving scheduling could take alertness rhythms into account. For instance, a circadian-aware calendar could match alertness models with tasks that are more or less cognitively intensive. It might also recommend group meetings or study sessions at times when most participants are likely to be closer to peak alertness, suggest groups whose members share similar chronotypes and might synchronize more easily, or make recommendations for class scheduling that align with students’ daily and weekly alertness patterns.

Another fertile area is the development of adaptive systems that could automatically personalize experiences or alter interface displays based on circadian profiles and current alertness levels. For instance, mobile notification delivery could better align distractions and interruptions with circadian rhythms of cognition. Productivity tools that block access to potentially distracting websites or applications might adjust their restricted usage times to match those when a particular user needs to protect periods of high alertness.

A third line of research could pursue designs that increase awareness of both current state and the impact our biological clock has on alertness. Since people may not be aware of their alertness in the moment [23] nor have a good sense of why and when they experience alertness fluctuations, systems capable of momentary alertness detection and longer term alertness prediction could help individuals make more informed choices — for instance when it comes to deciding when to study or when contemplating whether an “all-nighter” will be productive in the long run or instead lead to diminishing returns.

More generally, these tools might help both students and educators become more attuned to how individual rhythms relate to the times students are expected to cognitively perform. Already at-risk students under academic pressures are increasingly turning to stimulants to artificially heighten performance and extend working hours [5]; however, a chronobiological perspective suggests that consistently elevated alertness is ill-advised and contradicts our biology. Thus, we argue that our research and design ideas should not be posed as helping us work harder, longer hours. Rather, by incorporating an awareness of internal biological timing into research on alertness and technology, we hope to move towards a vision of systems that are designed in a way to support individually flexible work timing, healthy productivity goals, and overall well-being.

Considerations, Limitations, and Future Work

We focused on smartphone usage because of a number of advantages outlined earlier; however, depending solely on behavioral signals from phones does ignore useful data from other sources. In particular, we are likely missing some use of productivity tools that work better on devices with larger screens and better input methods (e.g., laptops), which would be desirable to consider in future work. Further, our mobile-based momentary assessment approach is itself a source of potential confounds as it can impact attention [82] and usage, suggesting the value of exploring more unobtrusive sensing. Such alternative sensing strategies could also capture aspects of alertness that may manifest through non-use of phones.

Likewise, it would be worthwhile to study phone usage behaviors beyond instances of app use (i.e., an app taking the foreground). Many of our analyses focused on this starting point of app use since it is a well accepted, easily comparable, and reliably captured metric, as described earlier. We note that this does mean a 15 second interaction and a 15 minute interaction could potentially appear the same way as individual usage events; however, the former case would likely look different because other app use events would tend to happen during the additional 14.75 minutes, especially considering the amount of app switching we observed. Still, we recognize the value in future work to look deeper into other metrics of usage such as duration, revisitations, or chains of app use that may offer additional insights.

In addition, although our application categorization is broadly useful, it is unable to account well for apps that can be used in ways that map to both low and high alertness. For instance, elevated use of entertainment apps is observed in both low and high alertness states. A potential explanation suggested by interviews is that “lightweight” games that do not require much mental energy (e.g., “mindless puzzle games”) are used primarily when fatigued, bored, distracted, and for procrastination; whereas games played when feeling energetic and alert tend to require more focus and attention (e.g., “active strategy games”). Similarly, correlations between our participants’ email use and alertness, together with interview data, suggest that checking email may be more productivity-oriented earlier in the day and more about “killing time” or socializing later on and especially before bed. Modeling the actual behaviors enacted in apps, though challenging, might therefore give a clearer picture of the relationships between biology, alertness, and technology use.

Finally, it would be desirable to investigate how our findings generalize beyond this study’s population of interest. For reasons provided earlier, we focused on college students as a vulnerable and valuable population to study, but patterns are likely different for people of other age groups or who have different roles and work responsibilities. It is also possible that users of non-Android phones behave differently. Further, we did not explicitly control for characteristics like class schedules, course load, or a number of other factors that might exert an influence. Thus extending this work to larger and more diverse samples would be a natural future direction, as would expanding models to include additional characteristics of participants and their contexts. Still, as the first study looking at mobile application use as a soft measure for studying circadian rhythms of alertness, we have obtained a variety of useful findings relevant to researchers interested in chronobiology, mobile sensing, or personalized technology design.

CONCLUSION

In this study, we combined passively collected smartphone data with qualitative data and an understanding of circadian rhythms of performance to gain a richer picture of both individual patterns of smartphone use as well as how usage trends reflect latent biological characteristics.

Drawing on a body of work from chronobiology to factor in biological influences, our study complements prior research on how external factors such as work environment, type of task, or individual motivational factors can affect focus and performance. Accounting for underlying biological rhythms sheds light on how app usage logs can reflect chronotype, alertness, and sleep — particularly in terms of productivity and entertainment app usage, which have daily and weekly rhythms, differ in amount and timing for different chronotypes, align with trends in alertness performance, and correlate with adequate and inadequate sleep.

This increased understanding of innate, idiosyncratic patterns of alertness and how they may manifest through app use behaviors introduces possibilities for modeling alertness and other biological characteristics from unobtrusively captured mobile application data. In turn, such circadian-aware passive sensing can lead to opportunities for systems that support smarter scheduling, provide circadian-adaptive experiences, and raise individual and institutional awareness of the lack of synchronicity between timetables and optimal cognitive performance.

Given that biological rhythms of alertness vary from individual to individual and that working contrary to them can have serious negative health and productivity consequences, these visions of biologically-friendly technology can have considerable impacts on individual, organizational, and societal levels.

ACKNOWLEDGMENTS

This work was partially supported by a grant from the Robert Wood Johnson Foundation and the Health Data Exploration Project, by the Intel Science & Technology Center for Pervasive Computing (ISTC-PC), and by the National Science Foundation under grant SCH-1344613. Elizabeth Murnane was supported by the National Science Foundation Graduate Research Fellowship under grant DGE-1144153.

REFERENCES

  • 1.Khaled Abdel-Kader, Manisha Jhamb, Lee Anne Mandich, Jonathan Yabes, Keene Robert M, Scott Beach, Buysse Daniel J, and Unruh Mark L. 2014. Ecological momentary assessment of fatigue, sleepiness, and exhaustion in ESKD. BMC nephrology 15, 1 (2014), 29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Abdullah Saeed, Matthews Mark, Murnane Elizabeth L., Gay Geri, and Choudhury Tanzeem. 2014. Towards circadian computing: early to bed and early to rise makes some of us unhealthy and sleep deprived. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing ACM, 673–684. [Google Scholar]
  • 3.Abdullah Saeed, Elizabeth L Murnane Jean MR Costa, and Choudhury Tanzeem. 2015. Collective Smile: Measuring Societal Happiness from Geolocated Images. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing ACM, 361–374. [Google Scholar]
  • 4.Anderson John AE, Campbell Karen L, Tarek Amer, Grady Cheryl L, and Lynn Hasher. 2014. Timing is everything: Age differences in the cognitive control network are modulated by time of day. Psychology and aging 29, 3 (2014), 648. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Arria Amelia M and DuPont Robert L. 2010. Nonmedical prescription stimulant use among college students: why we need to do something and what we need to do. Journal of addictive diseases 29, 4 (2010), 417–426. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Bai Yin, Xu Bin, Ma Yuanchao, Sun Guodong, and Zhao Yu. 2012. Will you have a good sleep tonight?: sleep quality prediction with mobile phone. In Proceedings of the 7th International Conference on Body Area Networks 124–130. [Google Scholar]
  • 7.Bailey Brian P and Iqbal Shamsi T. 2008. Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management. ACM Transactions on Computer-Human Interaction (TOCHI) 14, 4 (2008), 21. [Google Scholar]
  • 8.Basner Mathias, Mollicone Daniel, and Dinges David F. 2011. Validity and sensitivity of a brief psychomotor vigilance test (PVT-B) to total and partial sleep deprivation. Acta astronautica 69, 1l (2011), 949–959. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Blatter Katharina and Cajochen Christian. 2007. Circadian rhythms in cognitive performance: methodological constraints, protocols, theoretical underpinnings. Physiology & behavior 90, 2 (2007), 196–208. [DOI] [PubMed] [Google Scholar]
  • 10.Böhmer Matthias, Hecht Brent, Schöning Johannes, Krüger Antonio, and Bauer Gernot. 2011. Falling asleep with Angry Birds, Facebook and Kindle: a large scale study on mobile application usage. In Proceedings of the 13th international conference on Human computer interaction with mobile devices and services 47–56. [Google Scholar]
  • 11.Braun Research Center. 2015. Trends in Consumer Mobility Report. (2015). [Google Scholar]
  • 12.Brown Barry, McGregor Moira, and McMillan Donald. 2014. 100 days of iPhone use: understanding the details of mobile device use. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services ACM, 223–232. [Google Scholar]
  • 13.Geir Scott Brunborg, Rune Aune Mentzoni, Molde Helge, Myrseth Helga, Knut Joachim Mår Skouverøe, Bjørvatn, and Pallesen Ståle. 2011. The relationship between media use in the bedroom, sleep habits and symptoms of insomnia. Journal of sleep research 20, 4 (2011), 569–575. [DOI] [PubMed] [Google Scholar]
  • 14.Carrier Julie and Monk Timothy H. 2000. Circadian rhythms of performance: new trends. Chronobiology international 17, 6 (2000), 719–732. [DOI] [PubMed] [Google Scholar]
  • 15.Chen Mei-Yen, Wang Edward K, and Jeng Yi-Jong. 2006. Adequate sleep among adolescents is positively associated with health status and health-related behaviors. BMC Public Health 6, 1 (2006), 59. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Correa Angel, Lara Tania, and Antonio Madrid Juan. 2013. Influence of Circadian Typology and Time of Day on Temporal Preparation. Timing & Time Perception 1, 2 (2013), 217–238. [Google Scholar]
  • 17.Curcio Giuseppe, Ferrara Michele, and Gennaro Luigi De. 2006. Sleep loss, learning capacity and academic performance. Sleep medicine reviews 10, 5 (2006), 323–337. [DOI] [PubMed] [Google Scholar]
  • 18.Daan Serge and Merrow Martha. 2002. External time-internal time. Journal of biological rhythms 17, 2 (2002), 107–109. [DOI] [PubMed] [Google Scholar]
  • 19.Dement William C and Vaughan Christopher. 1999. The promise ofsleep: A pioneer in sleep medicine explores the vital connection between health, happiness, and a good night’s sleep. Dell Publishing Co. [Google Scholar]
  • 20.Digdon Nancy L and Howell Andrew J. 2008. College students who have an eveningness preference report lower self-control and greater procrastination. Chronobiology international 25, 6 (2008), 1029–1046. [DOI] [PubMed] [Google Scholar]
  • 21.Minh Tri Do Trinh, Blom Jan, and Gatica-Perez Daniel. 2011. Smartphone usage in the wild: a large-scale analysis of applications and context. In Proceedings of ICMI. 353–360. [Google Scholar]
  • 22.SM Doran HPA Van Dongen, and Dinges David F. 2001. Sustained attention performance during sleep deprivation: evidence of state instability. Archives italiennes de biologie 139, 3 (2001), 253–267. [PubMed] [Google Scholar]
  • 23.Dorrian Jillian, Lamond Nicole, Holmes Alexandra L, Burgess Helen J, Roach Gregory D, Fletcher Adam, Dawson Drew, and others. 2003. The ability to self-monitor performance during a week of simulated night shifts. SLEEP 26, 7 (2003), 871–877. [DOI] [PubMed] [Google Scholar]
  • 24.Drust B, Waterhouse J, Atkinson G, Edwards B, and Reilly T. 2005. Circadian rhythms in sports performance—an update. Chronobiology international 22, 1 (2005), 21–44. [DOI] [PubMed] [Google Scholar]
  • 25.Eastwood John D, Frischen Alexandra, Fenske Mark J, and Smilek Daniel. 2012. The unengaged mind defining boredom in terms of attention. Perspectives on Psychological Science 7, 5 (2012), 482–495. [DOI] [PubMed] [Google Scholar]
  • 26.Evans Benedict. 2014. Mobile is Eating the World. [Google Scholar]
  • 27.Falaki Hossein, Mahajan Ratul, Kandula Srikanth, Lymberopoulos Dimitrios, Govindan Ramesh, and Estrin Deborah. 2010. Diversity in smartphone usage. In Proc. of the 8th international conference on Mobile systems, applications, and services ACM, 179–194. [Google Scholar]
  • 28.Ferreira Denzil, Goncalves Jorge, Kostakos Vassilis, Barkhuus Louise, and Dey Anind K. 2014. Contextual experience sampling of mobile application micro-usage. In Proceedings of the 16th international conference on Human computer interaction with mobile devices & services ACM, 91–100. [Google Scholar]
  • 29.Ferreira Denzil, Kostakos Vassilis, and Dey Anind K. 2015. AWARE: mobile context instrumentation framework. Frontiers in ICT 2 (2015), 6. [Google Scholar]
  • 30.Foster R and Kreitzman L. 2011. The Rhythms Of Life: The Biological Clocks That Control the Daily Lives of Every Living Thing. (2011). [Google Scholar]
  • 31.Fox Susannah and Duggan Maeve. 2013. Tracking for health. Pew Research Center. [Google Scholar]
  • 32.Goel Namni, Basner Mathias, Rao Hengyi, and Dinges David F. 2013. Circadian rhythms, sleep deprivation, and human performance. Progress in molecular biology and translational science 119 (2013), 155. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Golder Scott A and Macy Michael W. 2011. Diurnal and seasonal mood vary with work, sleep, and daylength across diverse cultures. Science 333, 6051 (2011), 1878–1881. [DOI] [PubMed] [Google Scholar]
  • 34.Hang Alina, Luca Alexander De, Hartmann Jonas, and Hussmann Heinrich. 2013. Oh app, where art thou?: on app launching habits of smartphone users. In Proc. of the 15th international conference on Human-computer interaction with mobile devices and services 392–395. [Google Scholar]
  • 35.Hasher Lynn, Goldstein David, and May Cynthia P. 2005. It’s About Time: Circadian Rhythms, Memory, & Aging. [Google Scholar]
  • 36.Hembrooke Helene and Gay Geri. 2003. The laptop and the lecture: The effects of multitasking in learning environments. Journal of computing in higher education 15, 1 (2003), 46–64. [Google Scholar]
  • 37.Horne Jim A and Ostberg Olov. 1975. A self-assessment questionnaire to determine morningness-eveningness in human circadian rhythms. International journal of chronobiology 4, 2 (1975), 97–110. [PubMed] [Google Scholar]
  • 38.Horne James A and Ostberg Olov. 1977. Individual differences in human circadian rhythms. Biological Psychology 5, 3 (1977), 179–190. [DOI] [PubMed] [Google Scholar]
  • 39.Huang Ke, Ding Xiang, Xu Jing, Chen Guanling, and Ding Wei. 2015. Monitoring Sleep and Detecting Irregular Nights through Unconstrained Smartphone Sensing. (2015). [Google Scholar]
  • 40.Jones Simon L, Ferreira Denzil, Hosio Simo, Goncalves Jorge, and Kostakos Vassilis. 2015. Revisitation analysis of smartphone app use. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing ACM, 1197–1208. [Google Scholar]
  • 41.Juda Myriam, Vetter Cdline, and Roenneberg Till. 2013. The Munich chronotype questionnaire for shift-workers (MCTQShift). Journal of biological rhythms 28, 2 (2013), 130–140. [DOI] [PubMed] [Google Scholar]
  • 42.Kadison Richard and DiGeronimo Theresa Foy. 2004. College of the overwhelmed: The campus mental health crisis and what to do about it. Jossey-Bass. [Google Scholar]
  • 43.Kamdar Biren B, Kaplan Katherine A, Kezirian Eric J, and Dement William C. 2004. The impact of extended sleep on daytime alertness, vigilance, and mood. Sleep medicine 5, 5 (2004), 441–448. [DOI] [PubMed] [Google Scholar]
  • 44.Karatsoreos Ilia N, Bhagat Sarah, Bloss Erik B, Morrison John H, and McEwen Bruce S. 2011. Disruption of circadian clocks has ramifications for metabolism, brain, and behavior. Proceedings of the national Academy of Sciences 108, 4 (2011), 1657–1662. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Kay Matthew, Rector KyIe, Consolvo Sunny, Greenstein Ben, Wobbrock Jacob O, Watson Nathaniel F, Kientz Julie, and others. 2013. PVT-touch: Adapting a reaction time test for touchscreen devices. In Pervasive Computing Technologies for Healthcare. IEEE, 248–251. [Google Scholar]
  • 46.Lamond Nicole and Dawson Drew. 1999. Quantifying the performance impairment associated with fatigue. Journal of sleep research 8, 4 (1999), 255–262. [DOI] [PubMed] [Google Scholar]
  • 47.Lamond Nicole, Jay Sarah M, Dorrian Jillian, Ferguson Sally A, Roach Gregory D, and Dawson Drew. 2008. The sensitivity of a palm-based psychomotor vigilance task to severe sleep loss. Behavior research methods 40, 1 (2008), 347–352. [DOI] [PubMed] [Google Scholar]
  • 48.Lee Hosub, Sang Choi Young, and Kim Yeo-Jin. 2011. An adaptive user interface based on spatiotemporal structure learning. Communications Magazine, IEEE 49, 6 (2011), 118–124. [Google Scholar]
  • 49.Leshed Gilly and Sengers Phoebe. 2011. I lie to myself that i have freedom in my own schedule: productivity tools and experiences of busyness. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems ACM, 905–914. [Google Scholar]
  • 50.Lim Julian and Dinges David F. 2008. Sleep deprivation and vigilant attention. Annals of the New York Academy of Sciences 1129, 1 (2008), 305–322. [DOI] [PubMed] [Google Scholar]
  • 51.Lim Julian and Dinges David F. 2010. A meta-analysis of the impact of short-term sleep deprivation on cognitive variables. Psychological bulletin 136, 3 (2010), 375. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Mark Gloria, Iqbal Shamsi T, Czerwinski Mary, and Johns Paul. 2014a. Bored mondays and focused afternoons: The rhythm of attention and online activity in the workplace. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 3025–3034. [Google Scholar]
  • 53.Mark Gloria, Wang Yiran, and Niiya Melissa. 2014b. Stress and multitasking in everyday college life: an empirical study of online activity. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 41–50. [Google Scholar]
  • 54.Matic Aleksandar, Pielot Martin, and Oliver Nuria. 2015. Boredom-computer interaction: Boredom proneness and the use of smartphone. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing ACM, 837–841. [Google Scholar]
  • 55.Min Jun-Ki, Doryab Afsaneh, Wiese Jason, Amini Shahriyar, Zimmerman John, and Hong Jason I. 2014. Toss’n’turn: smartphone as sleep and sleep quality detector. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems ACM, 477–486. [Google Scholar]
  • 56.Monk Timothy H. 2005. The post-lunch dip in performance. Clinics in sports medicine 24, 2 (2005), e15–e23. [DOI] [PubMed] [Google Scholar]
  • 57.Monk Timothy H, Buysse Daniel J, Reynolds Charles F, and Kupfer David J. 1996. Circadian determinants of the postlunch dip in performance. Chronobiology international 13, 2 (1996), 123–133. [DOI] [PubMed] [Google Scholar]
  • 58.Moturu Sai T, Khayal Inas, Aharony Nadav, Pan Wei, and Pentland Alex. 2011. Using social sensing to understand the links between sleep, mood, and sociability. In Privacy, Security, Risk and Trust (PASSAT) and 2011 IEEE Third International Conference on Social Computing (SocialCom) IEEE, 208–214. [Google Scholar]
  • 59.Murnane Elizabeth L, Abdullah Saeed, Matthews Mark, Choudhury Tanzeem, and Gay Geri. 2015. Social (media) jet lag: how usage of social technology can modulate and reflect circadian rhythms. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing ACM, 843–854. [Google Scholar]
  • 60.O’Brien Danny. 2004. Life Hacks: Tech Secrets of Overprolific Alpha Geeks. In Emerging Tech Conference. [Google Scholar]
  • 61.Oginska Halszka and Pokorski Janusz. 2006. Fatigue and mood correlates of sleep length in three age-social groups: School children, students, and employees. Chronobiology international 23, 6 (2006), 1317–1328. [DOI] [PubMed] [Google Scholar]
  • 62.Oulasvirta Antti, Rattenbury Tye, Ma Lingyi, and Raita Eeva. 2012. Habits make smartphone use more pervasive. In Personal & Ubiquitous Computing, Vol. 16 105–114. [Google Scholar]
  • 63.Patel Sanjay R, Ayas Najib T, Malhotra Mark R, White David P, Schernhammer Eva S, Speizer Frank E, Stampfer Meir J, and Hu Frank B. 2004. A prospective study of sleep duration and mortality risk in women. SLEEP 27, 3 (2004), 440–444. [DOI] [PubMed] [Google Scholar]
  • 64.Pielot Martin, Oliveira Rodrigo de, Kwak Haewoon, and Oliver Nuria. 2014. Didn’t you see my message?: predicting attentiveness to mobile instant messages. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems ACM, 3319–3328. [Google Scholar]
  • 65.Pielot Martin, Dingler Tilman, Pedro Jose San, and Oliver Nuria. 2015. When attention is not scarce-detecting boredom from mobile phone usage. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing ACM, 825–836. [Google Scholar]
  • 66.Poushter Jacob. 2016. Smartphone Ownership and Internet Usage Continues to Climb in Emerging Economies. Pew Research Center. [Google Scholar]
  • 67.Rahmati Ahmad, Tossell Chad, Shepard Clayton, Kortum Philip, and Zhong Lin. 2012. Exploring iPhone usage: the influence of socioeconomic differences on smartphone adoption, usage and usability. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services 11–20. [Google Scholar]
  • 68.Roach Gregory D, Dawson Drew, and Lamond Nicole. 2006. Can a Shorter Psychomotor Vigilance Task Be Used as a Reasonable Substitute for the Ten-Minute Psychomotor Vigilance Task? Chronobiology international 23, 6 (2006), 1379–1387. [DOI] [PubMed] [Google Scholar]
  • 69.Roenneberg Till. 2012. Internal time: Chronotypes, social jet lag, and why you’re so tired. Harvard Press. [Google Scholar]
  • 70.Roenneberg Till. 2013. Chronobiology: the human sleep project. Nature 498, 7455 (2013), 427–428. [DOI] [PubMed] [Google Scholar]
  • 71.Roenneberg Till, Allebrandt Karla V, Merrow Martha, and Vetter Cdline. 2012. Social jetlag and obesity. Current Biology 22, 10 (2012), 939–943. [DOI] [PubMed] [Google Scholar]
  • 72.Roenneberg Till, Keller Lena K, Fischer Dorothee, Matera Joana L, Vetter Cdline, and Winnebeck Eva C. 2015. Human Activity and Rest In Situ. Methods in enzymology 552 (2015), 257–283. [DOI] [PubMed] [Google Scholar]
  • 73.Roenneberg Till, Kuehnle Tim, Juda Myriam, Kantermann Thomas, Allebrandt Karla, Gordijn Marijke, and Merrow Martha. 2007. Epidemiology of the human circadian clock. Sleep medicine reviews 11, 6 (2007), 429–438. [DOI] [PubMed] [Google Scholar]
  • 74.Roenneberg Till, Kuehnle Tim, Pramstaller Peter P, Ricken Jan, Havel Miriam, Guth Angelika, and Merrow Martha. 2004. A marker for the end of adolescence. Current Biology 14, 24 (2004), R1038–R1039. [DOI] [PubMed] [Google Scholar]
  • 75.Roenneberg Till, Wirz-Justice Anna, and Merrow Martha. 2003. Life between clocks: daily temporal patterns of human chronotypes. Journal of biological rhythms 18, 1 (2003), 80–90. [DOI] [PubMed] [Google Scholar]
  • 76.Schmidt Christina, Collette Fabienne, Cajochen Christian, and Peigneux Philippe. 2007. A time to think: circadian rhythms in human cognition. Cognitive Neuropsychology 24, 7 (2007), 755–789. [DOI] [PubMed] [Google Scholar]
  • 77.Shin Choonsung, Hong Jin-Hyuk, and Dey Anind K. 2012. Understanding and prediction of mobile application usage for smart phones. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing ACM, 173–182. [Google Scholar]
  • 78.Smith Aaron and Page Dana. 2015. The Smartphone Difference. Pew Research Center. [Google Scholar]
  • 79.Smith Mark R and Eastman Charmane I. 2012. Shift work: health, performance and safety problems, traditional countermeasures, and innovative management strategies to reduce circadian misalignment. Nature and science of sleep 4 (2012), 111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80.Taylor Daniel J and Bramoweth Adam D. 2010. Patterns and consequences of inadequate sleep in college students: substance use and motor vehicle accidents. Journal of Adolescent Health 46, 6 (2010), 610–612. [DOI] [PubMed] [Google Scholar]
  • 81.Thomée Sara, Härenstam Annika, and Hagberg Mats. 2011. Mobile phone use and stress, sleep disturbances, and symptoms of depression among young adults-a prospective cohort study. BMC public health 11, 1 (2011). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Thornton Bill, Faires Alyson, Robbins Maija, and Rollins Eric. 2015. The Mere Presence of a Cell Phone May be Distracting. Social Psychology (2015). [Google Scholar]
  • 83.HPA Van Dongen and Dinges David F. 2005. Circadian rhythms in sleepiness, alertness, and performance. In Principles and practice of sleep medicine. 435–443. [Google Scholar]
  • 84.Vetter Céline, Juda Myriam, and Roenneberg Till. 2012. The influence of internal time, time awake, and sleep duration on cognitive performance in shiftworkers. Chronobiology international 29, 8 (2012), 1127–1138. [DOI] [PubMed] [Google Scholar]
  • 85.Victoria J, Rideout MA, Ulla G, Foehr PD, Donald F, and Roberts PD. 2010. Generation M2. Media in the lives of 8- to 18-year-olds. Kaiser Foundation Research; (2010). [Google Scholar]
  • 86.Wagner Ullrich, Gais Steffen, Haider Hilde, Verleger Rolf, and Born Jan. 2004. Sleep inspires insight. Nature 427, 6972 (2004), 352–355. [DOI] [PubMed] [Google Scholar]
  • 87.Xu Ye, Lin Mu, Lu Hong, Cardone Giuseppe, Lane Nicholas, Chen Zhenyu, Campbell Andrew, and Choudhury Tanzeem. 2013. Preference, context and communities: a multi-faceted approach to predicting smartphone app usage patterns. In Proceedings of the 2013 International Symposium on Wearable Computers. ACM, 69–76. [Google Scholar]
  • 88.Yan Tingxin, Chu David, Ganesan Deepak, Kansal Aman, and Liu Jie. 2012. Fast app launching for mobile devices using predictive user context. In Proceedings of the 10th international conference on Mobile systems, applications, and services ACM, 113–126. [Google Scholar]

RESOURCES