Abstract
Passive sensing technology shows promise in capturing elements of adolescent mental health. Research testing if there is a signal between adolescents’ self-reports of same-day mental health indicators and passively sensed variables can be inconsistent, particularly with metrics pertaining to digital behaviors. Moreover, little is known if adolescent participation in passive sensing research is biased with respect to demographics and general metrics of mental health. The current research tested these aims among adolescents recruited from a large and diverse sample participating in an ongoing longitudinal study. Adolescents (aged 15–20; N = 131) participated in a 90-day passive sensing study, which collected data on both digital (keystroke, app usage) and offline (sleep, physical activity) behaviors. Although correlations indicated a small signal between same-day mental health indicators and several passively sensed variables (e.g., proportion of typed negative words, call behaviors), associations typically disappeared when disaggregating between- from within-person associations. Additionally, participation uptake was low but there was little evidence of bias in participation or data coverage based on mental health risk or demographics. Results demonstrate the feasibility of collecting passive sensing data with a diverse sample of adolescents, but barriers remain on adolescent willingness to engage in this research and the strength of signal between passively sensed variables and self-report constructs.
Keywords: Adolescence, Mental Health, Passive Sensing, Digital Technology, Smartphone
General Scientific Summary
Passive sensing technologies are potentially transformative in understanding adolescent mental health; however, the extent to which passively sensed data maps onto self-report constructs and the degree of bias in passive sensing samples requires further study. This study found a small and inconsistent signal between passively sensed behaviors and self-reported mental health constructs. The recruited sample, although small, was not systemically biased on demographics or mental health characteristics.
Mental health symptoms among U.S. adolescents are increasing (Keyes et al., 2019), and experts are searching for novel ways to respond to what is perceived to be a mental health crisis (Uhlhaas et al., 2023). Adolescents engage frequently with digital devices, such as smartphones (Faverio & Sidoti, 2024), raising the possibility that digital mental health assessments could be tailored to meet adolescents in the online spaces where they are spending much of their time (Nisenson et al., 2021). Passive sensing offers a potentially transformative method for collecting real-time metrics on markers of mental health risk (Modecki et al., 2019). These methods provide opportunities to capture both offline (e.g., sleep, physical activity) and online experiences (e.g., time spent, types of online engagement), which may be linked to mental health.
The current research reports on a 90-day passive sensing study in which adolescents were recruited from an existing large and diverse sample. First, we tested whether adolescents’ reports of metrics relevant to mental health (positive affect, negative affect, sleep quality) were related to same-day offline and online passively sensed behaviors, allowing us to test for a signal between passively sensed data and the mental health constructs with which they are predicted to be associated. Second, we leveraged the existing sampling frame to test for bias in passive sensing study participation. We asked whether adolescents who belonged to more vulnerable subgroups based on their socio-economic status, prior mental health status, and demographic features were less likely to participate in passive sensing. Understanding potential bias in participation is important as it informs which adolescents may be most likely to be included, and eventually to benefit from, studies that rely on advances in passive sensing.
Passive Sensing to Capture Offline and Online Behaviors Relevant for Mental Health
The current research employed the Effortless Assessment Research System app (EARS) to measure a host of online and offline adolescent behaviors (Lind et al., 2018, 2023). Passively sensed online behaviors included keystrokes (i.e., total words and proportion of self-referential, absolute, negative, and positive words transmitted), call status (i.e., incoming and outgoing call duration, number of calls placed or received), and app usage (i.e., total time spent on apps). Passively sensed offline experiences included motion (i.e., sleep duration, physical activity, stationary time) and GPS1 (i.e., time traveled, time at home)2.
Most passive sensing research examining mental health outcomes among young people has focused on college student and young adult samples (Beames et al., 2024). Less attention has been paid to middle-to-late adolescence, a period in which the onset of numerous mental disorders rises and peaks (Uhlhaas et al., 2023). It is critical to target those in this adolescent stage to help inform prevention efforts. Through unobtrusive sensing methods, just-in-time interventions can be deployed to provide support to adolescents experiencing mental health difficulties (Coppersmith et al., 2022). Targeting typically developing adolescents is necessary to successfully implement prevention efforts, as passive sensing studies focusing on clinical subgroups may miss at-risk adolescents who are not yet showing heightened symptomology.
Passive sensing studies among young people indicate that associations between general social communication features (i.e., call status; smartphone screentime) and mental health are inconsistent (Beames et al., 2024). Among adolescents, a within-person study found that daily outgoing calls are associated with greater same-month (but not next-month) anxiety and depressive symptoms, and daily incoming calls are associated with greater same-day and next-day anxiety symptoms (Rodman et al., 2021). In contrast, a between-person study found that outgoing calls are associated with lower depressive symptoms (MacLeod et al., 2021). Likewise, smartphone screen time has been associated with poorer mental health indicators in between-person (MacLeod et al., 2021; Marin-Dragu et al., 2023) but not necessarily within-person research (Rodman et al., 2024). Differences in analytic approaches could explain these conflicting results, such as by adopting a within-person versus between-person approach (Fisher et al., 2018).
Given the heterogeneity of digital experiences (Maheux et al., 2025), broad assessments of online behaviors should be complemented with specific assessments of how adolescents are utilizing technologies, such as by measuring specific content transmitted. Keystroke data may be especially valuable. A 90-day study with adolescents found that typing a greater proportion of positive words predicted more positive next-day mood, and typing a greater proportion of negative words predicted more negative next-day mood; in addition, baseline internalizing symptoms predicted typing a greater proportion of negative words, a lower proportion of positive words, and greater first-person pronoun use (Li et al., 2023). Similar patterns have been observed among adult samples (Byrne et al., 2021; Liu et al., 2022), although other work with adolescents suggests that these associations are not always consistent (Funkhouser et al., 2023; McNeilly et al., 2023). Greater use of absolutist words has been linked with internalizing symptoms in research examining online forums (Al-Mosaiwi & Johnstone, 2018), although this association was not observed in social app usage (Funkhouser et al., 2023).
Among young people in general, shorter sleep duration and poorer sleep quality are linked with greater internalizing symptomology (Beames et al., 2024), with rigorous smartphone passive sensing research supporting these linkages among adolescents at the daily-level (Šutić & Novak, 2024). In addition, objectively-measured physical activity is associated with better mental health; however, the existing body of research is mainly cross-sectional, and fewer studies have examined this relation among adolescents (Gianfredi et al., 2020). In one exception featuring a small clinical sample (N = 22) of late adolescents, daily vigorous physical activity was linked with lower daily stress and higher daily positive affect (Baynham et al., 2025).
Although a growing body of research has applied passive sensing methods among typically developing adolescents, there is still much to be learned with how offline and online experiences relate to adolescent mental health. The existing literature is often inconsistent and/or relies on small, selective samples. One source of these inconsistencies could be how passively sensed data is examined analytically, as some studies examine within-person linkages whereas others aggregate passively sensed data, thereby exploring between-person linkages. Identical effects are often not observed at the between- and within-person level (Fisher et al., 2018), necessitating the use of rigorous, multi-day studies to carefully parse these associations and understand two distinct questions. First, by examining within-person associations, we can use each adolescent as their own control and test whether reporting higher levels relative to one’s own average on a focal predictor predicts passively sensed behavior measured that day. In this study, we tested whether an adolescent’s morning-reported indicators of positive affect, negative affect, and sleep quality predicted their same-day passively sensed behaviors. Second, by testing between-person associations, we can examine how adolescents who report higher levels of affect and sleep quality, across the 90-day observational period, relative to one’s peers differentially engage in passively sensed behaviors.
An advantage of this study is that we had daily self-reports from the adolescents captured each morning alongside their passive sensing data. This allowed us to test whether adolescents’ morning affect and sleep quality predicted a variety of behaviors as recorded throughout the day. Considering directionality is particularly important in the context of passively sensed digital behaviors (i.e., keystroke, app usage, call status). It is often assumed that the relation between digital behavior and mental health is unidirectional, in that greater engagement in a particular digital activity predicts poorer mental health (for a review, see Odgers & Jensen, 2020). However, longitudinal research calls into question directionality (Rodman et al., 2024; Hancock et al., 2022), and emerging research suggests that pre-existing mental health status is predictive of how individuals engage with digital spaces (Kelly & Sharot, 2025). Tests of how mental health indicators predict subsequent online behaviors and passively sensed behaviors are needed.
Considering Sampling Bias in Adolescent Passive Sensing Studies
Passive sensing research requires unique privacy and time expectations from participants, as participants must be willing to provide access to personal data for weeks or months (Cooper et al., 2023). Studies have illustrated the promise in recruiting adolescents who are willing to participate in such studies (Domoff et al., 2021; Elavsky et al., 2022; Hamilton et al., 2024; Kadirvelu et al., 2023; Wade et al., 2021). However, it remains unclear what bias, if any, is present in adolescent samples who self-select to participate in passive sensing research. Missingness is a documented issue in digital mental health interventions (Goldberg et al., 2021). As with any study, data can be missing completely at random (i.e., missingness not contingent on observed or unobserved variables), missing at random (i.e., missingness contingent on an observed variable), or missing not at random (i.e., missingness contingent on an unobserved variable), with data missing not at random particularly problematic.
Existing studies tell us what types of participants are likely to experience data attrition over the course of a study. Among 179 adolescents followed over six months, missingness on both daily survey and passive sensing data were generally not significantly associated with baseline mental health and sociodemographic indicators, except that lifetime non-suicidal self-injury predicted missingness on suicidal ideation self-reports and Black participants had higher missingness on GPS data compared to other participants (Bloom et al., 2024). In a meta-study, Black adults had higher rates of accelerometer missingness compared to white adults; otherwise, there was little difference in missingness on GPS or accelerometer by key demographics (e.g., gender, age, education; Kiang et al., 2021). These studies do not speak to who, or which types of participants, are more likely to opt in to passive sensing studies. One study explored this by utilizing data from the ABCD study, finding that adolescents who opt in to passive sensing (30% of the invited sample) are slightly older, more likely to be female, and are more likely to be white and less likely to be Hispanic (Alexander et al., 2024). Additional research is needed to systematically explore this potential bias and inform which populations of adolescents may be best positioned to benefit from advances in passive sensing technologies, and if certain adolescents may be left out.
Because the sample in the current research was drawn from a prospective longitudinal study, we are uniquely positioned to test predictors of study participation and missingness. This is particularly noteworthy for study participation, as we can examine sampling bias on variables that are often unobserved for those who were invited to participate but declined. In addition to key socio-demographics (age, gender, socioeconomic status, race/ethnicity, urbanicity), we tested how participation may be linked to previous assessments of mental health risk, as past research suggests that study attrition is higher among those at greater risk (Dupuis et al., 2019). Individuals with greater mental health concerns report lower energy and greater fatigue and may perhaps be more hesitant to share sensitive data; in turn, this reduces their willingness to participate in research (Bixo et al., 2021). As a result, lengthy passive sensing studies may be particularly daunting. Finally, we tested predictors of data coverage (i.e., the extent to which passive sensing data was missing) for those who opted in to the passive sensing study.
The Current Study
This study advances the literature in several ways. First, we aimed to replicate and extend existing adolescent passive sensing studies by carefully assessing between- and within-person associations between morning-reported affect and sleep quality with passively sensed metrics of daily online and offline experiences. To date, many passive sensing studies have aggregated data across measurement points (e.g., MacLeod et al., 2021) or examined within-person associations at coarser intervals (e.g., monthly associations; Rodman et al., 2021, 2024). Fine-grained associations for some passive metrics (e.g., keystroke; Funkhouser et al., 2023; Li et al., 2023; McNeilly et al., 2023) have been tested but are inconsistent, necessitating more research to understand how these associations may unfold. Second, we advance the literature by testing reverse directionality, in that morning-recorded indicators relevant to mental health predict offline and online behaviors as assessed that day. This is an important direction given that mental health is more often studied as an outcome of these behaviors, rather than a predictor. Third, we leveraged an existing data set to systematically test how demographics and mental health risk may relate to passive sensing study participation and data coverage. A comprehensive understanding of this bias in the context of adolescent passive sensing research is still needed (but see Alexander et al., 2024), as this can inform what risk of bias, if any, there is among adolescents who opt into these studies and how data coverage may vary depending on the constructs that researchers are often most interested in exploring. Our research aims are as follows.
Aim 1: Is there a signal between adolescents’ morning reports of mental health metrics (affect, sleep quality) and same-day passively sensed behaviors?
Aim 2: Do adolescents’ demographic characteristics and mental health indicators predict a) participation in a passive sensing study, and b) passive sensing data coverage?
Method
Participants
Participants were recruited from an ongoing longitudinal study. The first study wave (in 2015) involved 2,104 adolescents who were selected to be representative of public school-attending adolescents in North Carolina on gender, race/ethnicity, and economic disadvantage. Participants in the current research were recruited from those still participating at Wave 4 of the study (in 2020, N=891). Of these adolescents, 8083 were invited to participate in the current research, which took place between August 2020 and March 2021. The 808 adolescents who were invited to participate were, on average, 17.09 years-old (SD=1.16), 58% female, 58% white, 21% Black/African American, 11% Hispanic, and 10% Multiracial/Other. Forty-seven percent were classified as economically disadvantaged.
Participants were contacted using information (phone numbers, email addresses, mailing addresses) provided during Wave 4, with parents providing adolescent contact information for those who had since turned 18 (53% of the sample). Participants were contacted approximately four times with an invitation to participate, with recruitment taking place in seven batches from July 2020 through December 2020; recruitment was done in batches to ensure proper onboarding and monitoring for each participant. Three full-time staff were available for recruitment and onboarding, with staff assisting with the assent/consent procedure and providing materials (i.e., instructional documents, troubleshooting videos) to aid EARS app installation and sensor completion. Personalized consultations were provided when needed. To help with privacy concerns, during the consent and onboarding process, participants were assured that only deidentified data would be analyzed, that no information typed into secure fields (e.g., credit card information) would be collected, and that they could drop out of the study at any point. Study procedures were approved by the Duke University Institutional Review Board. Informed consent was obtained for all participants and, for those under 18, their parents/legal guardians. Participants were compensated $10 for installing the app, $10 for each month of participation (totaling $30), and $10 for completing at least 80% of the survey prompts, for a possible total payment of $50. Data are not publicly available due to the sensitive nature of the study. We report how we determined our sample size, all data exclusions, all manipulations, and all relevant measures in the study.
In all, 192 adolescents installed the EARS app, 131 uploaded data for at least one month, and 126 participants uploaded data over the 90-day period (Figure 1). The supplement provides information on initial recruitment, demographics of follow-up waves, and how the Wave 4 sample (from which the passive sensing participants were recruited) differed from the original population representative sample; overall, any significant differences were small (Table S1).
Figure 1.

Flow Chart of Participant Retention
Measures
EARS
Full sensor information is in Table 1. Raw sensor data were cleaned, aggregated, and provided to the research team by the EARS app developer, Ksana Health. A key challenge of passive mobile sensing methods is integrating data from different time scales (e.g., 4,000,000 accelerometer points per day versus sporadic call logs) and from different platforms (Android and iOS). Once these data are integrated, one must derive meaningful features from the data to suit the research question. In the present study, the EARS app developer, Ksana Health, performed the data integration and feature extraction with guidance from the research team.
Table 1.
Sensor Information
| Keyboard | Motion | GPS | Survey | Call Status | App Use | |
|---|---|---|---|---|---|---|
| Constructs | Total Words Self-referential Words Negative Words Positive Words Absolute Words |
Sleep Duration Cycling Walking Running Stationary |
Location Distance traveled Time traveled Time at home Travel events |
Sleep Quality Pos Affect Neg Affect |
Incoming Call Duration Outgoing Call Duration Number of Calls |
Minutes App on Phone Foreground |
| Average number of days with sensor data among those with 1+ month completion | ||||||
| Mean SD Minimum Maximum |
42.82 36.30 0 90 |
68.78 32.46 0 90 |
74.98 23.56 0 90 |
69.46 23.81 1 90 |
41.56 28.97 0 90 |
86.92 12.11 35 90 |
|
| ||||||
|
All Installs
(n=192) |
||||||
| Android (n=44) | 68% | 80% | 73% | 72% | -- | 81% |
| iOS (n=148) | 16% | 46% | 54% | 49% | 65% | -- |
|
| ||||||
|
1+ Month Comp (n=131) |
||||||
| Android (n=36) | 81% | 96% | 87% | 86% | -- | 97% |
| iOS (n=95) | 24% | 69% | 82% | 74% | 88% | -- |
Note. Values distinguishing Android from iOS users for Keyboard, Motion, GPS, Survey, and App Use represent average percentage of study days (out of 90) in which sensor and feature data were collected. Values for Call Status represent percent of total participants for whom data were collected. Differences between iOS and Android users were observed for every analysis at p < .05, except for the comparison for GPS for those who completed at least one month of data collection (p = .265)
Keystroke.
The EARS keyboard logged all information typed, except for information typed in secure fields (e.g., passwords, credit card information). Variables were extracted for keystroke data typed in social media and messaging apps. The Android and iOS keyboards differed, in which iOS users were required to use a custom keyboard, whereas the keyboard for Android users adapted the existing keyboard. The raw text data were analyzed via dictionary-based methods to derive the following counts per day: total words, self-referential words (e.g., “I”, “I’m”), negative words (e.g., “mad”, “tired”), positive words (e.g., “love”, “happy”), and absolute words (e.g., “always”, “never”; Byrne et al., 2021). The self-referential dictionary was based on the LIWC dictionary of first-person singular pronouns (Tausczik & Pennebaker, 2010). The “bing” dictionary was used to capture affective words (i.e., positive and negative words; Liu, 2012). The absolutist dictionary was based on work by Al-Mosaiwi and Johnstone (2018). Proportions of words in a day for each self-referential, negative, positive, and absolute words were calculated. As done in past research (McNeilly et al., 2023), we excluded daily cases if less than thirty words were captured. In a study of adolescent typed text captured by EARS, McNeilly and colleagues (2023) found that 30 words was the threshold at which daily proportions of language features stabilized. Days with fewer than 30 words produced more outlying proportions (see detailed analysis in Supplement A of McNeilly et al, 2023).
Motion and sleep.
Using the phone’s existing motion sensors and Android and iOS’s built-in motion-sensing algorithms, the EARS app measured daily duration of cycling, walking, and running events (summed as a measure of physical activity). The built-in motion-sensing algorithms also measured the duration of stationary events, including sleep.
GPS.
Using the phone’s motion capabilities and location services, the following variables were collected at the daily level: number of GPS location captures, distance traveled, time spent traveling between stop locations, time spent at home, furthest point from home location, and number of travel events. These variables were computed based on the participant’s home location, which was the most prevalent location between the hours of 2:00am and 6:00am.
Daily survey.
Participants completed a 3-item survey each morning assessing their sleep quality and mood. Surveys were available starting at 7:00am and ending at 12:00pm, with notifications prompting survey completion. Items were randomized across days to reduce survey burden, with participants reporting on two of eight possible affect items each day (one positive, one negative). Participants reported on their sleep quality every day. Items assessing positive affect included accepted by others, good about the self, optimistic, and respected. Items assessing negative affect included sad, anxious, lonely, and worried. Response scales ranged from 1–5. Single items are commonly used in intensive longitudinal research to minimize survey burden, with the benefits of multiple items often only modest (Song et al., 2023).
Call Status (iOS Only).
For iOS users only, the EARS app captured the call log for a given day, including the frequency of incoming and outgoing calls, and the duration of incoming and outgoing calls. Call information was only captured on a given day if at least one call was initiated or received; given that it is unclear if missingness is due to no calls placed or an error in sensing, analyses with this variable are limited to the first research aim.
App Usage (Android Only).
Duration of app usage was available only for Android users, due to iOS privacy protections. The EARS app pulled information logged by the Android operating system assessing the daily amount of time a given app was in the foreground.
Data from Baseline (2015)
Demographics.
Participants self-reported age at all waves. Geographical features (urban versus rural) and neighborhood income were computed from census tract data from the initial wave of the study (2015). Adolescents were grouped into five income groups (1=Lowest income; 5=Highest income). Race/ethnicity and binary gender as self-reported at Wave 1 were used.
Administrative Data.
Administrative data from school records including economic disadvantage and test scores (reading and math) were available from the initial wave of the study (2015).
Mental Health Data from Wave 4 (2020)
The following measures were collected, on average, six months prior to participants’ EARS start date as part of the Wave 4 survey collection for the larger study.
Internalizing Problems.
Participants responded to six items measuring past month psychological distress, using a 1 (None of the time) to 5 (All of the time) scale (Kessler et al., 2002). Sample items include “nervous” and “hopeless” (α=.88). Item response theory model testing confirms the strong psychometric properties of this measure (Kessler et al., 2002).
Externalizing Problems.
Participants responded to 25 items assessing conduct problems (Farrell et al., 2000), including aggression and deviant behavior over the past month (e.g., “Thrown something at someone to hurt them”). The response scale ranged from 0 (Never) to 5 (20 or more times; α=.88). Past research has demonstrated the factorial and convergent validity of this scale (Farrell et al., 2000).
Stressful Life Events.
Participants reported whether they experienced 13 different stressful life events (e.g., “The breakup of a close friendship”; Merikangas et al., 2009). Participants received a score of ‘1’ for each item they endorsed, with items summed to reflect higher levels of stressful life events. The measure is based on best practices suggesting checklists to indicate the presence or absence of a particular experience, with items comprising face valid experiences commonly asked when assessing youth stress (Merikangas et al., 2009).
Perceived Technological Impairment.
Participants responded to six items assessing their perceived technological impairment using a 0 (Never) to 2 (Often) scale (Burnell & Odgers, 2023). A sample item includes, “Do you find it difficult to stop using technology, such as the internet or your cell phone, once you start?” (α=.78).
Analytic Plan
Aim 1 was addressed in two ways. First, we report on bivariate correlations between daily survey data and passively sensed data. Second, we ran random intercept multilevel models in which adolescents’ morning reports of positive affect, negative affect, and sleep quality predicted the same-day passively sensed metric of interest. These metrics were: total words typed, proportion of self-referential words, negative words, positive words, and absolute words; total app usage time, call frequency, call duration, stationary time, and time spent engaging in physical activity. Each predictor and outcome variable were modeled individually, resulting in 30 distinct models4. A similar analysis was run for sleep, except that prior night’s sleep predicted morning-recorded affect and sleep quality (resulting in three additional models). The predictor of interest was person-centered at the within-person level (Curran & Bauer, 2011); additionally, the predictor was averaged across the 90-day study period and included as a grand-mean centered predictor at the between-person level. All multilevel models were run in MPlus (Version 8.11; Muthén & Muthén, 1998–2017), and Full Information Maximum Likelihood was used to adjust for missingness on predictor variables. Robust standard error estimation was used. Given the large number of models, the p-value for interpreting significance was set to .01.
For Aim 2, we examined a) how those who did not participate in the passive sensing study differed from those who consented and installed the EARS app, b) how those who consented and installed the app differed from those who consented but did not install, and c) how those who installed the app but had 30 days or fewer of data differed from those who participated for the full 90 days. T-tests were used to examine how the two groups in each comparison differed based on demographic characteristics (age, gender, socioeconomic status [based on neighborhood income and economic disadvantage], urbanicity, academic achievement) and mental health risk (internalizing problems, externalizing problems, stressful life events, perceived technological impairment). For those who installed the EARS app, we conducted t-tests, ANOVAs, and bivariate correlations to examine how these same demographic and mental health constructs related to number of days of successful data coverage. Data coverage was defined as the number of days in which data was successfully captured on a given sensor, with a day labeled as no coverage if there was no sensor data present for that day.
Results
Aim 1: Do adolescents’ reports of morning affect and sleep quality predict same-day passively sensed behaviors?
Full results are in Tables S2–S5. Bivariate correlations are reported across all study days (not accounting for nesting) in the within-person column. Several significant correlations emerged; however, most were quite small (r<.10), suggesting a small signal between the survey and passive sensing data. Correlations in which the passive sensing and survey data are aggregated across a participant’s study days are reported in the between-person column. For the multilevel modeling analyses, very few within-person associations emerged that met our significance threshold of p<.01. On days in which adolescents reported greater sleep quality than their own average, they logged lower stationary time (β=-.09). On nights in which adolescents logged longer sleep duration than their own average, they reported greater sleep quality in the morning (β=.16). Several between-person associations emerged. Adolescents who reported greater positive affect over the course of the passive sensing period logged a lower proportion of negative typed words (β=-.27) and engaged in greater phone calls (β=.38). Adolescents who logged greater sleep duration over the course of the passive sensing period reported, on average, greater positive affect (β=.27), lower negative affect (β=-.26), and greater sleep quality (β=.26).
Aim 2: Do adolescents’ demographics and mental health indicators predict passive sensing study participation and data coverage?
Among the 808 adolescents who were invited to participate in the passive sensing study, 226 (28%) provided consent/assent to participate (Figure 1). Twenty-three percent of those under 18 consented, compared to 32% of those over 18, χ2 (1)=7.23, p=.007. Of the 226 who provided consent/assent to participate, 192 installed the EARS application on their mobile device (24% of total invites). Of the 192 who installed the app, 131 completed at least 30 days of data collection (16% of total invites). The majority of those who completed 30 days completed the full 90 days (96%, n=126). There were few differences between adolescents who were invited to participate in the passive sensing study but did not, and those who consented to participate and installed the EARS app (Table S6). Those who consented and installed EARS reported fewer stressful life events (d=.22); otherwise, no differences were observed on demographics or mental health risk. Likewise, few differences emerged between those who installed EARS and those who consented to participate but did not install (Table S7). Those who consented but did not install were more likely to be female and be classified as Multiracial or a race/ethnicity other than white, Black, or Hispanic. They also reported greater stressful life events (d = .75). Finally, those who installed EARS but had fewer than 30 days of data differed from those who participated for the full 90-day period had lower reading (d=.39) and math (d=.31) achievement scores; otherwise, these two groups did not differ (Table S8).
There was a slight decline in successful data capture for most sensors over the 90-day period (Figure 2). Despite a few small differences, there is no strong support for systematic bias in data coverage among the 192 adolescents who installed EARS (Table S9). Girls completed more surveys and had greater GPS coverage. Higher-academically achieving adolescents had greater motion and GPS coverage. Rural adolescents had greater keystroke coverage than urban adolescents. Among Android users5, adolescents who lived in more affluent neighborhoods had greater app usage coverage.
Figure 2.

Proportion of participants with data coverage across the 90-day study period
Discussion
This study leverages an existing large and diverse sample of adolescents to assess two questions. First, we tested if there is a signal between adolescents’ morning reports of indicators relevant for mental health (affect, sleep quality) and their passively sensed behaviors. Although there was a small signal detected for several behaviors, these associations generally disappeared with a more rigorous statistical approach (i.e., parsing between- from within-person associations). Second, we examined if adolescents’ demographic characteristics and mental health indicators predicted their participation in a passive sensing study and their data coverage within the study. There was some evidence that more vulnerable adolescents are less likely to participate and/or have lower coverage; however, associations were small and inconsistent.
A Signal between Adolescent Mental Health and Same-Day Passively Sensed Behaviors Changes by Analytic Strategy
Initial bivariate correlations suggested several associations between morning-reports of affect and sleep quality with same-day passively sensed behaviors. However, within-person linkages mainly disappeared after disaggregating within- from between-person associations, particularly for digital behaviors. Scholars have emphasized the importance of separating these two types of associations (Fisher et al., 2018), and prior digital technology research has found evidence of linkages between digital technology use and mental health at the between- but not within-person level (Stavrova & Denissen, 2021). Indeed, among the digital behavior analyses, the only two associations that were robust were at the between-person level. Adolescents who used a greater proportion of negative words relative to their peers reported lower positive affect, and adolescents who engaged in greater calls reported greater positive affect.
The lack of within-person associations pertaining to linguistic features conflicts with previous studies that found a stronger signal between mood and digital communication patterns, although findings in past studies are not always consistent (Funkhouser et al., 2023; Li et al., 2023; McNeilly et al., 2023). For example, McNeilly and colleagues (2023) measured daily mood using a single item assessing general feelings over the last day and utilized an all-female sample. Digital technology experiences vary by gender and other individual characteristics (Maheux et al., 2025), and future research utilizing a larger sample should explore further if the signal between mental health and passively sensed behaviors changes based on these characteristics. The between-person association between positive affect and call frequency is in line with past between-person research finding that outgoing calls was associated with lower depressive symptoms (MacLeod et al., 2021), perhaps reflecting how well-adjusted individuals typically engage in greater socialization (Mehl et al., 2010). Nonetheless, this finding did not extend to the duration of calls, and there were no within-person linkages observed with call variables, despite past research finding that more calls are associated with poorer well-being at the within-person level (Rodman et al., 2021). It is possible that heterogeneity in call communication may attenuate effects, as there are likely differences in associations if an adolescent is placing a call to discuss a stressor versus other topics (e.g., sharing good news). Finally, there were no within- or between-person associations with overall app usage, in line with past research suggesting minimal associations between broad metrics of digital technology use and well-being (Odgers & Jensen, 2020).
A stronger signal with mental health emerged for the offline behaviors, particularly sleep. Adolescents who reported greater sleep quality than their own average recorded lower stationary time that day, and adolescents who recorded greater sleep duration than their own average reported higher sleep quality in the morning. Between-person associations emerged, in that adolescents who recorded greater sleep duration relative to their peers reported greater positive affect, lower negative affect, and greater sleep quality over the passive sensing period. These findings are in line with well-documented conclusions on how integral sleep is for mood and well-being (Beames et al., 2024; Šutić & Novak, 2024) and provide support for the utility of passive sensing technologies to assess sleep outcomes. In turn, just-in-time interventions can be employed to provide support for adolescents who may be struggling with sleep (e.g., sending notifications to prompt bedtime), which in turn can foster benefits for adolescent mental health. The lack of association with physical activity could be due to the broad assessment of the variable, in which distinguishing types of physical activity (e.g., moderate versus vigorous) may be critical to understand nuanced associations with mental health (Baynham et al., 2025).
Adolescents’ Demographic Characteristics and Mental Health Indicators are Weak Predictors of Passive Sensing Study Participation and Data Coverage
Only one-quarter of adolescents who were invited to participate in the study installed EARS. This low participation rate is striking, especially given that the current research utilized a research-friendly sample (with adolescents having participated in up to four previous waves). This rate is similar to that observed in the ABCD study, which also employed EARS in an existing sample (Alexander et al., 2024). With most adolescents opting out of participating in a passive sensing study, it is critical to understand if there is bias in those who opt in, which in turn can inform bias in the associations tested in our first aim. Ultimately, adolescents who opted in only differed slightly from the larger pool from which they were recruited, with adolescents who opted in reporting fewer stressful life events than those who did not participate. Likewise, adolescents who consented but did not install EARS reported greater stressful life events than those who consented and installed. This could be driven by several factors. First, the life challenges that these adolescents are navigating may make it difficult for them to commit to a lengthy 90-day study, or to be able to engage in the multitude of steps required to participate (e.g., provide consent/assent, install the app, ensure that sensors are capturing data). Second, these life challenges may make some adolescents hesitant to allow sensitive data access to researchers, as their digital behaviors may reflect communication that details these challenges.
In addition, these findings could be a representation of greater privacy concerns among certain groups (Hofstra et al., 2016), although markers traditionally associated with research participation hesitancy (e.g., income, race/ethnicity; Alexander et al., 2024), were generally not significant. Interestingly, when examining data coverage, keystroke coverage was higher for rural versus urban adolescents, despite rural participants often being harder to reach. This bears promise in harnessing remote technologies such as passive sensing to recruit these hard-to-reach populations. Collectively, despite only a fraction of eligible adolescents participating in the passive sensing study, these adolescents remained a fairly diverse and heterogeneous group that reflected the pool from which they were initially recruited.
This group’s willingness to engage in the intensive data sharing that characterizes a passive sensing study could be due to several reasons. First, adolescents had an extended relationship dating back five years with our recruitment team, a relationship that we took careful steps to maintain through personal, positive contact. Thus, participants were already familiar with and trusted our recruitment team. Second, this team was heavily involved throughout the consenting, installing, and monitoring process, providing hands-on help as needed. Respect for privacy was articulated at every step, and a fundamental priority for all contact was maintaining a cordial relationship with each individual participant. We believe that these actions were a critical component in maintaining a fairly heterogeneous sample (despite its small size) that included vulnerable adolescents who may be more hesitant about data sharing. Nonetheless, we acknowledge the difficulties in accessing existing sampling frames and maintaining a fairly permanent staff to allow positive relationships to be sustained over time. Notably, participant compensation is unlikely to have played a large role in recruitment and retention, as compensation was fairly modest (up to $50 total). It is plausible that higher compensation may result in a larger, more compliant sample, although this may pose coercion problems if vulnerable populations are of interest.
Finally, there were relatively few predictors of data coverage among the participating sample. Successful data capture slightly declined for sensors over time, possibly indicating study fatigue, and this decline highlights the need for daily monitoring to ensure that uploads are made. There were important differences based on key demographics and academic achievement: higher test scores predicted greater motion and GPS coverage, higher neighborhood income was associated with greater app usage coverage, and girls had greater GPS and survey coverage (in line with past work; e.g., Dunton et al., 2007). As can often be expected with any study, those who are more conscientious may be more likely to provide extensive and high-quality data. A more important indicator in data coverage stemmed from operating system. Android data coverage was superior on nearly all sensors, in line with past research indicating fewer technical issues implementing passive sensing on Android versus iOS devices (Cornet & Holden, 2018). Recent developments in EARS have reduced these differences (Lind et al., 2023); however, given that Android and iOS users systematically differ (Android users were more likely male, economically disadvantaged, and had lower math test scores), special attention to operating system differences is needed in passive sensing studies. Since passive sensing apps are easier to develop for Android (Cornet & Holden, 2018), this may work favorably in gathering data from a diverse population, but capturing both types of users is necessary for representativeness.
Limitations and Conclusions
The data were collected during the COVID-19 pandemic, which may have influenced passively sensed behavior across all metrics (e.g., increased smartphone usage). The GPS sensor may have been particularly affected. Travel was restricted due to stay-at-home orders and school closures; because of this, we did not analyze GPS data in relation to mental health. It is also possible that data coverage may have been altered by the COVID-19 pandemic, if adolescents were more willing to allow data capture if travel was restricted. Prior research suggests a willingness to share location data if the purpose of research is clearly communicated and there is trust in the research team (Kolovson et al., 2020). By taking these steps, we are more confident that data coverage was not inflated due to the pandemic. However, additional research can test how coverage may vary during versus post-pandemic to confirm. Likewise, mobility sensors may have been impacted by the pandemic. Although research shows that sports activity declined during this time, overall physical activity increased (Schmidt et al., 2020), suggesting that there was not an increase in sedentary behavior due to restrictions on leisure activities. Nonetheless, given that the type of physical activity may be differentially associated with mental health indicators (Baynham et al., 2025), future research should further explore these findings.
Although participants were recruited from a large and diverse sample, the sample was not population-representative (see supplement), and estimates of bias may be underestimated; however, differences with a population representative sample were small. Passive sensing technologies are also vulnerable to frequent updates and changes; indeed, the EARS app as used in the current research has since undergone additional updates and advancements (Lind et al., 2023). Of particular note is the difference in coverage for Android versus iOS devices, with passive sensing technologies traditionally easier to implement on Android devices due to iOS privacy protections (Cornet & Holden, 2018). The keystroke data for iOS was difficult to capture, and some features (i.e., app usage) were only available for Android users, whereas others (i.e., call status) were only available for iOS users. Differences in data coverage could indicate that the signal between mental health indicators and passively sensed behaviors may be more comprehensive for Android versus iOS users, with Android users comprising only a small fraction of our sample. Systematically testing how operating system may moderate associations was beyond the scope of this study, given the lopsided makeup of Android versus iOS users. Future research should thoroughly explore if associations may vary.
Despite these limitations, this study advances the literature by exploring two distinct questions. First, we found that a signal between self-reported metrics relevant to mental health and passively sensed behavior is not consistent, particularly for digital behaviors, although some signal was observed. Second, we found that although study uptake in a passive sensing study is low, it is possible to obtain a diverse sample. The unobtrusive nature of passive sensing maximizes the ability to capture heterogeneous behavior over short intervals and has been championed as a key feature of just-in-time interventions that, if properly implemented, could deliver service where needed (Coppersmith et al., 2022). As the field blazes forward with innovations in the collection and analysis of large volumes of streaming data relevant to mental health, it is critical that adolescents, a population in need of services, are not left behind.
Supplementary Material
Funding Support
This research was funded by the National Institute on Drug Abuse, Grant P30DA023026, and a Jacobs Foundation Research Award. Kaitlyn Burnell is currently supported by the Winston Family Foundation.
Conflicts of Interest
Kaitlyn Burnell has served as a paid consultant for ongoing social media litigation.
Monika Lind holds an equity interest in Ksana Health Inc, the company that has the sole commercial license for certain versions of the Effortless Assessment Research System (EARS) app.
Footnotes
CRediT Statement
Kaitlyn Burnell: conceptualization, data curation, formal analysis, investigation, project administration, writing – original draft
Jesus A. Beltran: visualization, writing – review and editing
Monika N. Lind: writing – review and editing
Gillian R. Hayes: funding acquisition, writing – review and editing
Candice L. Odgers: conceptualization, funding acquisition, investigation, methodology, project administration, supervision, writing – review and editing
Analytic Code
Study code and output can be found at https://osf.io/p7c8t/
We did not assess linkages between passively sensed GPS data and mental health outcomes as data collection occurred during the COVID-19 pandemic, when traveling was restricted.
The EARS app collects data on other characteristics (e.g., selfies, battery level); however, due to restrictions from the approving Institutional Review Board, not all features were fully available at the time of this writing.
These 808 adolescents included 803 with valid Wave 4 data (5 did not have valid data). The remaining 88 participants were not recruited as they completed the Wave 4 survey late and could not complete 90 days of data collection within the project’s timeline. Comparisons between the Wave 4 and baseline sample are in the supplement.
Additional analyses were run in which the prior day passive sensing variable was controlled. Results did not change.
Because of EARS technical differences for iOS versus Android, we compared differences in data coverage, with Android users superior in coverage for nearly every metric (Table 1). Android and iOS users also differed on several demographics and mental health indicators. Females comprised 64% of iOS users and 33% of Android users, χ2 (1)=15.14, p<.001. iOS users were also less likely to be classified as economically disadvantaged (45% versus 65%), χ2 (1)=6.21, p=.013. Android users had lower math test scores (M=452.59 versus 455.99; t (215)=2.04, p=.043; d=.34 95% CI [.01, .68]).
References
- Al-Mosaiwi M, & Johnstone T (2018). In an absolute state: Elevated use of absolutist words is a marker specific to anxiety, depression, and suicidal ideation. Clinical Psychological Science, 6, 529–542. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Alexander J, Linkersdörfer J, Toda-Thorne K, Sullivan RM, Cummins KM, Tomko RL, Allen NB, Bagot KS, Baker FC, Fuemmeler BF, Hoffman EA, Kiss O, Mason MJ, Nguyen-Louie TT, Tapert SF, Smith CJ, Squeglia LM, & Wade NE (2024). Passively sensing smartphone use in teens with rates of use by sex and across operating systems. Scientific Reports, 24, 17982. 10.1038/s41598-024-68467-8 [DOI] [Google Scholar]
- Baynham R, Camargo A, D’Alfonso S, Zhang T, Munoz Z, Davies P, Alvarez-Jimenez M, van Berkel N, Kostakos V, Schmaal L, & Tagliaferri SD (2025). The dynamic association between physical activity and psychological symptoms in young people with major depressive disorder, An active and passive sensing longitudinal cohort study. Early Intervention in Psychiatry, 19, e70018. 10.1111/eip.70018 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beames JR, Han J, Shvetcov A, Zheng WY, Slade A, Dabash O, Rosenberg J, O’Dea B, Kasturi S, Hoon L, Whitton AE, Christensen H, Newby JM (2024). Use of smartphone sensor data in detecting and predicting depression and anxiety in young people (12–25 years): A scoping review. Heliyon, 10, e35472. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bixo L, Cunningham JL, Ekselius L, Öster C, & Ramklint M (2021). ‘Sick and tired’: Patients reported reasons for not participating in clinical psychiatric research. Health Expectations, 24, 20–29. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bloom PA, Lan R, Galfalvvy H, Liu Y, Bitran A, Joyce K, Durham K, Porta G, Kirschenbaum JS, Kamath R, Tse TC, Chernick L, Kahn LE, Crowley R, Trivedi E, Brent D, Allen NB, Pagiaccio D, & Auerbach RP (2024). Identifying factors impacting missingness within smartphone-based research: Implications for intensive longitudinal studies of adolescent suicidal thoughts and behaviors. Journal of Psychopathology and Clinical Science. Advance Online Publication. 10.1037/abn0000930 [DOI] [Google Scholar]
- Burnell K (2025, June 10). RAISE-UP Methods. Retrieved from osf.io/p7c8t. 10.17605/OSF.IO/P7C8T [DOI] [Google Scholar]
- Burnell K, & Odgers CL (2023). Trajectories of perceived technological impairment and psychological distress in adolescents. Journal of Youth and Adolescence, 52, 258–272. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Byrne ML, Lind MN, Horn SR, Mills KL, Nelson BW, Barnes ML, Slavich GM, & Allen NB (2021). Using mobile sensing data to assess stress: Associations with perceived and lifetime stress, mental health, sleep, and inflammation. Digital Health, 7, 1–11. 10.1177/20552076211037227 [DOI] [Google Scholar]
- Cooper JRH, Scarf D, & Conner TS (2023). University students’ opinions towards mobile sensing data collection: A qualitative analysis. Frontiers in Digital Health, 5, 1125276. 10.3389/fdgth.2023.1125276 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Coppersmith DDL, Dempsey W, Kleiman EM, Bentler KH, Murphy SA, & Nock MK (2022). Just-in-time adaptive interventions for suicide prevention: Promise, challenges, and future directions. Psychiatry, 85, 317–333. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cornet VP, & Holden RJ, (2018). Systematic review of smartphone-based passive sensing for health and wellbeing. Journal of Biomedical Informatics, 77, 120–132. 10.1016/j.jbi.2017.12.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Curran PJ, & Bauer DJ (2011). The disaggregation of within-person and between-person effects in longitudinal models of change. Annual Review of Psychology, 62, 583–619. 10.1146/annurev.psych.093008.100356 [DOI] [Google Scholar]
- Domoff SE, Banga CA, Borgen AL, Foley RP, Robinson C, Avery K, & Gentile DA (2021). Use of passive sensing to quantify adolescent mobile device usage: Feasibility, acceptability, and preliminary validation of the eMoodie application. Human Behavior & Emerging Technologies, 3, 63–74. 10.1002/hbe2.247 [DOI] [Google Scholar]
- Dreier MJ, Low CA, Fedor J, Durica KC, & Hamilton JL (2024). Adolescents’ self-regulation of social media use during the beginning of the COVID-19 pandemic: An idiographic approach. Journal of Technology in Behavioral Science. Advance Online Publication. 10.1007/s41347-024-00465-z [DOI] [Google Scholar]
- Dupuis M, Strippoli M-P-F, Gholam-Rezaee M, Preisig M, Vandeleur CL (2019). Mental disorders, attrition at follow-up, and questionnaire non-completion in epidemiologic research. Illustrations from the CoLaus|PsyCoLaus study. International Journal of Methods in Psychiatric Research, 28, e1805. 10.1002/mpr.1805 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Elavsky S, Blahošová J, Lebedíková M, Tkaczyk M, Tancos M, Plhák J, Sotolář O, & Smahel D (2022). Researching the links between smartphone behavior and adolescent well-being with the FUTURE-WP4 Project: Protocol for an ecological momentary assessment study. JMIR Research Protocols, 11, e35984. doi.org/10.2196/35984 10.2196/35984 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Farrell AD, Kung EM, White KS, & Valois RF (2000). The structure of self-reported aggression, drug use, and delinquent behaviors during early adolescence. Journal of Clinical Child Psychology, 29, 282–292. [DOI] [PubMed] [Google Scholar]
- Faverio M, & Sidoti O (2024, December). Teens, social media and technology 2024. Retrieved from https://www.pewresearch.org/internet/2024/12/12/teens-social-media-and-technology-2024/ [Google Scholar]
- Fisher AJ, Medaglia JD, & Jeronimus BF (2018). Lack of group-to-individual generalizability is a threat to human subjects research. PNAS, 115, e6106–e6115. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Funkhouser CJ, Trivedi E, Li LY, Helgren F, Zhang E, Srithan A, Cherner RA, Pagliaccio D, Durham K, Kyler M, Tse TC, Buchanan SN, Allen NB, Shankman SA, & Auerbach RP (2023). Detecting adolescent depression through passive monitoring of linguistic markers in smartphone communication. The Journal of Child Psychology and Psychiatry. Advance Online Publication. 10.1111/jcpp.13931 [DOI] [Google Scholar]
- Gianfredi V, Blandi L, Cacitti S, Minelli M, Signorelli C, Amerio A, & Odone A (2020). Depression and objectively measured physical activity: A systematic review and meta-analysis. International Journal of Environmental Research and Public Health, 17, 3738. 10.3390/ijerph17103738 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hamilton JL, Dreier MJ, Caproni B, Fedor J, Durica KC, Low CA (2024). Improving the science of adolescent social media and mental health: Challenges and opportunities of smartphone-based mobile sensing and digital phenotyping. Journal of Technology in Behavioral Science. Advance Online Publication. 10.1007/s41347-024-00443-5 [DOI] [Google Scholar]
- Hancock J, Liu SX, Luo M, & Mieczkowski H (2022). Psychological well-being and social media use: A meta-analysis of associations between social media use and depression, anxiety, loneliness, eudaimonic, hedonic and social well-being. SSRN [Google Scholar]
- Kadirvelu B, Bel TB, Wu X, Burmester V, Ananth S, Branco BCC, Girela-Serrano B, Gledhill J, De Simplicio MD, Nicholls D, & Faisal AA (2024). Mindcraft, a mobile mental health monitoring platform for child and young people: Development and acceptability pilot study. JMIR Formative Research, 7, e44877. 10.2196/44877 [DOI] [Google Scholar]
- Kelly CA, & Sharot T (2024). Web-browsing patterns reflect and shape mood and mental health. Nature Human Behavior, 9, 133–146. 10.1038/s41562-024-02065-6 [DOI] [Google Scholar]
- Kessler RC, Andrews G, Colpe LJ, Hiripi E, Mroczek DK, Normand S-LT, Walters EE, & Zaslavsky AM (2002). Short screening scales to monitor population prevalences and trends in non-specific psychological distress. Psychological Medicine, 32, 959–976. 10.1017/S0033291702006074 [DOI] [PubMed] [Google Scholar]
- Keyes KM, Gary D, O’Malley PM, Hamilton A, & Schulenberg J (2019). Recent increases in depressive symptoms among US adolescents: trends from 1991 to 2018. Social Psychiatry and Psychiatric Epidemiology, 54, 987–996. 10.1007/s00127-019-01697-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kiang MV, Chen JT, Krieger N, Buckee CO, Alexander MJ, Baker JT, Buckner RL, Coombs G, Rich-Edwards JW, Carlson KW, & Onnela J-P. (2021). Sociodemographic characteristics of missing data in digital phenotyping. Scientific Reports, 11, 15408. 10.1038/s41598-021-94516-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kolovson S, Pratap A, Duffy J, Allred R, Munson SA, & Areán PA (2020, May). Understanding participant needs for engagement and attitudes towards passive sensing in remote digital health studies. In Proceedings of the 14th EAI international conference on pervasive computing technologies for healthcare (pp. 347–362). [Google Scholar]
- Li LY, Trivedi E, Helgren F, Allison GO, Zhang E, Buchanan SN, Pagliaccio D, Durham K, Allen NB, Auerbach RP, & Shankman SA (2023). Capturing mood dynamics through adolescent smartphone social communication. Journal of Psychopathology and Clinical Science, 132, 1072–1084. 10.1037/abn0000855 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lind MN, Byrne ML, Wicks G, Smidt AM, & Allen NB (2018). The effortless assessment of risk states (EARS) tool: An interpersonal approach to mobile sensing. JMIR Mental Health, 5, e10334. 10.2196/10334 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lind MN, Kahn LE, Crowley R, Reed W, Wicks G, & Allen NB (2023). Reintroducing the effortless assessment research system (EARS). JMIR Mental Health, 10, e38920. 10.2196/38920 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu B (2012, May). Sentiment Analysis and Opinion Mining, Morgan & Claypool Publishers. [Google Scholar]
- Liu T, Meyerhoff J, Eichstaedt JC, Karr CJ, Kaiser SM, Kording KP, Mohr D, & Ungar LH (2022). The relationship between text message sentiment and self-reported depression. Journal of Affective Disorders, 302, 7–14. 10.1016/j.jad.2021.12.048 [DOI] [PMC free article] [PubMed] [Google Scholar]
- MacLeod L, Suruliraj B, Gall D, Bessenyei K, Hamm S, Romkey I, Bagnell A, Mattheisen M, Muthukumaraswamy Vv., Orji R, & Meier S (2021). A mobile sensing app to monitor youth mental health: Observational pilot study. JMIR MHealth and UHealth, 9, e20638. 10.2196/20638 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Marin-Dragu S, Forbes A, Sheikh S, Iyer RS, dos Santos DP, Alda M, Hajek T, Uher R, Wozney L, Paulovich FV, Campbell LA, Yakovenko I, Stewart SH, Corkum P, Bagnell A, Orji R, & Meier S (2023). Associations of active and passive smartphone use with measures of youth mental health during the COVID-19 pandemic. Psychiatry Research, 326, 115298. 10.1016/j.psychres.2023.115298 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Modecki KL, Goldberg RE, Ehrenreich SE, Russell M, & Bellmore A (2019). The practicalities and perils of ambulatory assessment’s promise: Introduction to a special section. Journal of Research on Adolescence, 29, 542–550. 10.1111/jora.12532 [DOI] [PubMed] [Google Scholar]
- Odgers CL, & Jensen MR (2020). Annual research review: Adolescent mental health in the digital age: facts, fears, and future directions. The Journal of Child Psychology and Psychiatry, 61, 336–348. 10.1111/jcpp.13190 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Maheux AJ, Burnell K, Maza MT, Fox KA, Telzer EH, & Prinstein MJ (2025). Annual research review: Adolescent social media use is not a monolith: toward the study of specific social media components and individual differences. The Journal of Child Psychology and Psychiatry, 66, 440–459. 10.1111/jcpp.14085 [DOI] [PubMed] [Google Scholar]
- McNeilly EA, Mills KL, Kahn LE, Crowley R, Pfeifer JH, & Allen NB (2023). Adolescent social communication through smartphones: Linguistic features of internalizing symptoms and daily mood. Clinical Psychological Science. Advance Online Publication. 10.1177/21677026221125180 [DOI] [Google Scholar]
- Miller-Johnson S, Sullivan TN, Simon TR, & Multisite Violence Prevention Project. (2004). Evaluating the impact of interventions in the Multisite Violence Prevention Study: Samples, procedures, and measures. American Journal of Preventive Medicine, 26, 48–61. 10.1016/j.amepre.2003.09.015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nisenson M, Lin V, & Gansner M (2021). Digital phenotyping in child and adolescent psychiatry: A perspective. Harvard Review of Psychiatry, 29. 401–408. 10.1097/HRP.0000000000000310 [DOI] [PubMed] [Google Scholar]
- Rodman AM, Burns JA, Cotter GK, Ohashi Y-GB, Rich RK, & McLaughlin KA (2024). Within-person fluctuations in objective smartphone use and emotional processes during adolescence: An intensive longitudinal study. Affective Science. Advance Online Publication. 10.1007/s42761-024-00247-z [DOI] [Google Scholar]
- Rodman AM, Vidal Bustamante CM, Dennison MJ, Flournoy JC, Coppersmith DLD, Nook EC, Worthington S, Mair P, & McLaughlin KA (2021). A year in the social life of a teenager: Within-persons fluctuations in stress, phone communication, and anxiety and depression. Clinical Psychological Science, 9, 791–809. 10.1177/2167702621991804 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schmidt SC, Anedda B, Burchartz A, Eichsteller A, Kolb S, Nigg C, ... & Woll A (2020). Physical activity and screen time of children and adolescents before and during the COVID-19 lockdown in Germany: a natural experiment. Scientific Reports, 10, 21780. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Song J, Howe E, Oltmanns JR, & Fisher AJ (2023). Examining the concurrent and predictive validity of single items in ecological momentary assessments. Assessment, 30, 1662–1671. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Šutić L, & Novak M (2024). Daily variation in sleep duration, affect and emotions in Croatian youth: An ambulatory assessment. European Journal of Mental Health, 19, e0022. 10.5708/EJMH.19.2024.0022 [DOI] [Google Scholar]
- Tausczik YR, & Pennebaker JW (2010). The psychological meaning of words: LIWC and computerized text analysis methods. Journal of language and social psychology, 29, 24–54. [Google Scholar]
- Uhlhaas PJ, Davey CG, Mehta UM, Shah J, Torous J, Allen NB, Avenevoli S, Bella-Awusah T, Chanen A, Chen EYH, Correll CU, Do KQ, Fisher HL, Frangou S, Hickie IB, Keshaven MS, Konrad K, Lee FS, Liu CH, . . . Wood SJ (2023). Towards a youth mental health paradigm: a perspective and roadmap. Molecular Psychiatry, 28, 3171–3181. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wade NE, Ortigara JM, Sullivan RM, Tomko RL, Breslin FJ, Baker FC, … & ABCD Novel Technologies Workgroup (2021). Passive sensing of preteens’ smartphone use: An Adolescent Brain Cognitive Development (ABCD) cohort substudy. JMIR Mental Health, 8, e29426. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
