Abstract
Background
TikTok is a global social media platform with over 1 billion active users. Presently, there are few data on how TikTok users navigate the platform for mental health purposes and the content they view.
Objective
This study aims to understand the patterns of mental health-related content on TikTok and assesses the accuracy and quality of the advice and information provided.
Methods
We performed a summative content analysis on the top 1000 TikTok videos with the hashtag #mentalhealth between October 12 and 16, 2021. Six content themes were developed to code the data: (1) a personal story, perspective, or confessional, (2) advice and information, (3) emoting, (4) references to death, (5) references to science or research, and (6) a product or service for sale. Advice and information were further assessed by clinical experts.
Results
A total of 970 mental health-related videos were pulled for our analysis (n = 30 removed due to non-English content). The most prevalent content themes included a personal story, perspective, or confessional (n = 574), advice and information (n = 319), emoting (n = 198), references to death (n = 128). Advice and information were considered misleading in 33.0% of videos (n = 106), with misleading content performing better. Few videos included references to scientific evidence or research (n = 37).
Conclusion
Healthcare practitioners and researchers may consider increasing their presence on the platform to promote the dissemination of evidence-based information to a wider and more youth-targeted population. Interventions to reduce the amount of misinformation on the platform and increase people's ability to discern between anecdotal and evidence-based information are also warranted.
Keywords: TikTok, mental health, social media, health communication, misinformation
Introduction
TikTok is a global social media platform that allows users to create and share short-form videos. During the COVID-19 pandemic and the new reality of physical distancing and public health measures, over 1.1 billion people worldwide reported using TikTok each month as an important platform to communicate, learn, and connect with others, with 80% of users being 12–30 years of age.1, 2 TikTok users currently spend an average of 1.5 h/day on the app and approximately 83% of all TikTok users have posted at least one video. 1
In the last few years, #mentalhealth and derivations of this term (e.g., #mentalhealthawareness, #mentalillness, #mentalhealthmatters) have been accessed tens of billions of times on TikTok. 3 The use of social media platforms and mobile phone-based apps in identifying and monitoring mental health symptoms and disorders is increasing each month, especially among youth.4, 5 Young people view mental health-related content organically, through self-directed searches, or via direction from complex algorithms that learn and predict a user's interest in specific content types, such as mental health advice and information. Despite the app's popularity among youth, only a few studies summarize how youth navigate TikTok information to guide decision-making for health and wellness.6,7–8 While research on the effectiveness and validity of these platforms is scarce, there have been encouraging results supporting their capacity to engage youth and increase mental health awareness, much of which transpired during the COVID-19 pandemic.8,9,10,11–12
While the use of social media for youth mental health awareness is promising, there are ethical and contextual considerations to be noted, such as how to best connect youth to developmentally- and age-appropriate services when problems are identified, and how to best address the proliferation of health-related misinformation online.13,14–15 A recent content analysis 15 found that TikTok proliferated negative content about cognitive behavioral therapy (CBT), despite the empirical evidence on CBT as an effective treatment for post-traumatic stress disorder (PTSD). Anecdotal reports also document several concerns about the content TikTok users view and their attitude towards the platform, such as influencers self-diagnosing personality disorders 16 and content related to self-harm and eating disorders. 17 TikTok's algorithm is also criticized for promoting unrealistic beauty standards 18 and transphobic views. 19 Conversely, TikTok has also been described as a valuable outlet for reducing stigma surrounding mental health and for equity-seeking groups to find community.8, 20 Therefore, while TikTok represents a potential public health tool, concerns about its safety are also evident.
At present, there is little evidence investigating how TikTok users navigate the platform for mental health purposes and the quality of the content they view. This remains a gap in the understanding of how to maximize TikTok's public health potential, while also minimizing its harms. As such, this study aims to better understand mental health-related content on TikTok by identifying sentiments, patterns, and themes across the top 1000 videos with the #mentalhealth, while also assessing the quality of mental health advice and information conveyed on the platform.
Methods
Data sample
We collected the top 1000 videos with the most views on TikTok (desktop version) with the hashtag #mentalhealth from October 12 to October 16, 2021. The metadata (views, likes, comments, shares, creator name, creator profile URL, video description, and video number) were scraped and videos were manually downloaded with a unique identifier. All data were collected and reported according to the terms and conditions of TikTok, which state that content posted by individuals is publicly available to syndicate, broadcast, distribute, repost, promote, or publish, excluding private information (e.g., home addresses or identity documents).
Data analysis
We applied a summative content analysis approach, a method commonly used to understand media-related communicative messages and phenomena.21,22–23 This type of analysis involves developing a list of codes (also referred to as a coding frame) to categorize or theme the data. The codes are then quantified, described, and interpreted. 24 For instance, this may include counting the number of TikTok videos relating to a specific code and describing the content, its meaning, and quality. Authors MZ, NO, and SB reviewed approximately 100 videos to identify and characterize the videos’ themes of importance related to mental health and interpreted their underlying content. MZ then test-coded the first 100 videos using the draft coding frame and made minor improvements. After the initial team approved the coding frame, it was presented to AM for review and validation. A final discussion was held among authors MZ, NO, SB, and AM and minor changes were made. Once finalized, the coding frame was presented to the six-member coding team (authors KM, CW, AO, AK, XD). Each coder attended a meeting to discuss the list of codes and provide their input, which led to minor changes in demographic collection strategies. The final coding frame sought to collect demographics of people featured in the videos, including age, gender, and race.
Demographic information was coded based on visual perceptions identified by the coders unless the information was provided within the content. Age was categorized by the following age groups: children (ages 0–12), youth (13–18), young adults (19–29) adults (30–64), and seniors (over 65). Race was categorized as white, racialized, or unable to identify, and gender was categorized as men, women, nonbinary, or unable to identify. We identified four initial content themes, comprising of (1) a personal story, perspective, or confessional, for any videos where people share and/or display their personal experiences with mental health, such as sharing their personal struggles, coping strategies, and/or recovery journeys through verbal and nonverbal communication (e.g., skits), (2) advice and information, for any videos offering advice or information related to mental health, (3) emoting, for any videos where people display intense emotion (i.e., anger, confusion, disgust, fear, happiness, sadness, shame, surprise), and (4) references to death, for any videos discussing death and/or thoughts of death. Two additional content themes were added as exploratory queries: references to scientific evidence or research, for any videos including terms like research, evidence-based, proven, leading scientists, scientific studies, and related terms; and a product or service for sale, for any videos mentioning a product or service for sale, for a total of six content themes. Videos could be coded into multiple content themes if more than one applied.
After final approval, the coding team each coded a subset of the total videos collected. Only videos in English were included, resulting in a final sample of 970 videos. To ensure interrater agreement, a separate research team member audited all coding to test the trustworthiness of the analysis and review uncertain coding flagged. All coding decisions received at least 80.0% agreement, 25 except the advice and information theme, which received 78.4% agreement between coders. Authors MZ and SB acted as tiebreakers.
A second coding team of clinical experts, consisting of a licensed clinical psychologist (JS) and two psychiatrists (SM, RS) were involved in assessing the quality and accuracy of videos giving advice and information (n = 319). Authors JS and RS viewed each video and determined whether its content was misleading, non-misleading, or unable to determine, and labeled them accordingly. During coding, if a true disagreement occurred (23.1%, n = 74) where one coder determined the video misleading, while the other non-misleading, it was sent to the third coder (author SM) for tiebreaking. If a coder labeled a video “unable to determine” and the other coder applied either “misleading” or “non-misleading,” the other author was prompted to determine if they agreed with the coders’ assessment (26.5%, n = 85). The third coder acted as a tiebreaker where disagreements occurred (72.3% agreement). The identified content themes, their underlying meanings, and their reach by TikTok metadata were then analyzed and reported.
Results
Six main content themes were identified across mental health-related videos, which are subsequently discussed in order of popularity (Table 1). Data stratified by coder-identified gender, race, and age are listed in Table 2. Content themes stratified by coder-identified demographics can be found in the Supplemental Material. Overall, coder-identified women were in more videos than coder-identified men (58.0% vs. 34.3%) but received fewer average views, likes, comments, and shares. Across coder-identified race, white individuals featured in more videos compared to racialized individuals. White men had the highest average views and shares, followed by racialized women, racialized men, and white women. Although white women had the highest number of videos (n = 445), they scored lower on average across all engagement metrics (views, likes, comments, and shares). Videos most prominently featured adults (45.0%) and young adults (33.0%), with some teenagers (13.7%). There were few videos including children (0.9%) or seniors (0.7%) (Table 2).
Table 1.
Content theme by TikTok engagement metric.
Metric | Story, confessional, or perspective | Advice and information | Emoting | References to death | References to scientific evidence or research | Product or service for sale | Sample total |
---|---|---|---|---|---|---|---|
Video count a | 574 (59.1%) | 319 (32.9%) | 198 (20.4%) | 128 (13.2%) | 37 (3.8%) | 16 (1.6%) | 970 |
Total views | 2,079,645,800 | 1,120,477,300 | 739,542,600 | 464,794,700 | 132,296,000 | 111,400,000 | 3,551,686,000 |
Mean views | 3,623,076 | 3,512,468 | 3,735,064 | 3,631,209 | 3,575,568 | 6,962,500 | 3,661,532 |
Total likes | 429,341,800 | 215,895,600 | 152,456,900 | 97,526,800 | 21,783,200 | 9,764,600 | 713,707,000 |
Mean likes | 747,982 | 676,789 | 769,984 | 761,928 | 588,735 | 610,288 | 735,780 |
Total comments | 4,503,533 | 2,737,795 | 2,000,325 | 1,298,998 | 260,685 | 44,989 | 8,689,193 |
Mean comments | 7846 | 8582 | 10,103 | 10,148 | 7046 | 2812 | 8958 |
Total shares | 8,953,266 | 9,781,228 | 3,146,323 | 2,870,206 | 1,714,660 | 185,394 | 19,943,582 |
Mean shares | 15,598 | 30,662 | 15,891 | 22,423 | 46,342 | 11,587 | 20,560 |
Videos could be coded under more than one theme, thus the sum of videos across each theme is greater than the sample total.
Table 2.
Ascribed gender, race, and age by TikTok engagement metric.
Race | Age group | ||||||||
---|---|---|---|---|---|---|---|---|---|
Gender a | Metric | Racialized | White | Children (0–12) | Teenagers (13–18) | Young adults (19–29) | Adults (30–64) | Seniors (over 65) | Total |
Men | Video count | 80 (8.2%) | 265 (27.3%) | 7 (0.7%) | 35 (3.6%) | 83 (8.5%) | 224 (23.1%) | 5 (0.5%) | 333 (34.3%) |
Total views | 301,073,300 | 1,150,983,900 | 27,900,000 | 153,856,600 | 275,573,300 | 977,927,300 | 20,900,000 | 1,369,257,200 | |
Mean views | 3,763,416 | 4,343,335 | 3,985,714 | 4,395,903 | 3,320,160 | 4,365,747 | 4,180,000 | 4,111,883 | |
Median views | 2,700,000 | 2,900,000 | 3,800,000 | 3,400,000 | 2,300,000 | 3,000,000 | 2,700,000 | 2,900,000 | |
Total likes | 58,119,800 | 207,289,000 | 4,354,400 | 29,659,100 | 56,395,700 | 173,051,100 | 3,297,400 | 250,124,700 | |
Mean likes | 726,498 | 782,223 | 622,057 | 8,4743 | 679,466 | 772,550 | 659,480 | 751,125 | |
Median likes | 4,572,00 | 545,800 | 406,400 | 628,300 | 486,400 | 526,500 | 384,300 | 515,700 | |
Total comments | 919,290 | 2,374,367 | 56,566 | 270,841 | 674,604 | 2,291,837 | 47,522 | 3,173,340 | |
Mean comments | 11,491 | 8960 | 8081 | 7739 | 8128 | 10,231 | 9504 | 9530 | |
Median comments | 5720 | 5858 | 5042 | 5284 | 5701 | 5990 | 10,100 | 5904 | |
Total shares | 1,541,541 | 6,707,666 | 61,929 | 394,893 | 1,675,652 | 6,050,860 | 184,581 | 8,120,581 | |
Mean shares | 19,269 | 25,312 | 8847 | 11,283 | 20,189 | 27,013 | 36,916 | 24,386 | |
Median shares | 14,250 | 14,400 | 4464 | 4061 | 13,800 | 17,050 | 43,900 | 14,800 | |
Women | Video count | 122 (12.6%) | 445 (45.8%) | 8 (0.8%) | 109 (11.2%) | 240 (24.7%) | 235 (24.2%) | 4 (0.4%) | 563 (58.0%) |
Total views | 503,595,700 | 1,516,670,400 | 25,200,000 | 373,851,700 | 781,970,800 | 919,843,600 | 12,300,000 | 1,973,966,100 | |
Mean views | 4,127,834 | 3,408,248 | 3,150,000 | 3,429,832 | 3,258,212 | 3,914,228 | 3,075,000 | 3,506,156 | |
Median views | 2,550,000 | 2,400,000 | 2,700,000 | 2,300,000 | 2,300,000 | 2,700,000 | 2,300,000 | 2,400,000 | |
Total likes | 100,829,500 | 315,642,500 | 4,633,100 | 80,450,300 | 170,587,700 | 177,094,800 | 2,464,600 | 408,010,200 | |
Mean likes | 826,471 | 709,309 | 579,138 | 738,076 | 710,782 | 753,595 | 616,150 | 724,707 | |
Median likes | 511,000 | 502,100 | 451,800 | 524,000 | 498,700 | 503,500 | 451,450 | 503,500 | |
Total comments | 958,231 | 3,303,612 | 63,771 | 716,275 | 1,722,537 | 1,944,442 | 15,231 | 4,199,109 | |
Mean comments | 7854 | 7424 | 7971 | 6571 | 7177 | 8274 | 3808 | 7458 | |
Median comments | 4142 | 4311 | 6208 | 4072 | 4220 | 4739 | 4111 | 4311 | |
Total shares | 2,591,849 | 6,783,306 | 53,399 | 855,966 | 4,100,438 | 4,876,354 | 26,964 | 6924 | |
Mean shares | 21,245 | 15,244 | 6675 | 7853 | 17,085 | 20,750 | 6741 | 17,018 | |
Median shares | 9670 | 5919 | 6334 | 4100 | 5953 | 10,700 | 2924 | 6924 | |
All b | Video count | 193 (19.9%) | 683 (70.3%) | 9 (0.9%) | 133 (13.7%) | 320 (33.0%) | 437 (45.0%) | 7 (0.7%) | 970 (100%) |
Total views | 722,469,000 | 2,509,817,000 | 32,500,000 | 458,085,600 | 1,049,844,100 | 1,771,256,300 | 28,600,000 | 3,551,686,000 | |
Mean views | 3,743,363 | 3,674,695 | 3,611,111 | 3,444,253 | 3,280,763 | 4,053,218 | 4,085,714 | 3,661,532 | |
Median views | 2,500,000 | 2,500,000 | 3,000,000 | 2,300,000 | 2,300,000 | 2,800,000 | 2,700,000 | 2,500,000 | |
Total likes | 145,046,400 | 502,500,700 | 5,733,100 | 97,412,600 | 228,687,200 | 332,490,400 | 5,016,000 | 713,707,000 | |
Mean likes | 751,536 | 735,726 | 637,011 | 732,426 | 714,648 | 760,848 | 716,571 | 735,780 | |
Median likes | 475,200 | 516,000 | 497,200 | 524,000 | 496,100 | 516,000 | 518,600 | 506,400 | |
Total comments | 1,732,528 | 5,615,308 | 73,655 | 892,439 | 2,468,235 | 4,088,591 | 54,531 | 8,689,193 | |
Mean comments | 8978 | 8222 | 8184 | 6710 | 7713 | 9356 | 7790 | 8958 | |
Median comments | 4732 | 5042 | 7373 | 4311 | 4803 | 5393 | 5065 | 5082 | |
Total shares | 4,066,064 | 13,699,383 | 85,099 | 1,187,639 | 5,959,749 | 10,920,632 | 206,064 | 19,943,582 | |
Mean shares | 21,068 | 20,058 | 9455 | 8930 | 18,624 | 24,990 | 29,438 | 20,560 | |
Median shares | 11,800 | 9169 | 8203 | 4100 | 8029 | 14,400 | 20,100 | 10,300 |
Only 16 videos included individuals categorized as non-binary, thus they have been omitted from the table as there were too few videos to reliably compare its metrics with the other genders. No videos were coded as “unable to identify.”
Videos could be coded under more than one gender if multiple people appeared in the videos.
Personal story, perspective, or confessional
The most common theme identified was the sharing of a personal story, perspective or confessional (59.1%) (Table 1). These videos often incorporated another related mental health theme identified through this analysis, particularly the emoting theme (n = 100, 17.4%) and the advice and information theme (n = 79, 13.8%). The most common emotions expressed were sadness (n = 191, 33.3%), followed by confusion (n = 132, 23.0%), and happiness (n = 130, 22.6%). Common narratives observed included testimonials of survival, struggle, journey, and inspiration. In some cases, videos were an outlet to share a difficult journey and reflect on recovery or living with illness. In other cases, videos were used as a creative outlet, such as skits and employing humor to connect on relatable mental health issues. Women were more represented compared to men (63.8% of videos with a woman vs. 30.0% of videos with a man) (Table S1) and white people were more represented than racialized individuals (72.6% vs. 19.9%) (Table S2). Adults (39.5%) and young adults (37.6%) were prominently featured compared to other age groups (Table S3).
Advice and information
Advice and information were offered in 32.9% of videos (Table 1). This included advice on managing difficult feelings, dealing with trauma and symptoms of mental health disorders, and identifying signs of a mental health disorder. For instance, one creator shared hidden signs of obsessive-compulsive disorder (OCD), including (1) making things even, (2) counting things you look at, and (3) handwashing. Some videos broke down mental health myths, encouraged individuals to seek help to address their mental health concerns, and provided information about psychiatric wards and crisis lines. Meanwhile, a few content creators shared their views on mental health medication, with some suggesting that medications are not necessary to treat certain mental health conditions, with statements such as “ADHD [attention-deficit/hyperactivity disorder] is a trait; meds are for profit only.” Groups giving the most advice across videos were women (56.1% vs. 36.4%) (Table S1), white individuals (73.7% vs. 18.8%) (Table S2), and adults (57.4%) (Table S3). Finally, 19.1% of videos (n = 61) providing advice and information were from qualified medical providers or counselors. Advice given was not always specific to a mental illness but also included ways to generally improve one's mental health.
After resolving any coder disagreement, advice given was coded as non-misleading in 67.0% (n = 215) of videos and misleading in 33.0% (n = 106). Some misleading videos contained overtly inaccurate facts; others lacked sufficient context to make conclusions and risked leading to inappropriate conclusions or self-diagnoses. Examples of misleading videos included misrepresentative facts about mental health disorders (e.g., signs of ADHD), attachment styles, gaslighting, and trauma responses. Misleading content performed better than non-misleading videos in views, likes, comments, and shares (Figure 1). Both healthcare professionals (HCP) (n = 61) and non-healthcare professionals (non-HCP) (n = 260) were more likely to give advice that was non-misleading than misleading, but HCPs were more likely to be non-misleading (HCP 75.4% non-misleading, 24.6% misleading; non-HCPs 65.0% non-misleading, 35.0% misleading). There was little difference in viewership between HCPs and non-HCPs (HCPs’ mean views, 3,460,727; non-HCPs’ mean views, 3,462,445).
Figure 1.
Comparison between misleading and non-misleading videos across TikTok engagement metrics (i.e., views, likes, comments, and shares).
Emoting
Videos themed as emoting, characterized by intense visual or audio expressing emotions, were found in 20.4% of videos (Table 1). Most often, videos expressed a difficult event, experience, or feeling and shared the intense emotion through crying, screaming, or other audio or visual representation. Men and women were observed emoting in similar video counts (41.9% man vs. 46.5% of videos with a woman) (Table S1). Meanwhile, white people emoted more than racialized individuals in the sample (66.2% vs. 20.4%) (Table S2). Adults emoted in the greatest number of videos (42.9%), followed by young adults (26.3%) and teenagers (16.7%) (Table S3).
References related to death
References related to death were found in 13.2% of videos (Table 1). Videos included individuals expressing their own thoughts of death (e.g., hopelessness, suicidal ideation, suicidal intent), sharing concerns about friends or family members dying by suicide, recounting experiences related to the topic of suicide (e.g., their own suicide attempts, losing a friend or family member to suicide), and providing advice to avoid acting on thoughts of suicide. Most videos contained personal narratives. For example, one video recounted a suicide attempt: “I will never forget when I was sitting on the floor with the paramedic. All I had to say was ‘I don’t want to do this anymore.” Another video discussed losing a loved one: “I lost my 16-year-old sister to suicide two months ago. I miss her so much every day. Long live [sister's name]. Baby sister, I love you so much.” Videos by women (63.3% vs. 33.6%) (Table S1), white people (69.5% vs. 23.4%) (Table S2), and adults (44.5%), (Table S3) contained the most references to death.
Exploratory queries
Few videos featured in our sample included references to scientific evidence or research (3.8%), which were characterized by any mention or allusion to the scientific process or in-depth biological discussion (Table 1). Additionally, there were few videos selling a product or service (2.0%) (Table 1). In the few videos under this theme, almost half were advertisements for mental health applications (43.8%), including Better Help (37.5%), which provides direct online counseling and therapy services 26 and Alan Mind (6.3%), a guided journaling platform. 27 Other videos in this category promoted and discussed healing products to hide self-harm scars and other miscellaneous products. There were too few videos in the science or research category and selling a product or service code to reliably compare across gender, race, and age.
Discussion
Our results demonstrate how popular and engaging mental health-related content is on TikTok. Specifically, TikTok is mainly used by teenagers (ages 13–18), young adults (ages 19–29), and adults (ages 30–64) as an outlet to share personal experiences and stories, give advice and information, and share intense emotions, which acquire on average 3,661,532 views per video. Thus, TikTok has immense potential for promoting mental health-related messages and education to a broad audience, especially among younger populations. However, this study demonstrates several barriers to disseminating reliable and equitable mental health-related content.
For instance, our study found fewer videos created by racialized individuals when pulling the top listed TikTok videos, and videos created by white men had more average views and shares compared to white women and racialized individuals. While TikTok has moderation guidelines to prevent hate speech and harassment, these same guidelines have been used to censor content from marginalized communities, including people of color, LGBTQ2S+ people, and people with disabilities, which can reinforce harmful biases.28,29,30–31 Although TikTok claims that they have banned such practices, their censoring decisions remain unclear. 28 As such, it is imperative that TikTok develops and enforces more robustly equitable content moderation policies and data privacy protections, which require transparency about the algorithm's content curation process and data handling practices. Policymakers may also consider implementing regulations that mandate regular audits of content moderation systems, require clear disclosure of data collection and sharing practices, and impose penalties for violations. Only by addressing these foundational issues can TikTok and other social media platforms serve as responsible and trustworthy mental health resources.
Another drawback to TikTok's algorithm is content quality. For example, the algorithm is more likely to promote shorter videos that tend to receive full views and higher engagement metrics. 32 This makes it challenging to discuss nuanced topics like mental health and limits the quality of the content shared on the platform. Content can often lead to misconceptions due to the platform's inability to tailor advice based on an individual's needs and the influence of its algorithm on people's beliefs and behaviors, regardless of scientific validation. 33 For instance, videos sharing trends such as “put a finger down challenge” tend to generalize symptoms associated with a specific mental health condition. While speculative, these generalizations can lead viewers to pathologize healthy behaviors and self-diagnose, which can contribute to health anxiety. 34 Nevertheless, it can also encourage people to reach out for specialist care to receive a clinical diagnosis, which they may not have otherwise without such information at hand. 35 Furthermore, while sharing personal stories can foster hope and connection, it can be difficult for viewers to discern how these stories apply to their own lives, particularly among people who lack media and health literacy and are less attuned to actively considering the context.
Other studies evaluating the quality of advice and information provided on TikTok relating to various health conditions, including prostate cancer, 36 erectile dysfunction, 37 and ADHD, 38 have also found a high percentage (47–90%) of videos containing misinformation. A cross-sectional study assessing the quality of ADHD content classified 51% of the top 100 videos with #adhd as misleading. 38 Our coders also noted concerns about data quality, with initial coding yielding 23.1% of videos resulting in true disagreement and another 26.5% resulting in one “unable to determine” label. This suggests a strong rationale for evidence-based health professionals to expand their scope of practice to engage on social media platforms such as TikTok, share accurate information, and dispel misinformation.39,40 For example, Al-Rawi and Zemenchik 41 report how frontline workers used TikTok to dispel misinformation during the COVID-19 pandemic and disseminated educational information to prevent the spread of COVID-19. TikTok may consider elevating qualified provider content by providing free ads and prioritizing algorithmic recommendations to ensure all have access to accurate mental health information.
Future research is needed to understand how users themselves view the role of TikTok in obtaining mental health information and sharing feelings or experiences, as well as the steps needed to ensure their safety on the platform. Focused attention is needed to gauge the experience of children, teenagers, and young adults, as they represent the majority of TikTok users. 1 A recent qualitative study exploring how youth (ages 12–24) used TikTok during the COVID-19 pandemic found that mental health-related content increased youths’ access to mental health information and support and gave them a safe space to share their experiences and connect with others about mental health challenges and successful coping strategies and treatments. 8 However, youth expressed concerns about the amount of misinformation on the platform and the barriers hindering their ability to recognize and report it (e.g., limited education, inability to embed evidence links, paid sponsorships, lack of verified information), affirming that youth themselves would appreciate strategies to increase their access to accurate mental health information. Conversely, youth also discussed the toll TikTok can have on youths’ mental health when being consistently exposed to negative experiences and comments. 8
While the majority of TikTok users represent a younger generation, social media usage varies widely across countries, age, gender, and social values.42,43 For example, TikTok and Instagram are used more frequently by Gen Z in the United States, 43 while YouTube and Instagram are more popular among this population in Canada. 42 Further, Millennials and Gen Z are more likely to use multiple social media platforms, 42 highlighting the need for future studies comparing mental health content across various social media platforms, especially with the fate of TikTok hanging in the balance around the world. 44 This would provide a more comprehensive understanding of the social media landscape's influence on mental health discourse and information dissemination, allowing for a more integrative synthesis of findings that could inform platform-specific and cross-platform interventions and policies.
The substantial proportion of “emoting” videos in our sample also raises questions about the nature and context of these highly emotional expressions and the potential implications of repetitive exposure to traumatic stories. Future research should investigate whether these videos are indicative of acute distress or crisis situations that may pose safety risks to the creators or viewers. This line of inquiry is crucial for understanding how to best connect TikTok users with appropriate mental health services and support resources, especially in cases where videos may signal imminent danger or the need for immediate intervention.
While this study has important implications, several limitations are noted. The sample was retrieved from the top-listed TikTok videos with the hashtag #mentalhealth, which does not necessarily include all of the top videos related to mental health. This is a consistent limitation across TikTok research.4,38,45–47 Our analysis was also limited to English-language videos, which may not fully represent the global diversity of mental health content on TikTok given its worldwide popularity. As such the findings may not be generalizable to TikTok content in other languages and cultural contexts. Furthermore, as demographic coding relied on visual identification from our research team, it was subject to unconscious bias and may have inaccurately represented the identity of creators. While we took several steps to increase accuracy, such as refining coding categories to high-level categories, double coding when details were questioned, and creating a “unable to code” category, the differences highlighted across age, gender, and ethnicity are highly speculative and do not provide insight on the specific experiences of gender-diverse and racialized individuals. As these individuals tend to experience poorer mental health and access to mental health services compared to cisgender and white individuals in primarily English-speaking countries including Canada 48 and the United States, 49 further research is warranted to understand how these groups utilize social media platforms such as TikTok to discuss mental health. Researchers may have to rely on qualitative research and/or add additional hashtags related to gender identity and ethnicity when conducting content analyses, as these metrics are not available for extrapolation on TikTok profiles.
Conclusion
TikTok has the potential to normalize mental health discourse and promote public health messages to a widespread audience. While it is uniquely positioned to relay large amounts of health advice and information, there is an urgent need to minimize the potential risks the platform poses, such as reinforcing biases, sharing personal information, spreading of misinformation, and negatively impacting people's mental health through excessive screen time, including the developmental health of younger populations. This may be achieved through a multifaceted approach involving governments, TikTok, healthcare organizations, school systems, service providers, and platform users. Governments regulations that require transparent content moderation and data privacy protections by TikTok are needed, as well as mechanism to limit misinformation and the negative mental health impacts of the platform. Meanwhile, qualified medical providers, HCPs, and mental health and health organizations should consider becoming active on TikTok to contribute to an evidence-based information ecosystem and further advocate for public health values across the platform's policies, features, and design. Platform users should also be mindful of the amount of time they spend on the platform and be critical of the content they consume given the lack of verified information on the platform. These skills should be taught across all levels of the education system, as digital information plays a large part of people's everyday lives.
Supplemental Material
Supplemental material, sj-docx-1-dhj-10.1177_20552076241297062 for Do you have depression? A summative content analysis of mental health-related content on TikTok by Roxanne Turuba, Marco Zenone, Raman Srivastava, Jonathan Stea, Yuri Quintana, Nikki Ow, Kirsten Marchand, Amanda Kwan, Anna-Joy Ong, Xiaoxu Ding, Cassia Warren, Alessandro R Marcon, Jo Henderson, Steve Mathias and Skye Barbic in DIGITAL HEALTH
Acknowledgments
We thank Dr. Krista Glowacki and Ana Flechas for assisting with the study and providing direct technical support.
Footnotes
Contributorship: Conceptualization: Marco Zenone, Raman Srivastava, Jonathan Stea, Yuri Quintana, Nikki Ow, Xiaoxu Ding, Alessandro Marcon, Jo Henderson, Steve Mathias, and Skye Barbic. Project administration: Marco Zenone, Roxanne Turuba. Data Collection: Marco Zenone, Nikki Ow, Skye Barbic, and Alessandro Marcon. Analysis and interpretation: Raman Srivastava, Jonathan Stea, Steve Mathias, Kirsten Marchand, Cassia Warren, Amanda Kwan, Anna-Joy Ong, Xiaoxu Ding, Marco Zenone, Roxanne Turuba. Writing—original draft preparation: Roxanne Turuba and Marco Zenone. Writing—review and editing: Roxanne Turuba, Marco Zenone, Raman Srivastava, Jonathan Stea, Yuri Quintana, Nikki Ow, Kirsten Marchand, Amanda Kwan, Anna-Joy Ong, Xiaoxu Ding, Cassia Warren, Alessandro Marcon, Jo Henderson, Steve Mathias, and Skye Barbic.
Data availability: The data that support the findings of this study are publicly available on TikTok. TikTok. #Homelessness [video file]. 2021, Oct [2021, Oct 16–21]. Available from: https://www.tiktok.com/search?lang=en&q=%23homelessness&t=1736275362183.
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Ethical considerations: All data was collected and reported according to the terms and conditions of TikTok, which state that content posted by individuals is publicly available to syndicate, broadcast, distribute, repost, promote, or publish, excluding private information (e.g., home addresses or identity documents).
Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the Canadian Institutes of Health Research, Michael Smith Health Research BC, University of British Columbia's Institute of Mental Health (grant number Operating Grant: Emerging COVID-19 Research Gaps & Research Trainee award (author KM), Scholar award (author SB), Marshall Fellowship).
Guarantor: SB.
ORCID iDs: Roxanne Turuba https://orcid.org/0000-0002-5355-7932
Marco Zenone https://orcid.org/0000-0003-4201-6070
Alessandro Marcon https://orcid.org/0000-0001-5018-423X
Supplemental material: Supplemental material for this article is available online.
References
- 1.Doyle B. (2024, May 7). TikTok statistics – Everything you need to know. Wallaroo Media. https://wallaroomedia.com/blog/social-media/tiktok-statistics/ (2024, accessed 27 March 2024).
- 2.Samji H, Dove N, Ames M, et al. Impacts of the COVID-19 pandemic on the health and well-being of young adults in British Columbia. British Columbia Centre for Disease Control COVID-19 Young Adult Task Force. July 2021.
- 3.TikTok. #MentalHealth hashtag videos on TikTok. https://www.tiktok.com/tag/mentalhealth?lang=en (accessed 1 April 2024).
- 4.Chandrashekar P. Do mental health mobile apps work: evidence and recommendations for designing high-efficacy mental health mobile apps. mHealth 2018; 4: 6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Wang K, Varma DS, Prosperi M. A systematic review of the effectiveness of mobile apps for monitoring and management of mental health symptoms or disorders. J Psychiatr Res 2018; 107: 73–78. [DOI] [PubMed] [Google Scholar]
- 6.Zenone M, Ow N, Barbic S. Tiktok and public health: a proposed research agenda. BMJ Glob Health 2021; 6: e007648. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Basch CH, Donelle L, Fera Jet al. et al. Deconstructing TikTok videos on mental health: cross-sectional, descriptive content analysis. JMIR Form Res 2022; 6: e38340. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Turuba R, Cormier W, Zimmerman R, et al. Exploring how youth use TikTok for mental health information in British Columbia, Canada: a qualitative study. JMIR Infodem 2024; 4: 53233. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Hawke LD, Sheikhan NY, MacCon Ket al. et al. Going virtual: youth attitudes towards and experiences of virtual mental health and substance use services during the COVID-19 pandemic. BMC Health Serv Res 2021; 21: 340. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Sala LL, The Z, Lamblin M, et al. Can a social media intervention improve online communication about suicide? A feasibility study examining the acceptability and potential impact of the #chatsafe campaign. PLOS One 2021; 16: e0253278. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Latha K, Meena KS, Pravitha MR, et al. Effective use of social media platforms for promotion of mental health awareness. J Educ and Health Promot 2020; 9: 124. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Cauberghe V, Van Wesenbeeck I, De Jans S, et al. How adolescents use social media to cope with feelings of loneliness and anxiety during COVID-19 lockdown. Cyberpsychol Behav Soc Netw 2021; 24: 250–257. [DOI] [PubMed] [Google Scholar]
- 13.Cinelli M, Quattrociocchi W, Galeazzi A, et al. The COVID-19 social media infodemic. Sci Rep 2020; 10: 16598. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.The Lancet Infectious Diseases. The COVID-19 infodemic. Lancet Infect Dis 2020; 20: 875. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Lorenzo-Luaces L, Dierckman C, Adams S. Attitudes and (mis)information about cognitive behavioral therapy on TikTok: an analysis of video content. J Med Internet Res 2023; 25: e45571. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Mohamed Z. Young women are self-diagnosing personality disorders, thanks to TikTok. https://www.elle.com/uk/life-and-culture/a39573245/young-women-self-diagnose-personality-disorder-tiktok/ (2022, accessed 27 April 2024).
- 17.Ledwith M. TikTok helps children find videos on self-harming and anorexia. The Times. https://www.thetimes.co.uk/article/tiktok-helps-children-find-videos-on-self-harming-and-anorexia-bdkxxpwcm (2022, accessed 27 April 2024).
- 18.Open Access Government. TikTok algorithm harms mental health via “unrealistic beauty standards”. https://www.openaccessgovernment.org/tiktok-algorithm-harms-mental-health-via-unrealistic-beauty-standards/134241/ (2022, accessed 27 April 2022).
- 19.Little O, Richards A. TikTok’s algorithm leads users from transphobic videos to far-right rabbit holes. Media Matters for America. from https://www.mediamatters.org/tiktok/tiktoks-algorithm-leads-users-transphobic-videos-far-right-rabbit-holes (2021, accessed 20 October 2021).
- 20.MacKinnon KR, Kia H, Lacombe-Duncan A. Examining TikTok’s potential for community-engaged digital knowledge mobilization with equity-seeking groups. J Med Internet Res 2021; 23: e30315. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Kleinheksel AJ, Rockich-Winston N, Tawfik Het al. et al. Demystifying content analysis. Am J Pharm Educ 2020; 84: 7113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Murdoch B, Marcon AR, Downie Det al. et al. Media portrayal of illness-related medical crowdfunding: a content analysis of newspaper articles in the United States and Canada. PLOS One 2019; 14: e0215805. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Rachul C, Marcon AR, Collins Bet al. et al. COVID-19 and ‘immune boosting’ on the internet: a content analysis of google search results. BMJ Open 2020; 10: e040989. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res 2005; 15: 1277–1288. [DOI] [PubMed] [Google Scholar]
- 25.Geisler C, Swarts J. Chapter 5. Achieving reliability. In: Coding streams of language. The WAC Clearinghouse: University Press of Colorado. 2019; pp. 155–202. doi: 10.37514/PRA-B.2019.0230 [DOI] [Google Scholar]
- 26.BetterHelp. Professional therapy with a licensed therapist. https://www.betterhelp.com/ (accessed 1 April 2024).
- 27.Alan Mind. Care for your team’s mental health. Simply and completely. https://mind.alan.com/ (accessed 1 April 2024).
- 28.Biddle S, Ribeiro PV, Dias T. Invisible censorship. TikTok told moderators to suppress posts by “ugly” people and the poor to attract new users. The Intercept. https://theintercept.com/2020/03/16/tiktok-app-moderators-users-discrimination/ (2020, accessed 26 February 2024).
- 29.Murray C. TikTok algorithm error sparks allegations of racial bias. NBC News. https://www.nbcnews.com/news/us-news/tiktok-algorithm-prevents-user-declaring-support-black-lives-matter-n1273413 (2021, accessed 26 February 2024).
- 30.Woods M. It’s queer versus TikTok moderation. Xtra. https://xtramagazine.com/power/tiktok-censorship-queer-moderation-200629 (2021, accessed 16 February 2024).
- 31.Harris C, Johnson AG, Palmer S, et al. “Honestly, I think TikTok has a vendetta against black creators”: understanding black content creator experiences on TikTok. Proc ACM Hum Comp Interact 2023; 7: 1–31. [Google Scholar]
- 32.Sheikh M. How the TikTok algorithm works in 2024. Sprout Blog. https://sproutsocial.com/insights/tiktok-algorithm/ (2024, accessed 1 April 2024).
- 33.Chevalier O. “It starts on TikTok”: looping effects and the impact of social media on psychiatric terms. Philos Psychiatry Pschol 2024; 31: 163–174. [Google Scholar]
- 34.Brooks K. TikTok leads people to self-diagnosing mental health conditions. WHSV3. https://www.whsv.com/2022/01/12/tiktok-leads-people-self-diagnosing-mental-health-conditions/ (2022, accessed 16 April 2024).
- 35.Boseley M. TikTok accidentally detected my ADHD. For 23 years everyone missed the warning signs. The Guardian. https://www.theguardian.com/commentisfree/2021/jun/04/tiktok-accidentally-detected-my-adhd-for-23-years-everyone-missed-the-warning-signs (2021; accessed 13 Sep 2024).
- 36.Xu Alex J, Taylor Jacob, Gao Tian, et al. TikTok and prostate cancer: misinformation and quality of information using validated questionnaires. BJU International 2021; 128(4): 435–437. [DOI] [PubMed] [Google Scholar]
- 37.Babar Mustufa, Loloi Justin, Patel Rutul D, et al. Cross‐sectional and comparative analysis of videos on erectile dysfunction treatment on YouTube and TikTok. Andrologia 2022; 54(5). 10.1111/and.v54.5 [DOI] [PubMed] [Google Scholar]
- 38.Yeung Anthony, Ng Enoch, Abi-Jaoude Elia. TikTok and Attention-Deficit/Hyperactivity Disorder: A Cross-Sectional Study of Social Media Content Quality. Can J Psychiatry 2022; 67(12): 899–906. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Stellefson Michael, Paige Samantha R, Chaney Beth H., et al. Evolving Role of Social Media in Health Promotion: Updated Responsibilities for Health Education Specialists. Int J Env Res Public Health 2020; 17(4): 1153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Zenone Marco, Kenworthy Nora, Barbic Skye. The Paradoxical Relationship Between Health Promotion and the Social Media Industry. Health Promot Pract 2023; 24(3): 571–574. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Al-Rawi Ahmed, Zemenchik Kiana. Tiktoking COVID-19 with frontline workers. Digit Health, 2023; 9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Denham B. Trends: Social media in Canada. Environics Research. https://environics.ca/article/2024-trends-social-media-in-canada/ (2024; accessed 13 Sep 2024). [Google Scholar]
- 43.Zote J. Social media demographics to inform your 2024 strategy. Sprout Blog. https://sproutsocial.com/insights/new-social-media-demographics/ (2021; accessed 13 Sep 2024).
- 44.Which countries have banned TikTok and why? Euronews. https://www.euronews.com/next/2024/03/14/which-countries-have-banned-tiktok-cybersecurity-data-privacy-espionage-fears (2024; accessed 13 Sep 2024).
- 45.Rutherford BN, Sun T, Johnson B, et al. Getting high for likes: exploring cannabis-related content on TikTok. Drug Alcohol Rev 2021; 41: 1119–1125. [DOI] [PubMed] [Google Scholar]
- 46.Herrick SSC, Hallward L, Duncan LR. “This is just how I cope”: an inductive thematic analysis of eating disorder recovery content created and shared on TikTok using #EDrecovery. Int J Eat Disord 2021; 54: 516–526. [DOI] [PubMed] [Google Scholar]
- 47.Fraticelli L, Smentek C, Tardivo D, et al. Characterizing the content related to oral health education on TikTok. Int J Environ Res Public Health 2021; 18: 13260. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Kia Hannah, Rutherford Leo, Jackson Randy, et al. Impacts of COVID-19 on trans and non-binary people in Canada: a qualitative analysis of responses to a national survey. BMC Public Health 2022; 22(1): 525. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Moore Kiara L. Mental Health Service Engagement Among Underserved Minority Adolescents and Young Adults: a Systematic Review. J Racial Ethn Health Disparities 2018; 5(5): 1063–1076. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental material, sj-docx-1-dhj-10.1177_20552076241297062 for Do you have depression? A summative content analysis of mental health-related content on TikTok by Roxanne Turuba, Marco Zenone, Raman Srivastava, Jonathan Stea, Yuri Quintana, Nikki Ow, Kirsten Marchand, Amanda Kwan, Anna-Joy Ong, Xiaoxu Ding, Cassia Warren, Alessandro R Marcon, Jo Henderson, Steve Mathias and Skye Barbic in DIGITAL HEALTH