Abstract
In this viewpoint, we introduce the term ‘screenwashing’, which describes the phenomenon whereby social media platforms, such as TikTok and Instagram, pretend to be more socially responsible than they actually are. That is, social media platforms pretend to be thoughtful about children's health and the prevention of problematic social media use, but this often turns out to be nothing more than “a lick of paint”. We describe how features like the one-hour notification on TikTok and Instagram are considered screenwashing and why we believe so. Screenwashing, an unethical practice, has the potential to mislead parents and young users. Consequently, we advocate for increased government intervention to protect our youth from the potential hazards associated with problematic social media use.
Keywords: screenwashing, tech industry, adolescents, time-notification
‘Screenwashing’ describes the phenomenon whereby social media platforms, such as TikTok and Instagram, pretend to be more socially responsible than they actually are. It is imperative to be aware of this phenomenon, as the popularity and reach of online media platforms have grown dramatically in recent years, and is continuingly growing. For example, TikTok more than doubled its worldwide user base between 2019 and 2021 (291.4 million to 655.9 million; Ceci, 2023). Also, Instagram's worldwide user base grew by 383 million from 2019 to 2021 (Insider Intelligence, 2022). These social media platforms are particularly popular among younger generations (Wallaroo media, 2023). And while there is an age limit of 13 years, there is an estimated 24–40% under 13 years of age on TikTok, Snapchat and Instagram (Ofcom, 2023; Thorn report, 2021). The social media platforms are designed to capture and sustain the attention of (young) users, resulting in U.S. youth spending an average of 113 min per day on TikTok and approximately 90 min per day on Snapchat (Ceci, 2023), with an average of 237 notifications per day on Instagram (Commonwealth of Massachussetts, 2023). While social media offers enjoyment and entertainment to the majority of users, involving activities such as exploring creative content and connecting with peers (Shapiro & Margolin, 2014), recent insights shed light on its adverse effects. Specifically, there's growing awareness regarding youth engaging with social media in a problematic and compulsive manner, exhibiting behaviors akin to addiction (Boer, Stevens, Finkenauer, Koning, & Van den Eijnden, 2021; Van den Eijnden, Lemmens, & Valkenburg, 2016). While only a small fraction of youth can be categorized as problematic users—exhibiting over five symptoms, approximately 4–5% (Boer et al., 2021)—a concerning observation is that more than one-third of young individuals display two or more symptoms, placing them in the at-risk group (Boer et al., 2021; Geurts, Koning, Vossen, & van den Eijnden, 2022). These problematic as well as at-risk users are more likely than normative users (0 or 1 symptoms) to report lower levels of mental health (life-satisfaction, self-esteem, depression and loneliness; Huang, 2020), sleep quality (Van den Eijnden, Geurts, Van der Rijst, TerBogt, & Koning, 2021), and social relations and school grades (Sharif et al., 2010; Van den Eijnden, Koning, Doornwaard, Van Gurp, & Bogt, 2018). This knowledge about the negative impact of problematic social media use on youth development has generated societal and political debate on the level of regulation that should be required for these social media platforms. Currently, governments are taking first steps in limiting the playing field of Very Large Online Platforms (VLOPs) such as Meta and TikTok through new legislations, such as the Digital Services Act (DSA; European Commission, 2021) and it's British counterpart the Online Safety Act (UK Parliament, 2023). In addition, in several countries around the world TikTok has recently been banned from work phones of government employees, and the European Commission introduced a similar measure to “increase cyber security”. Although these discussions about the safety of TikTok are already an important step forward in the political arena, we wonder why this hasn't led to any protective measures implemented for our children and youth in general. Why would one expect social media platforms to be harmful to politicians but not to our youth? How can we expect young people to use social media in a safe manner? It is imperative to introduce governmental regulations regarding access to and usage of social media platforms, as well as the design of the platform, in order to safeguard youngsters from the inherent risks associated with these platforms. For example, in the U.S. Meta is sued for ‘intentionally designing its social media platforms, i.e. Instagram, to be addictive to youth’ (Commonwealth of Massachussetts, 2023, page 1). Presently, the safeguarding of our youth predominantly hinges on the ‘preventive' initiatives introduced by social media platforms themselves. However, in this paper, we contend that placing excessive trust in these platforms to implement truly effective preventive measures for fostering healthy digital media usage is unwarranted. Instead, we assert that these initiatives often amount to ‘screenwashing,' a concept we will substantiate through our arguments.
What is screenwashing?
Screenwashing derives from the concept of greenwashing, which typically refers to companies pretending to be more environmentally responsible than they actually are. For instance, Shell highlights the sustainability of its investments by classifying natural gas as renewable energy (1.5% of their investment; Washington Post, 1/2/2023) to overshadow their involvement in environmentally damaging practices. We argue that this concept of ‘greenwashing’ can also be applied to social media platforms, which we further refer to as screenwashing.
Screenwashing refers to tech companies pretending to prioritize their users' health more than they genuinely do. In response to societal concerns about problematic social media usage and to uphold a positive brand image, social media platforms have recently introduced features aimed at ostensibly safeguarding young users from harmful platform use and promoting a healthier experience. Examples of these features include TikTok's time notification, as reported by Keenan in March 2023, Messenger's Parental Supervision Tools (Meta, 03/2023Meta, 03/2023), and Snapchat's Parental Content Controls (Team Snap, 03/2023Team Snap, 03/2023). Nevertheless, it's crucial to acknowledge that these social media platforms operate on an attention-based business model, wherein users' attention serves as the product sold to advertisers and other buyers (Williams, 2018). Therefore, social media companies are highly invested in getting and keeping the attention of their users, to increase usage time and thus their market share. Given this attention-economy business model of social media companies, they will always be inclined to increase usage rather than decrease it. This is exemplified by the changes in the ‘daily time limit’ options of Instagram (Lomas, 2022). Tech journalists argue that because of a decrease in market share, Instagram tried to increase the time users spend on their platform by quietly changing the minimum daily time limit from 10 to 30 min. Therefore, the supposedly protective measures implemented by the social media platforms themselves, are marketed on a disproportionately large scale to create a positive brand image instead of stimulating healthy online behaviors.
Why is it screenwashing?
While features initiated by the social media platforms may seem promising to prevent problematic social media use, we will further elaborate on why we believe this is screenwashing.
A first argument is that these preventative features of social media platforms are unlikely to be effective. Youth have limited resources (i.e. literacy skills, knowledge, self-control) and motivation to adhere to the incentive to control their social media use (Helsper & Smahel, 2020; Throuvala, Griffiths, Rennoldson, & Kuss, 2019). Due to adolescents' still developing brain, they face difficulties in regulating their actions, thoughts, and emotions (i.e., self-control), as well as for their tendency to seek immediate gratification (Casey, 2015; Du, Kerkhof, & van Koningsbruggen, 2019). Particularly when using social media, research has shown that self-control is even more challenged (Siebers, Beyens, Pouwels, & Valkenburg, 2021), because of increased online vigilance and preoccupation with social media (Johannes, Veling, Verwijmeren, & Buijzen, 2019). Social media fulfill several developmental needs such as identity formation, emotionally engaging with peers and looking for role models (Bossen & Kottasz, 2020; Nadkarni & Hofmann, 2012). That is, youth have the need to belong to a peer group and for that it is important to be able to join in conversations about the daily content of social media, such as TikTok. Subsequently, not all children are able and motivated to stop using social media after seeing a notification that they can click away (with a passcode) fairly easily. Due to the underdeveloped brain, children are more susceptible to the developmental needs that are fulfilled by social media. This is exemplified by research that demonstrated that notifications are not likely to be effective among young adults to lower smartphone screen-time and self-reported problematic social media use (Loid et al., 2020), yet may seem to work among adults to reduce potential harmful gambling behaviors (Bjørseth et al., 2021). Thus, it is not realistic to assume that a simple time notification will change behaviors that are satisfying adolescents' basic needs for immediate gratification and peer connection.
This brings us to our next argument; children can't beat an entire team of engineers that design their platforms in ways that make them addictive (Bhargava & Velasquez, 2021). Particularly addictive features of social media platforms are for instance 1) intermittent variable rewards (the so-called slot machine effect) such as pop-up notifications and “pull-to refresh” buttons (Clark & Zack, 2023), 2) design features to exploit our need for social validation and social reciprocity such as snapstreaks (Harris, 2017) and likes (Lee et al., 2020), 3) elimination of natural stopping cues achieved through the implementation of features such as endless scrolling (Bhargava & Velasquez, 2021) and 4) a powerful algorithm that grabs users' attention and keeps them hooked (user flow) by showing content that is based on youths' active liking (e.g., thumbs up), search terms and the length of watching certain content. Together, the design and algorithm behind social media platforms prolong usage time of their platform (Montag, Lachmann, Herrlich, & Zweig, 2019), hence increasing the risk of problematic social media use (Qin, Omar, & Musetti, 2022; Tian, Bi, & Chen, 2023). Hence, the impact of the one-hour notification, as implemented by platforms like TikTok, is significantly overshadowed by the multitude of other addictive features present on the platform.
How can we protect youth?
When it comes to protecting our children against the negative impact of problematic social media, it is crucial not to depend solely on social media platforms voluntarily taking their responsibility. As potential effective measures directly oppose their commercial interests, current actions taken by social media companies are simply examples of screenwashing, i.e., pretending to be more socially responsible than a tech company actually is. This is exemplified by the discrepancy in the public sharing of positive numbers related to self-harm content views compared to the actual statistics by Meta—less than 0.05% as opposed to 6.7% (or even 16.9% among 13–15 year olds) in the past seven days. Nor can we expect all underaged users or their parents to develop the resilience required for safe online behaviors that outweigh the powerful addictive features of these social media platforms.
Governments should take their responsibility and provide the necessary conditions for a safe digital environment for our youth (OECD, 2021) by limiting the playing field of tech companies. Foremost, governments should be in the lead in setting regulations concerning age-limits and addictive features of social media platforms. Second, governments must monitor compliance with these regulations. While efforts to increase the digital skills of children, parents and teachers remain important, clear regulations for tech companies may lower the digital inequality and protect (mental) health of our future generations. Coherent evidence-based policies and interventions are needed to address the balance between making use of the opportunities that the digital environment can bring to all children and protecting them from the risks. In our view, governments all over the world have the obligation to protect the rights of the child against the rapidly expanding influence of social media companies.
Funding sources
No funding sources.
Authors' contribution
Koning conceived the idea of the study, whereafter Koning & Vossen jointly drafted, wrote and revised the paper. Van den Eijnden reviewed the final paper.
Conflict of interest
We have no known conflict of interest to disclose.
References
- Bhargava, V. R., & Velasquez, M. (2021). Ethics of the attention economy. Business Ethics Quarterly, 31(3), 321–359. [Google Scholar]
- Bjørseth, B., Simensen, J. O., Bjørnethun, A., Griffiths, M. D., Erevik, E. K., Leino, T., & Pallesen, S. (2021). The effects of responsible gambling pop-up messages on gambling behaviors and cognitions: A systematic review and meta-analysis. Frontiers in Psychiatry, 11, 601800. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Boer, M., Stevens, G. W. J. M., Finkenauer, C., Koning, I. M., & Van den Eijnden, R. J. J. M. (2021). Validation of the social media disorder scale in adolescents: Findings from a large-scale Nationally Representative Sample. Assessment. 10.1177/10731911211027232. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bossen, C. B., & Kottasz, R. (2020). Uses and gratifications sought by pre-adolescent and adolescent TikTok consumers. Young Consumers, 21(4), 463–478. [Google Scholar]
- Casey, B. J. (2015). Beyond simple models of self-control to circuit-based accounts of adolescent behavior. Annual Review of Psychology, 66, 295–319. [DOI] [PubMed] [Google Scholar]
- Ceci, L. (2023). TikTok: Number of global users 2020–2025. Retrieved on September 13, 2023, from https://www.statista.com/statistics/1327116/number-of-global-tiktok-users/.
- Clark, L., & Zack, M. (2023). Engineered highs: Reward variability and frequency as potential prerequisites of behavioural addiction. Addictive Behaviors, 140, 107626. 10.1016/j.addbeh.2023.107626. [DOI] [PubMed] [Google Scholar]
- Commonwealth of Massachussetts (2023). META PLATFORMS INC. and INSTAGRAM LLC. Complaint and Jury demand. File retrieved on 20 December 2023 at https://www.mass.gov/files/documents/2023/10/25/2023-10-24-Meta%20Complaint%20-%20REDACTED.pdf.
- Du, J., Kerkhof, P., & van Koningsbruggen, G. M. (2019, Jul). Predictors of social media self-control failure: Immediate gratifications, habitual checking, ubiquity, and SOCIAL MEDIA AND SELF-CONTROL FAILURE 25 notifications. Cyberpsychology, Behavior, and Social Networking, 22(7), 477–485. 10.1089/cyber.2018.0730. [DOI] [PubMed] [Google Scholar]
- European Commission (2021). Digital Services Act. https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en. Retrieved 19 December 2023.
- Geurts, S. M., Koning, I. M., Vossen, H. G. M., & van den Eijnden, R. J. J. M. (2022). Rules, role models or overall climate at home? Relative associations of different family aspects with adolescents' problematic social media use. Comprehensive Psychiatry, 116, 1–15. [152318]. 10.1016/j.comppsych.2022.152318. [DOI] [PubMed] [Google Scholar]
- Harris, T. (2017). How a handful of tech companies control billions of minds every day. Retrieved on July 20, 2023, from http://cases.frsgym.dk/ny-2stx211-ENG_B-31052021/files/193760_non_fiction_4.pdf.
- Helsper, E. J., & Smahel, D. (2020). Excessive internet use by young Europeans: Psychological vulnerability and digital literacy? Information, Communication & Society, 23(9), 1255–1273. 10.1080/1369118X.2018.1563203. [DOI] [Google Scholar]
- Huang, C. (2020). A meta-analysis of the problematic social media use and mental health. International Journal of Social Psychiatry, 68(1), 12–33. 10.1177/0020764020978434. [DOI] [PubMed] [Google Scholar]
- Insider Intelligence (2022). Instagram in 2022: Global user statistics, demographics and marketing trends to know. Retrieved on September 4, 2023, from https://www.insiderintelligence.com/insights/instagram-user-statistics-trends/.
- Johannes, N., Veling, H., Verwijmeren, T., & Buijzen, M. (2019). Hard to resist? The effect of smartphone visibility and notifications on response inhibition. Journal of Media Psychology: Theories, Methods, and Applications, 31(4), 214. 10.1027/1864-1105/a000248. [DOI] [Google Scholar]
- Keenan, C. (2023). New features for teens and families on TikTok. Retrieved on September 13, 2023, from https://newsroom.tiktok.com/en-us/new-features-for-teens-and-families-on-tiktok-us.
- Lee, H. Y., Jamieson, J. P., Reis, H. T., Beevers, C. G., Josephs, R. A., Mullarkey, M. C., … Yeager, D. S. (2020). Getting fewer “likes” than others on social media elicits emotional distress among victimized adolescents. Child Development, 91(6), 2141–2159. 10.1111/cdev.13422. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Loid, K., Tähta, K., & Rozgonjuk, D. (2020). Do pop-up notifications regarding smartphone use decrease screen time, phone checking behavior, and self-reported problematic smartphone use? Evidence from a two-month experimental study. Computers in Human Behavior, 102, 22–30. 10.1016/j.chb.2019.08.007. [DOI] [Google Scholar]
- Lomas, N. (2022). Instagram quietly limits ‘daily time limit’ option. Retrieved on September 9, 2023, from https://techcrunch.com/2022/02/21/instagram-limits-daily-time-limits/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAI-ydbnbf60xGJDkdnPvHY7zyXemj65D7H1g1UKa5yjztytWlpmRIfuwWILxdKf8FKAMNf_VXUx9X9jJs3YFvqf28xgL0qnnJMKNqX95wcoe6y_vZe3l78OIjzGnajj-XG9Zlnd33T9tlvz6zWgyw1Im2KYKbaVmN7OxRUfSwlcR.
- Meta (2023). Giving teens and parents more ways to manage their time on our apps. Retrieved on July 20, 2023, from https://about.fb.com/news/2023/06/parental-supervision-and-teen-time-management-on-metas-apps/.
- Montag, C., Lachmann, B., Herrlich, M., & Zweig, K. (2019). Addictive features of social media/messenger platforms and freemium games against the background of psychological and economic theories. International Journal of Environmental Research and Public Health, 16(14), 2612. 10.3390/ijerph16142612. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nadkarni, A., & Hofmann, S. G. (2012). Why do people use Facebook? Personality and Individual Differences, 52(3), 243–249. 10.1016/j.paid.2011.11.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- OECD (2021). Recommendation of the council on children in the digital environment. Retrieved on April 20, 2023, from https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0389. [Google Scholar]
- Ofcom (2023). Report retrieved at https://www.ofcom.org.uk/__data/assets/pdf_file/0027/255852/childrens-media-use-and-attitudes-report-2023.pdf.
- Qin, Y., Omar, B., & Musetti, A. (2022). The addiction behavior of short-form video app TikTok: The information quality and system quality perspective. Frontiers in Psychology, 13. 10.3389/fpsyg.2022.932805. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shapiro, L. A. S., & Margolin, G. (2014). Growing up wired: Social networking sites and adolescent psychosocial development. Clinical Child and Family Psychology Review, 17(1), 1–18. 10.1007/s10567-013-0135-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sharif, I., Wills, T. A., & Sargent, J. D. (2010). Effect of visual media use on school performance: A prospective study. Journal of Adolescent Health, 46, 52–61. 10.1016/j.jadohealth.2009.05.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Siebers, T., Beyens, I., Pouwels, J. L., & Valkenburg, P. M. (2021). Distracted or not? An experience sampling study on adolescents’ social media use and self-control failure. OSF Preprints. [Google Scholar]
- Team Snap (2023, September 10). Introducing content controls on family centre. Retrieved September 13, 2023, from https://values.snap.com/en-GB/news/introducing-content-controls-on-family-center.
- Thornreport (2021). Retrieved September 4, 2023, from https://info.thorn.org/hubfs/Research/Responding%20to%20Online%20Threats_2021-Full-Report.pdf.
- Throuvala, M. A., Griffiths, M. D., Rennoldson, M., & Kuss, D. J. (2019). Motivational processes and dysfunctional mechanisms of social media use among adolescents: A qualitative focus group study. Computers in Human Behavior, 93, 164–175. 10.1016/j.chb.2018.12.012. [DOI] [Google Scholar]
- Tian, X., Bi, X., & Chen, H. (2023). How short-form video features influence addiction behavior? Empirical research from the opponent process theory perspective. Information Technology & People, 36(1), 387–408. [Google Scholar]
- UK Parliament (2023). Online safety Act 2023. Government Bill. Retrieved 19 December 2023 https://bills.parliament.uk/bills/3137.
- Van den Eijnden, R., Geurts, S., Van der Rijst, V., Ter Bogt, T., & Koning, I. M. (2021). The impact of social media disorder symptoms and social media use on sleep, and the preventing role of parental rules. International Journal of Environmental Research and Public Health, 18(3), 1346. 10.3390/ijerph18031346. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Van den Eijnden, R. J. J. M., Koning, I. M., Doornwaard, S., Van Gurp, F., & Bogt, T. T. (2018). The impact of heavy and disordered use of games and social media on adolescents’ psychological, social, and school functioning. Journal of Behavioral Addictions, 7(3), 697–706. 10.1556/2006.7.2018.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Van den Eijnden, R. J. J. M., Lemmens, J. S., & Valkenburg, P. M. (2016). The social media disorder scale. Computers in Human Behavior, 61, 478–487. 10.1016/j.chb.2016.03.038. [DOI] [Google Scholar]
- Wallaroo media (2023). TikTok statistics. https://wallaroomedia.com/blog/social-media/tiktok-statistics/. Accessed April 25, 2023.
- Williams, J. (2018). Stand out of our light: Freedom and resistance in the attention economy. Cambridge University Press. [Google Scholar]