Analyzing Social Media Claims regarding Youth Online Safety Features to Identify Problem Areas and Communication Gaps
By: Renkai Ma , Dominique Geissler , Stefan Feuerriegel and more
Social media platforms have faced increasing scrutiny over whether and how they protect youth online. While online risks to children have been well-documented by prior research, how social media platforms communicate about these risks and their efforts to improve youth safety have not been holistically examined. To fill this gap, we analyzed N=352 press releases and safety-related blogs published between 2019 and 2024 by four platforms popular among youth: YouTube, TikTok, Meta (Facebook and Instagram), and Snapchat. Leveraging both inductive and deductive qualitative approaches, we developed a comprehensive framework of seven problem areas where risks arise, and a taxonomy of safety features that social media platforms claim address these risks. Our analysis revealed uneven emphasis across problem areas, with most communications focused on Content Exposure and Interpersonal Communication, whereas less emphasis was placed on Content Creation, Data Access, and Platform Access. Additionally, we identified three problematic communication practices related to their described safety features, including discrepancies between feature implementation and availability, unclear or inconsistent explanations of safety feature operation, and a lack of evidence regarding the effectiveness of safety features in mitigating risks once implemented. Based on these findings, we discuss the communication gaps between risks and the described safety features, as well as the tensions in achieving transparency in platform communication. Our analysis of platform communication informs guidelines for responsibly communicating about youth safety features.
Similar Papers
Towards an Automated Framework to Audit Youth Safety on TikTok
Computers and Society
TikTok shows kids as much bad stuff as adults.
Protecting Young Users on Social Media: Evaluating the Effectiveness of Content Moderation and Legal Safeguards on Video Sharing Platforms
Social and Information Networks
Younger kids see bad videos faster online.
Social Media for Activists: Reimagining Safety, Content Presentation, and Workflows
Human-Computer Interaction
Makes social media safer for activists.