Teen Mental Health

Social Media Giants Accused of Hiding Research on Teen Mental Health Harm

Internal documents and statements from Meta, YouTube, TikTok, and Snapchat reveal that these social media giants were aware of the addictive nature of their platforms and the potential harm to teens, yet continued to target them. According to a newly unsealed legal filing, internal communications show executives acknowledging that the platforms’ designs could be harmful to users’ mental health, with one internal message comparing Instagram to a drug and another noting minors lack the executive function to control screen time. The lawsuit, brought by several school districts and individuals, alleges that the companies prioritized profit over user safety by deliberately designing features to maximize youth engagement and advertising revenue. While the companies deny the allegations, the filing raises questions about the effectiveness of safety features and their awareness of the platforms’ negative impact.

Read More

Instagram’s Algorithm: Fueling Vulnerability with “Eating Disorder Adjacent” Content

Instagram shows more ‘eating disorder adjacent’ content to vulnerable teens, internal Meta research shows, and that’s a pretty disturbing reality we need to grapple with. It’s like the platform, driven by algorithms designed to maximize engagement, is actively pushing users towards content that preys on their insecurities, fears, and vulnerabilities. This isn’t just about showing someone more of what they like; it’s about exploiting their weaknesses for profit.

The core issue seems to be that Meta’s algorithm is fundamentally built to capitalize on our obsessions, often those related to our perceived shortcomings. It’s almost as if the system is designed to identify and then amplify negative thought patterns.… Continue reading