TW: eating disorders and self-harm. Friend, Imagine you’re a 13-year-old girl who just created a YouTube account. You watch a video promoting eating disorders. It’s the first time you’ve encountered such content. What does YouTube’s algorithm do? Instead of directing teens away from this dangerous content, it starts recommending more harmful videos - including content about self-harm and suicide. CCDH’s disturbing new report shows that YouTube’s algorithm is pushing young girls to watch videos glorifying eating disorders and promoting self-harm content that could lead to fatal consequences. How do we know this? CCDH created a test account of a 13-year-old girl. Then we simulated the experience of this girl encountering an eating disorder video for the first time. We repeated this test 100 times, clearing the account’s history and cookies between simulations. In each of the 100 simulations, we analyzed the top 10 videos on YouTube’s “Up Next” recommendations displayed next to the video we were watching.
In total, we analyzed 1,000 video recommendations. This is what we found: 1 in 3 promoted harmful eating disorder content. 2 in 3 related to eating disorders or weight loss. 1 in 20 involved self-harm or suicide content.
|