Users Alarmed by Surge in Eating Disorder Content on X

Debbie was simply scrolling through X in April when her feed was unexpectedly flooded with disturbing posts—posts she had not subscribed to. One image showed an alarmingly underweight individual questioning if they were ‘thin enough.’ Another post detailed a user’s daily calorie deprivation. Both posts originated from a massive group with over 150,000 members that she had never followed.

Out of curiosity, Debbie clicked into the group. ‘As you scroll down, it’s all pro-eating-disorder messages,’ she shared. ‘People asking for opinions on their bodies, people asking for advice on fasting.’ Some posts even had administrators urging members to ‘remember why we are starving.’

The Observer uncovered seven more groups, collectively housing nearly 200,000 members, openly sharing content glorifying eating disorders. All of these groups appeared following Elon Musk’s acquisition and rebranding of Twitter to X in 2022.

Campaigners against eating disorders argue that this surge of harmful content highlights serious moderation shortcomings on X. Wera Hobhouse MP pointed out, ‘These findings are most concerning… X should be held accountable for allowing this harmful content to be promoted on its platform, which puts many lives at risk.’

Historically, the internet has been a haven for so-called ‘pro-ana’ content, ranging from early message boards to platforms like Tumblr and Pinterest. These sites took action in 2012 to ban content promoting eating disorders and self-harm after public outcry.

Debbie recalls the old pro-ana message boards: ‘But you’d have to search to find them,’ she said. Now, such content is more accessible than ever. Critics argue that social media algorithms push increasingly extreme content to users, escalating the danger.

The call for improved social media safeguards has grown in response to tragedies linked to harmful online content. The 2017 death of 14-year-old Molly Russell, who took her own life after viewing suicide and self-harm posts, led to increased scrutiny. Instagram, owned by Meta, banned any graphic self-harm content in 2019. The Online Safety Act, passed recently, mandates tech companies to protect children from harmful content or face hefty fines. However, Baroness Parminter noted that the Act does not adequately safeguard adults. ‘The duties on social media providers are only for content that children might see… And of course eating disorders don’t stop when you’re 18,’ she said.

X’s policies ostensibly prohibit content promoting self-harm, including eating disorders. Users can report violations and filter their timelines, but concerns about moderation have grown since Musk’s takeover. His decision to fire thousands of staff, including moderators, significantly diminished the team working on content moderation.

Hannah Whitfield, a mental health activist, deleted her social media accounts in 2020 to aid her eating disorder recovery but returned to some platforms, including X. She found ‘thinspiration’ posts on her For You feed, noting, ‘What I found with [eating-disorder content] on X was that it was much more extreme and more radicalized. It definitely felt a lot less moderated and a lot easier to find really graphic stuff.’

While eating disorder charities emphasize that social media is not the root cause of eating disorders, they acknowledge that it can exacerbate the problem for those already struggling. Researchers liken the appeal of pro-eating-disorder communities to a form of radicalization. A study by the University of Southern California revealed that ‘content related to eating disorders can be easily reached via tweets about ‘diet,’ ‘weight loss,’ and ‘fasting.”

Paige Rivers, diagnosed with anorexia at 10 and now training to be a nurse, noted how X’s settings to block certain hashtags are easily bypassed. ‘People started using hashtags that were slightly different, like anorexia altered with numbers and letters, and it would slip through,’ she said.

Tom Quinn, director at eating-disorder charity Beat, criticized the lack of moderation on X, stating, ‘The fact that these so-called ‘pro-ana’ groups are allowed to proliferate shows an extremely worrying lack of moderation on platforms like X.’

For people like Debbie, social media promised support but delivered the opposite. ‘It puts me off using social media, which is really sad because I struggle to find people in a similar situation, or people that can offer advice for what I’m going through,’ she said. X did not respond to requests for comment.

The unchecked proliferation of harmful eating disorder content on X has spotlighted the platform’s significant moderation failures, causing distress for many users. As the debate over social media’s role in mental health continues, the need for accountable, proactive moderation has never been more evident.

Source: Theguardian

Latest articles

Related articles

Leave a reply

Please enter your comment!
Please enter your name here