Add thelocalreport.in As A
Trusted Source
Half of the girls were fed harmful online material including posts self harm, suicidal A new study has found that eating disorders are emerging on social media apps within a single week.
teenagers were twice as likely to encounter “high risk” content tiktok And x Analysis of data from nearly 2,000 young people revealed that girls are exposed to significantly more harmful posts than boys compared to other major platforms.
The suicide prevention charity Molly Rose Foundation, which conducted the research several weeks ago Implementation of Online Security Actsaid their findings showed that teenagers were being recommended algorithmically harmful content on an “incredibly disturbing scale”.
Children The study said that while high-risk posts were being served without being discovered, more than 50 percent of teens surveyed were exposed to potentially high-risk content algorithmically in the recommended feeds of platforms like TikTok’s “For You” page.
The charity accused the algorithm of delivering potentially dangerous content to vulnerable teenagers and “targeting those at greatest risk of its impact”. It said 68 per cent of children categorized as having low health experienced high-risk suicide, self-harm, depression or eating disorders within a week.
The charity said people experiencing poor health or with special educational needs and disabilities (SEND) were also more likely to encounter high-risk content, with two in five reporting it appeared in their feeds.
Named after 14-year-old Molly Russell, who died in 2017 from the act of suicide while suffering from depression and “the negative effects of online material”, the Molly Rose Foundation said the data showed that those at highest risk of exposure to suicide and self-harm material before the act were “much higher than previously understood”.

Introduced in 2023, the Online Safety Act aims to regulate and curb harmful online content and requires major platforms to block these high-risk types of content from appearing in children’s feeds or prevent them from appearing as frequently.
But the charity said their findings should act as a “wake-up call” to the “urgent need” to strengthen the law.
Andy Burrows, chief executive of the Molly Rose Foundation, said: “This unprecedented study shows that just weeks before the Online Safety Act came into force, teenagers were being exposed to high-risk suicide, self-harm and depression material on an incredibly disturbing scale, with girls and vulnerable children facing a significant increase in the risk of harm.
“The extent to which girls are being bombarded with harmful content far exceeds our previous understanding and raises our concerns that Ofcom’s current approach to regulation ultimately fails to match the urgency and ambition needed to save lives.
“Technology Secretary Liz Kendall must now seize the opportunity to build on and strengthen the Online Safety Act and act decisively to put children and families before the Big Tech status quo.”
An Ofcom spokesperson said that under new measures designed to protect children in the Online Safety Act, any site that allows suicide, self-harm and eating disorder content must have highly effective age checks to prevent children from viewing it. It says tech companies should restrict other harmful content appearing in children’s feeds.
“Later this year, we will also publish new guidance on the steps sites and apps can take to help women and girls live safe lives online – recognizing the harms that disproportionately affect them,” it said.
X declined to comment but pointed out IndependentThis refers to its policies that prevent or encourage self-harm. TikTok was also contacted for comment.
A Department of Science, Innovation and Technology (DSIT) spokesperson said: “Although this research predates the enforcement of child protection duties on July 25, we expect young people will now be protected from harmful content, including content promoting self-harm or suicide, as platforms comply with the legal requirements of the Act. This means safer algorithms and less toxic feeds.
“Services that fail to comply can expect tough enforcement from Ofcom. We are determined to hold tech companies accountable and keep children safe.”