Children first see violent content online in primary school and believe it is “inevitable”, a study by Ofcom has revealed.
All 247 children interviewed by the watchdog said they had seen adult-only video game content, fighting and verbal abuse.
Social media and group chats were the most common ways they discovered the content, with many saying they viewed it when they were below the site’s minimum age.
Research shows that sharing videos of school fights is normal for many children.
Others said they had seen more extreme violence, such as that involving gangs, but much less frequently.
Some children aged 10 to 14 said they felt pressured by viewing violent content, found it “funny” and feared they would be singled out if they did not watch it.
Ofcom said teenage boys were most likely to share such videos, often doing so to become more popular by attracting comments or likes, or simply to “fit in”.
Some kids said they encountered violence through strangers’ posts on their news feeds or through what they called “algorithms.”
Many people feel they have little control and sometimes feel uneasy, scared or anxious.
Separate Ofcom research – conducted by Ipsos and social research agency Tonic – said young people who have viewed content about self-harm, suicide and eating disorders described content on social media as “prolific”.
This amounts to a “collective normalization and general desensitization” to these issues, the study said.
Some children said it made their symptoms worse, and others said they discovered other self-harm techniques.
A third study conducted for Ofcom by the UK’s National Center for Social Research and City University found that cyberbullying occurs wherever children interact online, with comments functions and direct messaging being key drivers.
Some children said they were bullied by being added to group chats without their consent.
Ofcom said a key factor in all three studies was children’s lack of confidence and trust in reporting their concerns.
Those who report say they often receive only general information, while others say the reporting process is too complicated or worry about their anonymity.
Activists have long urged social media companies to do more to prevent children from seeing harmful content.
Ian Russell blames the companies Still ‘pushing harmful content to millions of young people’ Six years after his daughter committed suicide.
MollyThe 14-year-old committed suicide after viewing posts about suicide, depression and anxiety.
Mother of murdered boy Brianna Gay also said Mobile phones should be made specifically for children under 16 Protect them from cyber harm.
this Cybersecurity Law – A bill passed last year – requiring online service providers to minimize the reach of illegal and harmful content.
However, a parliamentary committee said last month that the benefits may not be felt for some time as the law is fully implemented. Delayed to 2026.
The report also states that Ofcom is unable to act on individual complaints and only steps in if there are “systemic issues” with providers.
read more:
What is the Online Safety Act? How will it be performed?
Porn sites may have to use ID and card checks to protect children
Gill Whitehead, director of Ofcom’s online safety group, said of the recent research: “Children should not consider seriously harmful content – including content that depicts violence or promotes self-harm – to be an inevitable part of their online lives.
“Today’s research sends a strong message to tech companies that the time to act is now so they are ready to meet their child protection duties under new online safety laws.
“Later this spring we will consult on how the industry can ensure children have age-appropriate, safer online experiences.”
Follow us on Google news ,Twitter , and Join Whatsapp Group of thelocalreport.in