Last week of Early Bird!
Image: pexels.com

‘Tame your toxic algorithms to protect children online’, tech firms told

Image: https://howmuchdoesitcost.io/

Ofcom, the UK’s telecommunications regulator, says under-18s may be banned from using social media if providers do not comply with new online safety rules.

The regulator has published a draft children’s safety codes of practice, which requires social media firms to have strict measures for checking the age of its users.

Technology firms ‘must act to stop their algorithms recommending harmful content to children’ says Ofcom, which has set out some 40 practical measures outlining how it expects companies to meet their legal responsibilities to children.

The Online Safety Act imposes strict new duties on services that can be accessed by children, including popular social media sites and apps and search engines. Firms must first assess the risk their service poses to children and then implement safety measures to mitigate those risks.

‘Legal responsibilities to children’

This includes preventing children from encountering the most harmful content relating to suicide, self-harm, eating disorders, and pornography. Services must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material, online bullying, and content promoting dangerous challenges.

Dame Melanie Dawes, Chief Executive of Ofcom, said: “We want children to enjoy life online. But for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.

In line with new online safety laws, our proposed codes firmly place the responsibility for keeping children safer on tech firms. They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that’s right for their age.

“Our measures – which go way beyond current industry standards – will deliver a step-change in online safety for children in the UK. Once they are in force we won’t hesitate to use our full range of enforcement powers to hold platforms to account. That’s a promise we make to children and parents today.”

A spokesperson for Meta, which owns Facebook and Instagram, said the company wanted young people ‘to connect with others in an environment where they feel safe’.

‘Additional responsibilities’

“Content that incites violence, encourages suicide, self-injury or eating disorders breaks our rules and we remove that content when we find it,” they added.

A spokesperson for Snapchat said it acknowledged that it had ‘additional responsibilities to create a safe and positive experience’, adding: “We support the aims of the Online Safety Act and work with experts to inform our approach to safety on Snapchat.”

Click here for more on Ofcom’s draft safety codes.

Author: Simon Weedy

Add your comment

characters remaining.

Log in through one of the following social media partners to comment.