Video-sharing apps face heavy fines if they fail to protect children
Video-sharing apps like Snapchat and TikTok which are hugely popular with young people could be fined millions of pounds if they fail to tackle inappropriate content, under new rules from Ofcom, the UK’s broadcasting regulator.
Ofcom said companies must now legally crack down on anything from child pornography to hate speech found on their apps, ensure underage children cannot sign up for accounts, and make it easier for users to report inappropriate content.
Video sharing platforms (VSPs) are a type of online video service where users can upload and share videos with other members of the public. They allow people to engage with a wide range of content and social features.
Ofcom says it has carried out research showing that a third of users say they have witnessed or experienced hateful content; a quarter claim they’ve been exposed to violent or disturbing content; while one in five have been exposed to videos or content that encouraged racism.
It has set out guidance to help VSPs meet their obligations:
- Provide clear rules around uploading content. Uploading content relating to terrorism, child sexual abuse material or racism is a criminal offence. Platforms should have clear, visible terms and conditions which prohibit this – and enforce them effectively.
- Have easy reporting and complaint processes. Companies should implement tools that allow users to flag harmful videos easily. They should signpost how quickly they will respond, and be open about any action taken. Providers should offer a route for users formally to raise concerns with the platform, and to challenge their decisions. This is vital to protect the rights and interests of users who upload and share content.
- Restrict access to adult sites. VSPs that host pornographic material should have robust age-verification in place, to protect under-18s from accessing such material.
Dame Melanie Dawes, Ofcom Chief Executive, said: “Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them.
“The platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.”