UK puts free speech ahead of ‘legal but harmful’ online content
The UK government will not force social media firms and other tech giants to remove ‘legal but harmful’ content from their platforms, following concerns such a move could prevent free speech being exercised.
Online safety laws would instead, says the government, focus on protecting children and ensuring companies remove content that is illegal or prohibited in their terms of service.
Any incentives for social media firms to over-remove people’s legal online content will be taken out of the much-vaunted Online Safety Bill, which has been in development for several years.
Firms will still need to protect children and remove content that is illegal or prohibited in their terms of service, however the Bill will no longer define specific types of legal content that companies must address.
‘Firms will still need to protect children’
Michelle Donelan, the UK’s Digital Secretary, said that it was her aim to stop unregulated social media platforms damaging children.
“I will bring a strengthened Online Safety Bill back to Parliament which will allow parents to see and act on the dangers sites pose to young people,” she said. “It is also freed from any threat that tech firms or future governments could use the laws as a licence to censor legitimate views.”
Britain, like the European Union and other countries, has been grappling with the problem of legislating to protect users, and in particular children, from harmful user-generated content on social media platforms without damaging free speech.
The revised Online Safety Bill, which returns to Parliament next month for further scrutiny, puts the onus on tech companies to take down material in breach of their own terms of service and to enforce their user age limits to stop children circumventing authentication methods, the government said.
If users were likely to encounter controversial content such as the glorification of eating disorders, racism, anti-Semitism or misogyny not meeting the criminal threshold, the platform would have to offer tools to help adult users avoid it, it said. Only if platforms failed to uphold their own rules or remove criminal content could a fine of up to 10 per cent of annual turnover apply.
‘This is the reason I am on a mission’
Lucy Alexander’s son, Felix, was driven to commit suicide aged just 17 due to what his mother calls a ‘barrage of bullying’ online. “This is the reason I am on a mission to make sure no other child feels as much pain as he did. One death is one too many,” she said.
“The Online Safety Bill is a step in the right direction, it will hold social media accountable for protecting children online,” added Lucy. “The new changes to the bill will also see social media firms forced to publish risk assessments so that parents can see for themselves the dangers and risks that face children on these sites. It is time to prioritise the safety of our children and young people online.”
However, Ian Russell, the father of teenager Molly Russell who took her life after viewing suicide and self-harm content online, said the bill had been watered down. He told the BBC that he believed the most harmful content to his daughter could be described as ‘legal but harmful’.
“It is very hard to understand that something that was important as recently as July, when the bill would have had a full reading in the Commons and was included in the bill, this legal but harmful content, it is very hard to understand why that suddenly can’t be there,” he added.
Click here for more information on the proposed legislation.