TikTok fined £12.7 million by UK information watchdog for misusing children’s data

Image by amrothman from Pixabay

The data watchdog for the United Kingdom has fined social media giant TikTok £12.7 million pounds ($15.9 million) for breaching data protection laws, including the use the personal data of children aged under 13 without parental consent.

TikTok is estimated to have allowed as many as 1.4 million UK children under 13 to use its platform in 2020, even though it sets 13 as the minimum age to create an account, said the Information Commissioner’s Office (ICO).

That is despite UK data protection laws making it clear that organisations which use personal data when offering information society services to children under 13 must have consent from their parents or carers.

The ICO said the data breaches occurred between May 2018 and July 2020, with the Chinese-owned video sharing app, which is hugely popular with teenagers, not having done enough to check who was using the platform and remove the underage children who were.

TikTok failed to meet its legal obligations, even though it ought to have been aware that under 13s were using its platform. TikTok also failed to carry out adequate checks to identify and remove underage children from its platform.

The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately.

John Edwards, UK Information Commissioner, said: “There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws.

“As a consequence, an estimated one million under 13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

‘TikTok ought to have been aware under-13s were using its app platform’

In a statement, a TikTok spokesperson said: “TikTok is a platform for users aged 13 and over. We invest heavily to help keep under-13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community.

“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s Code, designed to help protect children in the digital world. It is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children.

The code sets out 15 standards to ensure children have the best possible experience of online services. Click here for more information.

Sources: Reuters UK and The Guardian

Author: Simon Weedy

Add your comment

characters remaining.

Log in through one of the following social media partners to comment.