Social media giant investigated over concerns of being too addictive for young people

Image: Shutterstock

The European Commission is investigating whether Meta, through its Facebook and Instagram social media platforms, is creating addictive behaviour and damaging the mental health of youngsters.

Formal proceedings are underway to assess whether Meta has breached the Digital Services Act (DSA) in areas specifically linked to the protection of minors.

The commission says it is concerned that the algorithms generated by these social media giants may ‘stimulate behavioural addictions’ in children, as well as create what it calls ‘rabbit-hole effects’, where young people are feed negative contents.

An in-depth investigation is now underway, says the commission, ‘as a matter of priority’ and will continue to gather evidence, for example by sending additional requests for information, conducting interviews or inspections.

These proceedings also allow the commission to take further enforcement steps, to see whether Meta has indeed breached the Digital Services Act, a landmark law that came into effect last year which makes digital companies liable for various ‘online harm’ elements.

‘Mitigate the risk of negative effects’

Announcing the opening of proceedings against Meta, Thierry Breton, European Commissioner for Internal Market, said: “We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram.

We will now investigate in-depth the potential addictive and “rabbit hole” effects of the platforms, the effectiveness of their age verification tools, and the level of privacy afforded to minors in the functioning of recommender systems. We are sparing no effort to protect our children.”

The investigations aim to specifically address the following:

  • Meta’s compliance with DSA obligations on assessment and mitigation of risks caused by the design of Facebook’s and Instagram’s online interfaces, which may exploit the weaknesses and inexperience of minors and cause addictive behaviour, and/or reinforce so-called ‘rabbit hole’ effect. Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.
  • Meta’s compliance with DSA requirements in relation to the mitigation measures to prevent access by minors to inappropriate content, notably age-verification tools used by Meta, which may not be reasonable, proportionate and effective.
  • Meta’s compliance with DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems.

Margrethe Vestager, Executive Vice-President for a Europe Fit for the Digital Age, said: “With the Digital Services Act we established rules that can protect minors when they interact online. We have concerns that Facebook and Instagram may stimulate behavioural addiction and that the methods of age verification that Meta has put in place on their services is not adequate and will now carry on an in-depth investigation. We want to protect young people’s mental and physical health.”

Click here for more information on the investigation into Meta.

 

Author: Simon Weedy

Add your comment

characters remaining.

Log in through one of the following social media partners to comment.