There is growing evidence of harm to children online, but addressing the risks is complex because the risks are often highly sensitive, their causes vary, long-term effects are hard to anticipate, and multiple stakeholders are required for the solutions.

The European Parliament has done a great deal to protect the rights of children in the digital environment, as have other stakeholders operating in the single market. But significant harm continues, and calls for action are escalating. So more must be done.

Recent policy developments

The past two decades have prioritised self-regulation, public awareness-raising, technological tools/solutions, and the fight against child sexual abuse online. Recent efforts to protect minors must be understood in relation to wider efforts by the EC to further the Digital Single Market. Our policy brief discusses:

  • The General Data Protection Regulation (GDPR) includes several provisions aimed at enhancing the protection of children’s personal data online, such as obliging service providers to use a clear and plain language that children can easily understand in all information society services that require personal data processing. Although the GDPR’s goal is not specifically to protect children from harm, it may have consequences for child protection.
  • Seeking to create a more level playing field, the recently-revised Audiovisual Media Services Directive (AVMSD) has revised content and advertising rules to created a single unified standard for the obligations of audiovisual media services providers regarding content that might harm minors. This means that video-sharing platforms such as YouTube fall under the revised directive, as does audiovisual content shared on social media services such as Facebook.
  • The Better Internet for Kids (BIK) Policy Map documented wide support for the BIK strategy, demonstrating many successes for child online safety policies. But many gaps remain in policy governance and stakeholder participation, with disappointing few improvements since the last BIK mapping exercise in 2014.
  • Other developments include the EU Human Rights Guidelines on Freedom of Expression online and offline, which includes “media and internet literacy,” and the “Code of conduct on countering illegal hate speech online,” which seems to offer a successful model for co-regulation, although there’s no specific focus on children.

The challenges

Pressing challenges facing GDPR, AVMSD and BIK centre on the implementation of legislation (especially GDPR), the effectiveness of self-regulation (especially AVMSD) and the media literacy of the public (presumed, one way or another, in most initiatives and instruments).

  • Most obviously, parents and children struggle to understand the available options and tools, as well as the risks they face and their responsibilities. Many are also frustrated with and worried by the sense of an unresponsive digital environment that doesn’t cater to their needs, respond to their concerns, or provide the tools they need. And they are confused by the different approaches to provision of options and tools offered by different companies, with parents often unable to find the support they want and children often able to evade the protections in place.
  • While the requirement in the AVMSD that video-sharing platforms to put in place measures to protect minors and others is a welcome move, this places considerable burden on providers to self-regulate in a transparent and effective manner. Any measures implemented must remain compatible with digital intermediaries’ liability exemptions under the E-Commerce directive, but there is no clear guidance on how this is to be achieved.
  • Media literacy is often cited as a solution to societal problems that involve the media. As new issues continue to arise (e.g. the need for critical information literacy given the rise of disinformation and ‘fake news’), it is widely agreed the need for media literacy is only likely to grow. However, there is little knowledge about actual levels of media and information literacy, and it is difficult to imagine a way to effectively deliver media literacy to the adult population, and current legislation does not tackle this issue.

What should be done

We suggest that the European Parliament should establish and promote clear common standards and assist coordination of the stakeholders. Parents, children, teachers and the wider public, as well as the industry, need clarity on what can be expected, and more needs to be done to help the market develop kitemarks, effective filters and other needed actions.

As is widely said but rarely implemented, this must be complemented by stepping up educational and awareness-raising efforts to promote the media literacy of the population. All this not only needs to be done but also to seen to be done, to build public trust at a time of crucial change.

Specifically, we recommend:

  • The creation of a comprehensive Code of Conduct for the converged digital environment that sets minimum standards for providers of services used by children. Ideally these would be embedded into the design of devices and services with the child’s best interests as paramount. To address the accumulating problems of self-regulation, the Code should be underpinned by strong backstop powers, independent monitoring and evaluation, and a trusted and sufficiently-resourced body to ensure compliance. It should guide intermediaries in their child protection responsibilities and provide clear consumer information and protections if services are not intended for children. Thus it should be useful to the industry and thereby support the digital single market, reducing business uncertainty and standardising norms and practice across member states.
  • The adoption of a Recommendation that promotes an integrated approach to media literacydefined broadly to support critical understanding, creative production and participation plus protective actions and technical skills. The scope should be updated as the digital environment evolves. It should be promoted consistently through all relevant EU policies and applied in national contexts from nursery years onwards, including both formal and informal educational and relevant cultural and information institutions, as well as encouraging wider voluntary participation. The reporting obligation in AVMSD is vital, as is appropriate follow up action.
  • For effective coordination, which is significantly lacking at present, we recommend that the Commission should convene a permanent High Level Expert Group to integrate the Code of Conduct, the Recommendation on media literacy, and encourage beneficial actions by Member States. This would provide the essential coordination across multiple stakeholder actions and ensure clear common standards. Further, its work should be inclusive, accountable, timely, independently evaluated and evidence based. It (or a related body) should also be public-facing, with a single and well-publicised point of contact to reach and support the public.
  • We recommend the provision of dedicated European funding to ensure pan-EU data collection on a regular basis to ensure robust, up to date evidence to guide the development of EU policy on the protection of minors in the digital age.

All these actions must include the meaningful participation of children themselves (as is their right to be consulted) and those relevant experts able to represent children’s best interests.

Unless we see take action on online child protection, legal uncertainty and disputes will continue and we run the risk of excessive legislation being enacted in response.

The full policy brief is available here: http://eprints.lse.ac.uk/90731/

Courtesy LSE Media Policy Project.