YouTube to hire thousands of human moderators to remove videos that harm children

YouTube plans to hire moderators to remove content not suitable for children. Since disturbing and harmful videos are easily accessible on YouTube via YouTube Kids app or directly on the site.

Last month The New York Times reported how disturbing scenes such as “Mickey Mouse in a pool of blood while Minnie Mouse watches, aghast” are easily accessible on the YouTube Kids app. Now in an attempt to make the video platform family friendly again Susan Wojcicki, CEO of Youtube announced in an official blog post that they will increase their number of human moderators in 2018 as “some bad actors are exploiting our openness to mislead, manipulate, harass or even harm.”

Youtube usage among children

A newly released research report shows the use of “YouTube has increased since 2016 by 11 percentage points for children aged 3-4, by 17 percentage points for 5-7s and by eight percentage points for 8-11s.” The report also revealed the use YouTube whether via the website or app “increases with the age of the child, accounting for forty-eight percent of ages 3-4, seventy-one percent of 5-7s, eighty-one percent of 8-11s and ninety percent of 12-15s.” 

Wojcicki also discussed how Youtube is working “with NCMEC, the IWF, and other child safety organizations around the world to report predatory behavior and accounts to the correct law enforcement agencies.”

On Tuesday 6th February 2018 the world celebrates Safer Internet Day. The goal is to get a range of stakeholders such as children, parents, companies, and policymakers to come together to “promote the safe, responsible and positive use of digital technology for children and young people”. To join the conversation and to get involved click here.

Author: Julia Zvobgo

Julia Zvobgo is the Community Manager of Child in the City.

Add your comment

characters remaining.

Log in through one of the following social media partners to comment.