-->

YouTube Will Grow Video Moderation Team to Over 10,000 in 2018

Article Featured Image

As YouTube's problems with extremist or child-unsafe content drag on, the company has found that there's no substitute for human moderators, and that it takes a large team to cull through the vast amount of videos it receives every hour. In a blog post yesterday, YouTube CEO Susan Wojcicki announced the company will grow its content moderation team to over 10,000 in 2018. While some publications are reporting that YouTube will add 10,000 staff, that's not quite right: The post didn't say how many people already work on moderation at YouTube, and how many will be transitioned from other positions.

With such a huge volume of content, automated processes are still YouTube's primary defense. Wojcicki said 98 percent of videos removed for violent extremist content are identified by machine-learning algorithms. YouTube started using this system in June, and so far automation has done the work of 180,000 people working 40-hour weeks. Nearly 70 percent of violent extremist content is removed within 8 hours of upload, she said.

YouTube is taking these aggressive actions to convince advertisers that it's still a safe choice. The company will apply more stringent criteria to which channels and videos can carry ads, Wojcicki wrote, although she didn't say when that would happen or what the new criteria would be. It can't come soon enough: Digiday reported today that ad buyers are moving to guaranteed brand-safe environments like Hulu and ESPN, driving prices up higher occasionally than for primetime TV slots.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

YouTube Lists Monetization Suspension Steps for Problem Creators

Following yet another creator crisis, YouTube is getting tough with problem channels while promising faster decisions and communications.

YouTube TV Expands to Additional 34 Markets, Brings Total to 83

The skinny bundle streaming service has been on a roll with new market announcements. Many smaller metro areas now have access.

YouTube Creates Vetting Policy to Safeguard Children's Videos

A variety of channels churn out low-quality, often disturbing videos using familiar children's characters. YouTube aims to keep these away from young viewers.

YouTube Details 4 Steps for Reducing Extremist Videos on Platform

Remove, flag, bury, and retarget: When it comes to taking extremist and hateful speech from its network, YouTube shows it has plenty of options.

YouTube Instructs Creators on Keeping Channels Ad-Friendly

After dealing with angry advertisers, YouTube had to deal with angry video creators. New guidelines help communicate what brands are looking for.