-->
Save your FREE seat for Streaming Media Connect in February. Register Now!

YouTube Execs: Take Responsibility for the Mess You've Created

Article Featured Image

If you haven't read Mark Bergen's "YouTube's Executives Ignored Warnings, Letting Toxic Videos Run Rampant" in Bloomberg yet, you should. It's a damning exposé that reveals that YouTube has not only been reluctant to deal with misinformation on its service, but has indirectly encouraged it by placing engagement and revenue above all else. It's by no means the first article of its kind, but it's by far the best researched and most exhaustively reported.

BlogHistorically, YouTube's response to criticisms that it's been complicit in spreading lies, conspiracy theories, and hate speech has wavered between dismissively defensive and half-hearted attempts to address the problems. At SXSW in 2018, CEO Susan Wojcicki said YouTube was "really more like a library" than a broadcaster, trying to align the company with institutions that have come under attack while ignoring the fundamental difference: Librarians actively assess the value of the content on their shelves, and can assist their patrons in separating the wheat from the chaff.

YouTube employs thousands of content moderators, but according to Bergen, they often haven't been given clear guidance on how to deal with conspiracy videos, and only recently did the site address "borderline" content—videos that don't quite violate YouTube's guidelines, but come close. Those videos remain on the site, but don't show up in recommendations to viewers, in the implementation of a proposal first considered (but rejected) in 2016. Likewise, YouTube's effort to counter the impact of anti-vaxxer videos by including a Wikipedia entry alongside the clips is hit-and-miss.

As a journalist, I'm about as close to a free speech absolutist as you can get. But these aren't free speech issues—the government is not, as yet, telling YouTube or any other platform what it can or cannot publish, nor should it. For conspiracy theorists, hate speech mongers, and the trust-fund libertarians so prevalent in Silicon Valley to hide behind free speech as a defense is disingenuous. Until anyone proposes government involvement—and I'm certainly not—this isn't a free speech or censorship issue.

But it's long past time that YouTube—along with Twitter, Facebook, and other social media sites that now act as news sources—take the issue more seriously. For all its faults, Facebook has taken the lead and has said it is banning white separatist and white nationalist content, while YouTube and Twitter remain reluctant. Likewise, Facebook is said to be near to announcing policies and procedures to prevent what we saw during the shooting in New Zealand last month. YouTube and Twitter, we're waiting.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

YouTube Highlights Video Policies Meant to Protect Minors

Following another damaging story about bad behavior on its platform, YouTube put out a blog post highlighting recent moves it made to safeguard minors.

YouTube Revises its Strikes System for Consistency and Clarity

The leading video destination overhauls its notification and strike system, making community guidelines easier to understand and obey.

YouTube Lists Monetization Suspension Steps for Problem Creators

Following yet another creator crisis, YouTube is getting tough with problem channels while promising faster decisions and communications.

YouTube Sets Limits on Which Partner Channels Can Show Ads

Channels in the YouTube Partner Program now need 10,000 lifetime views to qualify for ads, and must be reviewed for policy compliance.