Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

YouTube CEO Susan Wojcicki says vetting videos before they go up isn’t the right answer

“I think we would lose a lot of voices,” Wojcicki said.

Rani Molla
Rani Molla was a senior correspondent at Vox and has been focusing her reporting on the future of work. She has covered business and technology for more than a decade — often in charts — including at Bloomberg and the Wall Street Journal.

YouTube CEO Susan Wojcicki is okay with taking content down, but she doesn’t think it’s a good idea to review it before it goes up on the massive video-sharing platform.

That’s one big takeaway from her interview with Recode senior correspondent Peter Kafka at this year’s Code Conference.

“I think we would lose a lot of voices,” Wojcicki said. “I don’t think that’s the right answer.”

She also warned that it could be difficult to come up with criteria as to what could be uploaded in the first place: “What are the factors that you’re [using to] determine that? How are you deciding who is getting to be on the platform and have speech and who’s not?”

When Kafka pointed out the company is already making such decisions — but only after content is online on YouTube’s platform — Wojcicki emphasized the importance of reviewing content after it publishes on the site. “We see all these benefits of openness, but we also see that that needs to be married with responsibility,” she said.

The YouTube CEO admitted that there will likely always be content on YouTube that violates its policies.

“At the scale that we’re at, there are always gonna be people who want to write stories,” she said, suggesting that journalists will always choose to focus on the negative aspects of YouTube in their reporting.

“We have lots of content that’s uploaded and lots of users and lots of really good content. When we look at it, what all the news and the concerns and stories have been about is this fractional 1 percent,” Wojcicki said. “If you talk about what the other 99 point-whatever-that-number is — that’s all really valuable content.”

“Yes, while there may be something that slips through or some issue, we’re really working hard to address this,” she said.

Last week, YouTube updated its hate speech policy and said it will take down “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.” The policy directly mentioned removing videos that promote neo-Nazi content or videos that deny commonly accepted violent events, like the Holocaust or the Sandy Hook school shooting — but lots of other conspiracy theories and “borderline content” are still allowed on the platform.

Instead of approving videos ahead of time, Wojcicki suggested using tiers in which creators get certain privileges over time, like more distribution and monetization of their content.

“I think this idea of like not everything is automatically given to you on day one, that it’s more of a — we have trusted tiers,” she said.

In recent weeks, the company has confronted numerous issues. Last week, the video platform decided that YouTube creator Steven Crowder wasn’t violating its rules when he kept posting videos with homophobic slurs directed at Vox journalist Carlos Maza, though the company eventually demonetized Crowder’s channel.

YouTube has said that by limiting recommendations, comments, and sharing, it has reduced views of white supremacist videos by 80 percent since 2017. It’s only now banned that content altogether. The company is one of several prominent tech companies trying to figure out how to deal with hateful content proliferating on their platforms. Facebook banned white supremacist content on Facebook and Instagram in March. Twitter says it is looking into it. But even when these companies do make rules prohibiting harmful content, the sheer volume of uploads and posts on their platforms make it difficult to exclude content that breaks those rules. YouTube’s army of content creators uploads 500 hours of video each minute of every day on its site.

Wojcicki instead wanted to focus on the improvements the video company has made in the past few years.

“Two years ago there were a lot of articles, a lot of concerns about how we handle violent extremism. If you talk to people who are experts in this field, you can see that we’ve made tremendous progress.”

“We have a lot of tools, we work hard to understand what is happening on it and really work hard to enforce the work that we’re doing. I think if you look across the work you can see we’ve made tremendous progress in a number of these areas,” Wojcicki said. “If you were to fast-forward a couple years and say, well, what that would look like in 12 months and then in another 12 months, what are all the different tools that have been built, I think you’ll see there will be a lot of progress.”


Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh