Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Mark Zuckerberg wants you — and your government — to help him run Facebook

Remember when Silicon Valley’s giants scoffed at regulation? Now they see it as a protective shield.

Facebook CEO Mark Zuckerberg looking at his cellphone.
Facebook CEO Mark Zuckerberg looking at his cellphone.
Facebook CEO Mark Zuckerberg.
Drew Angerer / Getty Images
Peter Kafka
Peter Kafka covered media and technology, and their intersection, at Vox. Many of his stories can be found in his Kafka on Media newsletter, and he also hosts the Recode Media podcast.

Mark Zuckerberg built one of the most powerful companies in the world. Now he says he needs help running it.

In a Washington Post op-ed, the Facebook CEO is calling on “governments and regulators” around the world to help rein in the internet, and his own company.

“By updating the rules for the Internet, we can preserve what’s best about it — the freedom for people to express themselves and for entrepreneurs to build new things — while also protecting society from broader harms,” Zuckerberg writes.

Zuckerberg goes on to ask for new regulation addressing four topics: “harmful content, election integrity, privacy, and data portability.” But the bigger point is that he’s asking for regulation at all: For years, Silicon Valley’s tech leaders assumed that governments and regulators were anachronistic speed bumps to be avoided.

What’s changed, of course, is that governments and regulators around the world are now intent on creating new rules around the internet (or, at least, saying that they’re intent on doing so).

And Facebook would rather get out in front of it by suggesting the kinds of rules it would like to see implemented.

Facebook isn’t alone in this mindset. Lots of Silicon Valley’s biggest companies assume there are new regulations coming and are working with regulators to get the rules they think will help themselves. They don’t have to love the rules, as long as the rules give them a clear framework that spells out what they’re responsible for — and what they don’t need to do.

An obvious example, reiterated by Zuckerberg in his op-ed: Getting more countries to adopt the European Union’s General Data Protection Regulation. It’s not so much that Facebook et al think GDPR is particularly good at protecting consumer privacy. But they know how to work with GDPR, and they would rather have a consistent set of laws to follow instead of a patchwork of country-by-country laws.

Some of this may happen organically. In the US, for instance, Silicon Valley leaders expect individual states to enact their own regulations around internet privacy and other issues, and assume those rules will prompt the federal government to eventually create its own nationwide rules — which is what Silicon Valley would prefer.

On the other hand, it’s very hard to imagine a global consensus around … anything, let alone rules governing “distribution of harmful content,” as Zuckerberg floats here.

And there are plenty of people in the US government who are raising eyebrows at Zuckerberg’s ask. Here, for instance, is the chief of staff of the Federal Communications Commission, responding on Saturday:

Regardless of how practical it is, Facebook’s impulse to ask people who don’t run Facebook for help running Facebook looks like the new normal for Facebook.

Facebook — along with all of the other Silicon Valley companies that depend on individual users for content or inventory — has always asked other people to police their platforms. If someone uploads a video or song you own onto the site, it’s up to you to tell Facebook to take it down. And if you think that Pulitzer Prize-winning photograph of a nude Vietnamese girl running from a napalm attack shouldn’t be on the site, you should tell Facebook that, too.

In the wake of the 2016 election, Facebook has leaned even harder in this direction: It outsourced the detection of fake news to third-party fact-checkers (who have since complained that Facebook wasn’t serious about the work). And it asked readers to tell it what news sites are trustworthy. Now it wants an independent Facebook Court to rule on controversial content decisions.

Facebook is also spending billions on software and humans to help police its own property. (Casey Newton argues persuasively that Facebook should be spending much more on the humans it employs to look at some of the ghastly things people upload to the site.)

But Facebook’s fundamental positioning of itself as neutral ground, where people happen to show up and do things (as opposed to software that’s specifically designed to entice people to show up and do things) means that it’s always going to ask outsiders — users, copyright owners, regulators — to help keep it in line.

This article originally appeared on Recode.net.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh