Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook Will Stop Playing With Your Emotions (Sort Of)

The social network changed its research guidelines and established a review board in the wake of a much-maligned study.

Facebook updated its research guidelines Thursday, a policy change that comes three months after news surfaced that Facebook had purposely tried to manipulate user emotions in a 2012 study. The result for Facebook was a slew of pissed-off users, and the company hopes these changes will prevent similar issues in the future.

Facebook is always testing something in its attempt to improve the service, and Thursday’s update confirms the company will continue doing research. Facebook has lots of user data (lots and lots and lots), and it uses that information to change products, like Messenger or News Feed, and experiences, like which posts/ads you might see.

So what will Facebook do differently moving forward?

The company has essentially added a new level of checks and balances. Before, when groups within Facebook conducted research, the research plans were approved by those group leaders. For example, News Feed research was discussed and approved within the News Feed team, according to a spokesperson.

Now, Facebook has added a more expansive internal review panel that includes senior members from teams across Facebook. If a particular group plans to conduct research on sensitive topics, for example a specific groups of users (women of a particular age) or research relating to “content that may be considered deeply personal (such as emotions),” this research must be reviewed by the panel.

Facebook isn’t sharing names of individuals on the panel, but heads of different areas — communications, marketing, policy, etc. — will be on it, according to a spokesperson. No outside academics will sit on the panel, but Facebook consulted a number of professional researchers while assembling the group, the spokesperson added.

The idea is that opening the review and approval process to a wider range of employees will keep internal teams from moving forward with research that may not align with Facebook’s goals (at least the goals they want users to understand).

Facebook also says that research practices have been added to its six-week training program for new engineering hires beginning this week, and all employees will learn about company research practices during the annual privacy and security training that Facebook requires.

In regard to the manipulation study discovered in June, Facebook has apologized in the past for failing to communicate better with users about the research, and it reiterated that in a blog on Thursday. “We should have considered other non-experimental ways to do this research,” wrote Mike Schroepfer, Facebook’s CTO. “The research would also have benefited from more extensive review by a wider and more senior group of people.”

That second part — the more extensive review — is what Facebook’s new policy should fix. The company has also added a new website where all published academic research will live in the future.

This article originally appeared on Recode.net.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh