Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook says it will use image recognition software to fight revenge porn

The move comes a few months after it was reported that a Facebook group was being used to share revenge porn of women in the military.

Facebook CEO Mark Zuckerberg.
Facebook CEO Mark Zuckerberg.
Facebook

Facebook is using its image recognition software to keep users from sharing revenge porn to its different services, including Facebook, Instagram and Messenger, according to a post by CEO Mark Zuckerberg Wednesday morning.

“It’s wrong, it’s hurtful, and if you report it to us, we will now use AI and image recognition to prevent it from being shared,” he added.

Revenge porn is an inappropriate image or video shared online, usually by a former spouse or partner, with the intent of harassing and embarrassing someone.

Facebook’s plan here is slightly vague, but it sounds like the company will create a database of images that its algorithms can memorize and remove automatically from its different apps. Tech companies do something similar to fight the spread of child pornography. We’ve asked Facebook for clarity and will update once we hear back.*

Revenge porn is an issue in lots of corners of the internet, but the move on Facebook’s part comes just a few months after it was reported that hundreds of U.S. Marines were using a Facebook group to share photos of fellow service members. That group was shut down, but it has since moved to Snapchat, according to BuzzFeed.

* Update: In turns out Facebook shared more about these efforts in a blog post. According to the post, users can report an inappropriate image, which will be reviewed by a human on Facebook’s community operations team. If the image violates Facebook’s community standards, the company will use “photo-matching technologies” to block people from sharing that same image in the future.


This article originally appeared on Recode.net.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh