Image Credits:GongTo (opens in a new window) / Shutterstock (opens in a new window)

Facebook defends revenge porn pilot that has people upload nude images of themselves

Facebook Global Head of Safety Antigone Davis has clarified some things about how Facebook’s test pilot to combat revenge porn in Australia works. The strategy entails uploading your nude photos or videos to Messenger in order to help Facebook tag it as non-consensual explicit media.

“With this new small pilot, we want to test an emergency option for people to provide a photo proactively to Facebook, so it never gets shared in the first place,” Davis wrote. “This program is completely voluntary. It’s a protective measure that can help prevent a much worse scenario where an image is shared more widely.”

Facebook is doing this in partnership with Australian government agency eSafety in order to try to prevent people from sharing intimate images without consent. If someone fears they are at risk of revenge porn, they can contact eSafety. The organization might then tell them to send a nude photo of themselves to themselves via Messenger. Facebook’s hashing system would then be able to recognize those images in the future without needing to store them on its servers.

In her post, Davis clarifies that eSafety does not have access to the actual image. However, a “specially trained representative from [Facebook’s] Community Operations team” does need to first review the image before hashing it. Once the image has been hashed, Facebook notifies the person who submitted the photo and asks them to delete the photo from Messenger. At that point, Facebook will delete the image from its servers.

Facebook Chief Security Officer Alex Stamos says that while there are some risks involved with people self-reporting their photos, “it’s a risk we are trying to balance against the serious, real-world harm that occurs every day when people (mostly women) can’t stop NCII from being posted,” he wrote on Twitter.

NCII is short for non-consensual intimate imagery. Stamos went on to say that Facebook takes steps to protect the data and only retains non-reversible hashes.

Some critics, however, suggest a better method would be one that doesn’t require uploading the image in the first place. One suggestion is to hash the image locally and then upload the hash to determine a match. It’s also not clear if this is age-gated to prevent minors from sending their photos.

Techcrunch event

Disrupt 2026: The tech ecosystem, all in one room

Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400.

Save up to $300 or 30% to TechCrunch Founder Summit

1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately

Offer ends March 13.

San Francisco, CA | October 13-15, 2026

Stamos responded to the idea to calculate the hash locally, saying “photo fingerprinting algorithms are usually not included in clients to prevent the development of circumvention techniques” and “humans need to review to prevent adversarial reporting.”

Topics

, , ,
Loading the next article
Error loading the next article