Advertisement

The naked truth about Facebook’s revenge porn tool

The company is having a hard time getting its story straight. Again.

Facebook has announced it's trialling a tool in Australia to fight revenge porn on its platform, one that requires victims to send the company a copy of the violating images. Amazingly, this is true, and not a Clickhole story. It's the kind of thing that makes you wonder if there are human people at Facebook, and do they even understand what words mean? Because as we unravel the details of this tool -- totally not conceived by actual robots or a company with a zero percent trust rating among users -- we realize it's a very confusing tool indeed.

According to press who wrote about it (forcing Facebook to come back with panicked explanations) in detail that made it sound even worse, the revenge porn tool works like this:

A person starts a conversation with themselves in Facebook Messenger and uploads a nude.

Apparently, someone might want to do this if they see one of Facebook's monstrous users publishing nonconsensual sexytime photos of them, or fear a revenge porn scenario may some day come to pass. The process presumes the victim has these photos in the first place, and cavalierly ignores that this person is living in a nightmarish hellscape trauma that is in no way re-experienced by handing the instrument of their terror to an anonymous, unaccountable, possibly grey alien Facebook employee.

The idea is, that the user flags it as a "non-consensual intimate image." The photo is copied to Facebook, because that is what computers do: they copy files.

This apparently sends a copy of the image to the probable-Cybermen behind the scenes at Facebook, who momentarily pause from massaging advertisers with whale tears, laughing at people worried about Holocaust denial, high-fiving over scenes of unbelievable human devastation, and destroying democracy.

Then a person, and totally not a heartless tech bro, who works for Facebook looks at it. They decide if it is revenge porn, or if on that day you are just shit out of luck for getting your nonconsensual nudes removed.

At some point, according to what Facebook told Motherboard, the image has portions of it blurred out. This may happen with magic grey alien technology in transit, somehow preserving the privacy and dignity of the revenge porn victim. Maybe the employee just blurs their eyes over the sensitive parts by squinting really hard or rubbing their eyelids. Perhaps a superhacker Facebook cyber-script blurs the private bits so quickly you can feel a breeze come off the Facebook employee's computer.

Revenge porn laws

But probably not. A Facebook spokesperson told Motherboard that when the image is blurry, a highly specialized and incredibly trained team are the only people who have access to it for a few days. It is my personal hope that their training is in martial arts.

Yet when asked about how and when the blurring process happens, a Facebook spokesperson told Engadget that to clarify about the blurring process, the photos are hashed. We were then directed to this post, which doesn't talk about blurring of images at all.

So the exact process protecting the privacy of revenge porn victims that Facebook told Motherboard happens in its offices, and claimed to clarify to Engadget, may or may not be happen like this at all. This is what one might call "a bad sign."

Anyway. As best we know, after employees look at the photo (and it may or may not be altered for the privacy and dignity of its subject), Facebook's machines take over. Facebook makes a hash of the photo and stores it in a repository that's cross-checked against photo uploads on the service. We can rest assured that this part will work perfectly because Facebook has never made a mistake.

Once the hash is made, only then does Facebook delete the photo from its servers. A Thursday post from Facebook stated:

Once we hash the photo, we notify the person who submitted the report via the secure email they provided to the eSafety Commissioner's office and ask them to delete the photo from the Messenger thread on their device. Once they delete the image from the thread, we will delete the image from our servers.

Actually, hashes are how photo sites and indexes check for child porn. When those illegal photos are seized, they're hashed and put into databases that scan for matching images, helping authorities find violators and victims. The neat thing about photo hashes is that the photos can't be reconstructed from just the hashes.

In theory, this would make Facebook's process pretty slick. Except for the part where you're uploading the image to Facebook, of course, and the image is being looked at, transmitted, processed, and stored by this particular company. In contrast to its Thursday update, Facebook had assured Motherboard that the images are discarded after a few days of review. None of which makes us feel better. And it shouldn't if you take a minute to learn how easy it is to recover deleted files.

Facebook did not respond to my question as to whether or not the image or its hash is included in a user's shadow profile, or falls under Facebook's photo Terms, which are:

(...) a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that you post on or in connection with Facebook (IP License). This IP License ends when you delete your IP content or your account unless your content has been shared with others, and they have not deleted it.

After all, Facebook is in the business of collecting data.

The laws around the publication of intimate and private photos online without the subject's consent are a mess. The rules themselves, and the results, differ from country to country (and from state to state in the US), and even between civil and criminal federal laws. But technically -- and it's curious that Facebook isn't mentioning this -- if you created the images or you own the rights to them. If you own the copyright, you can (and should) ask for removal with a DMCA request.

The problem here, obviously, is trust. Lesley Carhart told Motherboard that her speciality is digital forensics. "I literally recover deleted images from computer systems all day—off disk and out of system memory," she said. "It's not trivial to destroy all trace of files, including metadata and thumbnails."

Facebook is asking people to trust it. The company that said Russian propaganda advertising only reached 10 million people then was forced to admit the true number was 126 million. The company that reached into people's address books on phones and devices, and altered Facebook users' contact information, re-routing communications to Facebook. The company that enforces a "real names" policy on users despite the fact that the National Network to End Domestic Violence proved that Facebook is the most misused social media platform by abusers. The company that let advertisers target users by race, outs sex workers, said "fake news" was not a real problem, and that experimented on its users' mental health.

Trust is something Facebook literally has none of.

Getting revenge porn taken down is hard, as well as emotionally and psychologically grueling for victims. It feels horrible, and is a fresh trauma every time the victim is confronted with a new violation. The police won't do it, and victims are tasked with finding all the images and videos, and sending each website and its host a takedown request that asks sites to remove the content.

If Facebook wanted to implement a truly trusted system for revenge porn victims, they could put the photo hashing on the user side of things -- so only the hash is transferred to Facebook. To verify the claim that the image is truly a revenge porn issue, the victim could have the images verified through a trusted revenge porn advocacy organization. Theoretically, the victim then would have a verified, privacy-safe version of the photo, and a hash that could be also sent to Google and other sites.

Facebook plans to roll out its fabulous new program in other countries soon. As they say in Menlo Park, "May the odds be ever in your favor!"

Disclosure: Violet Blue is an Advisor for Without My Consent, a nonprofit dedicated to helping victims of online privacy violations find paths to justice.

Images: PA/PA WIRE (Blurred smartphone image); REUTERS/Stephen Lam (Mark Zuckerberg)