Facebook is teaming up with four countries to test a preemptive system to detect and defend against “revenge porn.” Users are being asked to upload nude photos of themselves to Messenger.
The Australian Office of the eSafety Commissioner announced they were partnering with the social media giant last week on a pilot scheme that will allow anyone to report sensitive images being shared online without their permission.
The eSafety office, which works primarily to prevent the online abuse of minors, asked any Australian who fears that intimate images of themselves may be online to send a nude photo of themselves via Messenger. They will then notify Facebook, who will use image matching technology to stop those images from being uploaded to Facebook, Messenger, Facebook Groups or Instagram.
Facebook’s Head of Global Safety, Antigone Davis, said that the “industry-first” pilot will use “cutting-edge technology to prevent the re-sharing of images on its platforms.”
The Australian Office of the eSafety Commissioner Julie Inman Grant said that revenge porn or “image-based abuse” (IBA) can be an “incredibly devastating experience” for the victims.
The victims of revenge porn often give initial consent to sharing their photos with one other person, only to find out later that the images have fallen in the hands of a stranger. Sexually explicit photos, and the threat of publishing them for the world to see, can then be used for blackmail.
A recent study found that one in five Australians have been victims of revenge porn. Both men and women were found to be victims, but those in marginalized groups were found to be at the greatest risk.
“This lets the victim take control and be proactive in their own safety, when so often the burden is on the victim to report to multiple platforms and trace where the image has been. Our vision is that these images could be taken down from every website simultaneously,” Inman Grant told the Australian Financial Review.
Inman Grant said that sharing your nudes with Facebook is safe. Users are told to get in contact with the eSafety Commissioner, who will tell them to send the images to themselves via Messenger. Once the image is sent, Facebook will “hash” it, creating a digital fingerprint which is used to prevent the image from being uploaded in the future.
The user is then told to delete the image from their Facebook Messenger.
“They’re not storing the image; they’re storing the link and using artificial intelligence and other photo-matching technologies. So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded,” Inman Grant told the Australian Broadcasting Corporation.
However, it has been demonstrated that machine vision systems can be easily tricked by slight changes that would be virtually indistinguishable to the human eye. A team of Google researchers were able to show how making “imperceptibly small perturbations” to the pixels of an image would cause a neural network to no longer classify that image correctly.
"Good point MG, Blood donors in my zone get nothing more than a cup of tea and a chocolate biscuit. Donors are motivated to give to help others in dire need, believing they are doing that. I've set the record straight though and said it's…"
"That was almost two decades ago & if I tell anyone that 9/11 was an inside job they look at me like I'm crazy.
Brainwashing apparently works. I have to acknowledge that fact.
Something can happen right before your own two eyes & CNN can…"