Advertisement

Facebook have just made it harder for your disgruntled ex to share your naked photos.

Victims of such non-consensual posts, often referred to as ‘revenge porn’, now have some help in preventing their spread. The social media platform has announced new artificial intelligence tools designed to keep such content, once flagged, off its site for good.

“It’s wrong, it’s hurtful, and if you report it to us, we will now use AI and image recognition to prevent it from being shared across all of our platforms,” said Mark Zuckerberg, the social network’s founder and chief executive.

Advertisement

When a user reports that an intimate photo of them has been published without their permission, the image is removed and the company’s photo-matching technology is then used to flag the image if it’s posted again — not only on Facebook, but on Messenger and Instagram too.

Flagging an image is as easy as hitting the ‘Report’ link on the post, then clicking the newly added check-box labelled ‘nude photo of me’. The image is then reviewed by Facebook’s Community Operations team, and, if the photo is in violation, it will be removed — plus, in many cases, Facebook will also deactivate the offender’s account. 

Although Facebook has enabled people to report images for a while now, the language around revenge porn is now more clear and very specific to these types of intimate images.

To read more about this new process, check out Facebook Newsroom’s Using Technology to Protect Intimate Images and Help Build a Safe Community.

What’s more, the social media giant has also partnered with safety organisations “to offer resources and support to the victims of this behaviour.”

It comes less than a week after Instagram announced it was rolling out a new feature which will blur out sensitive photos and videos on millions of users’ timelines.

Advertisement