Facebook Fights Revenge Porn

Facebook Fights Revenge Porn
Stephen Yagielowicz

LOS ANGELES — Sex and technology are colliding once again, amidst a report that social media giant Facebook is using facial recognition to fight instances of so-called “revenge porn.”

An unintended consequence of our digitally open society, revenge porn’s roots stem from jilted lovers sharing sexually provocative selfies and other intimate images far and wide in an effort to exact revenge against a former partner for one grievance or another.

This often involves posting personal photos to social media sites such as Facebook, which in a follow-up to its initiative applying artificial intelligence (AI) technology to seek out objectionable images, is seeking to prevent the re-posting of images that had previously been reported and then tagged as revenge porn — and extending this prohibition to Instagram and Messenger as well.

The company undertook the effort in response to the significant emotional distress experienced by the victims of this digital bullying, with one in 25 Americans said to be affected, and takes the issue seriously enough to deactivate offender’s accounts.

“We’ve focused in on this because of the unique harm that this kind of sharing has on its victims,” Facebook Global Head of Safety Antigone Davis told TechCrunch writer Megan Rose Dickey. “In the newsroom post, we refer to a specific piece of research around the unique harm this has for victims. I think that’s where the focus was for this moving forward.”

While the system seems limited to matching uploaded images with a database of previously reported photographs, it is one more example of how Facebook uses facial recognition to identify specific users — a controversial practice that is also increasingly being used by adult entertainment websites to match photos of a fan’s fantasy girls up with real-world performers.

It is also a process that especially when integrated with AI drives Big Data, and may eventually lead to predictive warnings that a posted image may be inappropriately shared in the future, based on unique contextual signals, including friends and personal messages. For example, Facebook knows Johnny is in a relationship with Jane but is also messaging on the sly with Sally and Suzy, and so perhaps more likely to inappropriately share Jane’s private pics with his Instagram buddies.

It’s a troubling prospect, and newly relaxed rules on data sharing may make it even easier and more commonplace.

“At this moment, we’re not using AI to go through this particular content,” Davis adds, noting “There is significant context that’s required for reviewing non-consensual sharing.”

Related: