Facebook Is Going After Revenge Porn With New Artificial Intelligence Tool

Facebook’s first privacy-related announcement following Zuckerberg’s overhaul plan is about the sensitive topic of revenge porn.

Facebook has unveiled its latest privacy pilot program, this time to help victims of revenge porn. Luis Acosta/Getty Images

Facebook (META)’s new rebranding is all about protecting users.

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

Earlier this month, CEO Mark Zuckerberg announced a strategy overhaul of the social network’s privacy policy. In his 3,000-plus word blog post, Zuckerberg spoke about how the 15-year-old platform will start to do better by its users when it comes to their data.

Subscribe to Observer’s Business Newsletter

On Friday, Radha Iyengar, the company’s head of product policy research, and global safety policy programs manager Karuna Nain published a follow up about a pilot program regarding revenge porn.

In a post titled “A Research-Based Approach to Protecting Intimate Images,” the pair highlighted the ways Facebook’s new artificial intelligence (AI) program will help automatically find and report any intimate, non-consensual content being shared.  

“Discovering intimate images of yourself online when you didn’t consent to them being shared is devastating. That’s why we’ve taken a careful, research-based approach that concentrates on the victims—what they experience and how we can better protect them,” the duo wrote.

It’s interesting that Facebook’s first privacy-related announcement following Zuckerberg’s blog post has to do with a sensitive topic such as revenge pornography—especially given that criticism of Facebook in recent years has mainly focused on topics such as data breaches and the spread of “fake news.” However, the company’s advocacy against revenge porn shows that Facebook’s definition of privacy protection goes beyond the current news cycle’s.

Here, what Iyengar and Nain have outlined is an effort to eliminate the spread of what Facebook is calling “non-consensual intimate images (NCII).”

With the help of its automated detection program, Facebook says it wants to “Build clear, accessible tools to support victims in reporting a violation.” So instead of having to use the platform’s standard reporting procedure, victims of revenge porn will have dedicated tools for blocking and reporting users who share non-consensual images of them. Furthermore, Facebook’s preventive measures will also include launching an “emergency option” for people to alert Facebook to a specific photo to help prevent it from spreading.

Unsurprisingly, the social media behemoth admits the latest tool is already proving controversial, considering the sensitive nature surrounding private information and Facebook. However, the company claims that its “research with victims and feedback from organizations indicates this was an option victims generally wanted, and they wanted it built into the reporting process more specifically.”

In the coming months, Facebook’s focus on privacy will be better showcased with more tool and policy announcements, similar to the one focused on revenge porn.

Facebook Is Going After Revenge Porn With New Artificial Intelligence Tool