Skip to main content

Facebook Wants Your Nude Pictures to Fight Revenge Porn

Non-consensual intimate images are a problem that Meta's Facebook is teaming up with 50 NGOs to tackle.
  • Author:
  • Publish date:

Apple isn't the only Silicone Valley tech company looking to curb the spread of sexual images among children with Meta Platforms (the recently renamed Facebook) (FB) - Get Meta Platforms Inc. Class A Report entering partnerships with 50 non-governmental organizations to stop the non-consensual sharing of intimate images (NCII). 

Facebook Takes on Revenge Porn

Meta is supporting the launch of StopNCII.org with the UK Revenge Porn Helpline and other organizations to stop the explicit images. 

"At the heart of the work developing this tool have been the needs of victims and survivors by putting them in control without having to compromise their privacy," said Sophie Mortimer, manager for the Revenge Porn Helpline. 

StopNCII.org builds on technology developed by Facebook and Instagram's NCII Pilot that was started in 2018 to help victims stop the proliferation of their intimate images. 

TheStreet Recommends

The tool, which launched last week, asks Facebook users to share the potential offending images with the company in order for the company to bar the image from being shares on any of its platforms. 

As part of the submission process, users must confirm that they are in the image. The video or photo is then converted into a unique digital fingerprint which will be given to participating companies, including Facebook and Instagram. 

Protecting Social Media Privacy

In the previous version of Facebook's NCII program, human moderators were in charge of logging the digital images, leading to privacy concerns.

Under the new program, people who submit material can also track their cases and withdraw participation at any point.