The spread of non-consensual sharing of intimate images (NCII) has been a battle for survivors and technology companies for years, with the latter being pushed to implement tools to better combat the spread of said images.

Today, Meta, previously known as Facebook, has announced its support of StopNCII.org, an initiative by the UK Revenge Porn Helpline to prevent the sharing NCII, often referred to as revenge porn, on its platform.

Meta, along with Facebook Ireland, are amongst 50 non-governmental organisation partners globally to join the efforts of StopNCII.org. According to the press release from Meta, the platform is “the first global initiative of its kind” to help people who have faced their intimate images being shared without their consent.

The way the tool works is this: with StopNCII.org, a person concerned for their safety and privacy with regards to revenge porn can create a case. The tool includes a hash-generating technology which allots a unique hash value – or numerical code – to a specific image. Tech companies, like Meta, participating in the initiative can then receive the hash and use it to examine and identify whether said image has been shared online.

The program hugely differs from the initial pilot that Facebook launched in Australia in 2017. Then, Facebook asked users to send their nude images which would then be reviewed by human moderators. This raised significant privacy concerns and criticism, despite Facebook saying that the images would only be stored for a short period of time.

The battle to develop adequate and empowering technologies for this purpose continued in 2019, when the Facebook announced the launch of an AI tool that would help detect nude photographs, allowing for a swift removal.



The version announced today builds off of previous programs, with a markedly different approach to combating revenge porn.

The design of the updated tool and StopNCII.org is intended to provide control to the survivor, fueling their empowerment in the matter. It integrates the information and vital voices of survivors, experts, advocates, and other tech partners. The images shared with StopNCII.org will never leave a person’s device, but instead only the hash is shared with participating platforms in an effort to detect them.

“At the heart of the work developing this tool have been the needs of victims and survivors by putting them in control without having to compromise their privacy,” says Sophie Mortimer, Manager of the UK Revenge Porn Helpline. “StopNCII.org has been designed to be simple and easy to use, with detailed signposting to additional support and advice, giving back agency at a time when someone may feel utterly powerless.”

While revenge porn has been outlawed in countries like the UK (since 2015), and states like New York (since 2019), its spread persists.

The frequency of revenge porn, and the failure of certain tech companies to sufficiently dismantle existing systems, has been a point of concern for years. Telegram, for example, has refused to take steps to combat people posting revenge porn, leading to long battles for justice.

The term “revenge porn”, which is widely used, has also been criticised by survivors and activists, who deem it a misnomer. The primary concern is that the phrase may lead to victim-blaming and incorporates a sense of sexism. After all, “revenge” insinuates a crime committed against those who post such images, thus leading them to seek revenge.

Meta’s tool, which can be deemed a step in the right direction, hopes to “strengthen” the company’s efforts in tackling the spread of NCII.

If you are a victim of nonconsensual pornography and you reside in the United States, please call the CCRI Crisis Helpline at 844-878-CCRI (2274). In the UK, visit Revenge Porn Helpline. For more information on Meta and NCII, visit StopNCII.org.

©