Microsoft Bing has announced a significant enhancement to its ability to remove AI-generated and deepfake images from its search engine, aimed at combatting nonconsensual intimate image (NCII) abuse. This initiative comes through a new partnership with a nonprofit dedicated to victim advocacy.
In collaboration with the victim support initiative StopNCII, Microsoft is implementing a more “victim-centered” method that emphasizes a comprehensive detection process. StopNCII allows individuals to create digital fingerprints, or “hashes,” for intimate images that can be tracked and removed from specific platforms.
During a pilot program conducted through August, Microsoft utilized StopNCII’s database to instantly identify and prevent the display of intimate images in Bing search results. The tech giant reports that it has already acted on 268,000 explicit images.
StopNCII’s hashing system is already employed by various social media platforms and dating services, making Bing the first search engine to join this collaborative effort against NCII abuse.
ECNETNews Update
Scammers are exploiting your home images to escalate sextortion threats
In a parallel effort, another major tech company is taking steps to address the prevalence of nonconsensual deepfake content. This company has been overhauling its search ranking algorithms to reduce the visibility of explicit synthetic content, opting to prioritize high-quality, non-explicit materials, such as reputable news articles. Additionally, it has improved its reporting and review systems to facilitate quicker removal of harmful content, although it has yet to adopt StopNCII’s technology.
“Search engines are critical in the access and dissemination of images, making Bing’s proactive measures particularly significant for the welfare of those impacted,” noted an advocate from a relevant organization.
Microsoft also maintains similar processes for addressing NCII abuse related to real images and enforces stringent policies against intimate extortion, commonly referred to as sextortion. Earlier this year, Microsoft equipped StopNCII with its proprietary PhotoDNA technology, enhancing their collective efforts to identify and eliminate child sexual abuse material.
How Big Tech is tackling explicit nonconsensual deepfakes
How to Report Intimate Images with StopNCII
If you’re concerned that an explicit or non-explicit image may be at risk of unauthorized sharing or manipulation, you can register your fingerprint with StopNCII for potential future detection. Your personal images remain private and are not stored on any external platform.
-
Go to StopNCII.org.
-
Select “Create your case” from the top right corner.
-
Follow the personalized prompts to provide details about the image or video.
-
You will then upload photos or videos from your device, which StopNCII will scan to create hashes that are sent to participating platforms—ensuring your images are not shared.
-
Keep your case number safe to monitor if your content has been detected online.
If you have experienced the non-consensual sharing of intimate images, seek help by calling the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The initiative’s website also provides valuable information and a list of international resources.