In a significant move aimed at tackling the growing issue of non-consensual explicit images, Microsoft has introduced a powerful tool to help deepfake porn victims remove their images from Bing search results. This tool marks a major step forward in protecting individuals from the devastating effects of revenge porn and AI-generated synthetic nudes. Microsoft gives deepfake porn victims a tool to scrub images from Bing search, ensuring they can regain control over their online privacy.
The rise of generative AI technologies has led to an alarming increase in deepfake porn, where synthetic nude images are created to resemble real individuals without their consent. The proliferation of these images on search engines has become a critical issue, and Microsoft gives deepfake-porn victims a tool to scrub images from Bing search to combat this growing problem.
Partnership with StopNCII for Digital Fingerprinting
To provide a robust solution, Microsoft has partnered with StopNCII, a leading organization dedicated to helping victims of revenge porn. StopNCII enables victims to create a unique digital fingerprint, also known as a “hash,” for explicit images. This fingerprint can then be used by platforms to identify and remove these images from their services. With this collaboration, Microsoft gives deepfake-porn victims a tool to scrub images from Bing search by leveraging StopNCII’s hash technology.
StopNCII has already partnered with other major platforms such as Facebook, Instagram, TikTok, Reddit, and Snapchat. By joining forces with StopNCII, Microsoft gives deepfake porn victims a tool to scrub images from Bing search, ensuring a comprehensive approach to removing these harmful images.
Microsoft’s Actions to Tackle Revenge Porn
Microsoft’s decision to introduce this tool follows a successful pilot program in which 268,000 explicit images were removed from Bing’s image search by the end of August 2024. The company realized that relying on user reports alone was not enough to address the issue, leading to the development of a more proactive solution. Microsoft gives deepfake porn victims a tool to scrub images from Bing search, helping victims take immediate action rather than waiting for user reports.
According to Microsoft, this new tool responds to concerns raised by victims, experts, and stakeholders who felt that traditional user reporting methods were insufficient. With this tool, Microsoft gives deepfake porn victims a tool to scrub images from Bing search more effectively, providing a quicker and more reliable way to remove harmful content.
Google’s Approach and Criticisms
While Microsoft gives deepfake-porn victims a tool to scrub images from Bing search, Google has faced criticism for not partnering with StopNCII. Although Google offers its own tools to report explicit images, many have argued that the lack of collaboration with StopNCII has hindered its ability to tackle the deepfake porn problem. According to a report from Wired, Google users in South Korea alone have reported over 170,000 search and YouTube links for unwanted sexual content since 2020.
The contrast between Microsoft’s proactive stance and Google’s approach has been noted by many in the tech industry. Microsoft gives deepfake porn victims a tool to scrub images from Bing search, setting an example for other companies to follow in addressing this sensitive issue.
The Growing Threat of AI Deepfake Porn
The use of AI to create deepfake porn is a growing concern, particularly for younger individuals. Although StopNCII’s tools are designed for individuals over 18, there has been a surge in cases involving minors. High school students across the country are increasingly falling victim to deepfake technologies, highlighting the need for stronger regulations and tools. While Microsoft gives deepfake porn victims a tool to scrub images from Bing search, the absence of federal laws specifically targeting AI deepfake porn in the United States leaves many vulnerable.
Currently, the U.S. lacks a comprehensive law to address AI-generated deepfake porn, relying instead on a patchwork of state laws. As of August 2024, 23 states have passed laws against non-consensual deepfakes, while nine states have rejected similar proposals. Microsoft gives deepfake porn victims a tool to scrub images from Bing search, providing a much-needed line of defense in a largely unregulated space.
Legal Action and Future Prospects
In response to the growing threat posed by deepfake porn, prosecutors in San Francisco filed a lawsuit in August 2024 to take down 16 websites known for hosting deepfake content. These “undressing” sites use AI to generate synthetic nude images of real people, causing irreparable harm to their victims. As Microsoft gives deepfake porn victims a tool to scrub images from Bing search, it becomes clear that more tech companies need to follow suit and implement similar protections.
With Microsoft giving deepfake porn victims a tool to scrub images from Bing search, the hope is that other platforms, including Google, will enhance their efforts to combat the spread of non-consensual explicit content. This proactive approach is crucial in ensuring that victims of deepfake porn are not left defenseless against the invasive technology.
Featured Image Credits: Getty Images
Recommended:Microsoft Xbox Unveils Adaptive Joystick