Google makes it easier to remove explicit deepfakes from its search results

Google has recently introduced updates to Search aimed at making it incredibly difficult to find explicit deepfake content. As part of its ongoing battle against realistic but manipulated images, the company is now providing an easier process for individuals to have non-consensual fake images of themselves removed from Search.

Previously, users could request the removal of such images under Google’s policies. Now, when Google approves a removal request, it will also filter out all explicit results in similar searches related to that person. The system will scan for and eliminate duplicates of the offending image, which can help victims feel more secure, knowing the same image won’t easily resurface on other sites.

Additionally, Google has refined its ranking systems to ensure that if someone searches specifically for explicit deepfakes using a person’s name, the search results will highlight “high-quality, non-explicit content” instead. This means that news articles or other relevant non-explicit content about the person will appear in the results. Furthermore, Google plans to educate users searching for deepfakes by showing them content that discusses the societal impact of such images.

Google is careful not to eliminate legitimate content, such as an actor’s nude scene, while targeting deepfakes. It acknowledges that distinguishing between legitimate and fake explicit images remains a challenge. In the meantime, one of the strategies it has employed is to demote websites that frequently receive removal requests for manipulated images. This strategy sends a strong signal that these sites are not of high quality, a method that has proven effective for other types of harmful content in the past.

Source

Control F5 Team
Blog Editor
OUR WORK
Case studies

We have helped 20+ companies in industries like Finance, Transportation, Health, Tourism, Events, Education, Sports.

READY TO DO THIS
Let’s build something together