Google makes it easier to remove explicit deepfakes from its search results
It's also updating its algorithms to make them harder to find.
Google has rolled out updates for Search with the intention of making explicit deepfakes as hard to find as possible. As part of its long-standing and ongoing fight against realistic-looking manipulated images, the company is making it easier for people to get non-consensual fake imagery that features them removed from Search.
It has long been possible for users to request for the removal of those kinds of images under Google's policies. Now, whenever it grants someone's removal request, Google will also filter all explicit results on similar searches about them. The company's systems will scan for any duplicates of the offending image and remove them, as well. This update could help alleviate some of the victim's fears if they're worried about the same image popping up again on other websites.
In addition, Google has updated its ranking systems so that if a user specifically searches for explicit deepfakes with a person's name, the results will surface "high-quality, non-explicit content" instead. If there are news articles about that person, for instance, then the results will feature those. Based on Google's announcement, it seems it also has plans to school the user looking for deepfakes by showing them results that discuss their impact on society.
Google doesn't want to wipe out results for legitimate content, like an actor's nude scene, in its bid to banish deepfakes from its results page, though. It admits it still has a lot of work to do when it comes to separating legitimate from fake explicit images. While that's still a work in progress, one of the solutions it has implemented is to demote sites that have received a high volume of removals for manipulated images in Search. That's "a pretty strong signal that it's not a high-quality site," Google explains, adding that the approach has worked well for other types of harmful content in the past.