Science and TechArtificial Intelligence

Actions

Meta takes legal action against app that can 'nudify' images

The lawsuit, filed in Hong Kong, targets a company called CrushAI.
Meta takes action against app that can 'nudify' images
The Meta logo.
Posted
and last updated

One of the more troubling developments in artificial intelligence in recent years has been the creation of increasingly realistic fake images known as deepfakes. These can now be produced in seconds using apps — often without a person’s consent.

Meta is taking legal action against one of the most widely used apps that can digitally remove clothing from photos, a disturbing trend that has become a harsh reality for some victims.

The lawsuit was filed in Hong Kong against Joy Timeline HK Limited, the parent company for the CrushAI app. Meta claims the app previously circumvented its ad review process in order to place advertisements on its platforms.

“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,” Meta said in a statement.

The company added that it is building new technology to detect and more quickly remove ads promoting nudify-type apps.

RELATED STORY | A 15-year-old’s prom picture was altered into AI-created nudes

Ben Colman, co-founder and CEO of the deepfake detection company Reality Defender, said Meta’s move is a step in the right direction — but not nearly enough.

"They're giving people a false sense of security. You know, it's not like bad actors or hackers are gonna say, hey, look at me. Here's my company name, here's my address. I'm actually in the US where the laws and the courts can get me. They're not in the U.S. Or they're nowhere that can be actually found."

Colman added that it is not difficult for a company like Meta to regulate this kind of content.

Colman said some estimates suggest that 89% of all online media is or will be touched by AI, but he believes the real number is even higher.