Meta has filed a lawsuit against the developers of Crush AI. Crush AI is a “nudify” tool that uses AI to generate fake unclothed images of real people without their consent.
The company behind the app, Joy Timeline HK, is accused of running more than 8,000 ads across Facebook and Instagram, despite repeated removals.
The lawsuit was filed in Hong Kong after Meta claimed the company deliberately bypassed its ad review systems.
The app reportedly received around 90% of its website traffic from Meta’s platforms.
Tactics Used to Evade Detection
Meta states that the app’s developers used deceptive methods. These included creating dozens of advertiser accounts and regularly switching domain names.
Crush AI also sometimes used misleading names like “Eraser Annyone’s Clothes” followed by several numbers. The tactics worked.
The app’s ad operations went under the radar even after Meta removed earlier ads.
According to the author of the Faked Up newsletter, Alexios Mantzarlis, the company behind Crush AI continued its ad campaigns despite repeated warnings.
One Facebook page even openly promoted the app’s services.
The Big Problem
Crush AI is not the only app of its kind. Platforms such as X, Reddit, and YouTube have also struggled with a surge in ads for AI nudify services.
In 2024, links to these apps spread rapidly. And millions of users, including minors, were exposed to them.
In response, Meta and TikTok blocked common search terms related to these apps.
However, stopping these services entirely remains difficult. Many of them simply rebrand and reappear under new names.
New Safety Measures
Meta says it has developed better tools to detect harmful ads. These systems can now identify and remove AI nudify ads even when the content itself does not include nudity.
Here are the updates:
- Smarter Detection Systems: Meta now uses matching technology to find similar or duplicate ads more quickly.
- Expanded Flagging Terms: The company added new words, phrases, and emojis to its detection filters.
- Targeted Disruption: Meta is using advanced strategies to detect and take down networks of bad actors.
Since the beginning of 2025, Meta reports it has dismantled four ad networks connected to AI nudify services.
Tech Partners and Policymakers
Meta also shares information with other companies. Through the Tech Coalition’s Lantern program, a joint effort by Meta, Google, Snap, and others, the company has provided over 3,800 URLs connected to AI nudify services since March.
Outside the tech industry, Meta continues to support stronger digital safety laws.
The company backs the Take It Down Act, which gives parents greater control over the apps their children can download.
Digital Abuse
Crush AI and similar tools pose a serious risk to privacy and safety. They allow users to create fake explicit content of real people.
Often, the victims have no idea their image is being used. This is not just a violation of privacy but a form of digital abuse.
The damage can be long-lasting, especially when the content spreads quickly online.