Google, Apple hosted dozens of nudify apps, report reveals

Apple and Google’s App Stores host dozens of so-called “nudified” AI apps, despite claims that such apps violate the companies’ rules, a new investigation claims.

The Tech Transparency Project (TTP), a research initiative of the nonprofit watchdog Campaign for Accountability, found dozens of apps in both stores that digitally remove clothing, leaving people naked or nearly naked. He found 55 such apps in Google Play Store and 47 such apps in Apple App Store.

TTP wrote:

According to app analytics firm AppMagic, “The apps identified by the TTP have collectively been downloaded more than 705 million times worldwide and generated $117 million in revenue. Because Google and Apple get a cut of that revenue, they are directly profiting from the activity of these apps.”

Both the tech giants have reacted to the TTP report. Apple told CNBC it had removed 28 apps identified in the TTP report, while Google told the outlet it had “suspended a number of apps” while its investigation continued.

However, the TTP concludes that both app stores need to do more to prevent non-consensual deepfakes.

See also:

Obvious deepfakes are painful. How to deal with pain.

“The TTP’s findings show that Google and Apple are failing to keep pace with the proliferation of AI deepfake apps that can ‘undress’ people without their permission,” the report said. “Both companies They say they’re dedicated to protecting users, but they host a collection of apps that can transform an innocuous photo of a woman into a humiliating, sexualized image.”

The TTP report follows the controversy surrounding Elon Musk and XAI’s Grok, which is under investigation in several countries for producing sexual, non-consensual images. Mashable’s investigation found that Grok lacks basic security guardrails to prevent deepfakes. Additionally, researchers say Grok created more than 3 million sexually explicit images over an 11-day period between December 29 and January 8 – more than 20,000 of which depicted children.

With the dawn of the AI ​​era, sexual deepfakes will continue to be a major issue for tech companies moving forward.

Updated: January 28, 2026, 12:43 PM EST This story was updated to correct the name of The Tech Transparency Project and provide the correct link to the organization.



<a href

Leave a Comment