A new report has revealed a startling surge in AI “nudify” apps across Google and Apple’s app stores. These apps, which digitally remove clothing from images, are raising serious safety and ethical concerns as millions of users download them worldwide. While companies like xAI’s Grok have drawn public scrutiny, the problem is far broader than one platform.
According to the Tech Transparency Project (TTP), dozens of apps capable of generating nonconsensual sexualized images remain available, highlighting gaps in app store monitoring. These findings are driving renewed calls for stricter oversight and urgent action from both Apple and Google.
TTP identified 55 apps on the Google Play Store and 48 on Apple’s App Store that can “digitally remove the clothes from women and render them partially or fully naked.” Collectively, these apps have been downloaded over 705 million times and have generated approximately $117 million in revenue.
Despite previous reports and partial removals, many of these apps continue to operate, often returning to the stores after brief suspensions. This pattern shows that current safeguards are not enough to prevent the spread of nonconsensual AI-generated content.
Google has suspended several of the apps highlighted in the TTP report, while Apple removed 28, with two later restored. However, xAI’s Grok and similar platforms remain accessible, raising questions about inconsistent enforcement.
Critics point out that companies react quickly to certain controversial apps but have been slower to address AI nudify platforms that generate degrading images of real people. This inconsistency has fueled outrage among digital rights advocates, who warn that delayed action allows harmful content to proliferate.
AI nudify apps are now under legal and regulatory pressure in multiple regions. Grok and other AI platforms are facing investigations in the European Union and the United Kingdom, alongside at least one lawsuit from victims of nonconsensual AI-generated content.
The surge of these apps coincides with growing public concern over AI misuse. Experts warn that without stronger oversight, these tools could normalize harmful behavior, putting vulnerable populations—particularly women and minors—at serious risk.
The recurring appearance of nudify apps highlights limitations in Google and Apple’s moderation policies. Even when apps are removed, new versions or similar apps often emerge quickly. Experts argue that this cycle demonstrates the need for more proactive content screening, better AI detection tools, and stricter developer accountability.
Consumers, meanwhile, are left navigating app stores where harmful technology can appear disguised as harmless entertainment. Advocates are calling for urgent reforms that prioritize user safety over revenue, noting that millions of downloads show the vast reach and impact of these apps.
The TTP report underscores the urgent need for regulatory action and robust app store governance. Developers and platforms must adopt more rigorous ethical standards, while policymakers consider rules that prevent AI-generated abuse.
For users, the rise of AI nudify apps is a wake-up call: technology that seems entertaining can have serious real-world consequences. Awareness, responsible use, and stronger oversight remain key to stopping the spread of nonconsensual sexualized AI content.
AI Nudify Apps Flood App Stores, Raising Safe... 0 0 0 0 2
2 photos


Array