705 Million Downloads: AI Nudify Apps Are on Your Phone's App Store

10 min readBy Viallo Team

Quick take: Researchers found over 100 AI "nudify" apps on the Apple App Store and Google Play Store, with a combined 705 million downloads and $117 million in revenue. These apps use AI to generate nude images from real people's clothed photos - often without the subject's knowledge or consent. Apple and Google removed some after being contacted, but dozens remain. Your publicly shared photos are the raw material these apps run on.

Rows of smartphones displayed in a retail store under bright fluorescent lighting, shot with shallow depth of field

What researchers found

In early 2026, the Tech Transparency Project published findings from a systematic search of both major app stores. They found 55 AI nudify apps on Google Play and 47 on Apple's App Store. These aren't obscure apps hiding in the corners of the internet - they're available right next to your banking app and your weather widget.

The numbers are staggering. Combined, these apps have been downloaded 705 million times and have generated approximately $117 million in revenue. Some of the most popular ones rank in the top charts of their categories. They use euphemistic names and descriptions to skirt platform policies - terms like "AI undress", "clothes remover", or vague references to "body editing."

After the Tech Transparency Project contacted Apple and Google with their findings, Apple removed 28 apps and Google removed 31. But dozens remained available, and new ones continue to appear. The app review process at both companies clearly isn't catching them consistently.

How these apps actually work

The technology behind nudify apps is straightforward. They use generative AI models - specifically diffusion models similar to those powering Stable Diffusion and DALL-E - that have been fine-tuned on datasets of nude images. You upload a clothed photo of someone, and the AI generates a synthetic nude image that matches the person's body proportions, skin tone, and pose.

The results aren't perfect, but they don't need to be. Even a roughly convincing fake nude can cause devastating harm when shared among classmates, coworkers, or on revenge porn sites. And the technology is getting better rapidly - the gap between obviously fake and convincingly real is shrinking with every model update.

Most of these apps operate on a freemium model. They let users generate a few images for free, then charge for higher quality outputs, batch processing, or watermark removal. The $117 million in revenue proves there's a large and willing market for this.

Close-up of a hand holding a phone from behind, the phone screen not visible, urban street scene blurred in background

Who gets targeted

The Grok deepfake scandal in March 2026 put a human face on this problem. Three teenage girls from Tennessee filed a class-action lawsuit against Elon Musk's xAI after Grok generated sexualized deepfakes from their social media photos. A New York Times investigation found that Grok generated 4.4 million images in just nine days, with researchers estimating 23,000 of those were sexualized images of children.

But the problem extends far beyond Grok. Studies consistently show that the primary targets of AI-generated non-consensual intimate images are women and girls. A 2023 report from the Internet Watch Foundation found a 417% increase in AI-generated child sexual abuse material in just one year. The availability of nudify apps on mainstream app stores has only accelerated this.

The source material for these apps is almost always publicly available photos. Social media profiles, school photos, sports team pictures, vacation snapshots - any photo where someone is clearly visible can become input for a nudify app. The victims don't even need to have shared the photo themselves.

Why app stores keep failing to stop this

Both Apple and Google have policies that explicitly prohibit apps generating non-consensual intimate imagery. Apple's App Store Review Guidelines ban "content that is defamatory, discriminatory, or mean-spirited." Google Play's policy prohibits apps that"facilitate the generation or distribution of content that depicts, facilitates, or promotes child sexual abuse."

Despite these policies, 102 nudify apps made it through review. There are a few reasons for this recurring failure:

  • Deceptive listings. Apps use innocent-sounding names and descriptions during the review process, then reveal their actual functionality after approval. Some even maintain a "clean" mode that activates during Apple and Google's automated testing.
  • Scale overwhelms review. Apple's App Store receives roughly 1,000 new app submissions per day. Google Play receives even more. Thorough manual review of every app's actual functionality is practically impossible.
  • Financial incentives are misaligned. Both Apple and Google take a 15-30% cut of in-app purchases. A nudify app generating millions in revenue also generates significant commission for the platform.

Laws targeting AI-generated intimate images are being written and passed at an accelerating pace, but enforcement still lags far behind the technology.

The US federal TAKE IT DOWN Act, signed in early 2026, criminalizes the non-consensual distribution of intimate images, including AI-generated ones. Several states have passed their own laws targeting deepfake pornography specifically. In the EU, the AI Act classifies certain deepfake applications as high-risk, requiring transparency and compliance obligations.

But prosecuting app developers who operate from jurisdictions with weak enforcement is difficult. Many nudify apps are developed by anonymous teams registered in countries that have little interest in pursuing these cases. Even when apps are removed from one store, they often reappear under different names within days.

Try Viallo Free

Share your photo albums with a single link. No account needed for viewers.

Start Sharing Free

How to protect your photos

There's no perfect defense against someone downloading your photo and running it through an AI tool. But you can significantly reduce your exposure.

  • Lock down your social media profiles. Set Instagram, Facebook, and other accounts to private. Public profiles are the primary source of photos used in nudify apps.
  • Be selective about what you post publicly. Full-body photos and clear facial images provide the most usable input for these tools. Consider which photos genuinely need to be public and which can be shared privately with people you trust.
  • Use private photo sharing instead of social media. When you want to share photos with family and friends, use a platform that doesn't make photos publicly accessible. Sharing an album through a private link means your photos aren't indexed, searchable, or scrapable.
  • Talk to your kids about photo privacy. Teenagers are disproportionately targeted. Help them understand that photos shared publicly can be misused in ways they might not anticipate.
  • Report nudify apps when you find them. Both Apple and Google allow you to report apps that violate policies. The more reports an app receives, the more likely it is to be reviewed and removed.
A printed family photo album sitting on a bookshelf between other books, warm ambient light from a desk lamp

Why where you share matters more than ever

The existence of 705-million-download nudify apps on mainstream app stores isn't just a policy failure - it's a fundamental shift in what it means to share a photo online. Every publicly visible photo is now potential input for AI tools that can generate intimate images without consent.

This doesn't mean you should stop sharing photos. But it does mean the distinction between public and private sharing matters more than it ever has. Photos shared on a public social media profile are accessible to anyone, including people running nudify apps. Photos shared through a private, link-based sharing service are only visible to the people you choose.

The photo you share today could be processed by an AI tool that doesn't exist yet. Choosing where and how you share is the most important privacy decision you can make.

Frequently Asked Questions

Are nudify apps illegal?

It depends on the jurisdiction. In the US, the TAKE IT DOWN Act criminalizes non-consensual distribution of intimate images, including AI-generated ones. Several states have additional laws targeting deepfake pornography. In the EU, the AI Act regulates certain deepfake applications. But the apps themselves often operate from jurisdictions where enforcement is weak.

Can I tell if my photos have been used in a nudify app?

Unfortunately, there's no reliable way to know. These apps process photos locally or on the developer's servers, and the generated images are delivered privately to the person who uploaded your photo. Unless someone shares the result publicly, you may never find out.

Why don't Apple and Google just ban all AI image editing apps?

Not all AI image editing is harmful. Legitimate tools for photo enhancement, background removal, and creative editing use similar technology. The challenge is distinguishing between harmful and legitimate uses during app review, especially when apps deliberately disguise their true functionality.

How does Viallo protect my photos from being scraped?

Viallo shares photos through private links that require the specific URL to access. Photos aren't indexed by search engines, aren't visible on public profiles, and can't be scraped by bots crawling social media. You control exactly who sees your photos and can revoke access at any time.

Related articles