A Photographer Tried to Stop Adobe From Training AI on His Photos. He Lost.
Quick take: The founder of Diversity Photos, a stock archive built to represent underrepresented communities, partnered with Adobe Stock in 2018 to distribute his work. He later discovered Adobe used his entire library to train Firefly AI - despite Adobe publicly claiming Firefly was "ethically sourced." He sued. He lost. The ruling sets a dangerous precedent: if you upload photos to a platform, the terms of service may give that platform the right to train AI on your work. Your photos aren't just stored. They're raw material.

What happened between Adobe and Diversity Photos
In 2018, a photographer created Diversity Photos - a curated stock photography archive specifically designed to provide authentic representation of underrepresented communities. He partnered with Adobe Stock to distribute the collection, signing what he understood as a standard distribution agreement.
Years later, Adobe launched Firefly, its generative AI image tool. Adobe marketed Firefly as trained exclusively on "ethically sourced" content - Adobe Stock images, openly licensed content, and public domain work. The photographer realized his entire Diversity Photos library had been fed into the training pipeline. No separate consent was requested. No additional compensation was offered.
He filed a legal challenge to stop Adobe from using his photos for AI training. In March 2026, he lost.
How stock photo deals turn into AI training sets
The core issue isn't that Adobe secretly stole photos. It's that the terms of service already granted Adobe broad usage rights. When you upload to most stock photo platforms, you agree to terms that allow the platform to use your content for "platform improvement," "service development," or "related purposes." In 2018, nobody read those clauses and thought "AI training." In 2026, that's exactly what they mean.
This isn't unique to Adobe. The same pattern plays out across the industry:
- Getty Images: Licensed its entire archive to multiple AI companies while simultaneously suing Stability AI for using unlicensed Getty images. The message: we'll sell your photos to AI companies, but only if we get paid.
- Shutterstock: Partnered directly with OpenAI to provide training data for DALL-E. Contributors who uploaded before the AI partnership had no practical way to opt out.
- Google Photos: Terms allow Google to use uploaded content to "improve services." Google's AI research papers cite training on "user-contributed image datasets."
The legal reality is uncomfortable: the moment you accept a platform's terms and upload your photos, you've likely granted them rights that extend far beyond simple storage.

The legal argument - and why it failed
The photographer's legal team argued that AI training constitutes a fundamentally different use than stock photo distribution. Uploading a photo for licensing to editorial clients is not the same as feeding it into a machine learning model that generates competing images. The argument makes intuitive sense.
The court disagreed. The ruling leaned on the broad language in Adobe's contributor agreement, which granted rights to use uploaded content for "any purpose related to the operation and improvement of the service." Adobe's position was that Firefly is a service improvement - a new creative tool built on top of their existing platform. The court accepted this interpretation.
What makes this ruling significant isn't the specific case. It's the precedent. If broad terms of service can retroactively authorize AI training on content uploaded years before generative AI existed, then every photo you've ever uploaded to any platform is potentially fair game.
What this means for every photo you upload
You don't have to be a professional photographer for this to matter. The same legal framework applies to consumer photo platforms. When you upload family photos to Google Photos, vacation snapshots to iCloud, or event pictures to Facebook - you're accepting terms that grant the platform broad usage rights.
Most people think of photo storage as a utility. You put photos in, you get photos out. But the business model behind "free" photo storage has always been more complicated than that. Your photos have value beyond storage fees:
- Training data for AI models: Every photo teaches AI systems about composition, lighting, faces, objects, and scenes.
- Advertising intelligence: Photos reveal where you go, what you buy, who you spend time with, and what you care about.
- Product development: User photos help platforms build features like facial recognition, scene detection, and auto-tagging.
The Diversity Photos case shows that legal protections for photographers are weak when platform terms are broad. And consumer terms are typically even broader than professional contributor agreements.
Try Viallo Free
Share your photo albums with a single link. No account needed for viewers.
Start Sharing FreeHow to actually protect your photos from AI training
There's no perfect solution, but there are practical steps that reduce your exposure:
- Read the AI-specific terms: Most major platforms have added AI training clauses to their terms of service since 2023. Look specifically for language about "machine learning," "model training," or "service improvement."
- Use opt-out mechanisms when available: Some platforms offer AI training opt-outs, but they're often buried in settings. Google, Meta, and Adobe all have opt-out toggles. Whether opting out retroactively removes your data from existing models is unclear.
- Choose platforms with explicit no-training policies: A growing number of platforms explicitly state they will never use your content for AI training. This is a business decision, not a legal requirement - but it's the strongest protection available.
- Keep originals on local storage: If a photo never touches a cloud platform, it can't be used for training. This is the most secure approach but the least convenient.

Choosing platforms that respect your photos
The Diversity Photos ruling makes one thing clear: trusting that a platform "probably won't" use your photos for AI training isn't enough. You need explicit, contractual guarantees.
When evaluating a photo platform, ask these questions:
- Does the terms of service explicitly exclude AI/ML training from permitted uses?
- Is the platform's revenue model based on subscriptions, not data monetization?
- Where are the servers located? EU-hosted platforms are subject to GDPR, which requires explicit consent for automated processing.
- Does the platform use any image recognition, facial scanning, or content analysis? If it does, your photos are being processed by AI systems - even if not for "training."
Viallo's terms explicitly state that uploaded photos are never used for AI training, model development, or any purpose beyond storage and delivery to the people you choose to share with. The platform runs on subscription revenue, not data monetization. Photos are stored on EU-based servers under GDPR jurisdiction, and no image recognition or content analysis is performed on your uploads.
Your photos are your creative work. They're also your memories. Both deserve better protection than a terms-of-service clause written by a platform's legal team.
Try Viallo Free
Share your photo albums with a single link. No account needed for viewers.
Start Sharing FreeFrequently Asked Questions
Can Adobe legally train AI on photos I uploaded to Adobe Stock?
Based on the March 2026 ruling, yes. The court found that Adobe's contributor agreement grants broad enough rights to cover AI training. This interpretation applies to photos uploaded before Firefly existed. If you're an Adobe Stock contributor, check your agreement's "permitted uses" section.
Does Google Photos use my photos for AI training?
Google's terms allow using uploaded content to "improve services." Google has not explicitly confirmed or denied using Google Photos uploads for training specific AI models. Their privacy page offers an opt-out for some AI features, but the scope is limited.
How can I tell if a photo platform uses my photos for AI?
Check the terms of service for language about "machine learning," "automated processing," "model training," or "service improvement." If the platform uses any form of image recognition, facial detection, or auto-tagging, it's running AI on your photos. The question is whether the results are also fed back into training pipelines.
Does Viallo use uploaded photos for AI training?
No. Viallo's terms explicitly prohibit using uploaded photos for AI training, model development, or any automated processing beyond basic file handling. Photos are organized using GPS metadata and timestamps - no image content analysis is performed.
Can I opt out of AI training on stock photo platforms?
Some platforms offer opt-out mechanisms, but effectiveness varies. Adobe added an AI training opt-out for new uploads, but it's unclear whether opting out removes your existing photos from models already trained. The safest approach is choosing platforms that don't train AI at all.