OkCupid Gave 3 Million User Photos to an AI Company - The FTC Just Caught Them
Quick take: The FTC just settled with OkCupid and Match Group after discovering they handed nearly 3 million user photos - plus demographic and location data - to an AI startup called Clarifai. There was no data-sharing agreement, no user consent, no payment. OkCupid's founders had personally invested in Clarifai and did it as a favor. The settlement bans them from misrepresenting their privacy practices for 20 years, but there's no fine. Your dating profile photos may have been used to train AI, and there's nothing you can do to undo it.

What actually happened
In September 2014, OkCupid quietly transferred nearly 3 million user photos - along with demographic information and location data - to Clarifai, a visual recognition AI startup. The transfer wasn't part of any business deal. No contract was signed. No money changed hands. No restrictions were placed on how the data could be used.
Why? OkCupid's co-founders had personally invested in Clarifai and handed over the data as a favor to help the startup build its AI models. The FTC described it as sharing done'not as part of any business arrangement' - just founders helping out a company they had a financial stake in.
The FTC's investigation found that OkCupid's privacy policy at the time told users the app 'doesn't share personal information with others except as indicated in its Privacy Policy or when it informs and gives consumers the chance to opt out.' None of that happened. Users were never told. No opt-out was offered. The AI company wasn't a service provider, business partner, or affiliate - it was an outside startup the founders wanted to help.
What Clarifai does with photos
Clarifai is a visual recognition AI company. Its core product analyzes images to identify faces, objects, scenes, and concepts. The technology is used by everyone from retail brands to government agencies. What it needed in 2014 was training data - lots of it. And OkCupid's database of real human faces with demographic labels was exactly the kind of dataset that AI companies pay millions for.
Here's what makes this particularly troubling: Clarifai received not just photos but also demographic data and location information. That means the AI wasn't just learning to recognize faces - it was learning to associate faces with age, gender, ethnicity, and geography. These are exactly the features that make facial recognition systems both powerful and dangerous.
The FTC placed no restrictions on what Clarifai could do with the data. Once it left OkCupid's servers, the company had no oversight. That data could have been used to train models that are still running today - and probably was.

What the FTC settlement actually does
In March 2026, the FTC announced a settlement with OkCupid and Match Group Americas (the parent company). The terms sound strict on paper: both companies are permanently prohibited from misrepresenting how they collect, use, share, or protect personal data.
But there's a catch. There's no fine. No financial penalty at all. The settlement doesn't require OkCupid or Match Group to pay anything. It doesn't require Clarifai to delete the data. And it certainly doesn't help the 3 million people whose photos were already handed over.
The FTC's enforcement power is limited. It can prohibit deceptive practices going forward, but it can't retroactively undo AI training. Once your photos are baked into a model, they're there. Deleting the original files wouldn't remove what the model learned from them.
Why dating apps are a uniquely bad place for photo privacy
Dating apps collect some of the most sensitive photo data of any consumer platform. Think about what a dating profile contains: multiple clear photos of your face from different angles, often with body shots, sometimes in identifiable locations. Combined with the demographic data you voluntarily provide - age, height, ethnicity, education, location - it's a facial recognition training goldmine.
And unlike social media, where you might curate what you share publicly, dating profiles are created with the assumption of semi-privacy. You're showing yourself to potential matches, not the entire internet. When that assumption breaks down, it feels like a fundamental violation.
- Tinder is testing a feature that scans your device's photo gallery using AI to find patterns that could help you find matches
- Bumble uses machine learning on profile photos for verification and moderation
- Match Group (which owns OkCupid, Tinder, Hinge, and others) processes millions of user photos daily across its apps
The OkCupid case is probably the tip of the iceberg. It only came to light because the FTC investigated. How many similar data transfers happened at other companies and never got caught?
This fits a larger pattern
The OkCupid case isn't an isolated incident. It's part of a growing pattern where companies treat user photos as a raw material to be extracted and repurposed.
- Clearview AI scraped over 50 billion photos from the public web to build a facial recognition database sold to law enforcement
- Meta used public Instagram posts to train AI models and is now rolling out features that give AI access to your entire camera roll
- Adobe faced a lawsuit from photographers who discovered their images were used to train generative AI without consent
- AI photo editors were caught leaking 1.5 million user photos earlier this year
In every case, the same story plays out: users upload photos assuming a certain level of privacy, and the platform finds a way to monetize those images that nobody agreed to.
Try Viallo Free
Share your photo albums with a single link. No account needed for viewers.
Start Sharing FreeWhat you can actually do about this
The damage from OkCupid's data transfer is already done. Those 3 million photos are out there and probably baked into AI models that are still in production. But you can limit future exposure.
Audit your dating profiles. Consider what photos you've uploaded and whether you're comfortable with them being used for purposes beyond dating. Remove any photos you wouldn't want in a public-facing AI dataset.
Read privacy policies critically. Look for language about'service providers' and 'third-party sharing.' If a company says it shares data with'partners' or 'affiliates,' that's often code for 'whoever we want.'
Separate where you share. Don't use the same platform for casual photo sharing and sensitive personal photos. Dating profiles, family photos, travel memories - these should live in different places with different trust levels.

Why private photo sharing matters more than ever
The OkCupid case is a reminder that any photo you upload to a platform is only as safe as that company's internal policies - and those policies can be violated without you ever knowing about it.
Private photo sharing platforms that don't process your images with AI, don't sell access to third parties, and store data in privacy-compliant jurisdictions offer a fundamentally different trust model. Your photos aren't inventory. They're memories.
At Viallo, photos are stored on EU servers with end-to-end privacy. We don't train AI on your images, we don't share them with third parties, and we strip EXIF metadata from shared photos. Recipients don't even need an account to view albums you share with them.
Frequently asked questions
Were my OkCupid photos used to train AI?
If you had an OkCupid account before September 2014, your photos may have been among the 3 million transferred to Clarifai. The FTC's complaint doesn't name specific users, and OkCupid has not offered a way for individuals to check. There's no mechanism to remove your data from models that have already been trained.
Why is there no fine in the FTC settlement?
The FTC's authority under Section 5 of the FTC Act allows it to prohibit deceptive practices but doesn't always include monetary penalties for first-time violations. The 20-year consent order is the enforcement mechanism - if OkCupid or Match Group violate the terms, they'd face substantial penalties then.
Does this affect other Match Group apps like Tinder and Hinge?
The FTC's action specifically covers OkCupid and Match Group Americas. The consent order applies to both entities, meaning the privacy requirements extend across the corporate group. However, the specific data transfer to Clarifai only involved OkCupid's user data.
How can I protect my photos on dating apps?
Use unique photos that aren't posted elsewhere online, avoid including identifiable backgrounds, and review the app's privacy policy for third-party sharing clauses. For personal photos you want to keep private - family memories, travel photos - use a dedicated private sharing platform rather than a social or dating app.
Can Clarifai be forced to delete the photos?
The FTC settlement doesn't require Clarifai to delete anything. EU residents could potentially use GDPR rights to request deletion, but the company is US-based. Even if original photos were deleted, the AI models trained on them would retain what they learned - you can't untrain a model by deleting its input data.