The White House Wants to Override State Photo Privacy Laws - What You Need to Know

9 min readBy Viallo Team

Quick take: The White House released an AI legislative framework on March 20 that asks Congress to preempt state AI and privacy laws. That includes laws like Illinois' BIPA, which forced Meta to pay $650 million for using facial recognition without consent. If federal law overrides state protections, the strongest photo privacy rules in the country could disappear - and AI companies would face fewer restrictions on how they use your images.

The U.S. Capitol building photographed from a distance through bare winter tree branches, overcast sky

What the White House framework actually says

On March 20, 2026, the White House published its national AI legislative framework - a blueprint for how it wants Congress to regulate artificial intelligence. The document covers six areas: protecting children, safeguarding communities, respecting intellectual property, preventing censorship, enabling innovation, and educating Americans.

The headline, though, is preemption. The framework explicitly asks Congress to override state AI laws that the administration considers too burdensome for AI developers. The argument is that a "patchwork of conflicting state laws" hurts American innovation and global competitiveness.

In practice, this means states "should not be permitted to regulate AI development"and shouldn't "penalize AI developers for a third party's unlawful conduct using their product." The framework would shift regulatory power from states to the federal government, where the approach is explicitly lighter-touch.

Which state privacy laws are at risk

The strongest photo and biometric privacy protections in the US exist at the state level. Here's what could be affected:

  • Illinois BIPA (Biometric Information Privacy Act). The most powerful biometric privacy law in the country. It requires companies to get written consent before collecting facial recognition data, fingerprints, or iris scans. BIPA generated the $650 million Meta settlement and hundreds of other lawsuits. It's the reason Meta briefly abandoned facial recognition entirely.
  • Texas CUBI Act. Requires consent before capturing biometric identifiers and allows the attorney general to bring enforcement actions. Texas sued Meta in 2022 over facial recognition used in Facebook photos.
  • Washington State biometric privacy law. Similar to BIPA but without the private right of action. Still provides meaningful protections for facial recognition data collected from photos.
  • Colorado AI Act. Effective in 2026, it requires developers of high-risk AI systems to disclose what data they use and allow consumers to opt out. Photo analysis systems would likely qualify as high-risk.
  • California CCPA/CPRA. Gives consumers the right to know what personal information is collected and to delete it. This includes photos and biometric data derived from photos.

If Congress passes a federal law that preempts these state laws, the replacement would need to be at least as strong to maintain current protections. The framework's emphasis on"light touch" regulation suggests that's unlikely.

A stack of legal documents and folders on a wooden desk with a brass desk lamp casting warm light

What this means for your photos

Right now, if you live in Illinois and a company scans your photos with facial recognition without your consent, you can sue them. BIPA gives individuals a private right of action - you don't need the government to act on your behalf. That's why it's been so effective.

Federal preemption could replace that with a system where only federal agencies can enforce violations. Individual lawsuits would disappear. Companies that scan your photos without consent would answer to regulators instead of courts - and regulators can be slow, underfunded, or politically influenced.

This matters because AI companies are increasingly interested in photos. Facial recognition is used for everything from unlocking phones to identifying people in surveillance footage. AI training datasets are built from photos scraped from the internet. Dating apps are scanning camera rolls. Every one of these uses involves someone's photos being processed by AI - and state laws are currently the strongest check on how that processing happens.

The child safety question

The framework says it wants to protect children, but it shifts the burden of online safety from companies to parents. Instead of requiring platforms to implement age verification or content filtering, the framework emphasizes parental controls and education.

For photos, this is particularly relevant. Kids' photos end up on social media, in school systems, on messaging apps, and in cloud storage. If AI companies can process these images with fewer restrictions, the risk of children's photos being used for facial recognition training, deepfake generation, or behavioral profiling increases.

The 61 data protection authorities who signed the Joint Statement on AI-Generated Imagery in February 2026 specifically flagged risks to children. Federal preemption would remove some of the state-level protections that currently exist for minors' biometric data.

How EU protections compare

While the US debates weakening state protections, the EU is going the opposite direction. The EU AI Act, which started enforcement in 2026, classifies biometric identification as high-risk and bans real-time facial recognition in public spaces (with limited exceptions for law enforcement).

Under GDPR, any AI processing of personal photos requires explicit consent, a legitimate purpose, and clear disclosure. You have the right to object to automated processing, request deletion, and know exactly what your data is being used for. These aren't state-level rules that can be preempted - they're continent-wide regulations with serious enforcement teeth.

The practical difference: if you store photos on an EU-hosted platform, your images are protected by GDPR regardless of what happens in US federal legislation. If you store them on a US platform, your protections depend on which state you live in - and those protections might be about to shrink.

Try Viallo Free

Share your photo albums with a single link. No account needed for viewers.

Start Sharing Free

What you can do about it

Legislative battles take time, and this framework is a proposal, not a law. But you can take steps now to protect your photos regardless of what Congress decides:

  • Choose EU-hosted storage. GDPR protections apply based on where data is stored and processed, not where you live. EU-hosted photo platforms give you stronger baseline protections than US-hosted alternatives.
  • Use platforms that don't run AI on your photos. If a platform doesn't scan your photos with facial recognition or image analysis, the question of whether that scanning is legal doesn't apply to you.
  • Limit public photo exposure. The less your face appears in publicly accessible photos, the less useful it is for AI training datasets. Use private sharing instead of social media when possible.
  • Support state-level privacy legislation. If you live in a state with strong biometric or privacy laws, pay attention to preemption efforts. These laws exist because your state legislature decided your data deserves protection.

Viallo stores photos on EU servers with no AI processing, no facial recognition, and no biometric data collection. Your photos are protected by GDPR, not dependent on whichever direction US federal law goes.

A locked filing cabinet in a sunlit office with a small potted plant on top, symbolizing data security

The bottom line

The White House AI framework is one proposal in what will be a long legislative process. It might pass as-is, get modified significantly, or stall entirely. But the direction is clear: the federal government wants to reduce regulatory friction for AI companies, and state privacy laws are seen as friction.

If you're counting on your state's privacy laws to protect your photos from AI scanning, facial recognition, or training data collection, that protection isn't guaranteed to last. The safest approach is to store photos on platforms where the protections are structural - EU hosting, no AI processing, full user control - rather than dependent on laws that might change.

Try Viallo Free

Share your photo albums with a single link. No account needed for viewers.

Start Sharing Free

Frequently Asked Questions

Is the White House AI framework a law?

No. It's a legislative framework - a set of recommendations for how the White House wants Congress to write AI legislation. Congress still needs to draft, debate, and pass actual bills. The process could take months or years.

Would federal preemption eliminate BIPA?

It depends on the scope of the federal law. If Congress passes legislation that broadly preempts state biometric privacy laws, BIPA's private right of action could be replaced by weaker federal enforcement. The framework suggests states "should not be permitted to regulate AI development," which could extend to biometric data collection.

Does GDPR protect my photos even if I live in the US?

GDPR applies to how data is processed, not where you live. If your photos are stored and processed by an EU-based service, GDPR protections apply to that data. This is why choosing EU-hosted photo storage gives you stronger baseline protections.

Can AI companies legally use my photos for training right now?

It depends on the platform's terms of service and your state's laws. Many platforms include broad licensing terms that may cover AI training. In states with biometric privacy laws, companies need explicit consent for facial recognition. Without those state laws, the restrictions would be weaker.

Does Viallo collect biometric data from my photos?

No. Viallo doesn't run facial recognition, image analysis, or any AI processing on your photos. It uses GPS metadata and timestamps for organization. Photos are stored at full resolution on EU servers under GDPR protection.

Related articles