Police Are Searching 150 Million Passport Photos With Facial Recognition
Quick take: UK police have been secretly searching over 150 million passport and immigration photos using facial recognition for six years. Searches jumped from 2 in 2020 to over 400 in 2023, all without public knowledge or debate. A legal challenge is expected in 2026. This is a reminder that any photo tied to your identity - passport photos, ID selfies, even social media profile pictures - can end up in a facial recognition database. The fewer photos of your face available publicly, the harder it is to track you. Private photo sharing keeps your images out of these systems.

What happened in the UK
For six years, UK police forces have been running facial recognition searches against a database of over 150 million passport and immigration photos. The program was never publicly announced. There was no parliamentary debate. Citizens whose photos were searched were never notified.
The scale grew fast. In 2020, police ran just 2 facial recognition searches against the database. By 2023, that number had jumped to over 400. The searches were conducted by multiple police forces across the country, with no standardized oversight or accountability framework.
The revelation came through freedom of information requests and investigative reporting, not voluntary disclosure. A formal legal challenge is expected in 2026, with civil liberties groups arguing the program violates privacy rights under UK and European human rights law.
How facial recognition databases work
Facial recognition systems convert photos of faces into mathematical representations called faceprints. These are numerical maps of facial features - the distance between your eyes, the shape of your jaw, the proportions of your nose. Once your face is converted into a faceprint, it can be compared against millions of other faceprints in seconds.
- Passport and ID photos: Government databases are the largest sources of facial data. Your passport photo, driving license photo, and any government-issued ID photo can be included.
- Social media photos: Companies like Clearview AI have scraped billions of photos from social media platforms to build private facial recognition databases used by law enforcement.
- CCTV and surveillance: Real-time facial recognition cameras can match faces in a crowd against watch lists. London's Metropolitan Police has used live facial recognition at public events.
- Photo sharing platforms: Any photo you upload publicly can potentially be scraped and added to a facial recognition database. You won't know it happened.

It's not just the UK
Government facial recognition programs exist worldwide. The US Immigration and Customs Enforcement (ICE) has used facial recognition on driver's license databases. The FBI has access to over 640 million photos through its Next Generation Identification system. Australia's identity-matching services connect passport photos with law enforcement databases.
In the EU, the AI Act bans real-time biometric identification in public spaces with limited law enforcement exceptions. But the ban doesn't cover searching existing photo databases after the fact - which is exactly what the UK program does.
Clearview AI changed everything
In 2020, Clearview AI was exposed as having scraped over 3 billion photos from social media to build a facial recognition tool used by police. By 2026, that database has grown to over 50 billion images. Despite being fined in multiple countries (Australia, the UK, France, Italy, Greece), the company continues operating. Some law enforcement agencies still use it.
The lesson is stark: if your photo is publicly accessible online, it may already be in a facial recognition database. You likely have no way to know, and limited legal options to remove it.
Why this matters for your photo sharing
Every photo of a face that's publicly accessible is a potential entry in a facial recognition database. This includes:
- Photos posted on social media (even with "friends only" settings)
- Profile pictures on any platform
- Photos shared via public links without access controls
- Event photos posted on public websites
- Group photos shared in public forums or chat groups
The connection between photo sharing and facial recognition surveillance is direct. The more photos of your face exist publicly, the easier it is to identify and track you. This isn't a hypothetical - it's how these systems work today.
What you can do about it
You can't remove your passport photo from government databases. But you can control how many other photos of your face are publicly accessible.
- Share privately, not publicly: Use password-protected or private links instead of posting photos on social media. Private shares can't be scraped by facial recognition companies.
- Audit your social media: Review what photos are public on your profiles. Consider removing or restricting access to clear face photos, especially of children.
- Use platforms that don't scan faces: Some photo platforms run facial recognition for features like "people" albums. Others, like Viallo, don't process faces at all.
- Be cautious with ID selfies: Think twice before uploading ID photos or selfies for account verification. Check what the company does with that data and how long they keep it.
- Check for your face in databases: Services like HaveIBeenPwned check for data breaches. For facial recognition specifically, some EU data protection authorities can help you submit deletion requests to companies like Clearview AI.

Try Viallo Free
Share your photo albums with a single link. No account needed for viewers.
Start Sharing FreeHow private photo sharing helps
The fundamental difference between posting a photo on Instagram and sharing it via a private link is accessibility. A public social media post can be scraped by anyone - including facial recognition companies. A password-protected album link is only accessible to people who have the link and password.
Viallo doesn't run facial recognition on your photos. There's no face scanning, no faceprint generation, no people tagging. Your photos are stored in EU data centers (Germany), protected by GDPR, and only accessible to you and the people you explicitly share with.
This matters especially for family photos, children's photos, and group photos where multiple people's faces appear. Every face in a publicly shared group photo is a potential entry point into a facial recognition system.
The legal landscape in 2026
The law is catching up, slowly. The EU AI Act bans real-time public facial recognition with exceptions. GDPR gives EU residents the right to object to biometric processing. Several US states (Illinois, Texas, Washington) have biometric privacy laws. But enforcement is uneven, and most government surveillance programs operate in legal gray areas.
The UK legal challenge expected in 2026 could set an important precedent. If courts rule that searching passport databases without consent violates privacy rights, it could force similar programs worldwide to either get explicit authorization or shut down.
Until the legal framework catches up, the most effective protection is practical: limit how many photos of your face are publicly accessible. That's something you can control right now.
Frequently Asked Questions
Can police use my passport photo for facial recognition?
In the UK, yes - police have been doing this since at least 2018. In the US, the FBI and ICE have access to driver's license and passport databases. In the EU, the AI Act restricts but doesn't fully ban database searches by law enforcement.
Are my social media photos in facial recognition databases?
Possibly. Clearview AI scraped over 50 billion photos from social media and the open web. If your photos are publicly accessible, they may be in such databases. You can submit deletion requests under GDPR in Europe or under state laws like Illinois BIPA.
Does private photo sharing protect against facial recognition?
Yes. Photos shared via private, password-protected links are not accessible to web scrapers or facial recognition companies. Only people with the link and password can view them.
Does Viallo use facial recognition?
No. Viallo does not scan faces, generate faceprints, or run any facial recognition on your photos. Organization is based on GPS metadata and manual album creation, not biometric analysis.
How do I remove my face from Clearview AI?
EU residents can use GDPR rights to request deletion. Clearview AI has a form on their website for opt-out requests. In the US, residents of Illinois, Texas, and Washington have stronger rights under state biometric privacy laws.