ICE’s surveillance app is a techno-authoritarian nightmare

For The Guardian, I wrote about Mobile Fortify, an app that lets agents obtain vast amounts of information on anyone by scanning their face.

The lethal force Immigration and Customs Enforcement (ICE) is meting out on American streets is rightly drawing loud condemnations from politicians and editorial boards across the nation and around the world. Now is the time we must start paying attention to another highly damaging part of ICE’s arsenal: the agency’s deployment of mass surveillance.

I’m referring specifically to Mobile Fortify, a specialized app ICE has been using at least since May 2025. (Usage of the app was first reported last June by 404Media.) What is Mobile Fortify? It’s an app for facial recognition that can additionally take “contactless fingerprints” of someone simply by snapping a picture of a person’s fingers. The app has been used more than 100,000 times, including on children, as alleged in a lawsuit filed by the State of Illinois and the City of Chicago. And it’s dangerous.

After taking someone’s picture, an ICE agent can now scan for that person’s face or fingerprints in a host of government databases that reportedly include more than 200 million images. The agent will immediately obtain vast amounts of information on that person, including name and date of birth, possible citizenship status, names of family members, markers like alien registration numbers and much more.

ICE is reportedly using the app on people it suspects of being in the country without authorization, but this presumption comes with its own host of problems. (ICE is also believed to be scanning random people of color on the streets to determine citizenship.) Representative Bennie G Thompson, the ranking member of the House homeland security committee, told 404Media that ICE considers “an apparent biometric match by Mobile Fortify [to be] a ‘definitive’ determination of a person’s status and that an ICE officer may ignore evidence of American citizenship – including a birth certificate – if the app says the person is an alien”.

It gets worse. In a document obtained by 404Media, the government admits that “it is conceivable that a photo taken by an agent using the Mobile Fortify mobile application could be that of someone other than an alien, including US citizens or lawful permanent residents”. No one, citizen or non-citizen, is allowed to opt out, either. And, as the document states, “[e]very new photograph or fingerprint, regardless of match, is an encounter and stored and retained in ATS [Automated Targeting System] for 15 years”.

Fifteen years is an absurdly long time to retain such data. As a point of comparison, the TSA’s use of facial recognition is optional, and the agency says it deletes photographs after verification has been made. Then again, testimony during a 21 January hearing revealed that the TSA has been assisting ICE by checking passenger information for immigration enforcement operations.

This kind of technology is clearly not limited to the US. In Gaza, the Israeli military has also widely employed facial recognition to conduct mass surveillance, and it’s been used to identify and detain Palestinians, as reported by the New York Times. The Times also reported that the “technology struggled” in its mission, so the military began supplementing their search results by using Google Photos. Is there a connection between the extremely intrusive mass surveillance of Palestinians in Gaza and the mass surveillance happening on our streets? Put another way, are we too being transformed into overly surveilled subjects, like Palestinians in the occupied territories?

Today’s facial recognition tools are often roundly criticized for their inaccuracy, as they should be. Such occurrences are legion. Facial recognition has always been better at identifying white men than other people. One 2018 study led by a researcher at MIT found the maximum error rate in facial recognition software for light-skinned men was 0.8%. The error rate for darker-skinned women was 34.7%.

And the consequences of such biases are real. In New Jersey…

Read the rest here.

Leave a Reply