Clare Garvie, an attorney at the Georgetown Center for Privacy and Technology, testified before Congress in 2019 about the misuse of face recognition technology in the US. It wasn’t pretty.
Garvie told the House Committee on Oversight and Reform, “Face recognition presents a unique threat to the civil rights and liberties protected by the US Constitution.”
After her testimony, I asked Garvie why unique? She told me face recognition chills First Amendment free speech, and as used by many police departments, outright violates Fourth Amendment rights on unreasonable search, and 14th Amendment rights on due process.
She … found the majority of US police departments that had face recognition tech were not forthcoming about how it was used, or even whether they had it.
The NYPD, for example, claimed it had no documents on face recognition. It later turned over 2,000 pages on it when forced to. The NYPD was found to have fabricated evidence repeatedly. In one case, it tried to catch a beer thief that face recognition software couldn’t ID by using a photo of Cheers actor Woody Harrelson. The detectives thought the thief looked like him, so they fed in Harrelson’s photo to try to force a match from the software. Police routinely used falsified “probe photos” when they couldn’t get a match from the software.
A wanted posted for a beer thief, who the NYPD used an image of Woody Harrelson facial recognition to arrest.
Privacy laws and civil rights lawsuits, if they ever catch up to surveillance overreach, take years to get through the courts, at tremendous expense. Clearview AI, one of the biggest vendors of face recognition data and services to police, private companies, and even rich individuals, faces multiple class action lawsuits for a litany of legal and regulatory violations. Clearview’s clients include 2,200 domestic and foreign law enforcement agencies, the DOJ, DHS, universities, Walmart, Best Buy, Macys, and on and on. Did Clearview AI ask your permission to use photos you posted to Facebook or Twitter or Instagram in a criminal line up? No? Clearview AI lets police, or a random corporate cog at Walmart, do it anyway, even in localities where that has been specifically ruled illegal.
To make matters worse, face recognition tech can do a horrible job in identifying non-white suspects, putting non-Caucasians at higher risk for “false positives,” e.g., being arrested for a crime they did not commit because of a bad match. …
Amazon also sells a face recognition service, Rekognition. Rekognition has been excoriated for providing inadequate, incorrect, or non-existent training and documentation to clients, which inevitably leads to misuse. In a Washington Post article on Rekognition, the police department that helped develop it, the Washington County Sherriff’s Office in Oregon, stated it never had a complaint from defendants or defense attorneys on the use of face recognition software. But detective Robert Rookhuÿzen also said: “Just like any of our investigative techniques, we don’t tell people how we catch them. We want them to keep guessing.”
Isn’t that obstruction of justice? US police are required to share their evidence with defense attorneys, particularly exculpatory evidence, e.g., a search for a criminal suspect generated multiple matches in the face recognition software. As Georgetown’s Garvie pointed out, refusing to share those matches is a breach of the 14th Amendment.
Clearview AI and Rekognition are but two of dozens of developers of face recognition technology in the US. While much of it has been improving, it is still a mixed bag. Some versions of it can achieve a 99 percent match rate under ideal conditions. Under less-than-ideal conditions, that number can fall to 50 percent. And the rate of accurate matches from composite sketches is unbelievably low, with as few as 20 percent valid matches.
None of these shortfalls are highlighted to the public, defendants, and quite frequently, even the police using the face recognition systems. Instead, cops are told by vendors like Amazon that it works like magic. …
Some police agencies do require face recognition matches to be reviewed by trained human experts, as the FBI does. But as Steven Talley can attest, human face recognition experts are not infallible either. Talley was falsely arrested twice for bank robbery even after police had video footage showing Talley at his job while the robbery occurred. The second arrest, disregarding both fingerprint evidence and a rock-solid alibi, was based on the opinion of an FBI face recognition expert. The arrests cost Talley his job, home, marriage, and access to his children. Talley also suffered a broken sternum, broken teeth, ruptured discs, blood clots, nerve damage, and a fractured penis during the first brutal false arrest.
Surveillance provides safety?
Face recognition is a shinier, sexier iteration of the implicit promise justifying mass installation of CCTVs years ago. It ran, “If only we unleash unblinking and ever-present surveillance technology, we’ll reap a harvest of safety.”
The facts show otherwise. The largest ever meta-analysis on surveillance cameras, including 40 years of data, is that they reduce car theft and other property crime by 16 percent. And violent crime not at all.
They can help catch violent criminals and terrorists after the fact, but surveillance cameras do not prevent violent crime. …
When surveillance becomes too much:
Journalist Jon Fasman … perfectly encapsulated the challenge surveillance technologies pose to civil society:
I want to make a point about efficacy as justification. There are a whole lot of things that would help police solve more crimes that are incompatible with living in a free society. Suspension of habeas corpus would probably help police solve more crimes. Keeping everyone under observation all the time would help police solve more crimes. Allowing detention without trial might help the police solve more crimes.
All of these things are incompatible with living in a free, open, liberal democracy.
The article is mainly about Jetson, a new and secretive technology that can apparently ID people from over 200 meters from their unique heartbeat signature, using infrared lasers (which are invisible). You simply have no idea that it is being deployed.