In April 2019, we blogged about how the lack of rules for the collection of people's images for inclusion and use by facial recognition software. Privacy advocates were raising concerns about the high potential for misuse of this powerful technology. Criminal defense lawyers saw the potential for more wrongful arrests.
Providers of facial recognition services, like Clearview AI, “scrape” images from the internet and social media websites like Facebook, YouTube, Twitter and LinkedIn, without the permission of the site or the users. They then sell access to their image databases and search technology to law enforcement agencies, from the FBI, to highway patrol, to local police departments.
Your Face is in a Database
Most citizens would be surprised to learn more than 117 million American adults – more than half the adult population – has had their image catalogued in law enforcement facial recognition networks, according to a 2016 study by Georgetown Law.
In 2016, Texas police had the ability search 24 million mugshots in the FBIs database, as well as 24 million drivers' license photos of people, the vast majority of whom had never committed a crime. At that time the Dallas Transit police were planning to acquire facial recognition software.
Fast forward to March 2020, CBS News Dallas-Ft. Worth reported that facial recognition software from Clearview AI was being used by Fort Worth, Irving and Plano police departments in what they described as a trial period.
And it's taken off all across the country as police have found it to be a useful tool for some kinds of cases – shoplifting, check forgery, and ID fraud. The problem is that it's being used in every kind of criminal case and there are real problems with how the technology works. Worse yet, there are few safeguards.
CBS News spoke with Brian Holland, an expert in technology law at Texas A&M School of Law. He warned about the risk of hacking. Clearview AI was hacked that same month. (They said their database server wasn't breached).
For criminal defense lawyers, our more immediate concern is inaccuracy – identifying the wrong person.
Misidentification Leading to Wrongful Arrest
Government testing in 2019 found evidence of racial bias in almost 200 algorithms used by these programs, making them far more likely to misidentify people with darker skin tones (especially Black women).
Misidentification is what happened to Nijeer Parks, a New Jersey man who faced the possibility of 6 years in prison after he was identified by a “state-of-the-art” facial recognition program. The problem is, it wasn't him. He was 30 miles away from the crime scene, conducting business.
That didn't stop the police from arresting him without any other evidence and holding him in jail for 10 days. He's now suing the police department.
Wrongful arrest is no small matter. You could lose your job. Without two weeks income, you could be unable to pay your rent or mortgage that month. And you will now have an arrest on your criminal record.
While many police departments say that they don't rely exclusively on facial recognition software to make an arrest, the truth is, we don't actually know. The police don't have to say why, or they don't have to say they used facial recognition technology. Those who are researching this new use of technology have found cases (most notably in Florida) where there was no other reason for a person to have become a suspect.
If you've been charged with a crime in Collins County, Texas, or you feel you were wrongly arrested, the criminal defense team at Maddox Law is here to help. We thoroughly investigate every charge against you to provide a strong and swift defense. Call 972-546-2496 or send us a message today.
There are no comments for this post. Be the first and Add your Comment below.
Leave a Comment