DIMOND: The Double-Edged Sword of Facial Recognition Technology


On the television crime drama “FBI,” Special Agent Jubal Valentine brusquely orders an underling to run a photograph through facial recognition to identify a suspect. Boom! After a rocket-speed search, the computer spits out a name and address. Field agents get to work, and in no time, the bad guy is under arrest.

I’m here to tell you it is not that easy and it’s not that accurate.

Facial recognition programs are notoriously error-prone, often misidentifying an innocent person as potentially guilty. At the same time, these programs have proven to be wildly successful in catching criminals, both minor offenders and more violent ones, like child rapists.

Like most controversies these days, there is a wide chasm of opinion. Facial recognition is either a great gizmo in law enforcement’s tool belt or another means to perpetuate racial inequity in the name of public safety.

Here are some facts.

A recent federal study of 189 different facial recognition algorithms confirmed previous research showing facial recognition systems come to shockingly wrong identifications when searches involve people of color (especially women of color), the very young and the elderly. Native Americans had the highest rate of misidentification. Asian and African Americans faces were up to 100 times more likely to be incorrectly identified when compared to searches for white male suspects. Pacific Islanders are also often misidentified.

Case in point: Last January, Robert Williams, a gainfully employed, married father in Michigan, was shocked when he was handcuffed on his front lawn by Detroit Police. Surveillance video showing a heavyset Black man shoplifting expensive watches had been run through a facial recognition program, and Williams’ face came up a match. Williams had no police record and repeatedly proclaimed innocence. If the arresting officers had asked, he could have proven he was at work that day. Instead, the humiliated Williams was arraigned on charges of first-degree theft and held for 30 hours. Ultimately, the charges were dropped, but at last report, his record has still not been expunged.

One big complaint is that police are not required to reveal that they zeroed in on a suspect via a facial recognition program. As senior public defender Aimee Wyant in Pinellas County, Florida, put it, “Once the cops find a suspect, they’re like a dog with a bone: That’s their suspect. So, we’ve got to figure out where they got that name to start.”

Its reported that 1 in 4 U.S. law enforcement departments have used facial recognition in the search for suspects, but there are no definitive statistics kept on the percentage of error. We know minorities and women are prone to misidentification, but just how frequently does that happen?

The FBI, for example, runs more than 4,000 checks per month using a nationwide hodgepodge of photographs of nearly 120 million Americans. These photos come from state’s driver’s licenses, mug shots, juvenile records and other databases. Cooperating states, in turn, get access to the FBI’s system.

Since half of all American adults are in the FBI system, chances are high that your photo is in that database. Could you become another Robert Williams?

There is little oversight of the nation’s facial recognition systems, even though more and more organizations are using it — from surveillance at airports and border crossings to corporate and community security. Its use is more widespread than you can imagine.

A couple of years ago, the American Civil Liberties Union ran a test on Amazon’s Rekognition facial recognition program. Photos of every member of the U.S. Congress were scanned for possible matches with a vast array of mugshots. Astonishingly, 28 members were falsely identified as matching someone in the database.

Still, cop shops across the country can list all sorts of closed cases that began with a trip through facial recognition software. Convictions have been secured for child sex abuse, property crimes, credit card fraud, burglaries, robberies and car theft. Suspects have been identified in cold case shootings and incidents of road rage.

Good detectives know that a facial recognition photo match is only the beginning. Further investigation of alibis, witness statements and forensic evidence analysis is always required before arrest. Has that always happened in the past? No. Do cops learn from their mistakes? Let’s hope so.

Rockland County resident Diane Dimond is a journalist, author, and a regular contributing correspondent for the Investigation Discovery channel. To find out more about Dimond, visit her website at www.dianedimond.com

You must be logged in to post a comment Login