I Was Wrongfully Arrested Because of Facial Recognition Technology. It Shouldn’t Happen to Anyone Else

On January 9, 2020, Detroit Police Department (DPD) officers arrested me on my front lawn in Farmington Hills, Michigan, in front of my wife and two young daughters, for a crime I had nothing to do with. They refused to tell me why, and I had to spend the night sleeping on a cold concrete bench in an overcrowded, filthy jail cell before finally finding out that I was being falsely accused of stealing designer watches from a Detroit boutique.

While interrogating me, a pair of detectives let it slip that I had been arrested based on an incorrect facial recognition identification, a technology that has been proven to be both racist and faulty—especially when used in real-world conditions, like with blurry security footage.

This week we finally reached a settlement in my wrongful arrest lawsuit against the City of Detroit that ensures what happened to me won’t happen again.

Facial recognition technology has access to massive databases with millions of photos — including, at the time I was arrested, a database of 49 million photographs comprising every Michigan driver’s license photo going back years. Anyone who has a driver’s license can be included in these databases. The technology scans them all for similar-looking faces and spits out some possible suspects. Police would tell you they only use this information as a “lead” and then conduct meaningful investigations but my own personal experience, and that of other wrongfully arrested people around the country, refutes that assertion.

Case in point, the system somehow returned my expired driver’s license photo as an “investigative lead” that might match the thief. Rather than investigate the accuracy of this purported match, police accepted the “lead,” putting my photo in a lineup along with five other photos of Black men—each of whom looked less like the thief since a computer algorithm hadn’t decided those photos looked similar enough to the thief to be a possible match. The witness (who hadn’t even seen the crime happen, but merely reviewed the security footage) chose my photo out of this rigged lineup. And that is all the evidence DPD relied upon to arrest me.

Read More: Artificial Intelligence Has a Problem With Gender and Racial Bias. Here’s How to Solve It

When I was finally released after 30 hours, I learned that my oldest daughter had lost her first tooth while I was in jail—a precious childhood memory now warped by trauma for our whole family.  She also turned around a photograph of our family because she couldn’t bear to see my face after watching the police haul me away. The girls even started playing cops-and-robbers games and telling me I was the robber. There have been many moments over the last four years where I’ve had to try to explain to two little girls that a computer wrongfully sent their father to jail.

The bogus charges were ultimately dropped, but not before I had to go to court to defend myself against something I didn’t do. Once they were dropped, I demanded that police officials apologize and urged them to stop using this dangerous technology. They ignored me.

Since my story became public in 2020, we’ve learned of two other Black people in Detroit, Porcha Woodruff and Michael Oliver, who were also wrongfully arrested for crimes they didn’t commit based on police reliance on faulty facial recognition technology searches. Similar stories continue to pop up around the nation.

In a more just world, the cops would be banned from using this technology altogether. While this settlement couldn’t go that far, the DPD’s use of this dangerous and racist technology will now be much more tightly controlled. They will not be able to conduct a photo lineup based solely on a lead derived from facial recognition. Instead, they can only conduct a lineup after using facial recognition if they first uncover independent evidence linking the person identified by facial recognition to a crime. In other words, DPD can no longer substitute facial recognition for basic investigative police work.

Their obligations don’t end there. Whenever DPD uses facial recognition in an investigation, they must inform courts and prosecutors about any flaws and weaknesses of the facial recognition search they conducted, such as poor photo quality like in my case where grainy security footage was used. DPD will also, for the first time, have to train its officers about the limitations and inaccuracies of facial recognition technology, including how it falsely identifies Black people at much higher rates than white people.

What the Detroit Police Department made me endure changed my life forever. When I was being hauled off to jail, I felt like I was in a bad movie I couldn’t leave. For the past several years since my wrongful arrest, my family and I have traveled around Michigan and the country urging policymakers to protect their constituents from the horror I went through by stopping law enforcement from misusing this technology. I’ve repeatedly explained that I don’t want anyone to live with the fear and trauma that facial recognition technology inflicted on my family. With the settlement of my case, we take a big step toward that goal.

Read More

Robert Williams