Technology

The new lawsuit showing facial recognition is officially a civil rights issue

the-new-lawsuit-showing-facial-recognition-is-officially-a-civil-rights-issue

Williams’ illegal arrest, first reported by the New York Times in August 2020, was based on poor play by the Detroit Police Department’s facial recognition system. Two more false arrests have now been made public. Both are also black men and both have taken legal action.

Now Williams is following her path and moving on – not just by suing the department for his wrongful arrest, but by trying to ban the technology.

On Tuesday, the ACLU and the University of Michigan Law School’s Civil Rights Litigation Initiative filed a lawsuit on behalf of Williams alleging the arrest violated his Fourth Amendment rights and violated Michigan’s civil rights law.

The lawsuit calls for compensation, more transparency about the use of facial recognition and an end to the direct or indirect use of facial recognition technology by the Detroit Police Department.

What the lawsuit says

The documents submitted on Tuesday set out the case. In March 2019, DPD took a grainy photo of a black man with a red cap from Shinola’s surveillance video using the facial recognition system of a company called DataWorks Plus. The system returned a match with an old driver’s license photo of Williams. Investigators then took Williams’ license photo as part of a photographic statement, and a Shinola security guard (who was absent at the time of the theft) identified Williams as a thief. The officers were given an arrest warrant that required multiple approvals from department heads, and Williams was arrested.

The complaint argues that Williams’ false arrest was a direct result of the facial recognition system and that “this unlawful arrest and detention case illustrates the grave harm caused by the abuse and reliance on facial recognition technology”.

The case contains four points, three of which focus on the lack of a likely reason for the arrest, while one focuses on the racial differences in the effects of facial recognition. “By using technologies that have been empirically proven that they incorrectly identify black people at far higher rates than other groups of people,” says the DPD, the DPD denied Mr Williams unrestricted and equal use of the services, privileges and advantages of the Detroit Police Department because of its race or color. “

The difficulties faced by facial recognition technology in identifying darker skinned individuals are well documented. After George Floyd was assassinated in Minneapolis in 2020, some cities and states announced bans and moratoriums on police use of facial recognition. But many others, including Detroit, continued to use it despite growing concerns.

“Rely on sub-par images”

When MIT Technology Review spoke to Williams ACLU attorney Phil Mayor last year, he noted that issues of racism within American law enforcement made the use of facial recognition even more worrisome.

“This is not a bad actor situation,” said the mayor. “This is a situation where we have a criminal justice system that is extremely quick to prosecute and is very slow to protect people’s rights, especially when it comes to people of color.”

Eric Williams, a senior attorney for Detroit’s Economic Equity Practice, says cameras have many technological limitations, not least of which is that they are hard-coded with areas of color to identify skin tone and often simply cannot handle darker skin.

“I think every black person in the country has had the experience of being in a photo and the picture appears either a lot lighter or a lot darker.”

“I think every black person in the country has had the experience of being in a photo and the picture either gets a lot lighter or a lot darker,” says Williams, who is a member of the ACLU’s Michigan Legal Committee but is not working on the case Robert Williams. “Lighting is one of the main factors influencing image quality. The fact that law enforcement relies on really sub-par images to some extent is problematic. ”

There have been cases where race-based biased algorithms and technologies have been challenged. For example, Facebook underwent a massive civil rights review after it was found that its targeted advertising algorithms were serving ads based on race, gender, and religion. YouTube was sued in a class action lawsuit by black developers who alleged that its AI systems profile users and censor or discriminate against content based on race. YouTube was also sued by LGBTQ + creators who said content moderation systems flagged the words “gay” and “lesbian”.

Some experts say it is only a matter of time before the use of biased technology by a large body like the police force encounters legal challenges.

0 Comments
Share

Steven Gregory