The Detroit Police Department mistakenly arrested a mother of three based on a false match using facial recognition software, according to a lawsuit against the department. The case, according to civil rights advocates, highlights the dangers of bias in using such technologies in police work, which studies have shown do a poor job of matching images of non-white people. Early on the morning of 16 February, six Detroit police officers arrived outside the door of Porcha Woodruff, 32. Ms Woodruff, 32, was eight months pregnant at the time, and was helping her two children get ready for school. Police informed the mother, who is Black, that she was under arrest for a January carjacking and robbery. ‘Are you kidding, carjacking? Do you see that I am eight months pregnant?’ she told officers, according to a federal lawsuit filed last week. She later learned police identified her as a suspect after running security footage through the department’s facial recognition software, then putting a 2015 mugshot from a past arrest into a photo lineup where the carjacking victim singled out Ms Woodruff as her assailant. Though officials later dropped the case, Ms Woodruff argues in the suit the whole incident, which allegedly led her to suffer stress, dehydration, and stress-induced contractions, is an illustration of the dangers of biased facial-recognition software. ‘Despite its potential, law enforcement’s reliance on facial recognition has led to wrongful arrests, causing humiliation, embarrassment, and physical injury, as evident in this particular incident,’ the complaint alleges. Detroit Police said they are investigation her claims. ‘We are taking this matter very seriously, but we cannot comment further at this time due to the need for additional investigation,’ Chief James White told CNN in a statement. ‘We will provide further information once additional facts are obtained and we have a better understanding of the circumstances.’ The mistaken arrest is believed to be at least the sixth such incident in the US, according to the American Civil Liberties Union. Three of those have occured at the DPD, which began using facial recognition technology in 2018, according to the organisation, and all six of those wrongly arrested around the US have been Black. ‘It’s deeply concerning that the Detroit Police Department knows the devastating consequences of using flawed facial recognition technology as the basis for someone’s arrest and continues to rely on it anyway,’ Phil Mayor, senior staff attorney at ACLU of Michigan, wrote in a statement. ‘As Ms. Woodruff’s horrifying experience illustrates, the Department’s use of this technology must end. Furthermore, the DPD continues to hide its abuses of this technology, forcing people whose rights have been violated to expose its wrongdoing case by case. DPD should not be permitted to avoid transparency and hide its own misconduct from public view at the same time it continues to subject Detroiters to dragnet surveillance.’ In 2019, a 25-year-old Black man named Michael Oliver was wrongly accused of felony charges of stealing and breaking a teacher’s cell phone. Facial recognition software convinced police to put his photo in a lineup, even though he had visible tattos, and a different body type, skin tone, and hair style than the individual involved in the alleged theft. The following year, Robert Julian-Borchak was arrested in front of his family for an alleged theft from a high-end Detroit Shinola boutique. His name was given to police after a security contractor sent surveillance video to the DPD, which forwarded the footage to the Michigan State Police, who matched Mr Borchak’s name using a facial recognition tool. The case was later dismissed. ‘This is not me,’ Robert Julian-Borchak Williams told police during his investigation, according to The New York Times . ‘You think all Black men look alike?’ Facial-recognition tools can suffer from what’s known as algorithmic bias, certain preferences built into ostensibly neutral technologies because of their creation by biased human beings. Facial ID tools, for example, are often trained on large data sets of images, and using a data set that isn’t diverse can breed technologies that fail to accurately identify people of colour .
Black mother sues Detroit claiming wrongful arrest while pregnant due to face ID tech
Sourceindependent.co.uk
RELATED ARTICLES