Police Used Facial Recognition Tech Can't Tell Black People Apart, and That Leads To First Wrongful Arrest In The US
Police knock on the door, the wife of Robert Williams, a black American from Michigan, answers the police call and is asked where her husband is. She tells the police that her husband is not at home, but they refuse to believe her and ask if he has just entered the house. She says no, the one who entered the house being her mother...
A few hours later, Robert Williams parked his car in front of the house, and was arrested by police officers right in front of his two daughters. He is taken to the police where a detective hands him a picture of a black person and "asks him if this is not him". Robert politely replies no, at which point the detective hands him another picture and asks, "Tell me you're not here either," at which point Williams places that picture next to his face and says "it's not me" and that he hopes he doesn't think all people of color look the same.
The detective's reaction is that the computer says he is... this abuse we could say, of the police is due to the fact that the Michigan police started using the facial recognition technology provided by Microsoft Corp and Amazon.com Inc, which apparently can't distinguish people of color, and who not only misled the detective handling the case for whom Robert Williams was abusively and wrongfully arrested, but who is being asked to be taken out of service on the grounds of its inefficiency.
According to nbcnews, Michigan police are investigating the theft of five watches totaling $ 3,800, taken from a Shinola store in October 2018, and the misleading suspect that they got from their facial recognition system was mr. Williams, who spent 30 hours in arrest in January 2020, until the detectives concluded that they actually wrongfully arrested the man.
The American Civil Liberties Union of Michigan (ACLU) has been called by Robert Williams to file a complaint for his abusive and incorrect arrest, which he went through in January 2020, after the search algorithm of Rank One Computing, the service used by Michigan police to identify potential criminals with facial recognition cameras, incorrectly associated the photo in Mr. Williams' driver's license with a possible shoplifter.
"The computer must have been wrong" was the exit note that the wrongfully arrested African American got on his way out of the police station. However, the ACLU demands in the complaint made on behalf of Robert Williams that the methods of identifying criminals using facial recognition technology be taken out of use, Robert's case being clear proof that they are not effective.
"The facts of Mr. Williams' case prove both that the technology is flawed and that investigators are not competent in making use of such technology. Even if Rank One performs well, that didn't help Mr. Williams here and Rank One should take Responsibility. "
Jacob Snow, an attorney at ACLU of Northern California
It appears, however, that Microsoft Corp. and Amazon.com Inc. have stopped selling facial recognition technology to police following recent protests against arrests that unfairly target African Americans and other minorities. Facial recognition technology and its use does not stop here, and I will continue to throw at you some details about an American company that has been collaborating with the American police for several years, even under the radar for a while, and how such companies can actually steal the last inch of privacy that we once had, turning our nations into suspect nations.
Clearview is the company that with the help of AI and facial recognition technology makes it possible, as a search engine for a person's picture, to search for it in the immense depths of the internet, and find it associated with absolutely any social media profile, or tagged on other people's profiles, and help create a profile for a potential criminal, that the human eyes haven't witnessed otherwise, but facial recognition cameras did. Till now they have offered American police a 3 billion persons database, including Jacob Ward’s from CNBC, who offered himself as a volunteer to test such tech.
All these "faces" that are part of Clearview's gigantic database are taken from social networks such as facebook, twitter, or even google. However, it appears that they(facebook, twitter, google) do not agree with the work that Clearview is doing in the service of the police, and that they seek to ban its access to the collection of such data.
The use of facial recognition technology practically minimizes the possibility of anonymity, considering that such cameras are installed in almost every corner of the world, that they can not be tricked by using a mustache or a hat, they are available even in gas stations, where with their help you can buy a burger... gradually transforming our world into a Big Brother play.
No lawmaker however, has asked any citizen if they agree to be subject to the use of such technologies, and it appears that both Coronavirus and other violent conflicts around the world are used as a catalyst to justify the use of facial recognition throughout more countries, and street corners, and allowing the authorities access to more and more personal data that social media users have voluntarily provided to the Internet, not knowing how they could be used in the future, against them and their privacy of course.
Hoan Ton-That, CEO of Clearview, believes that his company, and its business, do not violate the fundamental rights and freedoms of those who become the subject of their huge database, and that Google has been doing this for a very long time. I personally believe that the use of such technologies will only lead to the destruction of any piece of privacy that we, as human beings, still have, and that we will gradually become, as I mentioned in a previous post, mere commercial products, bar codes, faces and numbers on a screen, and nothing more. Slowly, human society turns into a suspicious nation, and fulfills the prophecies of George Orwell transmitted through his literary work.
Thanks for attention,
Adrian