top of page
Eric Borglund

Discrimination in Facial Recognition

A facial recognition system is a technology capable of matching a human face from a digital image or a video frame against a database of faces. Facial recognition is used everywhere. It is used to unlock our phones, in security cameras, in airports, and even by law enforcement. But there has been evidence that this amazing technology that we all use and benefit from has discrimination within it.


Inequality in facial recognition algorithms:


Several studies have shown that facial recognition has many errors when it is used on a person with darker skin. The three largest facial recognition companies (Microsoft, Face ++ and IBM) all performed badly on dark skin and even worse on dark-skinned females. This shows that the technology is biased and works well on light-skinned people and how it is not built for a diverse range of people. The reason for this seems to be that the system was trained on many white male faces, but fewer women and people of colour, which made it less accurate for the latter groups. Less accuracy means more misidentifications and potentially more people being wrongly stopped and questioned by law enforcement who use this technology to track down criminals.




Figure 1 from (https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/)This image from The Gender Shades project revealed that these algorithms consistently demonstrated the poorest accuracy for darker skinned females and the highest for light skinned males.


This leads to the big problem, law enforcement in the US (United States of America). The main four things that police use facial recognition is to find missing children and disoriented adults, to identify and find exploited children, to identify and track criminals, and to support and accelerate investigations. The last part is the main problem, as the technology is inaccurate for people with darker skin. So this makes people with dark skin more likely to be falsely arrested than people with white skin. Consequently, black people are overrepresented in mugshot data (a photograph of someone's face), which facial recognition uses to make predictions. The inaccuracy from the technology and racist policing strategies lead to disproportionate arrests of black people, who are then subject to future surveillance. For example, the New York Police Department maintains a database of 42,000 “gang affiliates” who are 99% black and Latinx, with no requirements to prove the suspected gang affiliation. In fact, certain police departments use gang member identification as a productivity measure, which motivates false reports, leading to false arrests. There are no regulations at the federal or state level that specifically regulates facial recognition. This allows the police to abuse the use of this tool and we can see that mainly people of colour are getting targeted. Because of how badly the technology was being used by law enforcement, on June 10th 2020, Amazon banned police use of facial recognition technology for one year. This was after Geroge Floyd was killed by a police officer because of facial recognition being wrongfully used by the Minneapolis Police Department. Amazon has told Congress that new laws need to be implemented for the technology, but they kept stalling, leading to this unfortunate event. Amazon created the one year ban to give Congress time to implement new laws. From an article it states that amazon has stated “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested” IBM and Microsoft, who both sell to law enforcement, have taken similar moves.


Another example of abuse of facial recognition is when the Detroit Police Department conducted the model surveillance program Project Green Light (PGL) in 2016, which installed high-definition cameras throughout the city. The data is streamed directly to the Detroit Police Department and can be tested for face recognition against criminal databases, driver’s licenses, and state ID photos - almost every Michigan resident is in this system. But PGL stations were not distributed equally. Surveillance was concentrated on majority where Black people living areas, avoiding White and Asian enclaves (Figure 2).



Figure 2 from (https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/)Camera locations of Project Green Light Detroit partners (left) overlap with primarily Black communities in data from the U.S. census (right). In this city-wide program, most of the surveillance falls on Detroit’s Black residents.


Facial recognition is a powerful technology with significant implications in both criminal justice and everyday life. Overall, I think that until proper laws and improvements to the technology are implemented, law enforcement should be banned from using facial recognition for minor crimes such as jaywalking since people could be wrongfully arrested over a minor crime.

Bibliography:









15 views0 comments

Recent Posts

See All

Climate Change Hypocrisy

We acknowledge climate change, and we even call it the climate crisis. Yet, we roam freely in cars, travel on planes, and shamelessly...

Marta in Pants

An intimate history of the most popular sport in the world - but the female version The recently held Women's World Cup in Australia and...

Комментарии


bottom of page