Unmasking Facial Recognition

Unmasking Facial Recognition is a project exploring the implications of police-use of live facial recognition technology for minoritised communities. In particular, the project will be focused on the potential consequences for people of colour and Muslims in the UK. Facial recognition technology has faced criticism across the world for alleged racial and gender bias in the data which these products have been trained upon, in their designs, and in their deployments.

Through a series of workshops, interviews, roundtables, and desk-based research, our project will aim to shed light on these allegations, explore the potential implications for minority groups, and propose recommendations for policymakers and the public. The findings of this work will be published in a report later this year. For any queries, please contact Areeq Chowdhury at areeq@webrootsdemocracy.org.

This project has been supported by the Joseph Rowntree Reform Trust.

What is live facial recognition technology?

Live facial recognition technology is a system which analyses an individual’s face in order to determine an identification in real time. The technology works by examining facial patterns (e.g. distance between eyes, length of nose) in order to create a template of a face and by making a comparison with a template held on record. If the comparison renders a match, the system may provide a confidence score, e.g. 90% for a strong match. The threshold for a strong or weak match is set by the entity deploying the system.

There are two types of facial recognition. The first is known as ‘one-to-one’ matching. In this scenario, the system confirms that an image matches a different image of the same person in a database. This type of facial recognition system is used for unlocking smartphones or for checking passports at an airport. The second is known as ‘one-to-many’ matching. These systems are deployed in order to verify whether the face in an image has any match within a database. This is the system used for identifying a person of interest as part of a surveillance strategy. It is this ‘one-to-many’ system that our project will be focused on.

Bias in data, design, and deployment

Facial recognition systems have often been criticised for having a ‘gender and racial bias’. This is in reference to claims that these systems do not work as well on women and people of colour. A higher level of inaccuracy for these groups would mean that they are at a greater risk of misidentification by facial recognition systems. In a surveillance context, this could mean an individual being misidentified as a person of interest or for a genuine person of interest to pass by unnoticed. The bias our project is concerned with is that faced by minoritised communities, in particular people of colour and Muslims in the UK. This bias can arise in three areas: data, design, and deployment.

Bias can arise in data for a number of reasons. One example is that systems may not have been trained upon a sufficient number of dark skinned faces. This can lead to systems struggling when presented with a dark skinned face. In the design process, bias may arise due to the developers being unaware of their system’s deficiencies. During deployment, bias can arise due to the methods through which the system is rolled out. For example, if the persons of interest fed into the system are primarily from one ethnic group, this could contribute to a greater number of arrests from that one group, potentially exacerbating inequalities in society.

Upcoming events

On Wednesday 24th of June, from 6.30pm to 8.30pm, we will be hosting an online policy workshop for people of colour to come together and discuss the potential challenges and solutions involved with police-use of facial recognition technology. These will be facilitated conversations, held on Zoom, which aim to enable free and open conversation on the subject. To find out more and to register for the workshop, click here.

_DSC5762_Exp

Share your thoughts

Do you have thoughts or research on this topic which you would like to share with us for inclusion in our report? If so, please complete the short form below. You will be contacted for approval before any submission is included in our final report.