Facial recognition technology will ‘exacerbate racist outcomes’ in policing and should be banned ‘for a generation’ according to our new report, Unmasking Facial Recognition. The report also reveals that the Metropolitan Police failed to undertake an Equality Impact Assessment prior to their trials and warns that the technology may lead to a ‘face veil ban’ in future.
Facial recognition surveillance, also referred to as live facial recognition (LFR) or automated facial recognition, involves the use of technology which analyses an individual’s face in order to determine a positive identification in real time. The technology works by examining facial patterns (e.g. distance between eyes, length of nose) in order to create a template of a person’s face before making a comparison with a template held on a watchlist. The technology has attracted controversy in recent years and has been criticised as being vulnerable to racial bias with several studies demonstrating that the technology does not work as well on people of colour.
WebRoots Democracy has been researching this challenge of racial bias and our analysis finds that even if the technology works accurately, it is likely to be used disproportionately against people of colour. The report describes the technology as the “next generation of finger-printing”. Freedom of information requests submitted by the think tank revealed that the Metropolitan Police failed to undertake an Equality Impact Assessment (EIA) before their trials (which included using LFR at the Notting Hill Carnival) and that South Wales Police had ‘no concerns’ about the racial impact of the technology.
We also identified a particular risk of ‘anti-Blackness’ in facial recognition technology following a test in which the faces of 300 UK Members of Parliament (including all BAME MPs) were put through an online facial recognition system.
The report is calling for there to be a ‘generational ban’ on the police’s use of the technology in order to allow time for the police to overcome challenges of institutional racism which we argue will affect how LFR is deployed.
Areeq Chowdhury, Director of WebRoots Democracy, said:
“Facial recognition is not the next generation of CCTV, it is the next generation of finger-printing. Whilst CCTV takes pictures, facial recognition takes measurements. It is a highly intrusive form of surveillance which we should hesitate to adopt.
Our report finds that the police have been asleep at the wheel when it comes to the impact the technology will have on communities of colour. The fact that the Metropolitan Police did not even bother to undertake an Equality Impact Assessment before trialling the technology is staggering. It is very likely that facial recognition will be used disproportionately against people of colour and will exacerbate racial tensions in future.
The technology should be banned for at least thirty or forty years – a generation – until we are able to overcome the challenge of institutional racism in policing.”
Download and read the Unmasking Facial Recognition here.