Facial recognition technology branded “dangerously irresponsible”

Police failures to properly test facial recognition technology could lead to black and minority ethnic people being falsely identified, campaigners says.

The BBC say that three opportunities in the past five years have been missed to truly ascertain how effective facial recognition AI systems cope with ethnicity.

Campaigners have come out in force against the problems the technology has experienced to date, with privacy rights group Big Brother Watch saying that the new techniques “must be dropped immediately.”

A number of police forces in Britain have been trialing new tech in automated systems that are designed to identify the faces of individuals whose faces are caught on camera, as they walk in day to day life.

While official documents show police are alert to the problems that ethnicity can cause the technology, no efforts have been made to improve the recognition systems.

While acknowledging the “invaluable tool” in the fight against crime that facial recognition represents, the Home Office said:

“The technology continues to evolve, and the Home Office continues to keep its effectiveness under constant review.”

The concerns come as the software comes under fire for being targeted at mainly white faces. At a police working group, Durham Police Chief Constable, Mike Barton, said that “ethnicity can have an impact on search accuracy.”

Requests were made to the Canadian company behind the police’s facial image systems to look into the problems, but there remains no evidence to suggest follow ups were made.

Facial recognition was brought in for the Police National Database (PND) in 2014, which currently holds around 13 million faces. Privacy groups have been alarmed by the database because it holds images of individuals who have not committed an offence – a practice which a court decision ruled was unlawful in 2012.

Ethnicity tests could have been made last year, when Cardiff University held an assessment for police usage of the technology. However, the study stated that the tests were not made, due to “limited funds for the trial.”

Big Brother Watch director, Silkie Carlo, said:

“The police’s failure to do basic accuracy testing for race speaks volumes.

“Their wilful blindness to the risk of racism, and the risk to Brits’ rights as a whole, reflects the dangerously irresponsible way in which facial recognition has crept on to our streets. It must be dropped immediately.”


Join our free-to-attend digital event, Last Thursday in Privacy, addressing data protection, privacy and security challenges including working from home, COVID-19, global regulations and more. Visit https://digital.privsec.info/.

We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.