#privacy: Facial recognition at King’s Cross is a key issue in GDPR era

AI

Pressure has been increasing on officials at King’s Cross in London to give up more details on a developer’s deployment of facial recognition cameras, after the UK regulator called the technology’s use “alarming”.

Argent, the developer that has set up the AI system around King’s Cross has insisted that the tech is needed to shore up public safety. No more information has been given, however, regarding terms of use.

The topic is controversial because of the practice of mass surveillance being carried out in a public domain in a GDPR era characterised by fervent debate around privacy and public consent for data use.

The biometrics commissioner in Britain has since urged Downing Street to look closely at the laws governing the use of facial recognition technology, with a view to tightening standards.

The land around King’s Cross is privately owned, though under intense use by the public owing to a high number of retail outlets, dining establishments and office space. Google and Central Saint Martins College are among tenants in the surrounding buildings.

Public awareness of the surveillance was minimal until facial tech use was revealed in a recent Financial Times report.

Argent has not been willing to give an explanation regarding the legality of using the technology, nor on how long it has been employed, despite defending its use.

The UK’s biometrics commissioner, Professor Paul Wiles told the BBC he has “no idea” about the work being developed at King’s Cross.

“There’s no point in having facial-matching tech unless you are matching it against some kind of database – now what is that database? It’s alarming whether they have constructed their own database or got it from somewhere else.

“There is a police database which I very much hope they don’t have access to. Historically an area like that would have been public space governed by public control and legislation,” Prof Wiles said.

“Now a lot of this space is defined as private but to which the public has access,” he added.

The Information Commissioner’s Office is the body tasked with regulating the use of facial recognition technology because of its duty to monitor police data privacy. The watchdog has said that companies must show they have a “legal basis” for employing on facial recognition, under the terms of the GDPR.

Independent researcher, Stephanie Hare, said:

“We need to have laws about all biometrics including ones we haven’t even thought about yet.

“We need to future-proof it. We need to discuss hugely its role in the private sector. The police and the government is one thing, we need to know if the private sector is allowed to do this and if so, under what conditions?”

 

 


PrivSec Conferences will bring together leading speakers and experts from privacy and security to deliver compelling content via solo presentations, panel discussions, debates, roundtables and workshops.

For more information on upcoming events, visit the website.

We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.

Privacy Culture: Data Privacy and Information Security Consulting, Culture & Behaviour, Training, and GDPR maturity, covered. https://www.privacyculture.com/