#privacy: ICO to investigate facial recognition at King’s Cross

Kings Cross

The Information Commissioner’s Office (ICO) is to launch an official investigation into the facial recognition processes in operation at King’s Cross in north London.

When news of the technology’s use broke earlier in August, a spokesperson for the developer, Argent, said that monitoring was being carried out on the 67-acre site to “ensure public safety”, and that the cameras were just one in a “number of detection and tracking methods” that the developer employs.

“These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public,” the spokesperson told the Financial Times.

Public exposure of the surveillance has prompted a reaction from the ICO, with the regulator expressing its deep concern about increasing use of facial recognition software in society.

In a statement, Information Commissioner, Elizabeth Denham said:

“Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding.

“I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.

“Facial recognition technology is a priority area for the ICO and when necessary, we will not hesitate use our investigative and enforcement powers to protect people’s legal rights.

“We have launched an investigation following concerns reported in the media regarding the use of live facial recognition in the King’s Cross area of central London, which thousands of people pass through every day.

“As well as requiring detailed information from the relevant organisations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.

“Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way. They must have documented how and why they believe their use of the technology is legal, proportionate and justified.

“We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights.”

Argent has not yet given further details on the facial recognition systems.

London’s mayor, Sadiq Khan has written to the developer to express the “serious and widespread concern” around the legality of the surveillance.


PrivSec Conferences will bring together leading speakers and experts from privacy and security to deliver compelling content via solo presentations, panel discussions, debates, roundtables and workshops.

For more information on upcoming events, visit the website.

We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.