Speaking at the Royal United Services Institute (RUSI) annual security lecture, the head of London’s Metropolitan Police defended the force’s uses of live facial recognition (LFR).
Met Police Commissioner Dame Cressida Dick said that the majority of the audience would have utilised mapping tools, google translation, voice recognition, and perhaps facial recognition on the way to RUSI, thus the police should have access to the same modern tools the public have in order for them to do their job.
“Criminals make powerful use of the digital world, obviously the police should use cutting edge tech too as a force for good. The challenge for a 2020 Police Chief in the data age is to make tech and data more of an advantage to us than it is to the criminal,” the commissioner said.
When discussing LFR, the commissioner stressed that the technology does not store biometric data, nor does it make the final decision on whether a police officer should intervene or not.
Commissioner Cressida Dick added that the technology has been proved to not have an ethnic bias, with the Met’s trials resulting in the arrest of eight wanted individuals “whom we would otherwise have been very unlikely to identify.”
“Right now the loudest voices in the debate seem to be the critics. Sometimes highly inaccurate or highly ill informed,” the commissioner added, asking critics to justify victims of crimes why the the force should not use tech lawfully and proportionately to catch their perpetrators.
“It is not for me and the police to decide where the boundary lies between security and privacy, it is right for the police to contribute to the debate. But speaking as a member of public, I will be frank.
“In an age of Twitter and Instagram and Facebook, concern about my image and that of my fellow law-abiding citizens passing through LFR and not being stored, feels much, much smaller than my and the public’s vital expectation to be kept safe from a knife through the chest.”
Ironically, RUSI issued a government report last September warning that machine learning algorithms, similar to the one used in LFR, could be amplifying racial and other biases.
Just last week, Big Brother Watch took to Twitter saying: “Facial recognition surveillance means our identities are checked en masse. That’s a reversal of the presumption of innocence that’s so at the heart of British civil liberties. It needs to be banned NOW.”
PrivSec Conferences will bring together leading speakers and experts from privacy and security to deliver compelling content via solo presentations, panel discussions, debates, roundtables and workshops.
For more information on upcoming events, visit the website.
We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.
Privacy Culture: Data Privacy and Information Security Consulting, Culture & Behaviour, Training, and GDPR maturity, covered. https://www.privacyculture.com/