Can contacting tracing be effective if users don’t trust authorities with their privacy?

The UK government is expecting to roll out contact tracing apps at the end of May. Privacy advocates warn us that contact tracing could take us one step closer to an Orwellian society, but maybe there is, in any case, another related issue.  Users have to be persuaded that they can trust authorities with their privacy, if they don’t then contact tracing may not work anyway.

Singapore is a case in point. Researchers from Oxford University say that unless contact tracing apps are downloaded by at least 60 per cent of the population they aren’t effective. Yet in Singapore, the TraceTogether app has been downloaded by only 25 per cent of the population.

Such a low take-up, could make the data created by the app useless.

So why such a low take-up? The FT quoted a lawyer from Singapore saying: “distrust” of the government’s handling of personal data was one reason they had not downloaded the app.

The challenge is worldwide.

Take as an example, Alerta Guate, a contact tracing app developed for Guatemalan government by an American company, funded by the Tenlot Group. The app was said to be made available for free. Alas, according to Global Witness, location data from the app was being sent to the American company every few minutes.

In the UK, according to the UK government, its contact tracing app will respect privacy, but although data collected is anonymised, the app will be using a centralised system making it possible to de-anonymise the data.

As Abigail Dubiniecki from Myinhouselawyer told PricSec Report, “Although the NHS has said it will not use the centralised information to identify individuals but rather to leverage the analytics potential in privacy-protective ways, they haven’t shared their DPIA so we don’t know what mitigations they’ll put in place to protect against privacy risks. There are serious risks to centralisation like inadvertent or deliberate re-identification and lack of trust by users which could undermine its objectives and effectiveness.”

Ms Dubiniecki added: “UK innovators have been at the forefront of decentralised / edge AI and analytics. That the NHS would turn away from that innovative strength and leadership and adopt a riskier 1.0 approach to health data analytics is extremely disappointing and concerning.

“It does beg the questions, ‘Why do they need to hold onto all of it? How will they protect it against re-purposing? How will they get stakeholder buy-in when Brits are already among the most heavily surveilled populations (in terms of ubiquitous CCTV) in the world? It’s counter-productive and an unwise move in my view.”

Recently, Kate Allen, Amnesty International UK Director, said: “We all want to do everything we can to beat this virus, but as we’ve said before, our privacy must not become another casualty of the crisis – a Government decision-making approach to this app that incorporates key human rights principles can ensure that does not happen.

“We are certainly not trying to discourage people from using the app, but rather encouraging the UK Government to answer important questions about its approach at this crucial moment.

“Very important concerns around the current choice to adopt a centralised model, and issues like accessibility and transparency, still remain unanswered.

“As the UK Government prepares to roll the app out across England, it is critical that these are addressed so that people can properly understand what is being proposed and why. ”

Join our free-to-attend digital event, Last Thursday in Privacy, addressing data protection, privacy and security challenges including working from home, COVID-19, global regulations and more. Visit

We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.

Privacy Culture: Data Privacy and Information Security Consulting, Culture & Behaviour, Training, and GDPR maturity, covered.