Singapore is to become the first government in the world to grant citizens access to services online using facial verification and UK-based provider iProov has won the contract. FinCrime Report speaks to the firm’s founder Andrew Bud about the technology and issues of privacy, anti-money laundering, security and trust.

One of the most notable industries that has experienced a boom in business following COVID-19 has been biometrics. Though hardly new, technologies like biometrics, artificial intelligence and machine learning have played a key role in responding to the pandemic and will likely continue to be retrofitted for new uses across the board at great speed. Facial verification and authentication software, in particular, is likely to become the “new normal” as lockdown measures drive an incentive for engineering convenience and digital access. 

Seemingly concerned with both the security and epidemiological side of technology is the hot biometric company, iProov, which recently announced its partnership with the Singapore government’s national ID scheme, SingPass, and their plans to tackle anti-money laundering with its software. The company’s other big clients also include Eurostar, US Department of Homeland Security, NHS log on, and Evernym and Acuant, which both integrated iProov’s liveness detection technology into their own facial recognition platforms.

When the scheme was announced last month, Bud hailed the project as a “tipping point for facial verification” providing the “highest levels of reliable security”. But how did we get here?

In June 2018, the Singapore government published an open tender requesting facial verification services. iProov responded and secured the prize, beating multibillion-dollar companies in the process. But as an entrepreneur and engineer, Andrew Bud’s research into biometric solutions began much sooner in 2008 when his business at the time, the world’s largest purchaser of mobile payments, was hit by a cyberattack that used their network to steal money from millions of people. 

In 2010, Bud began to commission consultants to understand how to prevent attacks on authenticated consent mechanisms, to which there was no comprehensive solution. 

“And I spent a year thinking about this,” he says. “And because I’m an engineer, I ended up inventing the concept of FlashMark. And we then received quite a lot of money from the UK innovation organisation, InnovateUK, and the business started up in 2013. Since then, the team of people we’ve assembled have been working on developing this technology to the stage where it works with extraordinarily good reliability on a very large scale with national security levels of security.”

iProov-Genuine-Presence-Assurance-700x467

 

The package software contains two key components: genuine presence assurance and liveness assurance, which can be used flexibly by a firm or institution and can be selected on a transaction-by-transaction basis. Genuine Presence Assurance (GPA) enables organizations to confirm that the person they are interacting with online is the right person, is a real person and is authenticating at that moment in time. Liveness Assurance (LA) ensures the person is the right person and is a real person but does not protect against scalable digital attacks like deepfakes or replays where synthetic images are injected into device sensors, according to the company.

iProov states that no Liveness Assurance technology can claim to protect against deepfakes and replay attacks -in which cybercriminals intercept a secure network and misdirect a legitimate transmission – only against presentation attacks.

The company explain, “that is why it is essential to have GPA for high security transactions where fraud is more likely to occur, like at onboarding, or for high value transactions in banking. Liveness is suitable where the risk is lower, but it cannot protect against all types of attacks.”

GPA presents a ceremony to the user that is proportionate to the riskiness of the transaction. Whereas LA has a much quicker, limited ceremony process which satisfies lower risk transactions such as checking your current bank balance. The concept of ceremony is important, according to Bud.

He explains, “If you do something that [users] know is serious, and you just do it with a flick of a finger, tests show that it makes [users] very nervous. On the other hand, if they’re doing something that isn’t very important, and you ask them to go through the same ceremony, then they get frustrated, because it gets in their way of doing something. So, there is a cooperative – a link between how much risk and the amount of ceremony people want.”

Is it proportional? 

Though iProov’s software did not emerge from the current crisis, and concerns around the use of facial recognition and the like are hardly new, there is new reason to question if the technologies being quickly rolled out are proportionate to the issues at hand.

Bud has received this criticism many times, he says. “People sometimes used to ask us,” he says, “Surely liveness assurance is fine for everything. Why do you need the capability of genuine presence assurance?”” The answer lies in the potential of GPA for anti-money laundering solutions, says Bud. 

“We are up against criminal organisations and buildings full of PhDs motivated by compelling criminal business models. In some cases, we are up against the national security organisations of rogue countries for whom money laundering is not a way to make a bit of money on the side, these are fast, vast operations with big investments behind them. And so, we have to be secure against the kinds of threats that will emerge from these sorts of organisations, this isn’t a joke. It has to be taken seriously.”

In regard to the financial sector, Bud says, “the world is moving away from in-person branch-based identity checks towards remote onboarding. Remote onboarding requires access to an identity document, which now unfortunately, is rather straightforward. And access to the face of the person depicted in the identity document. Deep fake technology makes it easy to do as well. The sole defence you have is to make sure that when the person onboards, you have a real genuinely present image of the person setting up the account. That is fundamentally forensic evidence. And fundamentally it prevents the compromise of genuine or stolen genuine identity materials.”

The availability of technologies such as deepfake to cyber criminals, or anyone tech-savvy enough is extremely concerning, says Bud, who reveals that a couple of weeks ago, iProov’s technical team jokingly used deepfake technology to appear as Andrew Bud on a company video call.

GPA is a “crucial cornerstone” in the defence against large scale abuse of digital financial systems for money laundering, says Bud. “Something that passes a US Federal Government Red Team attack is so important.”

The potentials of facial verification software for identifying money laundering activity are clear. In September this year, Companies House underwent significant reforms which included the introduction of a compulsory identity verification scheme to identify fraudulent directors opening shell accounts. The changes were part of the UK government’s response to the Corporate Transparency and Register Reform Consultation published in 2019. 

Can it be trusted?

When asked if genuine presence assurance poses similar risks to facial recognition technology in regard to its capacity to reproduce bias, Bud made it clear that his team are proactively looking for bias in age, ethnicity and gender. 

“We are actively looking for evidence of that problem. We take that problem very seriously. We’ve designed our systems specifically to not suffer from those kinds of biases. But we recognise that we probably won’t be fully successful. So, we have put in place measures that monitor for that kind of bias, so that if and when we detect it, we can address the problem.”

Privacy campaigners including Privacy International have criticised the legal basis of consent as an appropriate ground for Singapore’s national verification scheme, claiming that it presents an imbalance of power. 

Ioannis Kouvakas, legal officer with London-based Privacy International, said, “Consent does not work when there is an imbalance of power between controllers and data subjects, such as the one observed in citizen-state relationships.”

‘We are up against criminal organisations and buildings full of PhDs motivated by compelling criminal business models’

    Andrew bud

When asked if Singaporeans could lose access to services if they do not consent to having their face scanned, Bud explained that this was not the case. 

He adds: “Absolutely not. The government of Singapore made it very clear that face verification is just one of the options for doing what they want to do. In Europe, it will be illegal to do that. The legal basis of consent under GDPR is very clear. You have to have an option. Under no circumstances will the user be compelled to do this.”

The UK is unlikely to have a national ID scheme link Singapore any time soon, but iProov have also supplied its software to Estonia’s state-certified digital identity program. Bud also says he is in talks with other governments.

On the topic of privacy, Bud explains that under GDPR, “We are only permitted to do with the data, what the data controller allows us and agrees we should do.

“They do not permit us to share that and we do not ask them for permission to share it because we have no interest in sharing it. Illicitly sharing it would be a serious breach of the law and we would never do that.”

Furthermore, he explains: “The second thing is that the biometric data we collect and process is subject to a privacy firewall that ensures that we never see or learn or know anything about the real identity of the user.” In the case of their client Eurostar, he says, “they know about their passengers, but they don’t want to deal with biometric data. On one hand you have Eurostar who knows about the passenger but does not want to see their face, on the other, you have iProov who sees the face but does not know the passenger. They are sealed off from each other by this privacy firewall.”

All in all, Andrew Bud is clear about his company’s role: “Our job is to ensure that the person who is using any kind of digital identity solution that includes an immunity passport, is who they claim to be.” 

Going back to 2008, when Bud found himself looking for the solution, he said in hindsight, “The problem that we actually are addressing turns out not to be anything to do with mobile payments. It turns out to be the establishment of trust in the online ecosystem. That’s where it comes from.”