By Tim Mackey, principal security strategist at the Synopsys CyRC (Cybersecurity Research Center)
Every so often I’m asked for a prediction. Some glimpse of a possible future from my magic crystal ball. As a cybersecurity person, I tend to look for possible outcomes – and unfortunately, those outcomes aren’t always that positive.
When speaking in public, I hold back my worst concerns because they could easily be viewed as, well, a bit out there. Just this once, I’d like for everyone to indulge me the opportunity to present a view from my crystal ball.
In case you’re not familiar with the power of the magic crystal ball, it requires you to start with a question. In this case, I’m curious what it would mean if cybersecurity professionals couldn’t convince companies to actually secure their data.
After all, I can’t remember the last time that a week went by without some company being in the news for operating an unsecured database, having an unsecured S3 bucket, or leaking sensitive customer data. It’s almost as if these companies don’t care about user privacy. Of course, that then opens the door for well-intentioned government officials to create laws designed to limit privacy discretions.
That then gets us an alphabet soup of regulations like GDPR, PIPEDA, CCPA, NDB, LGPD to name but a few of the current crop. These laws all have one important component in common – organizations need protect user data at all costs.
That means in part that users need to consent in some capacity to their data being used and that whatever data the organization has must be protected. Of course, one of the easiest ideas to protect data is that it must be “encrypted”. While a lay user, and for that matter a politician, might not know what encryption really means, they do know that its magic pixie dust that prevents bad guys from accessing their personal information. As such if a little encryption is good, then a lot must be better.
This is a happy world until someone recognizes that the proverbial bad guys also can use the encryption pixie dust. In the commercial computing world, this recognition occurred in the mid-1990s and to this day US companies are restricted in the manner they can use encryption in software they export. Of course, if the bad guys are outside of the US then US regulations don’t really apply, but the thought is there.
This model obviously presumed that the US was the center of the technology universe, but we all know that today software is a global commodity and that for practical purposes software authored anywhere is available to users everywhere. So, if bad guys are everywhere, and software is everywhere, and regulations require user data to be protected, where does that leave us?
Well, encryption pixie dust was so cool, so let’s solve the problem of bad guys accessing data using the power of full end-to-end encryption. We saw the beginnings of this when the venerable HTTP website became HTTPS and thus all data between the browser and website became encrypted – a boon for e-commerce. Obviously sprinkling that pixie dust on social media platforms is also good, and so too is adding it to our communication apps. Facebook CEO Mark Zuckerberg even went so far as to announce that the entire Facebook platform would include full end to end encryption.
But wait, if everything is encrypted and encryption is designed to be hard for bad guys to break thereby protecting our sensitive data, what if bad guys discover that they too can use encryption? This is where we get to the crystal balls possible future.
If the bad guys use encryption, then law enforcement won’t be effective. This is the same observation levelled against Apple following the Santa Barbara terrorist attack. While there are multiple ways to make encryption less effective, most center around providing law enforcement with ways to monitor or have a backdoor to remove the encryption.
What every workaround to these problems fails to recognize is that if encryption is weakened for one party, say law enforcement, then another party, say bad guys, could utilize the same weaknesses and access the same data. The argument then becomes one of “trust us” where the public are asked to trust that governments won’t abuse their ability to monitor digital traffic and to trust that organizations will implement appropriate safe-guards to ensure that their applications and infrastructure are secure even with weakened encryption processes.
Were such a paradigm to be enacted, it would mean that all existing applications would be obsolete as they were designed with the now illegal but stronger encryption rules. It would mean that existing data housed by companies would need to be decrypted and then saved following the new rules.
While these processes were occurring, it would present an appealing target for malicious actors to insert their software into the process or applications and mine available data. This is all before we apply international laws to certain data types or recognize that hitting the reset button on security means we should expect years of learning to ensure that the new models aren’t error prone.
Unfortunately, this possible future is brought to you by the EARN IT bill currently working its way through the US Senate. While everyone can agree criminals are using technologies to evade capture, weakening the security of public communication and business operations in an effort to identify suspected criminals from the general population only puts the public at greater risk of identity theft or worse.
Law enforcement has never been an easy job, and new technologies are often a double-edged sword, but there is no magic bullet. Reducing or eliminating data security assurances in the name of shouldn’t be the solution if we trust our citizenry and want a future where privacy and digital security matter.
Join our free-to-attend digital event, Last Thursday in Privacy, addressing data protection, privacy and security challenges including working from home, COVID-19, global regulations and more. Visit https://digital.privsec.info/.
We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.