Public and regulatory interest in data protection issues has increased significantly over the last few years, in no small part due to the introduction of the General Data Protection Regulation (GDPR).
While 25 May 2018 was engrained in boards’ minds as the GDPR implementation date, it was by no means the end of the road for data protection compliance. As new technologies and businesses continue to find increasingly innovative ways to exploit personal data, privacy issues continue to hit headlines daily.
Over the last year there have been some significant developments in the data protection sphere, including the Information Commissioner’s Office’s (ICO’s) first notices of intention to issue fines under the GDPR, demonstrating that the regulator will not shy away from levying significant sums of money for serious security breaches. On 2020’s Data Protection Day, we look at our predictions for what will be the top five hottest topics and areas of regulatory interest over the coming year.
Adtech has risen significantly up the regulatory agenda in 2019, with the ICO launching an investigation into adtech and real time bidding in the first half of the year. After issuing a fairly damning report in June, the regulator spent the second half of the year working with stakeholders to consider possible solutions to the many and varied issues the ICO identified with data protection compliance in the adtech ecosystem.
There have also been some surprising moves by key players in the market, with Google announcing on 15 January that it will phase out third-party advertising cookies over the next two years. The ICO’s investigation continues into 2020 and we expect to see significant regulatory involvement in improving data protection practices across the adtech landscape throughout the year, with a real possibility of strict enforcement action against businesses that do not take steps to remedy the issues identified.
2019 was the year of facial recognition, with lots of press attention around its use in policing as well as by private developers. The Information Commissioner, Elizabeth Denham, issued her first formal opinion under the Data Protection Act 2018 (DPA) in October and it focused on the use of live facial recognition technology by law enforcement agencies. At the same time, she also announced her intention to issue another opinion focussing on private sector use of the technology, which we anticipate will be published in the first part of this year.
There are significant advantages to biometrics (such as security and uniqueness) and there is scope for biometric data to form a key part of regulatory compliance; certainly, payment service providers will need to use the technology to comply with strong customer authentication requirements.
But there is also a plethora of privacy challenges associated with use of the technology and interest in this area shows no signs of going away. The Ada Lovelace Institute announced just last week that it had commissioned Matthew Ryder QC to lead an independent review of the governance of biometric data. The Ryder Review will run alongside the Citizens’ Biometric Council, which considers the ethical and social issues of biometric technology, and the final report and recommendations will be published in October. We therefore expect 2020 to see continued public and regulatory engagement on biometrics, with recommendations from the Ryder Review late in the year informing potential regulatory reform in the area.
Protection of children online
We saw many news stories throughout 2019 debating the risks to children online and tech platforms’ responsibilities to protect against these risks. Earlier this month, the ICO added its voice to the mix by releasing the final version of its Age Appropriate Design Code, something that it was legally required to prepare under the DPA.
The Code contains 15 standards that businesses must adhere to when providing online services to children, to ensure that the highest levels of privacy protection are built in and children are given age-appropriate privacy information.
The Code will be laid before Parliament and is expected to be fully in force by autumn 2021. But even before the Code is fully operational, online service providers of all sizes are likely to start giving more consideration to how they design their child-facing services. In a world where parents and carers are increasingly concerned about children’s online safety, compliance with the principles in the Code is not only likely to reduce risk of enforcement action, but also become a competitive differentiator and an indicator of public trust for online service providers.
International data transfers
Transfers of personal data overseas continue to generate much head-scratching among data protection officers. Two important recent developments give us an indication of what we might see in this arena over the coming twelve months. Firstly, the Advocate General (AG) of the Court of Justice of the European Union (CJEU) issued his opinion before Christmas in the ongoing ‘Schrems II’ case, which challenges the validity of standard contractual clauses (SCCs) to legitimise transfers of personal data outside the EEA.
If the AG’s opinion is followed (which it usually is), SCCs will remain valid, but controllers wishing to rely on SCCs will be expected to assess the recipient country’s laws to ensure there is no conflict with the SCCs. This could be a significant and costly burden for businesses and it will be interesting to see what guidance is released on the extent of these obligations.
Secondly, last week’s signing of the Withdrawal Agreement Act into law has given UK businesses a little more certainty on how Brexit will affect data flows with EEA organisations. During the transition period, things will be “business as usual”, and the status quo is likely to be maintained post-transition for UK to EEA transfers.
The European Commission (EC) intends to issue an adequacy decision for the UK by the end of the transition period (31 December 2020) which would allow data flows from the EEA to the UK to continue uninhibited too, but this is a fairly ambitious deadline. We expect both the UK and the EC to work hard throughout 2020 to meet this time frame, and whether this is realistic will become clearer throughout the year. UK businesses would be well-advised to ensure they know where they are relying on data from EEA countries, so that gaps can quickly be plugged if required.
Data ethics is by no means a new topic, but the Centre for Data Ethics and Innovation (CDEI), set up in 2019, has brought data ethics firmly into the realms of government policy. The CDEI is an independent advisory body, tasked with advising the government on maximising the benefits of data-driven technologies whilst maintaining public trust in the use and commercialisation of personal data.
Among its plans for 2020 are concluding large-scale reviews into online targeting and bias in algorithmic decision-making, with final reports due later in the year, as well as carrying out shorter, thematic projects on priority topics that respond to live issues and public concerns.
Consumer and privacy groups have been quick to highlight some of the ethical dilemmas presented by the use of new technologies and data architecture projects; To what extent should innocent civilians be expected to accept that they may be incorrectly identified by the police utilising facial recognition technology?
Why should fully automated processing algorithms be allowed to make decisions that impact the services which people access?; Do we trust our major technology companies to handle our data lawfully, securely and ethically, especially in the light of numerous breaches in recent years?
Trust is becoming an ever more powerful concept in this environment. It can be a key differentiator for organisations engaging with new and emerging technology and recent moves such as Google’s stance on cookies, mentioned above, show that consumer expectations and concerns are really starting to inform organisations’ approaches to their use of data.
We expect the CDEI to play an increasingly greater role throughout 2020 in directing policy and guidance on maintaining high levels of trust whilst enabling responsible innovation and safe exploitation of data.
Written by Emma Erskine-Fox, associate at UK law firm TLT
About the author
Emma is an associate in the Technology and Intellectual Property team, specialising in technology contracts and data protection.
Emma advises on a wide range of data protection, cyber security and e-privacy matters. She has run several large-scale GDPR compliance and contract remediation projects and has extensive experience of advising on data protection compliance issues.
PrivSec Conferences will bring together leading speakers and experts from privacy and security to deliver compelling content via solo presentations, panel discussions, debates, roundtables and workshops.
For more information on upcoming events, visit the website.
We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.
Privacy Culture: Data Privacy and Information Security Consulting, Culture & Behaviour, Training, and GDPR maturity, covered. https://www.privacyculture.com/