Anju Khurana to speak at PrivSec New York

We are delighted to announce that Anju Khurana will address audiences at PrivSec’s New York conference, which is taking place on November 5th and 6th at Columbia University, NYC.

Head of Privacy Americas at BNY Mellon, Anju has extensive experience in designing and establishing global privacy management programs.

Within this field, previous roles have seen Anju lead in developing and updating privacy policies and procedures; developing processes for the completion of personal information inventories; privacy impact assessments; incident management and breach response; monitoring, tracking and driving regulatory change, and ensuring ongoing training, monitoring, auditing, reporting and evaluation of the privacy management program.

In her current role, Anju is focused on privacy strategy and policy development, governance, centralised operational support, regulatory change and business transformation.

She also deals with risk and compliance activities to ensure compliance with all applicable privacy and data protection laws while enabling business and technology innovation and valuable insights for clients and customers.

Ahead of her talk at PrivSec New York, Anju and colleague, Gary Barr, Chief Data Officer at BNY Mellon, teamed up to answer questions on the evolving relationship between cyber-security and data privacy.

 1. Are consumers waking up to the value of personal data and the importance of keeping it secure?

Consumers do have a heightened awareness of the value of their personal data, particularly following the wake of the Facebook Cambridge Analytica scandal and the proliferation of data breaches reported in the media.

A new IBM study conducted by Harris Poll recently revealed that consumers are dissatisfied with the way many businesses are handling their personal data. How well a company protects its customers’ data is viewed as more important than the quality of the company’s products and services and most consumers will refuse to work with organizations that won’t keep their data secure.

The survey, which was conducted from 8-15th August 2019, included responses from 1,000 adults over the age of 18. More than 50 percent of the respondents indicated that they have either been the victim of or knew someone whose data had been compromised. 94 percent of consumers agreed that businesses should be doing more to actively protect consumers against cybersecurity threats. 84 percent stated that they had lost “all control” over how their personal information is processed and 64 percent have chosen not to work with a business because of concerns around whether the business would keep their data secure. No matter how great the firm’s products or services, 83 percent of consumers said they would stop working with a firm if they discovered their information has been shared without their consent.

Consumers are also more focused than ever in taking back control over their personal data, with 76 percent of respondents agreeing that they would be more willing to share personal data if there was a way for them to take back or retrieve this data, and 77 percent noting the chance the opt out of having their personal data shared with third parties was “important” to them.

With increased media attention and new privacy laws redefining the data landscape, consumers are growing even more confident in demanding that their data be sufficiently protected and treated with respect and that it uses are kept visible and transparent and that their data is used only as agreed.

2. What are the priority areas for privacy professionals to address as they bid to develop privacy and security within an organisation?

Privacy professionals are facing a fast-changing regulatory environment and post-GDPR ripple effect, which includes the California Consumer Privacy Act (CCPA), Cayman Data Protection Law, Brazil General Data Protection Law and others on the horizon (e.g., India draft bill).

Privacy professionals should work on future-proofing their privacy programs and move away from one-time privacy compliance projects to developing a more sustainable global privacy program. Developing global baselines, principles and frameworks where feasible to avoid a patchwork of laws and automating key privacy processes to make processes more scalable is essential.

Privacy professionals should evaluate how the privacy program furthers the company’s mission and goals and focus on how the organization plans to use personal data to not only ensure privacy and security compliance, but to extract value from the data collected to develop new products, automate and digitize core activities, improve business performance and the overall client experience.

Privacy and security professionals should explore the use of privacy-enhancing technologies to enable the sharing and use of data in a privacy preserving manner. Privacy professionals should focus on breaking down silos and lead cross functional groups with key stakeholders in data security, data governance, records management, as well as with the business units, technology, legal, risk and compliance.

Professionals should also focus on educating and empowering employees with the right tools to address privacy risks. This will also drive more accountability within the business.

They should focus on communication, training and awareness regarding privacy expectations to be embedded into processes, applications, and product development and embed Privacy by Design into the culture and DNA of their organizations.

Privacy has synergies with several other areas and, if the expertise and resources are available, can sit in various departments within an organization such as Legal, Compliance, IT, Risk, or Data Strategy / Governance. Privacy should be a board-level agenda item. Thus, when developing an operational model, privacy professionals should determine where they will be a strategic asset to the company, receive the most executive support and resources to most effectively enable the company’s mission.

Finally, privacy professionals and companies should think of privacy as a competitive differentiator and use privacy regulatory initiatives as a catalyst for launching business driven initiatives. I was recently taking an Uber in mid-town Manhattan and took this picture of one of Apple’s new privacy billboards which stated,

“Your iPhone knows a lot about you, but we don’t. Privacy. That’s iPhone”

I understand that this is part of a larger privacy ad campaign in the US and Canada and wouldn’t be surprised if we saw more companies following suit.

3. What impact are data privacy laws having on technology advancements, for example in the field of AI and machine learning?

The use of Big Data has implications for privacy, data protection and the rights of individuals. In a business context, Artificial Intelligence is used to solve problems, promote efficiencies, and identify threats. In order to develop an effective AI tool, companies rely on data. Specifically, Artificial Intelligence involves algorithms that improve themselves by consuming data. The more data that algorithms consume, the better they get at identifying patterns, e.g. in evaluating candidate resumes, identifying insider threats, maximizing marketing opportunities or in identifying fraud. Meanwhile, the algorithms continue to get better and the data gets combined with other data in new and different ways.

Data privacy laws such as the GDPR require organizations to tell individuals what their personal data will be used for at the time of collection, minimize the information they collect and limit its use to what is necessary. They must limit how long they hold data, tell individuals what data they hold on them and what it is being used for, and alter or delete an individual’s data if requested. If personal data is used to make automated decisions about individuals, organizations must explain the logic behind the decision-making processes and, in some instances, provide data subjects with an opportunity to request a human review of the automated-decision.

Big Data challenges data minimization, purpose limitation, data retention, transparency and consent. This is further complicated when organizations use data to make inferences about people using sensitive personal information (e.g., health, biometric data, sexuality, political beliefs, religion) which require even stronger protections under privacy laws. Companies attempt to anonymize or de-identify data, but this can only be deemed truly anonymous and placed outside of the scope of privacy laws when robust technical controls have been established to prevent re-identification of data subjects.

As technology becomes increasingly complicated, so does the law. The UK’s Information Commissioner’s Office (ICO) is currently working on an AI audit framework and has published guidance and blogs on Big Data, artificial intelligence, machine learning, data minimization, and automated decision making.

Privacy and security professionals should provide guidance to their data scientists, digital and data leaders regarding best practices and privacy expectations to be embedded in machine learning, artificial intelligence and analytics activities.

Some best practices include:

  • Be able to provide a basic explanation of the AI tools/systems and types of data inputs and outputs and provide meaningful information about the logic involved and potential consequences for individuals.
  • Apply privacy by design principles and conduct either Data Protection Impact Assessments (GDPR) or security and privacy risk assessments of new AI products, services and functionality.
  • Consider other privacy risks and ethical issues and evaluate how the AI may affect individuals and consider potential bias and discriminatory impacts and how to reduce them.
  • Consider data minimization; consider how and what they are collecting, how long it’s being kept and assess collection processes.
  • Consult with Privacy and Legal to confirm they have obtained rights to use AI data sets for AI purposes in contracts and privacy disclosures.
  • Evaluate de-identification and other anonymization techniques to limit use of data in a personally identifiable manner.
  • Develop a process for handing individual rights requests under CCPA, GDPR and other privacy laws.

In the end, it shouldn’t be a case of Big Data versus data protection but rather to view privacy as an instrument to enable the ethical use of AI.

4. To what extent are privacy and security intrinsically linked?

Privacy relates to the rights you have to control your personal data and how it’s used. Security refers to how your personal data is being protected. You can’t achieve data privacy without data security. Because of overlapping privacy and security regulations and the current data breach landscape, privacy officers have had to become more fluent in IT and security skills. It is important for privacy, security and technology teams to work closely together to ensure organizations protect their personal data information.

Recognizing the link between privacy and security, the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) recently released on August 6, 2019 ISO/IEC 27701. ISO/IEC 27701 is a privacy extension to ISO/IEC 27001. Compliance with ISO 27701 first requires compliance with the requirements of ISO 27001.

These security standards are intended to complement each other, and ISO 27701 is designed to help organizations protect and control the personal information they handle. As part of the development of ISO 27701, privacy laws from the EU, Australia, Canada, Brazil, Hong Kong, Singapore, South Korea, Turkey and the USA (California) were reviewed and the standard uncharacteristically includes mappings to the EU General Data Protection Regulation (GDPR). Mappings of the ISO 27701 requirements to other privacy laws, such as the California Consumer Privacy Act of 2018 (CCPA), GLBA and HIPAA, are expected and will help organizations by providing a common standard and baseline for demonstrating compliance with these regulatory regimes.

On 9th September 2019, the National Institute of Standards and Technology (NIST) also released the preliminary draft of the NIST Privacy Framework: A Tool for Improving Privacy through Enterprise Risk Management. This document aims to help organizations with maximizing beneficial uses of data while minimizing privacy problems for individuals. It has been released for public comments which must be received by Oct 24, 2019.

In their release, they stated:

Privacy is a concept distinct from security, but the two are intimately connected in our digital world. A security breach that cracks a company’s database might reveal private information about thousands of individuals. For that reason, many industry stakeholders over the past year requested that NIST align the Privacy Framework with the Cybersecurity Framework, one of NIST’s flagship publications. The Privacy Framework is therefore aligned with the Cybersecurity Framework both structurally and conceptually, and they are designed to be used together.”

5. What are your predictions regarding data privacy developments for 2020?

If I only had a crystal ball! We can certainly expect more legislation surrounding data privacy. There are now over 100 privacy laws in the world and GDPR is driving other countries to adopt similar regulations. Following the lead of the California Consumer Privacy Act (CCPA), we will see more US states developing comprehensive privacy laws at probably a faster rate than any potential federal law.

We will continue to see a proliferation of data breaches which is not unsurprising given today’s cyber threat landscape. As a result, I think we will also see more prescriptive security requirements in these new privacy laws and more state regulation on mandatory data breach disclosures.

Historically, privacy laws have required appropriate technical and organizational measures to ensure a level of security appropriate to the risk and sensitivity of the data. But that may be changing. The FTC has settled or litigated more than 60 law enforcement actions against businesses that allegedly failed to take reasonable precautions to protect consumers’ data. The FTC has also proposed adding more specific security requirements to its Safeguards Rule under Gramm-Leach-Bliley Act (GLBA) consistent with NYDFS cybersecurity regulations and the NAIC model law for insurance data security.

In 2019, we saw the privacy tech industry start to mature and attract massive funding. This will continue and we will see more privacy management and privacy enhancing technologies emerge in 2020. We will also see companies start embedding automated privacy into their digital transformations.

Finally, I believe there will even greater consumer awareness about their rights around personal data and we will see more companies like Apple announcing that they are putting privacy at the head of their strategy, making it a competitive differentiator.

Hear Anju Khurana live at PrivSec New York

Hear Anju Khurana impart her views live at PrivSec New York this November, where she will be discussing ways to design an effective privacy and data protection team alongside a panel of industry specialists.

Addressing audiences at PrivSec New York, Anju joins a high-profile list of guest speakers and representatives from global names, including Uber, the New York Times, Bank of England, Raytheon and many more.

As the march towards stronger US data privacy laws continues, PrivSec New York comes at a critical time in the debate between law makers and enterprise. With the CCPA deadline looming closer, the need for ideas, debate and innovation in data privacy has never been greater.

To find out more about PrivSec New York, click here.


PrivSec Conferences will bring together leading speakers and experts from privacy and security to deliver compelling content via solo presentations, panel discussions, debates, roundtables and workshops.

For more information on upcoming events, visit the website.

We have been awarded the number 1 GDPR Blog in 2019 by Feedspot.

Privacy Culture: Data Privacy and Information Security Consulting, Culture & Behaviour, Training, and GDPR maturity, covered. https://www.privacyculture.com/