Stay in touch

Application security insights and invitations to exclusive events in your inbox


Your data will stay confidential Private and Confidential

Barry Cook on Privacy, AI and GDPR Enforcement

Thursday, December 6, 2018 By

Being a Group Data Protection Officer at VFS GLOBAL and a recognized privacy expert, Barry Cook explores practical enforcement of GDPR in the era of AI.


Barry Cook on Privacy, AI and GDPR Enforcement
Barry Cook, Privacy & Group Data Protection Officer – SVP, CIPP/E, CIPM, CIPT, VFS GLOBAL

Today, we are honored to welcome Barry Cook, Privacy & Group Data Protection Officer at VFS GLOBAL, to provide our readers with valuable insights on privacy and cybersecurity.

1) Why are many European companies still not prepared for GDRP?

Recent surveys indicate that several months after the enforcement of the GDPR only 35% of companies say that they are compliant. Why is this? For companies who have an established digital footprint that is externally facing, their journey to compliance would have been much easier as much of the technical and organisational measures required by the GDPR would be in place. However companies that operate in a “Closed, we own the data” type of environment typically find the journey to compliance a rocky road. In my opinion this is a direct indication of how some companies have disregarded the principles of Privacy & Data Protection in the past. Data privacy and Data protection was not seen as important to their business model so therefore it didn’t receive the appropriate attention at higher management levels.

2) When do you think we will see a first exemplary punishment under GDRP?

We are starting to see the lead up to the big fines coming in now. The UK ICO has recently issued an enforcement notice with a fine of £17 million attached, if the enforcement notice is not complied with. This, somewhat ironically, is the company that worked on behalf of pro-Brexit groups. There have been plenty of smaller fines issued by DPAs across Europe, but to be fair, most of these fines were based on actions that commenced before the GDPR came into force. I think it will be the middle of 2019 at the very earliest before we see any big fines levied and that will be for the “simple” cases. For the more complex cases, don’t hold your breath, it’s going to be sometime before those big fines make the headlines.

3) How can users protect their privacy when using social networks that de facto became inalienable part of everyone's life?

This is a complex topic. We have to define what is commonly understood by the term privacy when it comes to personal data. Surveys have shown that when the public was asked “What does privacy mean?” the reply usually contained the word “control”. For example, “Control of how my personal data is used”, “Control of who can see or access my data”, “To be able to have control over my data”. So, the public perception of what privacy means is somewhat different to the actions taken by the public when using social media. With most social media platforms, “Control” is built into the Settings or Options pages, however, in many cases those Settings are not utilised. Therefore it falls to the social network providers to act on behalf of their users and pre-set the default settings so that they protect the privacy of the user. This is more commonly known as “Privacy by Default & Design”. This is where we run into problems. The goal of the Social Network provider is to socialise the information that is given to them by the user, in other words, to disperse it as widely as possible. The information posted by a social network user can be monetised. Yes, even those cute pet videos! The greater the dispersion of that fluffy cat video, the greater its monetary value. If users really want to protect their privacy then they need to use the controls provided by the social media platforms, however, this may seriously impact the number of “followers” or “likes” received!

4) Where is the subtle border between justified online advertising and intrusive monitoring of user activity?

Receiving adverts when browsing can be annoying and arguably intrusive. In my opinion, marketing of products and services based on prior purchasing is commonly acceptable and may even be agreeable to the individual if they have an interest in that topic. Where this activity starts to cross the boundary in to the realm of intrusive monitoring is where the combining of data sets occurs. The American company Target had their marketing techniques exposed several years ago. One Target employee provided a hypothetical example. Take a fictional female Target shopper, who is 23, lives in Florida and in March bought unscented cocoa-butter lotion, a purse large enough to double as a diaper bag, zinc and magnesium supplements and a bright blue rug. There’s, say, an 87 percent chance that she’s pregnant and that her delivery date is sometime in late August. And perhaps that it's a boy based on the colour of that rug? This kind of market research has been around for a long time. Where we are seeing differences now is with Artificial Intelligence and the collection of data from very diverse sources. Many years ago companies would buy-in data on consumers and combine this with their own databases. Now this is no longer necessary as a person’s footprint can be “scraped” to gather that information. When this is combined with AI to create individual predictions of future behaviour and then target that marketing to an individual based on that prediction, then I believe we have to be careful.

5) How can SMEs ensure their compliance with endlessly growing regulations, such as GDPR?

SMEs should view data protection as a way of enhancing their brand rather than just a compliance requirement. This means that marketing teams need to interact more with legal and compliance teams. This is where the skills of the privacy professionals come in. That way the SMEs will build privacy and therefore data protection in to their business model. When this is achieved, then maintaining compliance becomes a recognisable way of influence the bottom line.

6) Do you see any practical applicability of Machine Learning and AI in data privacy solutions?

Yes, AI and machine learning can be of great benefit here. One area that comes to mind is the area of vendor management. All contracts with vendors that handle personal data should have a data protection addendum that clearly defines the data to be processed and the responsibilities of each party. More often than not the data privacy elements are embedded in the master service agreement rather than called out explicitly. This means that procurement and legal teams need to identify the data privacy clauses and asses if they are sufficient and modify if not. As this is not their primary skill set then often the contracts are passed to the privacy team which in turn consumes their resources. AI and machine learning would be ideally suited to perform this task.

Other areas where AI can prove useful for privacy professionals is in the assessment of privacy and data protection by design and default. Ensuring that privacy and data protection “best practice” is embedded into the product or service from the beginning. AI is in its infancy now and as it evolves we will see a greater integration in the management of Privacy.

7) How should organizations select and evaluate their Data Protection Officers (DPO)?

The Institute of Privacy Professionals has put together a DPO certification pack which I would strongly recommend any DPO to take. This ensures that an organisation knows what they are getting. We are also seeing guidance documents from data protection agencies, such as France’s CNIL, on the skills and training requirements for DPO’s. However, certifications aren’t the only factor that organisations need to consider. The modern privacy professional needs to have a cross-silo understanding of how the business operates and they will need to have good communication and negotiation skills. A privacy professional’s role is not just about understanding the data protection legislation, it’s about acquiring and maintaining business relationships as this is vital to the successful implementation of a privacy strategy.

8) Which data protection laws should be taken into consideration by a Swiss company that operates globally?

Where do we begin? The GDPR has started an avalanche of new data protection laws. In the last year there have been almost 100 new data protection laws and many are first time laws for some countries. With approximately 850 laws that impact the use of data, any company operating globally has to seriously consider its privacy strategy when operating outside the country where it is based. At VFS Global we operate in 141 countries. It would be impossible to have 141 different operational models for data protection, therefore what I have done is to align the corporate standard with the GDPR as this is currently the baseline for data protection legislation. This means that VFS Global will be de facto compliant in the countries of operation, even if these countries are beyond the scope of the GDPR. That doesn’t mean I ignore the local legislation, this has to be reviewed for any specifics for that country, such as data localisation requirements. By having a robust privacy governance framework that is not focused on the fine details, enables me to utilise it as a basis for the technical and organisational measures that need to be operationalised.

9) What are the biggest pitfalls when implementing your corporate data privacy strategy?

With a geographically and culturally diverse workforce, the biggest challenge is raising awareness to the concepts of privacy and data protection. Organisations often implement technical controls to manage data protection, and this is frequently the way in which “privacy teams” operate having evolved from the IT security function. However, this is just part of the implementation of the privacy framework. The human element is the component that will make the biggest impact to the success of a corporate data privacy strategy. One of the biggest pitfalls I see with companies implementing a privacy strategy, is that they adopt a compliance driven approach, tying-in data privacy to data breech. Adopting this approach will mean that the privacy strategy becomes a “tick-box” project that is begrudging implemented by the enabling and operational functions. While the need for compliance mustn’t be ignored, and as we can see the penalties for non-compliance are high, this shouldn’t be the sole driver for implementing a privacy strategy. As mentioned before, the expectations of the public (and employees) as to how their personal data is handled are increasing. These expectations are being recognised by the more mature organisations and meeting those privacy expectations has become an integral part of the organisation’s strategy.

10) Would you recommend your friend's kids pursuing a careers in data privacy?

Absolutely! But I’m biased! What we are seeing now is just the start of an evolving profession as data privacy becomes embedded into the business landscape. As we move forward we will see data privacy specialisations starting to develop. Areas such as designing privacy and ethics into AI and Machine Learning algorithms, ensuring anonymity in big data processing and privacy public relations roles will be just a few of the areas that we can expect to see appearing in the future.


High-Tech Bridge's series of Cybersecurity Leaders Interviews compile thoughts of the cybersecurity executives, thought leaders, visionaries and eminent technology experts.

User Comments
Add Comment

High-Tech Bridge on Facebook High-Tech Bridge on Twitter High-Tech Bridge on LinkedIn High-Tech Bridge RSS Feeds Send by Email
Share
Let's Talk