GDPR: A year on
In this briefing we’ll look at what has happened since the GDPR came into force, including:
- A brief GDPR recap
- GDPR myths in 2018
- Many businesses were not ready
- 2019 the year of enforcement?
- Is adtech a target?
- AI and the future of data processing
- Consent management platforms
- What’s next? The ePrivacy Regulation
- The global knock-on effect of the GDPR
A brief GDPR recap
The General Data Protection Regulation (GDPR) was designed to harmonise European data law. It replaced the 1995 data protection directive and came into force on May 25th 2018, applying to ‘controllers’ and ‘processors’ of personal data (or personally identifiable information) and bolsters the rights of data subjects in the EU.
The rights for individuals provided by the GDPR are as follows:
- The right to be informed
- The right of access
- The right to rectification
- The right to erasure
- The right to restrict processing
- The right to data portability
- The right to object
- Rights in relation to automated decision making and profiling.
The regulation has a potentially wide-ranging impact stemming from the enshrinement of these rights, such as the need to provide data subjects with information about the purposes of processing, retention periods and who data will be shared with – and all this in clear and plain language.
Subject access requests should now be dealt with within a month. And individuals have the right to move data easily from one service to another.
The GDPR’s seven key principles (see below) are not dissimilar to those of the 1998 Data Protection Act in the UK. However, there are notable differences, particular the principle of accountability.
Businesses must take responsibility for compliance, keeping records and creating processes accordingly.
|Principles of the GDPR|
|Lawfulness, fairness and transparency|
|Integrity and confidentiality (security)|
There was lots to consider in the GDPR for marketers – for example, what constitutes personal data, what unambiguous consent means, and what legal basis to use for processing (e.g. beefed up consent vs. legitimate interests).
The ICO’s Guide to the General Data Protection Regulation is a good place to start for any reader who wants a thorough recap of what the regulation includes.
GDPR myths in 2018
Writing in the Financial Times, journalist Malcolm Moore described 2018 as ‘the year of the needy inbox’. Strictly speaking the neediness was exhibited prior to the May 25th deadline, by which time consumers were increasingly aware of the GDPR.
Compliance efforts by businesses were hard to miss, including lots of re-permissioning campaigns by email (some of which were likely not necessary but a misunderstanding stemming from too much focus on consent, which is not the only legal basis for processing of personal data under the GDPR).
Mishcon de Reya’s Data Protection Advisor Jon Baines highlights this trend for ill-thought-through re-permissioning efforts in his blog post titled “The GDPR consent email I’d like to receive”.
Baines points out that consent is described more strictly by the GDPR, but the Privacy and Electronic Communications Regulations (PECR) has, since 2003, prevented marketers sending unsolicited emails. Therefore, for those marketers doing their job properly already, no action was required to re-gain consent for email communications.
And, of course, those that emailed seeking consent from people who had opted out of marketing communications may have broke the existing law in their efforts to comply with the forthcoming GDPR.
These myths were a common theme in 2018. The ICO ran a series of myth-busting blog posts to help businesses make the right decisions, including a light-hearted Christmas edition where Steve Wood, Deputy Commissioner for Policy, felt the need to clarify that:
“No, GDPR doesn’t ban Christmas cards, even in corporate context. If you are sending Christmas cards to friends, family, neighbours etc you don’t need their consent. If you’re sending corporate Christmas cards, you need to be more careful and consider whether it contains direct marketing – especially if it addressed to an individual. In particular, if sending a corporate Christmas greeting electronically, eg by email, then be sure to comply with the Privacy and Electronic Communication Regulation (PECR) rules on electronic marketing.”
Perhaps the biggest confusion was around opting in – with many marketers getting the wrong end of the stick and assuming opt-in was always required to send electronic marketing under the GDPR. But, to turn to the helpful Jon Baines of Mishcon again, “when you buy, or enter into negotiations to buy, a product or service from someone, the seller only has to offer an “opt out” option for subsequent electronic marketing. Nothing in GDPR changes this.”
The GDPR provides a right to opt out of direct marketing and suggests direct marketing may be regarded as a legitimate interest. But it’s the PECR which still governs when opt in is needed for electronic marketing.
Many businesses were plainly not ready
Given the flurry of action and the publicity around the regulation, which even led to GDPR memes, some commentators have been surprised that not everybody was ready come the deadline.
Compliance, even now – nearly a year after the GDPR came into force – is an ongoing affair. Getting privacy policies and tick boxes in order ahead of deadline was one thing, but ‘data protection by design and by default’ – now a legal requirement and part of adherence to the GDPR’s principles – is quite another.
In the UK, the regulator’s own journey to compliance was ironically highlighted by a recent report from The Register. The article highlighted the fact that the ICO’s privacy notice for its own staff (including information about the use of personal data) is, by its own admission, still ‘under construction’.
Despite the lack of this notice, the ICO told The Register that its staff had been “made aware” of its policies but that after recruiting more staff the policies had to change and the notice was to be “updated”.
Amongst the small business community, a Hiscox survey of SMEs revealed that 39% of those surveyed don’t know who GDPR affects. One in 10 don’t think GDPR gives consumers any new rights. And nine in 10 don’t know the main new rights GDPR affords consumers.
This ill-preparedness is not as surprising when you consider what Security Scorecard’s Fouad Khalil, writing for Silicon Republic, describes as ‘the lack of more direct guidance from regulators’.
Indeed, taking SMEs as an example, the ICO launched its self-assessment checklist ‘to help sole traders and self-employed individuals to assess their compliance’ in October 2018, five months after the GDPR came into force.
Whilst the ICO has offered excellent guidance, it often came necessarily close to deadline. February 2018 saw the end of the consultation period for the Article 29 Working party guidelines on consent. Comments were reviewed and detailed guidance on consent was published by the ICO in May 2018.
The crunch month of May 2018 was also when the ICO published detailed guidance on children and the GDPR, determining what is personal data, automated decision-making and profiling, the right to be informed, and Data Protection Impact Assessments (DPIAs).
As Khalil writes in his Silicon Republic article, ‘The GDPR is notably light on prescriptive commands compared to previous regulations. This can be a good thing, as it encourages companies to consider the spirit of the law rather than just making it a tick-box exercise. However, it has also made the job of compliance much more difficult.’
Considering the spirit of the law, though, sounds like a strangely subjective activity. Can we trust marketers and business to embrace this spirit?
Raegan MacDonald, Head of EU Public Policy at Mozilla, told TNW late in 2018 that “Many companies appear to be interpreting GDPR as narrowly as possible. I’m concerned that privacy is still by default put at risk without users understanding or having meaningful control.”
Will 2019 change the focus for businesses as regulators begin enforcement? That’s certainly the opinion of MacDonald, and in the UK the Information Commissioner has talked of focusing only on compliance in the first year – a period which some commentators have called a ‘phoney war’ of data protection compliance. Is the phoney war now over?
2019 the year of enforcement?
On March 18th 2019, Jon Baines wrote on the Mishcon de Reya website that “An FOI request by Mishcon de Reya reveals that the ICO have issued no “notices of intent” to serve GDPR fines, nearly ten months on from it coming into effect.”
Baines argues that given notices of intent must be delivered before any fine is levied, it may be some time before the first fine appears (from the UK regulator).
However, notable ongoing investigations by the ICO include BA and Marriott, which on January 4th 2018 announced a data breach exposing 383 million guest records.
Writing for Forbes, Yiannis Mouratidis speculates that potential losses for Marriott could potentially be greater than the $915 million (or 4% of global revenue) which represents the strictest fine that could be meted out – losses could possibly include clients claiming damages as a results of hacked payment cards and passports.
As for fines issued already across the EU, there is one that stands out – France’s Commission nationale de l’informatique et des libertés (CNIL) imposed a fine of 50 million euros on Google on January 21st 2019.
The fine was the result of complaints on 25th and 28th May from two groups – None Of Your Business and La Quadrature du Net. These two associations claimed that Google did not have a valid legal basis to process the personal data of the users of its services, particularly with respect to the personalisation of advertising.
The CNIL investigations into Google account creation found that “Essential information, such as the data processing purposes, the data storage periods or the categories of personal data used for the ads personalization, are excessively disseminated across several documents… The relevant information is accessible after several steps only, implying sometimes up to 5 or 6 actions. …Moreover, the restricted committee observes that some information is not always clear nor comprehensive.”
Furthermore, “the purposes of processing are described in a too generic and vague manner, and so are the categories of data processed for these various purposes. Similarly, the information communicated is not clear enough so that the user can understand that the legal basis of processing operations for the ads personalization is the consent, and not the legitimate interest of the company.”
Consent obtained for ads personalisation was found by CNIL not be valid because it is not sufficiently informed, is not specific (but relates to many Google services) and is not ‘unambiguous’.
Further rationale for action was that this was not a time-limited infringement, the Android OS is widespread in France, and that ads personalisation is a major part of Google’s business model.
For now, this fine is an outlier. A blog post from law firm Osborne Clarke states that “Where there have been other fines (in Germany and Portugal), the amount of those fines has been considerably lower.” The author also references a report by the Handelsblatt published in January 2019, which detailed 41 fines under the GDPR by German data protection authorities.
Of course, one of the issues for business is that even a year later, the journey to compliance may still be a way from completion, with changes in business practice representing a barrier. Ryan Chiavetta, associate editor for the IAPP writes that at an IAPP meeting in March 2019, McDermott Will & Emery Co-Chair, Privacy and Security Mark Schreiber, CIPP/US, said “We still have too little time and it’s a year later. We expect 50 percent of covered companies are still in the process of GDPR compliance and it will likely go on for another couple of years.”
This is where accountability comes into play. Businesses must show the steps they are taking to compliance and the ‘phoney war’ may come to an end this year. After all, the complaints are flooding in to European data protection authorities.
In December 2018, UK Information Commissioner Elizabeth Denham said in a speech that the ICO had received 8,000 breach notifications since GDPR came into force.
Osborne Clarke reported a big uptick in access and erasure requests from data subjects post May 2018. The firm also details how problematic these requests can be, saying that access requests which ask “lots of complicated questions about data processing, some of which fall within scope of Article 15 of the GDPR and others which don’t” are “manageable but often require experienced data protection lawyers to avoid pitfalls…”
Another category of requests, by employees and former employees for data found within emails, are described as “ time-consuming and expensive exercises that businesses find a chore and which can often lead to litigation if not handled well.”
Is adtech a target?
We’ve already heard how Google is the target of action from the CNIL. The same body has also issued warnings against geolocation-focused adtech in France, specifically the companies Teemo, Fidzup and SingleSpot, which AdExchanger reports have made changes to their systems and been cleared.
In the UK, adtech is also in the spotlight. Simon McDougall, Executive Director for Technology Policy and Innovation at the ICO, writes that “The GDPR has clear requirements around notice and transparency. We are interested in how people are told, and what people are told, about the use of their personal data for online advertising purposes when they visit websites or access apps, as well as how accurate this information is.”
The ICO has most notably been firing warning shots in relation to Facebook’s use of data analytics for political purposes. The tech company was fined £500,000 for breaching data protection laws with an investigation finding that “between 2007 and 2014, Facebook processed the personal information of users unfairly by allowing application developers access to their information without sufficiently clear and informed consent, and allowing access even if users had not downloaded the app, but were simply ‘friends’ with people who had.”
Commenting on the fine, Information Commissioner Elizabeth Denham was bullish, noting “We considered these contraventions to be so serious we imposed the maximum penalty under the previous legislation. The fine would inevitably have been significantly higher under the GDPR. One of our main motivations for taking enforcement action is to drive meaningful change in how organisations handle people’s personal data.”
AI and the future of data processing
One of the most interesting implications of the GDPR is its use in ensuring fairness during automated decision making and profiling by AI.
To recap, the GDPR has provisions on this type of data processing, notably in Article 22, which states that “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”
The ICO summarises:
“You can only carry out solely automated decision-making with legal or similarly significant effects if the decision is:
- necessary for entering into or performance of a contract between an organisation and the individual;
- authorised by law (for example, for the purposes of fraud or tax evasion); or
- based on the individual’s explicit consent.”
Where special category data is concerned, explicit consent is required from the individual. Alternatively, the processing may be necessary for reasons of substantial public interest.
A data protection impact assessment is required to assess the risks of such automated decision making.
Since the GDPR came into force, this area has emerged as more of a focus for the ICO. In November 2018, the office announced the appointment of Dr Reuben Binns as its first Postdoctoral Research Fellow in Artificial Intelligence (AI).
Dr Reuben Binns’ two-year fellowship will see him “research and investigate a framework for auditing algorithms and conduct further in-depth research activities in AI and machine learning.”
You can read an overview of the aforementioned framework, published in March 2019 by Dr Binns and Technology Policy Adviser Valeria Gallo.
The authors identify eight AI specific risk areas the framework will cover. These are:
- Fairness and transparency in profiling
- Fully automated decision making models
- Security and cyber
- Trade-offs – covering challenges of balancing different constraints when optimising AI models (e.g. accuracy vs. privacy).
- Data minimisation and purpose limitation
- Exercising of rights
- Impact on broader public interests and rights
As machine learning and deep learning enters the parlance of marketers, this work will be helpful in ensuring any processing is not in danger of coming down on the wrong side of any of these eight areas of focus by the UK regulator.
Perhaps the biggest change in Europe, certainly in the UK, since May 2018 is the proximity of Brexit, which at time of writing has a deadline of October 31st 2019.
Although the Data Protection Act 2018 is designed to ensure personal data will be easily exchanged between UK and EU entities post-Brexit, there is a change to note.
Once the UK is no longer part of the EU, it will need to show an adequate level of data protection for transfers to continue from the EU to the UK.
Elizabeth Denham writes that “In a ‘no deal’ situation the UK Government has already made clear its intention to enable data to flow from the UK to EEA countries without any additional measures. But transfers of personal data from the EEA to the UK will be affected.
Such transfers may involve the use of standard contract clauses (SCCs), and the ICO provides an interactive tool for small and medium sized businesses to help select and complete the right SCC.
Denham’s myth-busting blog post on Brexit and data protection is certainly worth a read.
Consent management platforms
One of the most visible results of the GDPR has been additional popups on websites seeking to gain the visitor’s consent for tailored ads and tracking, and to be transparent about which third parties will use this data and for what purposes.
These are often managed by consent management platforms, of which there are a number on the market, the most prominent being Quantcast Choice, which is based on the IAB Europe’s Transparency and Consent Framework.
Quantcast states that three months after launch (on May 15th 2018), the platform was present on more than 10,000 websites worldwide and had enabled more than one billion consent choices.
Somer Simpson, Product Lead at Quantcast, told the Econsultancy blog that “In the early days, Quantcast feared a fragmented ecosystem, with every vendor capturing consent in their own way – which results in a horrible user experience.”
The aim of the CMP is to simplify the process of giving consent to a whole swathe of adtech companies. In practice, these new popups have annoyed many users but that is perhaps the nature of transparency, it is in some part interruptive, with the user presented with information before they use a website for the first time.
For more on CMPs, read “What is a consent management platform and are they needed?” on the Econsultancy blog.
What’s next? The ePrivacy Regulation
The ePrivacy Regulation will replace the ePrivacy Directive or ‘cookie law’. It is currently in draft and, as Davinia Brennan of A&L Goodbody writes on Lexology.com, the latest draft “published by the European Council, provides that it will apply 24 months from the date it is adopted, with the result that even if it is adopted imminently, it may not come into effect until 2021.”
The aim of the regulation is to require users to provide consent before a business can access their device, for example to read or write cookies. It is likely to align with the GDPR, requiring this consent to be similarly informed, specific and unambiguous.
Further work on the regulation is needed, as highlighted in a progress report by the Council of the EU, including, as Brennan writes, “with regard to the burden for browsers and apps, the competition aspect, the impact on end-users, and the ability of this provision to address the issue of consent fatigue.”
The ePrivacy Regulation, then, is one to watch over the next year for marketers and ecommerce professionals.
The global knock-on effect of the GDPR
In the wake of the GDPR a number of new laws have been enacted which tighten up data protection in other parts of the world.
These include the California Consumer Privacy Act, often termed ‘GDPR Lite’, which dictates how businesses operating in California should collect, share and protect personal information. Like the GDPR, individuals have the right to greater oversight of what information is collected and who it is shared with. The act will come into force in 2020.
Elsewhere, on November 1st 2018 Canada updated its Personal Information Protection and Electronic Documents Act (PIPEDA), which has been law since 2000 and applies to the collection, use or disclosure of personal information in the course of a commercial activity. The update has added requirements to notify the regulators of data breaches that pose a risk of harm, to also notify the individuals affected and to keep records for two years.
Danny O’Brien, International Director at the Electronic Frontier Foundation, highlights that “Brazil passed its own GDPR-style law this year; Chile amended its constitution to include data protection rights; and India’s lawmakers introduced a draft of a wide-ranging new legal privacy framework.”
In the US in April 2018, US senators introduced a bill that would forbid large platform companies like Facebook from using so-called dark patterns to effectively trick users into giving up information or agreeing to certain terms.
Democrat Mark R. Warner of Virginia, explained the rationale behind the Deceptive Experiences To Online Users Reduction (DETOUR) Act:
“For years, social media platforms have been relying on all sorts of tricks and tools to convince users to hand over their personal data without really understanding what they are consenting to. Some of the most nefarious strategies rely on ‘dark patterns’ – deceptive interfaces and default settings, drawing on tricks of behavioral psychology, designed to undermine user autonomy and push consumers into doing things they wouldn’t otherwise do, like hand over all of their personal data to be exploited for commercial purposes.
“Our goal is simple: to instill a little transparency in what remains a very opaque market and ensure that consumers are able to make more informed choices about how and when to share their personal information.”
The proposed law calls for the creation of a professional standards body associated with the Federal Trade Commission (FTC) that would “act as a self-regulatory body, providing updated guidance to platforms on design practices that impair user autonomy, decision-making, or choice, positioning the FTC to act as a regulatory backstop.
Note that this briefing is not intended to construe legal advice or offer comprehensive guidance.
Source: Customer Experience