If “data is the new oil”, then “PII is the new petrol”.
Since the introduction of GDPR legislation, PII or Personally Identifiable data is potentially regarded as highly inflammable, as GDPR infractions and data scandals burn up corporate coffers. Regulators had long warned businesses of steep fines amounting up to 4 percent of a company’s annual turnover or €20 million – the higher of the two. The massive leap in penalization has understandably created a general scare around GDPR, given how previous fines amounted to €500,000 at the worst.
However, not all businesses have been proactively working towards GDPR compliance. This is mainly due to the rudimentary knowledge or lack of resources in understanding and implementing GDPR. Statistics reveal that nearly one-third of European businesses have yet to achieve GDPR compliance. The recent round of GDPR fines have taught us that a lackadaisical approach will land companies in a deep financial soup.
Let’s review some cases of GDPR penalization and understand how businesses are becoming GDPR-compliant, in the context of analytics of course. Before that, a brief about GDPR.
What is GDPR?
For the benefit of readers who are not aware of the nomenclature or history of GDPR – read on! If you’re well-versed with GDPR, we suggest you skip to the next section.
What’s the Damage?
Since the roll-out from 28th May 2018, GDPR has raked up millions in fines. One of the first significant case was British Airways, where users were redirected to a malicious website from the official BA site. The personal information of 500,000 customers were stolen. The regulatory body slapped a whopping £ 183,000,000 fine on the airline, this translated to 1.5 percent of its revenue in 2017.
In early 2019, Google came under the scanner of the data protection authority in France, for lack of transparency and use of passive opt-ins for user consent to personalize ads, attracting a fine of € 50,000,000. The round-up from major GDPR fines stands at € 428,545,407as of December 2019.
The tight regulation and hefty penalties from GDPR have surely tempted business owners to reconsider doing business in Europe. However, for most businesses, this is hardly an option. Let’s acknowledge that the objective of GDPR has been not to dissuade businesses from operating in the EU, but to give users more control over their personal data and enforce more accountability from businesses.
GDPR in the Context of Analytics
There has been a noticeable shift in the accountability for user data privacy from end users to businesses, since GDPR rolled out. Fines and penalizations apply to parties that collect, store, process and/or control data – addressing each stage in the data lifecycle to protect user interest and privacy.
1: Great vigilance
Today, business needs to take a proactive responsibility in ensuring end users understand how personal data is used. This means businesses cannot hide behind complex privacy policies to tap into valuable personal data. An uphill battle lies ahead for businesses that collect and/or process data. This will imply greater vigilance is required from both the Data Controller and the Data Processor. Both jointly own the primary responsibility of ensuring GDPR compliance. This involves the necessary task of drafting contractual agreements that outline the privacy responsibilities and mandates of third-party processors and sub-processors.
Another issue pertains to historical data and existing analytics stacks in play prior to GDPR implementation. These might have been subjected to specific, broad or no user consent. Companies are expected to have furnished a written record of why data is being collected and inform users of the same, before collecting data. This deems historical data non-compliant unless the user has formally consented to the same.
2: Ensuring User Consent
GDPR has brought in greater transparency and given users near-complete control in approving or disapproving collection of their personal data as well as exercising consent on how the data will be used. For businesses, this means that the clauses on data collection and usage need to be explicitly and clearly laid out to enable users’ opt-in/out. This is a far cry from the complex legalese that websites previously provided about using personal data. GDPR emphasizes consent as “freely given, specific, informed and an unambiguous indication of the data subject’s agreement to the processing of personal data relating to him or her.”
3: Reorienting data for data analytics
The very nature of analytics makes interpretation of data collection and processing a difficult task, more so, in the context of iterative analytics and artificial intelligence operations. This poses a big challenge for businesses specializing in or utilizing analytics. Some wonder if this means dwindling data sets, but there is some consolation in that data science and engineering techniques can use the following to process the data yet keep it protected in depth such as:
Anonymization – where the data subject is anonymized. Of course, there is a remote chance of identification through aggregation. This works well when very few parameters are being processed.
Pseudonymization is a more robust data management technique recommended by GDPR authorities to protect user privacy. It enables businesses to access specific data elements required for the job and not the entire data set. By restricting information access to data touchpoints than subject level, user privacy remains uncompromised, as individual data elements cannot be used to trace back to the actual user.
At TheMathCompany, steps are taken to ensure that any PII data received is minimal and relevant to the analytics engagement. Fortunately, most of our engagements relate to identifying trends and ehaviour at a demographic level, hence anonymised data forms are usually acceptable. In the odd case, where we have received more PII data than required, a review meeting is initiated with the customer in order to resolve any liability.
4: Impact on Customer Profiling
Customer profiling is used to automate decisions, direct marketing, etc. by processing personal data. From this, individual interests, behavior and preferences, are identified; this is only allowed under lawful bases. Summarizing, Article 22 states that the data subject need not be subject to a decision based solely on automated processing, which includes profiling.
Furthermore, data scientists will need to ensure predictive models are devoid of indirect/ hidden bias or discriminatory training data as the GDPR warns against the use of race, ethnicity, etc. and discriminatory outcomes in automated decision-making.
End users can recall consent at any given point of time under GDPR. This implies:
- Businesses need analytics stacks that dynamically clarify user consent before each run to avoid a breach.
- Data collected for one purpose cannot be re-used for a different purpose, without user consent. For example, a retailer that collects user addresses for shipping purpose, cannot use it to profile users for re-targeting or marketing purposes.
It helps for businesses to setup a system that can easily track user consent and also readily give users access to their personal data on requests, after validating that it is the user in question of course – lest it leads to a data breach situation.
5: Security Measures
Encryption has grown as one of the lowest hanging fruits that businesses harness to strengthen security. Given the ease and cost-efficiency in implementation, encryption has been strongly recommended by data regulation bodies time and again. Pseudonymization’s fine-grain data access is contributing towards preserving user privacy.
Conducting periodic DPAs (data protection audits), and assigning a DPO (data protection officer) who oversees compliance should be a part of the data security strategy as per Article 37 of GDPR, especially for large companies.
Security can no longer be regarded as an afterthought, as something that can be plugged in retrospectively after collecting data. It needs to be designed in at the start. In cases where sensitive data such as health records, biometric data, political stance, religion, race etc., are in question, GDPR enforcers rightly advise businesses to take a data protection by design and default approach. Business will still need to set in place a system to monitor and notify customers, in the event of a breach, within 72 hours of its occurrence.
What Does the Future Hold?
A study by McDermott-Ponemon in Europe and the US, reveals that businesses annually spend an average of $13 million on GDPR compliance to deflect heftier potential fines from non-compliance. Tech giants like Amazon.com Inc. have grown cognizant of recent regulatory developments and privacy concerns, so much so, that the end users can now ask Alexa for both food recommendations and also for previous data commands to be wiped out, without having to trawl complex user privacy options on the website. Social media giant Facebook, who earned a whopping $16.6 billion from targeted ad sales in the post-GDPR era i.e. Q4 2018, released a lengthy dissertation on its vision to practice privacy-first social networking with hopes of calming its data scandal notoriety.
It’s a different story with mid and small-sized businesses. They do not always have adequate resources or know-how to observe compliance. Many of these businesses outsource technical compliance aspects to third party experts with the right skill. Ultimately, business insurance is cashing in on the fear of potential fines being levied for flouting GDPR and thereby, walking away with a good chunk of the pie.
It is said, “Regulation is the mother of invention!” What else will GDPR give rise to in the future, is the question.