RISKY BUSINESS: Technologies Requiring a Data Protection Impact Assessment (DPIA) under the GDPR

Under the European Union GDPR privacy compliance obligations, Data Protection Impact Assessments (DPIA) are mandatory for data processing “likely to result in a high risk to the rights and freedoms of data subjects.” Failure to conduct such a risk assessment is a breach of the GDPR that is subject to significant fines. Whether an organization is required to comply with the GDPR is beyond the scope of this article but if your organization processes any of the following types of “risky” Personal Data of EU or UK citizens listed in the table below, now is the time to find out.

Personal Data is broadly defined as any information relating to an identified or identifiable natural person, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.

How can an organization determine whether to incur the expense of conducting a DPIA? According to Article 35(3) of the GDPR, there are three types of processing that always require a DPIA: 1) systematic profiling with significant effects; 2) large scale use of sensitive data; and 3) public monitoring. Beyond the bare-bones Article 35(3) requirements, various national authorities within the EU (and the UK GDPR) have set up guidelines for determining “high risk” processing.

Processing Requiring DPIA Examples
1.     Innovative Tech o   Machine learning, deep learning

o   Artificial Intelligence (AI)

o   Autonomous vehicles

o   Intelligent transportation systems

o   Wearables and other smart technologies

o   Some IoT applications (chat bot?)

2.     Denial of Service o   Credit checks

o   Mortgage and insurance applications

o   Pre-check processes for purchases (e.g., smartphones)

3.     Large scale Profiling o   Smart Meters or IoT applications

o   Hardware/software for fitness or lifestyle monitoring

o   Social media networks

o   Adding AI to existing processes

o   COVID contact tracing

4.     Biometric Data o   Facial recognition

o   Workplace access systems

o   Access control for hardware/applications such as voiceprint, fingerprint, facial recognition

5.     Genetic Data o   Medical diagnosis

o   DNA testing

o   Medical research

6.     Data Matching o   Fraud prevention

o   Direct marketing

o   Monitoring the personal use of benefits

7.     Invisible Processing o   List brokering

o   Direct marketing

o   Online tracking by third parties

o   Online advertising

o   Data aggregation

8.     Tracking o   Social networks

o   Software applications (e.g., Strava, Google Maps)

o   Hardware/software for fitness/lifestyle monitoring

o   IoT devices, applications, and platforms

o   Online advertising

o   Web and cross-device tracking

o   Data aggregation

o   Eye tracking (e.g., online exam proctoring)

o   Work from home tracking

o   Work keylogging software

o   Loyalty schemes

o   Wealth profiling

9.     Targeting children or vulnerable individuals o   Internet-connected toys

o   Social networks

o   Mobile applications

10.  Risk of Physical Harm o   Whistleblowing/complaint procedures

The purpose of highlighting numerous “high risk” activities is to demonstrate that what we accept as important, modern, and even necessary technologies require Personal Data processing. A 2010 discussion on Metafilter may have coined the phrase that "if you're not paying for something, you're not the customer; you're the product being sold.” That is still true for Google and Facebook but as a corollary, we help pay for our tech by ceding rights to individual personal information.

The enactment of data protection & privacy compliance obligations, ranging from Massachusetts’ 2009 regulation (201 CMR 17.00), to the GDPR, to the California Consumer Privacy Act (CCPA, effective 1/1/20), signal a palpable concern that self-regulation of personal information is ineffective. The highly intrusive SolarWinds hack of some of the most secure IT infrastructure in the world serves to remind us that security will never be 100% effective. If an incompetent password policy set by a vendor can open tens of thousands of secure systems to malicious activity, it is even more important to encrypt, segregate, or otherwise harden Personal Data. The general intent of the privacy compliance obligations is clear enough, but the added DPIA requirement for high risk Personal Data, and the resulting fines for non-compliance should help prioritize risk management decision making. Categories of Personal Data requiring a DPIA require a greater level of protection especially inside of IT systems that might be part of the next SolarWinds-scale attack.