For the past few years, a “perfect storm” of data privacy risk challenges has arisen. To start, data protection laws such as GDPR and CCPA are forcing organisations to preserve consumer data. Complicating matters is the growing prevalence of technologies that permit individuals to connect and communicate (such as e-mail, instant messaging, chat, social networks, etc.), resulting in an increasing amount of personal information (generated by and about individuals).
In between companies looking to exploit this data for financial gain and governments who may use this data to illegally surveil their citizens, a whole host of thorny ethical considerations are starting to emerge. Furthermore, attacks from hackers and criminals looking to obtain personally identifiable information (PII) have surged dramatically in recent years. In fact, in 2020, there were 3,950 confirmed data breaches – though the real number is probably much higher.
Privacy-enhancing technologies (PETs) refers to a broad range of technologies (hardware or software solutions) that embody fundamental data protection principles by minimising personal data use and maximising data security. At the same time, these tools are designed to enable data value extraction in order to unleash its full commercial, scientific and social potential – they just do so in a way that preserves data privacy and ensures compliance to regulatory standards.
PETs can work in a variety of ways but, generally speaking, they use encryption and data obfuscation techniques to provide anonymity, pseudonymity, unlinkability and unobservability of data subjects. Technologies encompassed under this umbrella term include:
- Homomorphic Encryption– an encryption scheme that converts data into ciphertext that can be analyzed and worked with as if it were still in its original form.
- Secure Multi-Party Computation– a cryptographic protocol where a number of distinct, yet connected, computing devices (or parties) wish to carry out a joint computation of some function while preserving certain security properties in the face of adversarial behavior.
- Federated Learning – A decentralized form of machine learning (ML), it enables ML models to scale predictions on millions of edge devices, such as mobile phones, without collecting raw personal identifiers, thereby preserving privacy.
- Zero-Knowledge Proofs (ZKPs) – a digital protocol that allows for data to be shared between two parties without the use of a password or any other information associated with the transaction.
What are the downsides of PETs?
Though PETs are growing in popularity, they are not without their downsides. To start, they can be incredibly complex to implement and manage. As they require vast computational capacity, PETs can also be incredibly expensive and environmentally destructive. For example, encryption can entail a substantial increase in data size, which can cause major bandwidth problems.
Last but not least, given the complexity of these tools, they can be difficult to govern and build regulatory standards for. As a result, measuring the effectiveness of PETs is difficult if not impossible, leading some to question the effectiveness of PETs in general. In fact, a handful of watchdog groups have speculated that PETs may even have the opposite effect. As Elizabeth Renieris of the Ada Lovelace Institute explained in a recent article, Why PETs (privacy-enhancing technologies) may not always be our friends,
“In practice, PETs are complex, expensive and resource intensive, making them hard to implement and prone to user error. And despite their benefits, the use of PETs can further consolidate power and control in the hands of those who already have too much of both, typically those with the resources to exploit them. PETs can also create a false sense of safety and security, and thereby incentivise and legitimate activities or practices that we might otherwise find objectionable.”