The benefits of pet ownership are often understated. Our furry (and not so furry) friends are understood to lower stress levels and provide comfort. Whilst not as tactile (and far less slobbery), the same might be said of software and hardware solutions that (*deep breath*) achieve specific privacy or data protection functionality or protect against risks of privacy of an individual or a group of natural persons. This is how the EU Agency for Cybersecurity (ENISA) defines privacy enhancing technologies a.k.a. PETs.

PETs have become a hot topic in recent years. That’s because they enable adopters to satisfy the insatiable appetite for access to, and valuable insights derived from, personal data, but without compromising the privacy of the individuals to whom those data relate. Their adoption is, however, not without risk.

ENISA has been championing PET adoption for the better part of a decade, and made various resources available. The push for data-driven innovation on these shores has led to the development of other resources, such as the Centre for Data Ethics and Innovation’s useful PET Adoption Guide. And now, as part of its updated draft guidance on anonymisation, pseudonymisation and privacy enhancing technologies – which has been drip-fed chapter by chapter for over a year – the ICO is having its say.

The latest instalment of that tome, Chapter 5: Privacy-enhancing technologies, has been published and aims to provide readers (the target audience being DPOs and those charged with data protection in larger organisations) with detail to help apply PETs in practice.

The chapter is divided into 2 parts. The first addresses how PETs can help achieve data protection compliance. It does so by breaking the topic down into a number of questions. We’ve listed these below along with our own answers which summarise the guidance:



How do PETs relate to data protection law?

They help demonstrate a data protection by design & default approach.

What are the benefits of PETs?

They help reduce risk to data subjects by enabling analysis of personal data without necessarily sharing it.

What are the risks of using PETs?

Some lack maturity (in terms of scalability, standards and security); implementation can require significant expertise; lack of appropriate controls can increase risk.

What are the different types of PETs?

There are 3 broad categories. Those that:

  1. Reduce the identifiability of individuals by weakening / breaking the connection between the original personal data and the derived data. Examples include differential privacy and synthetic data.
  2. Focus on hiding or shielding data. Examples include homomorphic encryption and zero-knowledge proofs.
  3. Split datasets or control access to certain parts of the data. Examples include trusted execution environments, secure multi-party computation and federated learning.

Are PETs anonymisation techniques?

Not all PETs result in effective anonymisation but can play a role.

When should we consider using PETs?

Consider it at the design stage. PETs are particularly suitable where there is large-scale collection and analysis (e.g. AI, IoT and cloud computing).

How should we decide whether or not to use PETs?

Perform a DPIA. Pay particular attention to the state of the art (i.e. is the PET sufficiently mature for your purposes?)

How do we determine the maturity of a PET?

Use Technology Readiness Levels or other models such as ENISA’s PETs maturity assessment. Consider industry standards – the ICO has created a useful list of those standards along with known weaknesses.    

The second part outlines some PETs and summarises their benefits, risks and implementation considerations. The PETs covered, along with the summary description of each which we’ve cut and pasted from the guidance, are:

Homomorphic encryption

Provides strong security and confidentiality by enabling computations on encrypted data without first decrypting it.

Secure multiparty computation

Provides data minimisation and security by allowing different parties to jointly perform processing on their combined data, without any party needing to share all of its data with each of the other parties.

Federated learning

Trains machine learning models in distributed settings while minimising the amount of personal data shared with each party

Trusted execution environments

Provide enhanced security by enabling processing by a secure part of a computer processor, which is isolated from the main operating system and other applications.

Zero-knowledge proofs

Provide data minimisation by enabling an individual to prove private information about themselves without revealing what it actually is.

Differential privacy

Generates anonymous statistics by adding noise to individual records.

Synthetic data

Provides realistic datasets in environments where access to large real datasets is not possible.

The ICO is at great pains to emphasise that this isn’t a comprehensive list and that the plan is to keep it updated as new PETs come online.

Meanwhile, the proliferation of PETs coupled with the need to balance innovation with privacy (after all, it shouldn’t be a zero-sum game) means that this new guidance will doubtless be well-received by organisations looking to decide whether the time is right for them to provide a PET with a loving home.  

The ICO’s consultation on the guidance closes on 16 September 2022.