EXPOSING INDIVIDUALS TO HARM AND DISCRIMINATION

1.4 Legitimate Interests

The Government propose to create a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test.

The balancing test is the thin veil that divides responsible uses from abuses. Removing it would subvert the nature of “legitimate interest” and effectively undermine the essential principles of legality. Furthermore, it would normalise uses of data that result in harm and discrimination to individuals.

Impact on Migrants

The legitimate interests of “reporting criminal acts or other safeguarding concerns” as well as “ensuring that records of individuals are accurate and up to date” would allow unprecedented freedom to collect personal data about migrants for reasons connected with immigration control.

It is also worth noticing that the Government envisage to update this list via regulatory-making powers: in essence, this would give them the power to write the rules they need to legitimise legal shortcomings of the Home Office. (see also on Section 1.3: Further Processing).

Impact on Workers

The legitimate interests of “reporting criminal acts or other safeguarding concerns” would make it easier to legitimise workers’ surveillance in a manner that is disproportionate to the fulfilment of the employment relationship.

Furthermore, the legitimate interests of “ensuring that records of individuals are accurate and up to date”, “internal research and development purposes, or business innovation purposes aimed at improving services for customers”, and “monitoring, detecting or correcting bias in relation to developing AI systems” would allow unprecedented freedom to gig employers to use workers’ data for the management of their platforms (see also on Section 1.3: Further Processing).

Impact on BAME, LGBTQIA+, other vulnerable groups

The legitimate interests of “internal research and development purposes, or business innovation purposes aimed at improving services for customers”, and “monitoring, detecting or correcting bias in relation to developing AI systems” would allow unprecedented freedom to sell data to third parties, as well as reuse data referring to sensitive characteristics under the guise of “de-biasing AI”. For instance:

Impact on NHS patients and medical data

The legitimate interests of “monitoring, detecting or correcting bias in relation to developing AI systems” would open the floodgates for the re-use of medical data to train and develop commercial AI application sunder the guise of ““monitoring, detecting or correcting bias in relation to developing AI systems”. For instance:

Google DeepMind is a company that develops commercial AI applications. In 2016, they collected health records of 1.6 million NHS patients to train their AI models. The ICO found that the parties did not use this data legally. Under Government proposals, this data could be used under the guise of “de-biasing AI”.

Our answer to section 1.4: Legitimate Interests

The Government propose to create a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test.

The balancing test is the thin veil that divides responsible uses from abuses. Removing it would subvert the nature of “legitimate interest” and effectively undermine the essential principles of legality. Furthermore, it would normalise uses of data that result in harm and discrimination to individuals.

In our answer to Q1.4.1, we explain how:

  • Scrapping the balancing test would subvert its very nature, and undermine the essential principles of legality.
  • Scrapping the balancing test would expose individuals to harms and data rights abuses.
  • It is false that over-reliance on consent is a product of regulatory uncertainty around legitimate interest. Consent fatigue is a widespread malpractice employed intentionally by malicious organisations.

In our answer to Q1.4.2, we explain how:

  • The Government list of activities that would be exempted from the balancing test is exceptionally broad.
  • The Government proposal undermines data rights in practice.
  • The Government would be given the power to unilaterally write legal bases that fit their own purposes.

The government therefore proposes to create a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test in order to give them more confidence to process personal data without unnecessary recourse to consent

Q1.4.1. To what extent do you agree with the proposal to create a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test?

We strongly disagree “with the proposal to create a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test” (Q1.4.1), and we urge the Government to withdraw this proposal.

The balancing test is the thin veil that divides responsible uses from abuses: scrapping it fundamentally subverts the nature of legitimate interest by legalising abuses and data uses that harm individuals. We also stress that:

  • Necessity and proportionality tests do not provide sufficient protection to individuals without the balancing test.
  • This proposal undermines the principle of legality.
  • This proposal is based on wrong and false premises.

Finally, we stress that purpose limitation is among the most abused lawful grounds for processing, as organisations routinely refuse to carry out the balancing test or downplay the risks for individuals. Rather than expanding its use, the Government should consider imposing a duty on organisations to make their Legitimate Interest Assessments publicly available.

Q1.4.1a. Please explain your answer, and provide supporting evidence where possible.

Scrapping the balancing test is an extraordinarily unsound proposal. It fundamentally subverts the nature of legitimate interest by providing a lawful basis for data uses that harm individuals or otherwise violate their rights.

Legitimate interest is different from the other lawful bases because, in principle, it allows any use of personal data without obtaining consent from the individuals concerned or meeting any other safeguarding requirement. However, organisations relying on this clause need to check if their envisaged use of data does not, in practice, harm individuals or trump their rights.

The balancing test is the thin veil that divides responsible uses from abuses, and must be retained. Doing otherwise would allow, in principle, every data use to be lawful, regardless of the harmful of discriminatory impact it may have on individuals.

Necessity and proportionality are no good replacement for the balancing test

It is worth mentioning that the fact that, under these clauses, “The processing would still have to be necessary for the stated purposes and proportionate” does in no way provide an adequate safeguard to abuses. First of all, where a balancing test is meant to answer whether a given activity is permissible or not, necessity and proportionality are questioning in what manner an activity can be carried out. In other words, the question shifts from “if” to “how”. Further, necessity and proportionality assessments inherently present grey areas, that are ill-suited to deal with the open-ended nature of legitimate interest.

This proposal undermines the principle of legality

Lawful grounds are meant to provide suitable safeguards against data-driven abuses or discrimination. For instance,

  • The lawful ground of “consent” (Article 6(1)a of the UK GDPR) ensures that individuals have been informed and have freely chosen to consent to the use of data. It also ensures that they can withdraw their consent at any time.
  • The lawful ground of contract (Article 6(1)b of the UK GDPR) prevents organisations that are in a contractual relationship with the individual from using his or her data for reasons that are not necessary for the fulfilment of the contract. In other words, it protects individuals from unexpected and abusive uses of their data.

Legitimate interest is an exceptionally flexible ground in that it can be used for any reasonable purpose. It also requires that a balancing test is conducted, to check that the rights and freedom of individuals are not overridden.

Normally, this will require that organisations relying on legitimate interest must implement transparency and accountability measures that allow individuals to know about the data use and exercise their rights. It will also require organisations to check that the use of data does not result in harmful or discriminatory outcomes.

On the other hand, scrapping the balancing test would allow any data use that can be linked to the list of activities listed by the Government to take place, regardless of the existence of suitable safeguards and the potential harms this may cause to individuals.

This proposal is based on unsubstantiated, false premises

This proposal is even more concerning because of the radically flawed reasoning that underpins it. Indeed, the Government moves from the premise, expressed at §57, that regulatory uncertainty would push businesses to over-rely on consent, thus subjecting individuals to consent fatigue. This premise is obviously false.

Firstly, legitimate interest is one of the most abused lawful grounds for processing.1 This suggest that the Government should require organisations to make Legitimate Interest Assessments publicly available, instead of removing them.

Secondly, the assumption that industry over-relies on consent because of uncertainty over what legal basis to adopt is untenable and clearly at stake with any available evidence. Rather, irresponsible and malicious businesses deliberately foster consent fatigue to extort consent from individuals who would otherwise refuse.

For instance, in the adtech field:

  • Consumers have consistently expressed their preference not to consent or to opt-out to online tracking, direct marketing, or other privacy-invasive practices.2
  • The data-driven industry relies on various manipulative and deceptive techniques, also known as dark patterns, which have been thoroughly documented and exposed over the years.3
  • Corporate lobbying has long opposed the introduction of legally binding signals that would allow Internet users to set their privacy preferences once, persistently, and in a user-friendly manner, thus resolving consent fatigue and restoring consumers’ agency.4
  • Another example is Facebook’s unseemly reaction to Apple’s implementation of a feature that, similarly to legally binding signals, allows iOS users to set their preferences via software by answering a clear yes or no question.5

It is clear that the irresponsible and malicious organisations fabricate consent-fatigue and are not interested in legal certainty, but in loopholes to leverage and exploit. Government proposals around legitimate interest will likely provide them with the perfect regulatory loophole for their malicious scopes.

Thirdly, an appropriate Government reaction in the face of deliberate and assiduous attempts to dodge legal requirements and violate individuals’ privacy and data protection rights should be to crack down on offenders and re-establish legal order. Contrary to this expectation, the proposal to remove or subvert the nature of legitimate interest effectively legalises these abuses and gives offenders the privilege of impunity.

Q1.4.2. To what extent do you agree with the suggested list of activities where the legitimate interests balancing test would not be required?

We strongly disagree with “the suggested list of activities where the legitimate interests balancing test would not be required” (Q1.4.2), as it is unacceptable not to require the balancing test for any activity conducted under the lawful basis of legitimate interest (see answer to Q1.4.1).

Furthermore, we provide evidence of how the Government proposed list of legitimate interests would:

  • lead to egregious harms and privacy violations inflicted to individuals
  • normalise other harms associated with the use of data
  • enable the Government to unilaterally write rules that meet their data uses

Q1.4.2a. Please explain your answer, indicating whether and why you would remove any activities listed above or add further activities to this list.

The list of suggested “interest” that the Government provides shows just how easy it is for anyone to claim a legitimate interest and on this basis disregard harms and violations of privacy rights.

We provide evidence of the potential to harm individuals for each activity the Government proposed.

Egregious harms harms and privacy violations enabled by the Government proposal

Using personal data for internal research and development purposes, or business innovation purposes aimed at improving services for customers”.

This interest is particularly representative of the arbitrary nature of this proposal as “to improve our services” is the single, most abused buzzword that irresponsible and malicious organisations rely on.

For instance, Bounty UK was fined 400.000£ by the ICO for selling 14 million records of mothers and children to advertisers and data brokers.6 In particular, the ICO found that Bounty UK did not obtain valid consent from the data subjects, nor they could claim a legitimate interest because of the lack of transparency and impact on the rights of the individual concerned.7

Scrapping the balancing test would have allowed Bounty UK to sell mothers and children’s records for the legitimate interest of “improving services for customers”. Lack of transparency, unfairness, and harm for the individuals involved would no longer be tested with the balancing test, and selling mothers and children’s data would become lawful.

Managing or maintaining a database to ensure that records of individuals are accurate and up to date, and to avoid unnecessary duplication”.

As it reads, this mean that anyone who for any reason has stored a record regarding a given individual would be allowed to collect further information about that individuals with no strings attached. No justification to persistently track someone’s activities would be needed, insofar it accomplishes the function of “maintaining records accurate and up to date”.

In other words, this ground would provide a licence to spy on individuals with impunity, in particular to organisations such as

  • Data brokers, whose business is to collect and sell personal data to insurers, employers, or essential service providers for reference checks.
  • Digital Advertisers and adtech intermediaries, whose business is to leverage on individual’s vulnerabilities, needs and desires to sell them products.
  • eCommerce stores, whose business would benefit from the ability to know how your habits and needs evolve over time.

In turn, this has the potential to exacerbate risks for individuals’ rights in virtually any setting where their data is used to make decision about them.

Reporting of criminal acts or safeguarding concerns to appropriate authorities”.

This interest is incredibly problematic because it enables surveillance and privacy breaches in any instance where someone can claim to have a “safeguarding interest”.

For instance, in a recent judicial case in the UK, some individuals installed cameras and other surveillance devices to harass their neighbours. The judge held that this was done in contravention with UK data laws.8 The Government, however, would give “malicious neighbours” a lawful ground to “report criminal acts and other safeguarding concerns” regardless of whether this trumps the right to privacy and quite enjoyment of their neighbours.

Furthermore, public and private bodies would have an almost unconditional right to share the personal data they store with authorities or other individuals who claim to have a “safeguarding interest”. This would have a precedented impact on UK residents, particularly minorities, migrants, political activists, and other vulnerable individuals who may rightly fear persecution or abuses from law enforcement.

Monitoring, detecting or correcting bias in relation to developing AI systems”.

Regardless of the purported intentions, this interest is another clause that will likely open the floodgates to abuses.

For instance, in 2016 Google DeepMind hit the headlines for signing an illegal data-sharing deal with the NHS Free Trust and collecting over 1.6 million medical records without NHS patients’ knowledge or consent.9 The issue was investigated by the Information Commissioner’s Office, which found that Google DeepMind “failed to comply with data protection law”.10 Recently, a UK law firm announced that they are bringing collective legal action against Google.11 Within this context, it has been noted how

We do not know—and have no power to find out—what Google and DeepMind are really doing with NHS patient data, […] Once our data makes its way onto Google-controlled servers, our ability to track that data—to understand how and why decisions are made about us—is at an end”.12

However, this clause would allow companies like Google to collect personal and even sensitive data under the pretext of “detecting and correcting bias”. Once collected, it would be exceptionally difficult for the individuals concerned to hold these corporations to account.

The Government proposal normalises further harms

Other legitimate interests grounds provided by the Government are even more difficult to justify, as they turn activities that are uncontroversial into a threat to the rights and freedom of individuals. For instance:

  • Delivering statutory public communications and public health and safety messages by non-public bodies” is easily justified “for the performance of a task carried out in the public interest” under Article 6(1)e of the UK GDPR. However, allowing any private bodies to deliver “public communications and public health and safety messages” in the absence of a public interest provides a ground that is unnecessary and could be abused.
  • Improving or reviewing an organisation’s system or network security”, as well as “Improving the safety of a product or service that the organisation provides or delivers”, are also activities that can easily pass the balancing test under the lawful ground of legitimate interest (article 6(1)f of the UK GDPR), or the legal duty to implement appropriate security measures (articles 6(1)c and 32 of the UK GDPR). However, removing the balancing test for these activities legitimises insecure, unreliable and disproportionate approaches in achieving this goal.
  • De-identifying personal data through pseudonymisation or anonymisation to improve data security” is already an obligation under the UK GDPR. Thus it can be easily carried out to comply “with a legal obligation”, for instance under article 6(1)c of the UK GDPR.
  • Using audience measurement cookies or similar technologies to improve web pages that are frequently visited by service users” could be enabled by aligning ICO Regulatory guidance with the EU, which already allows audience measurement that is not carried out with third-party services or other means that poses high risks for the individuals.13 However, scrapping the balancing test would legitimise e use of privacy-invasive audience measurement solutions that track individuals across different websites, such as Google Analytics.

The Government would gain the power to unilaterally write rules that accommodate their needs

Another reason of concern is that the Government envisages that this list “could be updated via a regulation-making power”. For instance, the legitimate interest of “Reporting of criminal acts or safeguarding concerns” appears to be designed specifically to enable controversial Government programmes such as

  • the deployment of Live Facial Recognition14
  • the expansion of the National Fraud Initiative15
  • invasive biometric surveillance of migrants.16

In other words, the Government would effectively be able to design the law that are supposed to keep their power in check. The fact that public and private bodies alike would be able to piggyback on the vast discretionary areas carved out by Government regulation-making power is but an aggravating factor.

1Bits f Freedom, A Loopholein Data Processing. Available at: https://www.bitsoffreedom.nl/wp-content/uploads/20121211_onderzoek_legitimate-interests-def.pdf

2Norwegian Consumer Council, Out of Control. Available at: https://www.forbrukerradet.no/undersokelse/no-undersokelsekategori/report-out-of-control/

3Norwegian Consumer Council, Dark Patterns. Available at: https://www.forbrukerradet.no/dark-patterns/

4Corporate Europe Observatory, Shutting down ePrivacy: lobby bandwagon targets Council. Available at: https://corporateeurope.org/en/power-lobbies/2018/06/shutting-down-eprivacy-lobby-bandwagon-targets-council?hash=keZ3nidpfZbeVQAaYUYoSTDAvjXndSjmZurmIQLAQeY

5The Wall Street Journal, Facebook Meets Apple in Clash of the Tech Titans—‘We Need to Inflict Pain’ . Available at: https://www.wsj.com/articles/facebook-meets-apple-in-clash-of-the-tech-titanswe-need-to-inflict-pain-11613192406

6ICO, Bounty UK fined £400,000 for sharing personal data unlawfully. Available at: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/04/bounty-uk-fined-400-000-for-sharing-personal-data-unlawfully/

7ICO, Monetary Penalty Notice to Bounty (UK) Limited. Available at: https://ico.org.uk/media/action-weve-taken/mpns/2614757/bounty-mpn-20190412.pdf

8BBC, Neighbour wins privacy row over smart doorbell and cameras. Available at: https://www.bbc.co.uk/news/technology-58911296

9The Register, Google AI gains access to 1.2m confidential NHS patient records. Available at: https://www.theregister.com/2016/04/29/google_given_access_to_reams_of_confidential_patient_information/

10The Register, Google DeepMind trial failed to comply with data protection – ICO. Available at: https://www.theregister.com/2017/07/03/google_deepmind_trial_failed_to_comply_with_data_protection_law/

11The Register, Brit law firm files suit against Google and Deepmind over use of hospital patients’ data. Available at: https://www.theregister.com/2021/09/30/royal_free_deepmind_representative_action_uk/

12National Library of Medicine, Google DeepMind and healthcare in an age of algorithms. Available at: https://pubmed.ncbi.nlm.nih.gov/29308344/

13Working Party Article 29, Opinion 04/2012 on Cookie Consent Exemption. Available at: https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2012/wp194_en.pdf

14Digital Freedom Fund, A Landmark Victory Against Police Use of Facial Recognition. Available at: https://digitalfreedomfund.org/a-landmark-victory-against-police-use-of-facial-recognition/

15Open Rights Group, NFI proposals seek to give police more powers. Available at: https://www.openrightsgroup.org/blog/nfi-proposals-seek-to-give-police-more-powers/

16See https://stopthescan.co.uk/