Data: a new direction Consultation Guidance

We are asking that organisations promoting equalities and combating discrimination respond to the Government’s consultation.

This page describes the strong and vital relationship between data protection and prevention of unfair and discriminatory outcomes, in order that you can easily respond to the consultation.

Within each topic below, we link to more detailed information and model anwers for the questions in the consultation. In the coming days, we will also provide model short responses for different sectors.

The advice on these pages is focused on the topics we believe are most important to organisations working on equalities and discrimination topics.

What the Government are proposing

Government plans to remove the GDPR are an unprecedented assault to the rights and freedoms of UK residents.

The GDPR protects individuals from harmful or discriminatory uses of their data. In the digital age, as more public and private services are delivered using and profiling personal data through automation and “Artificial Intelligence”, the role of data is central to fair outcomes.

As a result, GDPR is rapidly becoming a key tool for organisations seeking equal outcomes and preventing discrimination, at precisely the moment when technology is fuelling opportunities to exclude and reproduce prejudices at scale.

What Government are proposing, instead, is to give the freedom to use personal data even when it harms our mental health, fuel our addictions, violate our employments’ rights, or discriminate against minorities, because they want unfettered digital “innovation”.

Simultaneously, our rights to know, chose or challenge harmful or discriminatory uses of data are treated as an annoyance to get rid of. Victims’ access to remedies will be reduced, and offenders will benefit from grey areas and easy-to-game rules.

In the following, we expand on how Government proposals would

  1. undermine the right to data protection, and how it would enable harm and discrimination in the fields of: use of health data for commercial and exclusionary purposes; Government detection of fraud and maintenance of immigration control; credit scoring and access to essential services; digital marketing; and surveillance.
  2. remove remedies against abuses, and how it will disempower: workers from challenging abuses at work; students from challenging unfair algorithmic grading; civil society and journalism from investigating and exposing abuses.
  3. shield offenders from enforcement, and protect: offenders from outside scrutiny; toxic, exclusionary, discriminatory business practices from enforcement; the Government from independent oversight.

Undermining data protection leads to discrimination

The GDPR protects individuals from harmful or discriminatory uses of their data. It does so by imposing a duty on organisations to use our data in a legal, transparent, and fair way.

The DCMS proposal, instead, frames human dignity as an obstacle in the way of innovation. Organisations are given leeway to harm and discriminate under the guise of “unleashing the power of data across the economy”.

Allowing to reuse personal data for commercial purposes under the guise of research

Section 1.2 of DCMS consultation: Research Purposes. Our draft response is here.

We would expect scientific research to be trustworthy, ethical, and mindful of the concerns of individuals being researched. The DCMS proposal goes in the other way: they would twist the meaning of research provisions to allow reusing personal data for commercial purposes. 

For instance, scientific research could be used as a disguise to feed medical data to AI models, and develop health tech products. In the United States, this led to rampant discrimination against black individuals: these algorithms were penalising historically disadvantaged groups in order to reduce hospitalisation costs.

EXAMPLE – new legal ground for research and commercialisation of health data

Google DeepMind is a company that develops commercial AI applications. In 2016, they collected health records of 1.6 million NHS patients to train their AI models. The ICO found that the parties did not use this data legally. A UK-based Law firm is now bringing representative action on behalf of NHS patients, lamenting the breach of UK data protection laws.

However, the Government would establish a new legal basis for research, that risks allowing NHS medical data to be shared with private corporations without patients’ knowledge or consent.

Enabling data to be reused to harm and discriminate

Section 1.3 of DCMS consultation: Further processing. Our draft response is here.

Organisations and other third parties would be allowed to sell and reuse personal data more freely. Individuals would be exposed to harmful or exclusionary practices when it comes to commercial offers, the provision of services, and other life necessities.

Furthermore, the Government would be allowed to pass new laws and reuse data freely “in the substantial public interest”, lacking suitable safeguards. The existing list of “substantial public interests” ranges from “statutory and Government purposes” to “standards of behaviour in sport”, and it could be expanded indefinitely.

This widens the scope of potential abuses: Government system could exclude or discriminate unfairly, without the need to apply safeguards against abuses or to ensure that processing is “fair”.

EXAMPLE 1 – easing compatibility test and data brokers reuse of data

The ICO found that data being collected by Credit Reference Agencies (CRAs) for “statutory credit referencing” were later resold to advertisers for direct marketing. Advertisers used this information to exclude individuals from commercial offers based on their creditworthiness or other “undesired” characteristics being revealed by credit referencing data. Data brokers and CRAs stopped these practices following the ICO investigation, and an enforcement notice was issued against Experian for their failure to comply.

Government plans could end up legitimising the reuse of data in an underhanded manner by easing the compatibility test. In turn, this would affect one’s ability to enter or being offered a variety of commercial offers and in particular

  • tenancy contracts;
  • gas, electricity, internet or mobile phone contracts;
  • loan, mortgages and credit cards.

Credit reference data has been known to perpetuate race and class-based discrimination. It can also lead to exclusion of individuals who suffered hardship or were victims of fraud.

EXAMPLE 2 – reusing of data for a substantial public interest and Govt powers

Under the National Fraud Initiative, the Cabinet Office has been matching datasets from public records (death records, benefits claimants, employment and pension lists, credit reference data, immigration datasets) to detect fraud.

Liberalising the further use of data for reasons of “substantial public interest” would give the Government unprecedented power to apprehend records and data being stored by public or private organisations and look for “suspect” activities. Migrants are likely to be incredibly affected by such a regime, as they routinely hand over bank statements, utility bills and other documents to prove their right to work, to reside, or to rent in the UK.

This data would then be used to subject individuals to:

  • police investigations, arrests and prosecutions;
  • evictions;
  • employment dismissals;
  • suspension of benefits payments;
  • fines

Giving organisations a licence to harm and discriminate

Section 1.4 of DCMS consultation: Legitimate Interests. Our draft response is here.

A new legal basis would provide a loophole for using personal for a broad range of activities, spanning from “Reporting criminal acts and safeguarding concerns” to “improving customer services”. Under this new clause, organisations could claim a “legitimate interest” even if it causes discrimination, violations of individuals’ rights, or other adverse consequences.

EXAMPLE 1 – improving customers services

Bounty UK was fined £400.000 by the ICO for selling 14 million records of mothers and children to advertisers and data brokers. Central to the ICO decision was the fact that Bounty couldn’t claim a legitimate interest, because selling data without their knowledge or consent violated their privacy and harmed them.

Under this new legal ground, Bounty UK could sell mothers and children’s records for the legitimate interest of “improving services for customers”. In other words, they would be given a licence to profiling mothers and children and exploiting this data for commercial purposes without the knowledge or consent of the individuals being affected.

EXAMPLE 2 – reporting criminal acts or safeguarding concerns

In a recent judicial case in the UK, some individuals installed cameras and other surveillance devices to harass their neighbours. The judge held that this was done in contravention with UK data laws. The Government, however, would introduce a lawful ground to “report criminal acts and other safeguarding concerns” regardless of whether this trumps someone’s right to privacy.

Beyond empowering “malicious neighbours” to break the quiet enjoyment of their peers, data recorded in this way could be shared with the police to identify suspects, participants to a strike or a street rally etc… thus legalising a practice that was used in the United States to persecute political protesters from the Black Lives Matter movement.

Removing remedies against abuses and discrimination

The GDPR empowers individuals to know if and how an organisation is using their data, for what reason, as well as to delete, correct, or object to the use of this data.

The DCMS proposal, instead, treats our rights to know, choose and complain about how their data is used as an annoyance to get rid of.

Removing the right to human review of life-changing automated decisions

Section 1.5 of DCMS consultation: AI and Machine Learning. Our draft response is here.

Scrapping Article 22 of the UK GDPR would shift the burden to actively scrutinise automated life-changing decisions from organisations to individuals, who have no control or access over these system. Automated decisions tend to adversely impact those that are already discriminated against, by amplifying discrimination already present in society.

EXAMPLE – how article 22 has been used in practice

Imposing fees to exercise the right of access

Section 2.3 of DCMS consultation: Subject Access Requests. Our draft response is here.

Subject access requests have been an invaluable tool to promote accountability and enable individuals to challenge decisions or data uses that discriminated or harmed them. Imposing subject access requests fees would disempower individuals, and allow irresponsible or malicious organisations to profit by collecting fees from their own victims.

EXAMPLE – paying fees benefits crooks

Restricting the right to lodge a complaint

Section 5.6 of DCMS consultation: Complaints. Our draft response is here.

Victims would be required to try to resolve violations with offenders before complaining to the ICO. This would shift scrutiny from organisations to complainants, and how they lodged their complaints. Vulnerable groups, and individuals coming from disadvantaged backgrounds will be disproportionately affected.

EXAMPLE – restrictions on the right to lodge a complaint

The complexity and opaqueness of digital ecosystems does not always allow to identify an organisation to complain against. This is the case of adtech, or other complex and opaque digital ecosystems. Imposing a duty to contact the offender before lodging a complaint would effectively deny recourse against dodgy organisations.

Furthermore, an irresponsible or malicious organisation may leverage on the duty of the individual to contact them first to:

  • Sway and manipulate complainants
  • Bully, intimidate or threaten complainants
  • Shred evidence of malpractice

This will disproportionately affect vulnerable groups, and individuals coming from disadvantaged backgrounds.

Shielding data offenders against enforcement

Accountability and independent oversight are essential prerequisites to the right to data protection, as it allows to hold offenders accountable and promote compliance with the law.

The DCMS proposal, however, would compromise both accountability and the effectiveness and independence of the ICO. This will make it easier for organisations to discriminate, and more difficult for victims or the ICO to audit these practices. The ICO would also be controlled by the Government, undermining its ability to hold the public sector to account for their failures.

Scrapping the accountability framework, and replace it with east-to-game privacy management programmes.

Section 2.2 of DCMS consultation: Reform of the accountability framework. Our draft response is here.

Under the UK GDPR, organisations have a range of flexible, risk-based duties to perform to demonstrate compliance, such as conducting Data Protection Impact Assessments. With privacy management programmes, organisations would be free to decide how to demonstrate compliance. This would deny individuals of useful grounds to expose and challenge harmful or discriminatory practices.

EXAMPLE – privacy management programmes

In the examples made above, offenders would be able to claim that they implemented a “privacy management programme which includes the appropriate policies and processes for the protection of personal information”, or that they relied on unsubstantiated “risk assessment tools for the identification, assessment and mitigation of privacy risks across the organisation”. Rathen than asking organisations to demonstrate they conducted “due diligence”, the burden to prove that these token measures are not adequate would fall on the individuals being discriminated.

Tasking the ICO with a new duty to have regard of growth and innovation

Section 5.2 of DCMS consultation: Strategy, Objectives, and Duties. Our draft response is here.

The relationship between toxic business practices and harmful use of data is well established: data-driven businesses are amassing an ever-growing amount of information regarding our habits and daily interactions to draw profiles of our desires and characteristics. This information is then leveraged upon to treat us differently, enabling discrimination.

It follows that the ICO cannot be expected to priorities the profits of the offenders against the rights of the victims. This would equal to incitement to abuse, and it would encroach harmful business practices instead of promoting better ones.

EXAMPLE — adtech resisting enforcement and change with economic arguments

Claiming that law enforcement or regulation will hamper economic growth is the single, most abused scarecrow by businesses that do not wish to abide by the law. For instance,

It is worth reminding that data-driven advertising is used to routinely discriminate and harm individuals based on their race, gender, sexual orientation, health conditions, political opinions and religion. The same is true for Facebook, that allowed advertisers to exclude women or black individuals from housing and job advertisements. Unsurprisingly, both the Unites States and the European Union are discussing a general ban on these illegal and harmful practices.

However, thanks to Government plans the ICO would likely have to tolerate ensuing discrimination and harm instead.

Empowering the Government to dictate the priorities of the ICO and amend the Commissioner’s salary without Parliamentary approval

Section 5.2 & 5.3 of DCMS consultation: Strategy, Objectives, and Duties & Governance Model and Leadership. Our draft response is here.

The role of a watchdog is incompatible with the notion of taking orders from the Government they must supervise. Further, empowering the Government to amend the salary of the commissioners as they please would have an obvious potential for retaliation and a chilling effect.

EXAMPLE – the Government is a repeated offender

The UK Government was the offender in many of the examples we saw above (racist VISA algorithm, A-level, dystopic Live Facial Recognition, NHS Free Trust and Google DeepMind).

This move clearly follows a pattern, where the Government is trying to undermine Judicial Review or rewrite the judgments they don’t agree with. Needless to say, none of these proposals are compatible with a democratic country governed by the rule of law. The same standard is not acceptable when it comes to Government abuses in the use of UK residents’ data.