Digital Privacy

Automated Hiring and Firing: How the Data Act will harm gig workers

The Data Use and Access (DUA) Act, which was finally passed last week, could particularly impact the estimated 4.4 million people who work for ‘gig’ economy platforms in the UK, who are classed as self employed, and therefore not protected by employment law. Instead, they rely on data protection law to protect their basic employment rights at work. The DUA Act brings in changes to subject access requests and automated decision-making that favour businesses but further undermine the rights of gig workers. This is just one group who will be harmed by the weakening of our data protection rights under this new law.

Waiting for your Data
The Impact of Extended Waiting Times for SARs

Workers for companies like Uber or Deliveroo have a contract, also known as a supplier agreement, which can be terminated at any time without notice or reason, and with no right to appeal. The lack of information leaves little room for these workers to challenge unfair dismissals. Workers have therefore relayed on Subject Access Requests (SARs) to get information from companies to help them understand why they have been fired.

Increasing the barriers to getting your personal data and allowing companies to adopt obstructive or delaying tactics significantly impacts these low-wage and precarious workers leaving them fighting on their own to access their data, to understand why their supplier agreement might have been terminated.

So the Labour government is throwing precarious workers under the bus to ‘ease burdens on businesses’, but is it really the right thing to do? And how is the economy supposed to grow if workers are denied job security and rights at their workplaces?

Companies may take longer to respond to Subject Access Requests, and they may not respond in full:

We have the right to submit Subject Access Requests (SAR) to government bodies, organisations and companies to request what data they hold about us. Under the UK GDPR, they had to respond within a month, Once the DUA Act takes effect, organisations will be able to ‘stop the clock’ and delay these responses, for instance if they struggle to identify the individual making the request. Also, the DUA Act will restrict the scope of access requests to ‘a reasonable and proportionate search for the personal data’.

These changes, which are supposed to ease the administrative burden for organisations will effectively disempower individuals: if an organisation has too much data about you, or if they struggle to find this data due to bad record management practices, it is your right to access that is restricted and their malpractice that is rewarded.

Fired with a Click
The Potential Impact of Automated Decision-Making on ‘gig’ economy workers

The new law makes it easier for companies to use ADM and this is likely to have a negative effect on workers in the ‘gig’ economy. App-based couriers and private hire drivers working for platform companies have their work regulated with algorithmic management; their work is assigned, regulated, penalised and disciplined by algorithms with little direct human interaction. The use of ADM has caused numerous disputes especially in relation to app-based workers accounts being de-activated. In 2021 an Amsterdam Court ruled that while the termination of four Uber drivers’ accounts had had some human supervision, the previous suspension of their account for an investigation period had been fully automated, in violation of the right not to be subject to ADM under Article 22 of the GDPR. Uber had to compensate the workers. A further court ruling in Amsterdam in 2023 stated that Uber did not provide enough explanation about the human involvement in decision making, nor did it contact the workers whose account had been terminated nor otherwise gave them the opportunity to express their point of view.

The use of ADM for the deactivation of accounts has also caused grievances in the UK. Trade Unions organised protests and publicly demanded further transparency of de-activation processes. In 2021 the Independent Workers Union of Great Britain (IWGB) filed a claim against Uber at the Central London Employment Tribunal on behalf drivers and couriers as allegedly the company’s automated facial-verification software was not able to effectively identify people with darker skin. Drivers were dismissed on the basis of failed verification attempts, without subjecting the data to a human review. At the time this campaign was supported by a cross-party group of 60 MPs who signed an Early Day Motion calling for a fair terminations process for app-based couriers and private hire drivers.

The DUA Act states that ADM can be used if the data subject is informed about the decision, and they can make representations, and can obtain human review or contest the outcome. Will this mean more or less rights for ‘gig’ workers?

The examples above are reasons to believe that exposing workers to the use of ADM against their will and agency will have a potentially worsening effect on workers rights and that companies will further make decisions like this. The burden of proof falls on individuals rather than the company. A simple glitch in the app or a phone with no battery, might lead to an automated decision that has serious consequences, such as deactivation or dismissal. In such cases, workers will be left to challenge these opaque decisions on their own, and might find themselves without a job or income, having to challenge a wrongful automated decision.

It’s easier for companies to use Automated Decision Making:

The DUA Act will also remove the right not to be subject to automated decision-making (ADM) in most circumstances. ADM, is defined as a decision taken where there is no meaningful human involvement. This will now be permitted by default, provided special category data is not processed i.e data about health, sexual preferences, political or religious views, trade union membership etc. However, the risks associated with automated systems do not necessarily depend from the use of this data, but from the purpose they are being used for. So an ADM that is meant to measure workers’ performanfce will inherently expose workers to the risk of being fired, regardless of whether special category data has not been used to make the decision to fire them.

The DUA Act, fortunately, will not remove a number of ‘safeguards’ which are currently provided by the UK GDPR. For instance, organisations will still have an obligation to provide a data subject with information about decisions in relations to the data subject; enable the data subject to make representation about such a decision; enable the data subject to obtain human intervention on the part of the controller in relation to such decision; enable the data subject to contest such decision.

However, these safeguards will not be underpinned by our right to choose not to be subject to an ADM in the first place, and the DUA Act will give the government powers to arbitrarily change or restrict any of these safeguards via Statutory Instruments. This does not only reduce agency and defences that workers have against ADM, but paves the way for their rights to be further watered down without meaningful Parliamentary scrutiny or public debate.

Automated Hiring
Efficiency vs. Fairness

ADM has been hailed as a game changer in recruitment practices for its cost-cutting, time-saving, and resource efficiency. For example, AI can be used to screen CVs and application letters to assess candidates, saving organisations time. But this raises questions about fairness and transparency. Biased data used to train AI recruitment systems can negatively impact individuals, particularly those from marginalised groups, who may be disproportionately affected by such algorithms.

These simple dynamic reinforces the need for a strong, rights based framework to favour the responsible development and use of these new technologies: by empowering workers to choose or refuse to be subject to the use of ADMs and other such tools, it is up to employers to be trustworthy and convince or demonstrate that these systems are secure and in the best interests of both parties. However, as the Labour government proceeds to gut the right not to be subject to ADM, workers are disempowered against employers’ decisions to use these tools irresponsibly.

Will the ICO step up for workers?

Passing the responsibility for fair working conditions from multi-billion pound companies to workers is clearly unfair. The ICO recently adopted a new ‘AI and biometrics strategy’, where they announced they will be working on a new statutory code of practice to among other things, ‘ensure that automated decision-making (ADM) systems are governed and used in a way that is fair to people, focusing on how they are used in recruitment’.

Unfortunately, the ICO has a poor record when it comes to standing up to government and businesses on behalf of the public. And under the new law, its mission has shifted from focusing on privacy and data protection to also include ‘promoting innovation and competition’. Indeed, at a recent APPG event John Edwards defended the government‘s proposals in the DUA Act that disempower workers and leave them without a choice against the use of ADMs by their employer.

The ICO does, however, still claim to be, and operate as, an independent authority. Also, they will be required by law to consult relevant stakeholders before issuing a code of practice: it will be up to workers, their trade unions and their representatives to stand up for workers and gig workers, challenge the regressive stance of the Labour government and the ICO, and demand that the ICO addresses the imbalance of power that the DUA Act has created.

Hands Off Our Data