Digital Privacy
14 Jan 2026 Jim Killock
Techno-Permacrisis
Musk’s latest venture, image generation in Grok that until Wednesday lacked sufficient guardrails to prevent the easy production of non-consensual sexual images and even child abuse images, provoked an Ofcom investigation and further EU Commission action as well as the promise of UK emergency legislation against apps that provide such images in less than a week.
Failure to Regulate AI
It is hard to explain Musk’s motivations or what he believed he was achieving. Nevertheless, the harms of the concentration of power that Musk represents could be tempered by AI regulation. The European AI Act, which is unfortunately not yet fully in force, will require service providers to understand and mitigate risks. Labour have so far not gone ahead with an AI Bill, under pressure from US AI and cloud providers, who have sold them a story of low regulation encouraging UK economic growth; something they should now revisit. Billions were promised.
The government is now, understandably, advancing specific, emergency legislation. Given the difficulties applying guardrails to AI, for instance the ease by which users can bypass guardrails through adjusting the language of their prompts, next week’s legislation will be difficult to frame. It could also hit open weight AI models, if written badly.
Emergency legislation might now be unavoidable given such serious problems from Grok, but pro-active risk based regulation might have avoided this situation arising in the first place, or at least provided the levers for regulators to act promptly. A larger, comprehensive bill would give time and space to think these risks through, across other issues as well, such as AI’s use in criminal justice, benefits, finance, and employment.
Without an AI Bill, AI’s most visible failures in the UK, both now and in the future, should be understood as the predictable outcome of concentrated platform power combined with a decision not to regulate AI risk at a systemic level. Meantime, Labour’s push to deliver AI into policing, justice, migration policy, health and benefits is running its own serious risks, without a public discussion of what this means.
A Dependency Problem
There is a wider, uncomfortable question which is not yet central to the political debate. Musk, X and Grok are not some accidental rogue element. They are the extreme edge of an alignment between far right politics, billionaire controlled platform infrastructure, and the US government.
Labour needs to develop a strategy that tackles the effects of that relationship in the UK; but beneath the immediate reaction to Grok, the truth is that the Government is building the power of the tech giants, rather than tackling it.
Palantir have advanced in the NHS, and are now so deeply embedded in the Ministry of Defence that major contracts proceed without competitive bids. Tony Blair’s advancement of Larry Ellison’s Oracle globally raises questions of whether they might be involved with Digital ID. The Government has weakened data protection to favour private sector “innovation” rather than personal rights, and appointed Doug Curr from Amazon to run the Competition and Markets Authority.
Last week’s Cybersecurity Bill debate raised stinging questions about the risks of Digital ID and of the lack of any visible plan to ensure that the UK can even operate its own systems without the threat of off switches from foreign providers. It is unclear that the UK has any meaningful plan to encourage domestic provision of IT systems that the government can control, even given the economic benefits of encouraging the UK tech sector.
Whether we look at social media or the government’s relationship with its own technology, the UK is dependent on a small number of mostly US companies that are tied to a destructive form of politics and discourse. In both cases, the core problem is lock-in: users and institutions are trapped, and regulation is forced to operate through the very platforms causing harm. Government should seek to disrupt this abusive relationship by building the alternatives, especially open source alternatives which are not controlled by a single company.
Until we start to tackle this relationship of dependence, democratic erosion and economic extraction, the UK will be doomed to a cycle of techno-permacrisis. Illegal content, online abuse or democratic manipulation will raise calls for regulation. Content regulation will bleed into surveillance and the roll back of encryption technologies. Government will depend on the bad actors – Meta, X, TikTok – to implement content regulation while their business models continue to rely on low cost, high response clickbait. Regulation on these lines will not deliver a healthy online environment.
How to Stop the Permacrisis
The missing middle path is structural: creating real protection for customers through regulation and competition measures that let people escape from malign social media, rather than solely attempting to police behaviour within monopolised systems.
- Digital Sovereignty
We set out the first steps towards Digital Sovereignty in our briefing to Parliament. In short, start building up the use of open source software, so government isn’t tied to a single company to maintain and modify what they depend on; and commission software services with the principle of ‘Public Money, Public Code’.
- An AI Bill
The government should commit to a comprehensive AI Bill, to mitigate risks of harms and to rights across the public and private sectors. This should include a ban on the most harmful AI technologies, such as live facial recognition and technologies that criminalise populations, as well as protecting people from discrimination in sectors such as in employment and finance.
- Social media that is accountable to its users
We set out the steps the UK should take in our recent report, Making Platforms Accountable and Creating Safety. In short: create a competitive market where people can move out of any platform without losing who they follow and their followers; move government announcements to Mastodon and BlueSky; and stop funding X and Meta with time and ad money.
The answers exist. We need MPs with political courage to put them into practice. The alternative is techno-permacrisis.