Joint Committee advice cannot fix flawed Online Safety Bill

On 14 December 2021, the Joint Parliamentary Committee charged with scrutinising the draft Online Safety Bill published its report.

In a blog post published that day, ORG director Jim Killock highlighted the fundamental flaw in the government’s approach to tackling online harms. The draft Online Safety Bill attempts to address the symptoms of a dysfunctional system, without addressing its underlying cause: The “attention market.”

Thus we can see the ‘attention market’ as essentially being a problem of loss of control of personal data – a data protection issue – combined with lack of choice about how we interact with particular audiences – which is a competition issue.

Until control and choice are restored to users, the causes of harm online will remain.

The report of the Joint Committee acknowledges that the business model of the most successful social media companies, and the design of their platforms, contributes to the problem that the Bill hopes to solve (paragraphs 36-41). One interesting (and central) recommendation of the committee is that the new law should refer to “regulated activity” rather than “harmful content” (para 68). This would ensure that Ofcom were able to scrutinise, and issue Codes of Practice for, the algorithmic amplification and frictionless content technology (for example, auto-play on YouTube) that sends unwitting users towards extremist content and disinformation. The committee even goes so far as to itemise certain design features that should be named on the face of the Bill.

Unfortunately, the committee does not place a similar emphasis on restoring control of personal data to the user. It does suggest that Ofcom liaise with the Information Commissioner on whether ‘data profiling’ is legal (para 112), but other recommendations on user-data appear more concerned with enabling law enforcement to unmask anonymous users (para 94) rather than a wider concern for privacy rights.

The Committee also appears content with the protections offered by the UK GDPR, and says nothing about the fact that the government is currently manoeuvring to water-down those protections. This misplaced faith has been further undermined by the recently published review of the Human Rights Act, which further reveals the government’s antagonism to Article 8 rights of its citizens. The committee recommends a dedicated route through the courts for users to enforce their data rights (paragraph 460), but this will be meaningless if users have no actual rights to enforce.

International co-operation

In a two paragraph section on ‘international co-operation’ the committee recommends that “Ofcom should have the power on the face of the Bill to share information and to co-operate with international regulators at its discretion.” This is an important recommendation, but one that perhaps is already in effect, given that the Trade and Co-operation Agreement between the EU and the UK includes the establishment of a Committee on Services, Investment and Digital Trade. The first meeting of that committee took place on 11 October 2021, where the parties “exchanged updates on their respective online safety policies” [PDF].

Such co-operation is crucial. The European Union is currently developing its own Digital Services Act that covers much of the same territory as the Online Saftety Bill. If the two regulatory regimes diverge, then the big tech companies will have to choose which set of rules to follow. They will almost certainly prioritise the EU rules over the UK. This could lead to an embarrassing situation where Ofcom is simply ignored, or where certain features, functions and content on social media are simply switched off for UK users. If the British government wants to create “world leading” regulatory system then it must do the diplomacy to ensure that where the UK leads, other countries follow. Otherwise the “ground breaking” Online Safety Bill will simply break the Internet for British users.

Curtailing the Secretary of States Powers

As explained in an earlier post, the draft Bill hands the Secretary of State unprecedented powers to interfere in the work of the regulator. The tone of the Joint Committee report suggests that it thinks that the powers handed to the secretary of state are too wide. However, many of its recommendations would do little to curtail the powers. For example, the committee recommends (at paragraph 148) that the Secretary of State’s power to designate “priority illegal content” (currently set out at clauses 41(4) and 44 of the Bill) should be used “only in exceptional circumstances.” However, “exceptional circumstances” is in the eye of the beholder and depends on who makes that judgement. It could be argued that the clause as it stands is designed to cater for “exceptional circumstances” already — in which case the recommendation would do nothing to narrow the scope of the Ministerial power.

A similar problem appears with regards to the powers to designate otherwise illegal content as harmful. Paragraph 180 of the report recommends the new parliamentary committee should scrutinise any new designations, before the Secretary of State can proceed to make a statutory instrument. That committee would need to have special regard to freedom of expression. Such an amendment to the clause would be better than nothing… but it would be no substitute for a full parliamentary debate.  The DDCMS and the Joint Committee both acknowledge that “designation” of any kind of content will result in new curbs on free speech. Such interference with our fundamental rights should be the subject of primary legislation, not statutory instrument. The Joint Committee could have been much bolder in its recommendations to limit ministerial power.

By contrast, the Joint Committee takes a strong and unequivocal stance with regards to the Secretary of State’s power to modify the Codes of Practice, and issue ad hoc guidance to Ofcom. It makes a simple recommendation that those powers (currently found at clauses 109 and 113) be removed from the draft Bill. This would be a welcome change to the legislation, and would restore some of the independence that a regulator should enjoy.

End-to-End Encryption

The Joint Committee’s report cites a warning from the Information Commissioner office that “E2EE and online safety should not be seen as a false dichotomy” (para 254). However, the report does at times slip into precisely that binarism, suggesting that the benefits of E2EE in terms of protecting individuals’ freedom and privacy requires “balancing” with online safety requirements. Such an approach overlooks the fact that in many situations, such as protection from online scams and identity thieves, E2EE is precisely what delivers online safety.

The report also notes that there are no easy solutions to tackling illegal content shared on E2EE channels, because “mature technical solutions are not widely available” (para 254). It recommends designating E2EE as a “risk factor” for which the technology companies should mitigate. This approach could unnecessarily stigmatize a fundamental technology upon which a great deal of commerce and innovation depends.

The Committee also asks the government to provide more clarity on how it wants providers to deal with E2EE (paragraph 257). Unfortunately, this clarity is unlikely to come before the Bill is officially introduced to Parliament next year. The Government’s stated policy is a tautological demand that end-to-end encryption be readable by those who intercept it.

Anonymity

Political demands to end online anonymity have become a feature of public life. ORG has been a consistent defender of the right to anonymity and pseudonymity, and the arguments are well rehearsed: For many people, it is the only way they can properly exercise their right to freedom of expression, whether that is exploring aspects of their own identity or expressing dissent at the behaviour of people in power. For activists living under dictatorships, anonymity can mean the difference between liberty and imprisonment, life and death.

Thankfully, the government is not proposing to end online anonymity. When the Secretary of State for Digital, Culture, Media and Sport Nadine Dorries gave evidence to the Joint Committee, it was clear that she had been persuaded that any attempt to end to anonymous comments would be unworkable and illiberal. The report of the Joint Committee also gives short shrift to the idea that people can or should be forced to post content only in their own names.

Instead, it makes the recommendation that the Ofcom Code of Practice includes a requirement that any process that reveals users’ identities will also uphold a minimum standard of privacy, to protect (for example) LBGTQ+ users and those living in repressive regimes (para 94). While, again, this is welcome, it is not the last word on the matter. Big tech companies are not above abandoning their users human rights if means securing continued access to large markets; and what happens when the law enforcement agency seeking to unmask a whistle-blower is operating closer to home?

Once more, the approach from government (abetted by the Joint Committee), is to propose measures that will significantly encroach on our rights to free speech, privacy and data security… and then bolt-on some words of mitigation afterwards. Elsewhere in the report, the committee recommends “safety by design.” It would be better if those drafting the legislation embedded “human rights by design” too.

Clause 11

Perhaps the most significant recommendation of the committee is the modification of clause 11 from the Bill (para 176). This is the provision that imposes a duty on tech companies to protect adult users from harm arising from otherwise legal content. During the committee evidence sessions, the clause was criticised for being too wide: the only way the social media companies could possibly abide by its terms would be to throttle a significant amount of legal speech.

The committee recommends a slightly less onerous duty: to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities. “Reasonably foreseeable” is a phrase that must itself be defined, but it is, at least, a concept that appears elsewhere in the common law. Moderating the harmful content will still be a difficult and perhaps insurmountable endeavour, but the committee’s recommendation will at least give that task some boundaries.

Of course, any provision that threatens social media companies with fines (and criminal liability for their executives) if they do not heavily moderate certain types of legal content will necessarily violate the principle that “what is legal offline should be legal online.” A better approach would be to remove “legal but harmful to adults” duty from the Bill altogether, allowing Ofcom and the social media companies to focus on the task of tackling illegal content and protecting children.

The proposed reforms to clause 11 are emblematic of the Joint Committee’s response to the Bill. It has rejected the demands that the Bill should be even tougher on online speech (for example, banning anonymity) and it has not endorsed the unusually broad powers that the Government wants to put in the hands of the Secretary of State. This is to be applauded.

Nevertheless, the Committee has remained within the conceptual box that the Government has drawn around this issue. It has proposed tweaks and incremental changes, but it has not challenged the wisdom of leaping straight to a system of full state regulation.

As such, it is a tacit endorsement of a system that is likely to lock the state and the technology companies into a hellish marriage of control. When the social media platforms find that they are unable to meet the impossible moderation and technological demands placed on them, the political pressure for the Government (through Ofcom) to “do more” will only grow… and the framework set out in the draft Bill will enable even greater interference into online speech. Government policy priorities will gradually displace the law as the primary means to determine what we can say and do online.

Hear the latest

Sign up to receive updates about Open Rights Group’s work to protect our digital rights.

Subscribe