
Online Safety Act Entrenches Big Tech Market Dominance, New Report Finds
A new report by digital campaigners, the Open Rights Group, outlines how the Online Safety Act favours big tech, and harms small websites and the general public.
How to fix the Online Safety Act: A Rights First Approach identifies key recommendations to ensure that potential harmful outcomes of the sprawling law are avoided.
The report’s key findings are:
How to fix the online safety act
Read the reportThe OSA disproportionately impacts small sites
Provisions to prevent children accessing certain content will disproportionately impact small and medium-sized businesses who may be required to conduct a risk analysis even if they are a sole trader or run a site on a voluntary basis. These sites will have to check if they have UK users, perform a risk assessment and put themselves at the risk of fines or personal liability for prison sentences should they fail to comply with Ofcom’s directives. Millions of sites could be impacted – there are, for example, over 450 million WordPress blogs, many of which allow user to user comments. Already small sites and services are closing, including a hamster forum, a local residents group, and a video game that had been running for 19 years.
Children’s right to access information could be impacted
The OSA’s provisions impact not-for-profit Wikipedia, which is taking legal action and seeking a judicial review of the Act’s proposals. Wikipedia meets the threshold of a Category 1 user-to-user service and will therefore be subjected to onerous provisions including an annual risk assessment and potentially having to introduce age verification to prevent children encountering certain content. As the report notes, the worst-case scenario is that Wikimedia is blocked in the UK and the foundation’s representatives are made legally liable for non-compliance. This will impact the British public’s right to access information, and will harm children as well as adults’ Article 13 freedom of expression rights to access information.
Big Tech dominance will be furthered by the Act
The OSA and Ofcom’s guidance do not tackle the centralisation of market power and dominance of a few platforms, nor do they attempt to empower users systemically. Therefore, the online harms approach risks failure on its own terms. The OSA interventions are acting as a force to further concentrate market power, as it is Big Tech that will have the means to comply with its provisions. This will exacerbate the underlying reasons for unwanted content to circulate.
The policies to tackle the most extreme kinds of content are unlikely to succeed, and there will be pressure for further action, in the fields of legal but harmful content. ORG calls on the government to concentrate on other measures designed to introduce market forces for users to choose platform experiences that favour trust and positive interactions over misdirection and provocation.
Age verification proposals create a new and insufficiently regulated market for handling sensitive data
Ofcom’s consultation discusses a number of methods of age verification that might be “highly effective”, noting that whatever method is adopted must be technically accurate, robust, reliable, and fair. Ofcom leaves the choice of solution up to service providers, who seem likely to subcontract third-party vendors to provide age assurance. There are serious privacy consequences in creating a large, new market for largely untested services that will handle sensitive data. Open Rights Group believes that service providers should allow users to choose the method and identity provider they use, this would enable users to choose a platform and provider they already trust with their personal data and create a market driver for companies to compete on security standards.
Jim Killock, Executive Director of Open Rights Group said:
“The Online Safety Act is a sprawling and broken piece of legislation. This complicated legislation favours Big Tech who have the means and capacity to comply with the Act’s onerous provisions. Meanwhile, small sites and not-for-profits such as Wikipedia are put in an impossible position.
“In trying to tackle the worst of the web we are harming the best of it. Instead of tackling online harms, the Act concentrates the market power of Big Tech companies, exacerbating the underlying reasons that harmful content circulates.”
Report author Dia J Kayyali said:
“As the UK follows in Trump’s footsteps by becoming ever more hostile to the rights of transgender people and less welcoming to immigrants, free expression online – including for LGBTQIA+ kids – is more important than ever. This important report lays out the ways the Online Safety Act could harm expression, at a time when it is more crucial than ever.”
Report author Dr Bernard Keenan said:
“The Online Safety Act imposes extensive duties on online services regarding content moderation, amplification, and age verification, while putting a raft of new administrative powers and criminal offences on the statute books. While recognising that the current market in social media and search can and does cause harm to users, we are concerned that the Act’s approach to solving them creates other risks to freedom of expression, privacy, and market structure.
“If the Act encourages platforms to over-moderate legal content, it may result in chilling effects, disproportionate impact on marginalised or persecuted groups, and disproportionate risks on smaller services. The technical risks to privacy created by age verification tools are also highlighted. Successful implementation will depend on ensuring alignment with human rights standards.”