Policy paper

Overview of expected impact of changes to the Online Safety Bill

Updated 18 January 2023

Introduction

1. The Online Safety Bill delivers the government’s manifesto commitment to make the UK the safest place to be online, while defending free expression. The government has continued to consult with stakeholders since introducing the Bill into Parliament and as a result, a number of policy changes have been made.

2. The Bill as introduced was widely welcomed. However, some aspects caused significant concern amongst parliamentarians, free speech groups and members of the public, who argued that changes were necessary to ensure that free speech was not curtailed. The Digital Secretary announced changes to address these concerns while strengthening protections for children.

3. This document outlines the rationale driving the changes made to the Online Safety Bill during House of Commons passage and provides an initial, qualitative assessment of where changes to costs are expected and whether these changes are likely to be increases or decreases from the accompanying final impact assessment (IA),[footnote 1] which was published when the Bill was introduced. As required by the Better Regulation Framework, we will set out our detailed assessment of amendments to the Bill during its passage through Parliament in an enactment impact assessment which will be subject to scrutiny by the Regulatory Policy Committee (RPC) and published following Royal Assent.

Overview of changes

Illegal content duties

Rationale

4. Some stakeholders, including the draft Online Safety Bill Joint Committee, expressed concerns that the Bill was overly focussed on content removal, rather than safety by design. Others raised concerns that, as previously drafted, the illegal content moderation duties create uncertainty about how providers should determine whether content is illegal. This could result in providers either under-removing illegal content, or over-removing legal content.

5. Others raised concerns that the Bill will not adequately tackle activities such as breadcrumbing, where child sexual exploitation and abuse (CSEA) offenders are posting links or having conversations on a particular site, which are preparatory to a CSEA offence, which then might occur on a different platform, or even offline.

Requirements

6. The government has amended the illegal safety duties to include requirements on service providers to assess the risk of their services being used for the commission or facilitation of a priority offence to ensure that the Bill adequately addresses illegal activity and cross-platform harm.

7. To clarify the ‘illegal content duties’, we have also added provisions establishing how providers should determine whether content amounts to illegal content. These amendments provide greater clarity about how service providers should make judgements about content on their service, including whether or not it amounts to illegal content and must be removed.

8. We have also made amendments to the Bill to give Ofcom the power to require a company to use best endeavours to develop and/or source technology to prevent, identify and remove child sexual exploitation and abuse (CSEA) content on their platform. These amendments will ensure that Ofcom will have the explicit power to require platforms to source solutions and/or innovate to tackle online CSEA where necessary and proportionate.

Expected impact

9. Changes to the illegal safety duties give greater clarity to both providers and Ofcom in discharging the existing duties in the Bill, but do not change the government’s original policy intention and are therefore not substantially different from previous requirements in the Bill. Amendments to Ofcom’s CSEA notices are also in line with existing policy, although they have been met with concerns from some tech companies about the privacy and security implications of developing technology to scan for CSEA on private communications. We have included strong safeguards that will help mitigate concerns around this power. A more in depth assessment will be produced as part of work for the Bill’s enactment impact assessment.

Child safety duties

Rationale

10. There was concern about whether the child safety duties made it explicit enough that providers may need to use measures such as age assurance and enforce any existing age restrictions in order to deliver the child safety duties.

Requirements

11. Platforms in scope of the child safety duties must ensure children are protected from accessing harmful content and activity and they are required to set out how they will do this in their terms of service. As a result of the changes to strengthen the Bill, it is now an explicit requirement for services to set out in their terms any age restrictions for use of their service, and how these will be applied consistently.

12. It has been clarified that providers may need to use measures such as age assurance to protect child users from harmful content and activity. This will also mitigate the wider risks child users face on their service, as part of meeting the child safety duties.

Expected impact

13. Under the previous version of the Bill, the government already expected providers to include details on how they enforce any minimum age restrictions for use of the service, as part of meeting the child safety duties. Many platforms also already specify any age restrictions in their terms of service. As a result, there are not significant changes to the requirements set out previously.

14. The final IA provided an indicative estimate for the likely scale of the cost to business of use of age assurance of between £17.9 million and £89.6 million (central estimate = £35.8 million) in present value terms across the ten-year appraisal period. There are not expected to be significant implications for businesses beyond those already estimated as a result of the above clarifications. This indicative assessment will be developed further as part of the Bill’s enactment impact assessment.

Transparency, accountability and freedom of expression measures

Rationale

15. The Bill as introduced caused significant concern amongst parliamentarians, free speech groups and members of the public, who argued that changes were necessary to ensure that legal free speech was not curtailed. Concerns were particularly focused on the government labelling legal content as harmful, and indirectly incentivising companies to remove this.

16. Concerns have also been raised about companies arbitrarily removing content that does not breach the companies’ terms of service. Where companies do have terms of service in place, concerns have been raised that these policies aren’t always properly enforced, for example policies about prohibiting abuse and other harmful content. The gap between companies’ terms of service and what they do in practice creates uncertainty for users and reduces trust in the relevant companies. Users should be able to trust companies to keep their promises about how they will treat them and their content.

17. Many adults do not want to see certain types of content. Polling by Ipsos shows over four in five (84 per cent) adults in the UK are concerned about seeing harmful content - such as racism, misogyny, homophobia and content that encourages self-harm.[footnote 2]

18. Concerns have been raised by press stakeholders, including the Joint Committee, that further protections are required in the Bill to help ensure users’ access to news content is not compromised.

Requirements

19. The adult safety duties (‘legal but harmful’ provisions) have been removed from the Bill and replaced with new transparency, accountability and freedom of expression duties. Category 1 organisations will no longer be required to assess risks from and set terms of service in relation to legal but harmful content and activity accessed by adults. Category 1 organisations will instead be required to set clear terms of service in relation to the restriction or removal of user-generated content, and the suspension or banning of users on grounds related to user-generated content. These terms of service must provide sufficient detail for users to understand what content is and is not permitted on the platform.

20. Category 1 platforms will be required to provide optional user empowerment tools to give users greater control over the content they see. Following the removal of the ‘legal but harmful’ adult safety duties from the Bill and the ability for the Secretary of State to designate categories of ‘priority harmful content to adults’, companies will instead need to provide these tools for a new list of content categories which has been set out on the face of the Bill. This list covers content that encourages, promotes or provides instructions for suicide, self-harm or eating disorders, and content that is abusive or incites hate on the basis of race, religion, sex, sexual orientation, disability or gender reassignment.

21. The government has made a number of amendments to the Online Safety Bill in relation to content published by a recognised news publisher. This includes introducing a requirement on Category 1 services to notify a recognised news publisher and offer a right of appeal before taking action against its account or content, unless it constitutes a relevant offence under the Bill or the platform would incur criminal or civil liability by hosting said content. Category 1 services and Ofcom are also required to assess the impact the regime has on the availability and treatment of news publisher content and journalistic content.

Expected impact

22. While most platforms will already have some form of terms of service which outline acceptable use, and these are potentially business-as-usual activities, the final IA provided an indicative estimate for the likely scale of cost to businesses (including Category 1) from assessing and updating terms of service to comply with the duties. This was estimated between £17.8m and £33.6m (central estimate = £23.1m) in present value terms across the ten-year appraisal period. Category 1 platforms already have terms of service about content they remove or restrict access to, and the circumstances in which they ban or suspend users. However, the duty to ensure that the terms of service provide sufficient detail for users to understand what content is and is not permitted on the platform is likely to create some additional cost as in some cases, platforms will be required to make their terms of service clearer and more detailed.

23. The final IA also estimated indicative costs from potential additional content moderation required to comply with the duties of between £1,319.1m and £2,486.2m (central estimate = £1,902.6m). Given the new duties apply to all content which services remove or restrict access to instead of a defined list of priority harmful content to adults, there are likely to be additional costs to businesses from enforcing these terms of service.

24. It is possible, given that no businesses will be expected to assess the risks of legal but harmful content accessed by adults, that costs could be lower than previously estimated. However, as stated in previous impact assessments,[footnote 3] these differences are expected to be minimal.

25. The user empowerment duties are not substantially different from previous requirements in the Bill, however we will set out our detailed assessment of the expected impact in the enactment impact assessment.

26. Given the uncertainties on requirements, the final IA outlines a range of expected costs associated with assessing impact on freedom of expression and privacy of between £1.1m and £11.5m (central estimate = £2.7m). The cost of the additional requirement to include a section on availability and treatment of news publishers and journalistic content is likely to be captured within the upper bound of this range.

27. Some changes to the Bill will have implications for Ofcom’s operating costs, which will be met through an annual industry fee. The final IA provided an estimate of a 10-year profile of Ofcom’s operating costs, and DCMS will work with Ofcom to update this estimate to reflect changes made to the Bill since introduction, which will be included in the updated enactment IA.

Communications offences

Rationale

28. The harmful communications offence was one of three new communications offences included in the Bill, alongside the false and threatening communications offences. Since then, stakeholders and members of the public have expressed concerns that the harmful communications offence has the potential to produce unintended consequences on freedom of expression and potentially criminalise legal and legitimate speech on the basis that it has caused someone offence.

29. In addition to these offences, the Law Commission recommended that the government introduce a specific offence for sending flashing images to people with epilepsy with the intention of inducing seizures. Although some instances of sending flashing images would be captured by the Offences Against the Person Act 1861, a new offence was needed to specifically address this behaviour. This recommendation followed a public consultation, which included evidence from the Epilepsy Society about the severity of psychological harm caused by epilepsy trolling. The Law Commission recommended that the seriousness of this harm warrants a more serious response than standalone communications offences.

Requirements

30. The harmful communications offence has been removed from the Bill. To retain protections for victims of abuse, the government will no longer repeal elements of the Malicious Communications Act and Section 127 of the Communications Act offences, which means the criminal law will continue to protect people from harmful communications, including racist, sexist and misogynistic abuse.

31. In response to the Law Commission recommendations, the Bill will now create new offences to address sending or showing flashing images electronically to people with epilepsy intending to cause them harm. This offence carries a potential prison sentence of up to five years.

Expected Impact

32. A justice impact test was previously conducted which concluded that the introduction of the harmful communications offence would have a de minimis impact, and therefore the removal of this offence is not likely to have a significant impact on the criminal justice system.

33. New offences relating to flashing images are expected to result in additional costs to law enforcement and the criminal justice system. A justice impact test will be conducted to fully assess these impacts, the analysis of which will be incorporated into an updated Online Safety Bill impact assessment. These offences will however provide greater protection to victims and the community.