Guidance

Online Safety Bill: government amendments at Lords report stage

Published 30 June 2023

The government tabled a number of amendments to the Online Safety Bill on 29 June, ahead of its House of Lords report stage. These changes will improve the legislative framework established by the Bill, and ensure that technology companies are held accountable for keeping internet users, particularly children, safe. The government will continue to table further amendments to achieve these aims later during parliamentary passage, and the details of these amendments are set out below.

These amendments (tabled on or before 29 June) follow extensive debate and engagement with parliamentarians and stakeholders during the Bill’s passage thus far.

Overarching introductory statement

The government is introducing a clause at the start of the Bill in the form of an introductory statement. This is to provide clarity on the general purpose and overall approach of this important and novel legislation.

Age assurance and age verification

The government has proposed a significant package of amendments on age assurance. These amendments will strengthen the protections for children from online pornography by requiring providers who publish or allow this content on their service to use age verification or age estimation to protect children from it.

The Bill will also now set a clear, objective and high bar for effectiveness that these measures need to meet when determining the age of a user. Age verification or age estimation will need to be highly effective in correctly determining whether or not a user is a child.

These government amendments will:

  • require providers which publish pornographic content to use either age verification or age estimation which meets the high bar to prevent children accessing it
  • require tech companies that allow users to share pornographic content on their site to use age verification or age estimation which meets the high bar to prevent children accessing pornographic content, where they identify such content on their service
  • require Ofcom to set out guidance on the kinds of measures that are effective in meeting the new standard
  • require Ofcom to publish a report on in-scope providers’ use of age assurance technology
  • set clear principles for the use of age assurance on the face of the Bill, which Ofcom will be required to consider when producing codes of practice for services which allow user interaction, and guidance for sites which publish pornography. There will also be robust definitions of age assurance, age verification and age estimation
  • require dedicated pornography sites (regulated under Part 5 of the Bill) to publish details of how they are protecting children from pornography
  • clarify that the duties in Part 5 of the Bill apply to all pornographic content that is generated or displayed on Part 5 services. This includes content from provider-controlled AI-empowered bots (i.e. generative AI) to address concerns in this area

Finally, in recognition of the vital protections that this regime will bring for children online, the government is setting a clear timetable for the implementation of the child safety duties, illegal content duties and Part 5 duties by obliging Ofcom to produce draft codes of practice for Part 3 and relevant guidance for Part 5 within 18 months of the Bill reaching Royal Assent.

Put priority harms to children on the face of the Bill

The objective of the Online Safety Bill is, above all, to protect children from dangerous content online. This includes material such as pornography, content encouraging or promoting suicide, self-harm or eating disorders, content depicting or encouraging serious violence, or cyberbullying. A government amendment will put the categories of ‘primary priority’ and ‘priority’ content that is harmful to children on the face of the Bill. These categories are set out below.

‘Primary priority’ content that is harmful to children

  • pornographic content
  • content that encourages, promotes or provides instructions for suicide, self-harm, or eating disorders.  These categories will capture content that romanticises, glamorises or portrays suicide, self-harm or eating disorders as desirable. It will also capture content that provides methods or advice on practising these harmful activities

‘Priority’ content that is harmful to children

  • content that depicts real or realistic serious violence or injury. This includes depictions of physical fighting where there is serious physical harm to the individuals depicted, serious violence to an animal, and graphic depictions of a serious injury such as an open wound
  • content that encourages, promotes or provides instructions for an act of serious violence. For example, the glorification or glamorisation of violence
  • content which encourages, promotes or provides instructions for a challenge or stunt highly likely to result in serious injury. For example, content encouraging asphyxiation challenges
  • content that encourages the ingestion, inhalation or exposure to harmful substances. For example, the promotion of dangerous DIY abortion methods
  • content that is abusive or incites hate on the basis of race, religion, disability, sex, sexual orientation, or gender reassignment. This will include legal forms of racist, homophobic, and sexist abuse
  • bullying, including threats, and degrading or humiliating content

This change makes clear which types of content platforms will need to prevent under-18s from accessing, and which content they will need to provide protections from to reduce the risk of age-inappropriate material being accessed. This will also allow the implementation of the Bill to progress more quickly.

Bereaved parents’ / coroners’ access to data

New government amendments will enable Ofcom to require information relating to a child’s social media use if requested to do so by a coroner, and to produce expert reports for a coroner on request. The changes will make clear that Ofcom can share information with coroners without business consent. Furthermore, the amendments will require Category 1, 2A and 2B service providers to have clear policies for disclosing data regarding a deceased child, and mechanisms for responding to disclosure requests from parents or guardians.

Ensuring government accountability over statutory codes

These amendments will replace the ‘public policy’ wording in clause 39(1)(a) with a more clearly defined list of reasons for which the Secretary of State can require Ofcom to modify a code of practice. This list will comprise national security, public safety, public health, and the UK’s international relations and obligations, aligning directly with existing powers the Secretary of State has over Ofcom in section 5 of the Communications Act 2003. The amendments will also clarify that the Secretary of State can only intervene for exceptional reasons, and will increase transparency of the use of this process by requiring details to be published at the time it is used.

Adult safety

The government has proposed an amendment that will strengthen the user empowerment content features by requiring Category 1 providers (the largest user-to-user services) proactively to ask their registered adult users whether they would like the content tools to be applied. This will address concerns about the accessibility of these tools, particularly for adult users with disabilities and young adults who will no longer have the child safety protection. Importantly, however, this amendment retains the concept of choice for adult users over the legal content they view.

Our changes will also provide users and Ofcom with more information about how content is accessed and transmitted. Amendments to the Bill will allow Ofcom to require companies to provide transparency reports about content that is subject to the user empowerment duties, providers’ terms of service, and about the algorithmic design which may affect the likelihood of encountering this content. Further amendments will require Category 1 services to carry out a much more detailed assessment of the likelihood of their service hosting the different kinds of user empowerment  content, and publish a summary of their findings. This will help ensure that platforms offer appropriate tools to users based on this assessment, and will increase platforms’ accountability to Ofcom and their users where they fail to do so.

Intimate image abuse

The government has also tabled amendments to the Online Safety Bill on the sending and sharing of intimate images, fulfilling the commitments made during passage through the Commons. These amendments are based on recommendations from the Law Commission in their report “Taking, Making and Sharing Intimate Images”, published in July 2022, and mean the law will be better equipped to tackle the harm posed by intimate image abuse. The new offences, which extend to England and Wales only, will ensure victims have the additional protection they deserve and confidence in the law when coming forward to report such abuse.

The government amendments will repeal the offence of disclosing private sexual photographs and films with intent to cause distress (commonly known as the “revenge porn” offence).  It will be replaced with four new offences: a new “base offence” of sharing an intimate image without consent; two further offences of sharing an intimate image with intent to cause alarm, humiliation or distress, or for the purpose of obtaining sexual gratification (with the possibility of the offender being subject to notification requirements, commonly referred to as being on the “sex-offenders register); and a new offence of threatening to share an intimate image. Across all these offences, the government is adopting the Law Commission’s approach to defining an intimate image to cover images that are nude, partially nude, sexual or of a toileting nature. Most importantly, the government will extend the definition of a photograph/film to include manufactured or manipulated images (including so-called ‘deepfakes’); and make provision for victims to qualify for special measures and anonymity.

The government also aims to bring forward a broader package of offences that will address both taking and sharing intimate images, as originally recommended by the Law Commission, when Parliamentary time allows.

The amendments detailed in this statement will ensure that proportionate and future-proof duties are established by the Online Safety Bill. These changes will strengthen protections for both children and adults, in order to achieve the Bill’s objective of making the UK the safest place to be online.

Tackling violence against women and girls

Tackling violence against women and girls is a priority for the government – at home, on the streets, and online. To strengthen the protections for women and girls in the Bill, the government is announcing that the Bill will include a requirement for Ofcom to produce guidance which summarises in one clear place, measures that can be taken to reduce the risk of harm to women and girls, and demonstrate best practice. Ofcom will be required to publish this guidance. The amendment will require Ofcom to consult when producing the guidance, ensuring it reflects the views of experts, women and girls.

Broadcast amendments

These amendments to the exemption for recognised news publisher content in the Bill will clarify that recognised news publishers’ produced content that is distributed by themselves or users is exempt from providers’ duties. However, as soon as a third party user edits or modifies it in any way, it is no longer exempt. In addition, minor amendments have been tabled to tweak the meaning of the word ‘broadcast’ to cater for the BBC’s internet-enabled transmissions.

Child Sexual Exploitation and Abuse (CSEA) reporting / data retention

This amendment seeks to make a provision to include in regulations a requirement for companies to retain the Child Sexual Exploitation and Abuse (CSEA) data included in reports made to the NCA and associated account data under clause 59. This change brings the CSEA reporting requirement in line with international standards.

List of additional government amendments to be tabled later during parliamentary passage

1. Fees / funding changes - provider definition

This amendment will clarify the fee regime clauses to enable Ofcom to take into account the revenue of a group undertaking in relation to the provider which accrues revenue from a regulated service, rather than only the revenue of the provider of a regulated service.

2. Media Literacy

The Bill will now update Ofcom’s statutory media literacy duty under the Communications Act to introduce new objectives relating specifically to regulated services. This includes: building public resilience to disinformation; and requiring Ofcom to publish a media literacy strategy every 3 years, with annual reports on progress towards the strategy. The changes will also add requirements for Ofcom to carry out, or commission others to carry out, activities to raise the public’s media literacy regarding regulated services; and publish a statement of recommendations on media literacy for stakeholders, including tech companies and other organisations delivering media literacy initiatives.

3. Senior Management Liability

The government previously introduced a new criminal offence ensuring that tech executives that fail to comply with Ofcom’s requirements in relation to the child safety duty can be held to account.  This amendment will extend this offence so that tech executives can be held to account if they fail to effectively tackle Child Sexual Exploitation and Abuse (CSEA) content. This will ensure that the offence effectively captures the most harmful content and actions related to children, while ensuring that the offence is proportionate and directed towards keeping children safe online.

4. App stores

This amendment will establish a duty on Ofcom to undertake research into the role of app stores in children accessing harmful content.

5. Algorithmic assessments

These amendments will add an express power to the existing Ofcom information gathering power (clause 91) and amend Ofcom’s powers of entry, inspection and audit (Schedule 12) to allow Ofcom to access services remotely in order to observe demonstrations or tests. These changes are particularly important for understanding how algorithmic systems function and therefore assessing providers’ compliance with the Bill. This would align Ofcom’s powers with those of the Competition and Markets Authority under the current proposals of the Digital Markets, Competition and Consumers Bill.

6. Researcher access report and guidance

These amendments will require Ofcom to publish its report into researcher access to information within 18 months, and will require Ofcom to publish guidance on this issue, including how to improve access for researchers in a safe and secure way.

7. Dispute resolution and user redress

This amendment will place a requirement on Ofcom to undertake a user redress review, assessing the sufficiency of the Bill in relation to content reporting and complaints functions, and provide the Secretary of State with a report setting out its findings. Following Ofcom’s report, the Secretary of State will have a discretionary  deferred power to commence a new requirement on Category 1 services to offer impartial out of court dispute resolution systems for the resolution of specified complaints, akin to similar provisions in the video-sharing platform regime.

8. Changes to the level of scrutiny

In response to the recommendations from the Delegated Powers and Regulatory Reform Committee, the government will table amendments to include further parliamentary scrutiny of the qualifying worldwide revenue and revenue threshold definitions. The government will also make the procedure specifying the first regulations of the Category 1 threshold subject to the affirmative procedure. Subsequent changes to the threshold will remain under the negative procedure.

9. Interpretation of ‘harm’ and ‘functionalities’

These amendments will clarify on the face of the Bill that harm can be caused by the cumulative volume of harmful content, as well as the nature of the content, and that a combination of functionalities can drive up the risk of harm. Together with the overarching statement and the categories of harmful content to children, these amendments will underline that the regulatory framework is focused on systems and processes.