Guidance

Fact sheet on changes to the illegal content duties within the Online Safety Bill

Published 23 August 2022

Summary

In response to engagement with a broad range of stakeholders, the government has made a number of amendments affecting the illegal content duties at clauses 8 and 9 of the Online Safety Bill. These amendments, added to the Bill in the House of Commons on 12 July 2022, will clarify:

  • that companies must take preventative measures, including design measures, to mitigate a broad spectrum of factors that enable illegal activity, both on and offline.
  • how service providers should make judgements about content on their service, including whether or not it amounts to illegal content and must be removed.

Background

1. The Online Safety Bill delivers the government’s manifesto commitment to make the UK the safest place to be online, while defending free expression. It will introduce statutory duties on user-to-user and search services, including duties to tackle illegal content. These duties will be overseen by an independent regulator, Ofcom.

2. The illegal content duties will require providers to proactively mitigate the risk that their services are used for illegal activity or to share illegal content (‘preventative duties’). Services will also be required to address illegal content once it appears on their service (‘content moderation duties’). Schedules 5, 6, and 7 of the Bill set out a list of ‘priority offences’, to which preventative duties apply. The content moderation duties apply to both priority offences and other relevant offences where there is an individual victim.

3. The government has continued to consult with stakeholders since introducing the Bill into Parliament. Some stakeholders, including the Joint Select Committee on the draft Bill, expressed concerns that the Bill was overly focussed on content removal, rather than safety by design. Others raised concerns that, as previously drafted, the illegal content moderation duties create uncertainty about how providers should determine whether content is illegal. This could result in providers either under-removing illegal content, or over-removing legal content. The government carefully considered stakeholders’ representations on these matters and has made a number of amendments to the Bill in response.

Amendments relating to the ‘commission or facilitation of an offence’ and service design (amendments 58, 59, 60, 61 and others[footnote 1])

4.These amendments require providers to assess the risk of their services being used to commit or facilitate the commission of a priority offence, and to design their services to mitigate the risk of this occurring. This will ensure providers implement preventative measures to mitigate a broad spectrum of factors that enable illegal activity on their platforms, including where this might not result in ‘content’ on a platform which is clearly identifiable as being illegal.

5.The amendments introduce an explicit duty on companies relating to the design of their services. Companies must take a safety-by-design approach, managing the risk of illegal content and activity appearing on their services, rather than focusing primarily on content moderation.

6. In addition, these amendments make clear that platforms have duties to mitigate the risk of their service “facilitating” an offence, including where that offence may actually occur on another site - such as can occur in cross-platform CSEA offending - or even offline. This addresses concerns raised by stakeholders that the Bill will not adequately tackle activities such as breadcrumbing, where child sexual exploitation and abuse (CSEA) offenders are posting links or having conversations on a particular site, which are preparatory to a CSEA offence, which then might occur on a different platform, or even offline.

Amendments relating to how companies should make judgements about content (New Clauses 14 and 15[footnote 2])

7. To strengthen the ‘illegal content moderation duties’, we have also added provisions establishing how providers should determine whether content amounts to illegal content. These amendments provide greater clarity about how service providers should make judgements about content on their service.

8. New clause (NC14) establishes that providers’ systems and processes should consider all reasonably-available contextual information when making judgements about whether content is content of a particular kind. This includes content judgements relating to other duties in the Bill (content of democratic importance, journalistic content, harmful-to-children and harmful-to-adults content), as well as content judgements relating to the illegal content duties. This is important as it will often be difficult for providers to make judgements about content without considering context.

9. With regard to illegal content and fraudulent adverts specifically, NC14 also requires providers to put in place systems that enable them to ascertain whether, on the basis of all the reasonably available information, there are reasonable grounds to infer that a) all the relevant elements of the offence, including the mental elements, are present; and/or b) no defence is available. This will provide greater certainty about the standard to be applied by providers when assessing content, including judgements about whether content is illegal content or a fraudulent advertisement. They give greater clarity to both providers and Ofcom in discharging the existing duties in the Bill, but do not change the government’s original policy intention.

10. To support these new provisions further, we have also added a new clause (new clause 15 or NC15) requiring Ofcom to issue guidance on illegal content judgements. We expect this will include examples of the kind of contextual and other information that is likely to be relevant when drawing inferences about mental elements and defences, and how far providers should go in looking for that information. This will provide greater certainty to companies about when they must take action against illegal content, particularly in relation to offences that rely on mental elements, or absence of defences.

11. The amendments do not affect the Bill’s focus on providers’ systems and processes, and the duties will be proportionate to providers’ size and capacity and the risk of harm posed to users. Providers will have to ensure their systems for making judgements about content (including any automated systems) are designed to take into account all the relevant information that is reasonably available to them. This could include, for example, appropriate training and guidance for moderators. When supervising providers’ compliance with their duties, Ofcom will not penalise providers for individual content moderation decisions. Rather, the aggregate performance of systems and processes, the way they are designed, and the overall approach they take to assessing content, will be relevant.

Annex - Text of New Clauses and Lists of Amendment Numbers

The text of the amendments that the numbers below refer to can be found in the July 2022 Notices of Amendments papers.

1) Amendments relating to the ‘commission or facilitation of an offence’ and service design

  • These policies were implemented through amendments 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 75, 76, 77, 79, 80, 81, 85, 86, 87, 88, 89, 92, 93, 95, 102, 103, 104, 106, 107, 108, 109, 110, 111, 112, 130, 131, 132, 133, 134, 141, 142, 150.
  • These include amendments to the safety duties and risk assessment duties themselves , and then also to other clauses for the purposes of consistency (consequential amendments).

2) New clauses and amendments relating to how companies should make judgements about content

“NC14

Providers’ judgements about the status of content

(1) This section sets out the approach to be taken where—*

  • (a) a system or process operated or used by a provider of a Part 3 service for the purpose of compliance with relevant requirements, or

  • (b) a risk assessment required to be carried out by Part 3, involves a judgement by a provider about whether content is content of a particular kind.

(2) Such judgements are to be made on the basis of all relevant information that is reasonably available to a provider.

(3) In construing the reference to information that is reasonably available to a provider, the following factors, in particular, are relevant—

  • (a) the size and capacity of the provider, and

  • (b) whether a judgement is made by human moderators, by means of automated systems or processes or by means of automated systems or processes together with human moderators.

(4) Subsections (5) to (7) apply (as well as subsection (2)) in relation to judgements by providers about whether content is— (a) illegal content, or illegal content of a particular kind, or (b) a fraudulent advertisement.

(5) In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is content of the kind in question (and a provider must treat content as content of the kind in question if reasonable grounds for that inference exist).

(6) Reasonable grounds for that inference exist in relation to content and an offence if, following the approach in subsection (2), a provider—

  • (a) has reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied, and

  • (b) does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.

(7) In the case of content generated by a bot or other automated tool, the tests mentioned in subsection (6)(a) and (b) are to be applied in relation to the conduct or mental state of a person who may be assumed to control the bot or tool (or, depending what a provider knows in a particular case, the actual person who controls the bot or tool).

(8) In considering a provider’s compliance with relevant requirements to which this section is relevant, OFCOM may take into account whether providers’ judgements follow the approaches set out in this section (including judgements made by means of automated systems or processes, alone or together with human moderators).

(9) In this section— “fraudulent advertisement” has the meaning given by section 34 or 35 (depending on the kind of service in question); “illegal content” has the same meaning as in Part 3 (see section 52); “relevant requirements” means—

  • (a) duties and requirements under this Act, and

  • (b) requirements of a notice given by OFCOM under this Act.” Member’s explanatory statement This new clause clarifies how providers are to approach judgements (human or automated) about whether content is content of a particular kind, and in particular, makes provision about how questions of mental state and defences are to be approached when considering whether content is illegal content or a fraudulent advertisement.”

“NC15

Guidance about illegal content judgements

(1) OFCOM must produce guidance for providers of Part 3 services about the matters dealt with in section (Providers’ judgements about the status of content) so far as relating to illegal content judgements.

(2) “Illegal content judgements” means judgements of a kind mentioned in subsection (4) of that section.

(3) Before producing the guidance (including revised or replacement guidance), OFCOM must consult such persons as they consider appropriate.

(4) OFCOM must publish the guidance (and any revised or replacement guidance).”

The policy was also implemented through amendments 57, 69, 72, 73, 74, 78, 82, 83, 84, 90, 91, 98, 100, 101, 105

  1. A full list of the amendments which implement these changes is in the annex to this fact sheet 

  2. Implemented by NC14, 15 and other consequential amendments which are listed in the annex to this fact sheet