Guidance

Understanding and reporting online harms on your online platform

Find out what online harms are, how to manage them, and where to report them if they occur.

What are online harms?

The government’s draft Online Safety Bill defines online harms as user generated content or behaviour that is illegal or could cause significant physical or psychological harm to a person.

Online harms can be illegal, or they can be harmful but legal. Examples of online harms include (but are not restricted to):

  • child sexual exploitation and abuse

  • terrorist use of the internet

  • hate crime and hate speech

  • harassment, cyberbullying and online abuse

Online harms in this context refers to harm experienced by users. It does not include harm to organisations or businesses, or harms to people as a result of a data protection or cyber security breach. Learn more about preventing child sexual exploitation and abuse. Learn more about preventing terrorist content and activity online.

New online safety legislation is coming which will aim to reduce online harms. If you own or manage an online platform in scope of the forthcoming legislation, you will have a legal duty to protect users against illegal content. If children are likely to access your services, you will also have to put in place measures to protect them.

How online harms develop

Online harms start out as risks on your platform. If left unmanaged, these risks allow harmful behaviour, such as the posting of harmful content, to happen.

Example

A 13 year old child uses a social network to chat with their friends. The social network has privacy settings, but these are not set to high by default. This creates a risk that the child can be contacted by unknown adults. Unless this risk is managed, the child may be vulnerable to child sexual exploitation and abuse.

Are all online harms illegal?

Online harms are not always illegal, but those that are legal may still be harmful for an individual. For example: the promotion or encouragement of self-harm or eating disorders content or behaviour that is inappropriate or harmful for children in particular, such as posting pornographic material or graphic violence abuse, harassment or bullying that does not pass the threshold of a criminal offence Where the law is not clear, having clear terms of service for your platform can help your users to understand what they can and cannot do.

Who is responsible for tackling online harms on a platform

Online platforms may include websites, apps or software that you either own or manage for yourself or someone else, including licensed third party software. Currently, everyone who owns or manages an online platform has an existing legal requirement to remove illegal online content.

If you do not address illegal content on your service, you may be legally liable. If someone uses your platform to store or share illegal content, you have a legal requirement to remove it as soon as you become aware of it.

Where harms are not illegal, you should still take reasonable steps to keep your users safe from harm. The steps you take will depend on your resources, the type of platform you manage, and the features and functions you offer your users. If you do not take reasonable steps to ensure harms do not occur, your platform’s reputation for safety may be at risk. When the new online safety legislation comes into force, businesses and organisations in scope will be expected to take further action to tackle the risks of harm on their platforms. This includes putting in place appropriate systems and processes to reduce the risk of harm. Learn about your current and future responsibilities if you own or manage an online platform.

How to manage online harms

The best way to manage online harms on your platform is through safer platform design. This means making safety a fundamental feature of the design of your website, app or software. It also means being able to respond to harms if they occur. Take care not to limit your users’ rights. Good platform design should:

  • protect users’ right to freedom of expression and not censor or remove content unnecessarily

  • strengthen users’ privacy by providing them with greater control over their personal information and not infringing upon their privacy unnecessarily

Having clear and accessible terms of service will help users to understand what is and is not allowed on your platform.

How to report illegal harms

You should put in place processes to let your users tell you about illegal harms occurring on your platform. Illegal harms should be reported to law enforcement as soon as you are aware of them. When making a report you may be asked for details such as when the harm took place, who was involved and what actions you took after it happened.

Reporting online harms

If you have seen an image or video online containing sexual abuse of a child - you can report it to the Internet Watch Foundation (IWF). Report online content which is legal but harmful to Report Harmful Content NSPCC and Stop It Now offer advice and support to those who think they have seen child sexual abuse or exploitation online.

Report hate crime and discrimination to Stop Hate Crime


Part of Online safety guidance if you own or manage an online platform

Published 29 June 2021