Guidance

Safer platform checklist: practical steps to protect your users from online harms

If you manage an online platform that allows user generated content, these 7 steps will help you keep your business and your users safe.

This guidance is distinct from the forthcoming regulatory requirements that will be introduced through the Online Safety Bill. The legislation is currently being drafted and is not yet law, which means you do not have to take action yet. But you may want to take steps now to prepare and keep users safe on your platform.

Overview

An online platform is any type of website, app or software where users can interact or share user generated content.

Online harms can happen when a platform’s features or functions create a risk to users’ safety, or to the safety of another. These harms may be caused by illegal content or activity. They may also be caused by content or activity that is legal, but still harmful.

New online safety legislation is coming which will aim to improve people’s safety in the UK. The laws will apply to businesses or organisations that operate online platforms that:

  • host user generated content such as images, videos and comments

  • allow UK users to talk with other people online through messaging, comments and forums

  • allows UK users to search a range of websites and databases – search engines that only allow searches of a single website will not be in scope

If you own or manage an online platform which enables any of these, you will have a legal duty to protect users against illegal content. If children are likely to access your service, you will also have to put in place measures to protect them.

How to prevent online harms

The best way to prevent online harms is by taking a preventative approach, using best practice safer platform design to manage risks. This is safer for your users, easier for you and more cost effective than responding to harms after they happen.

This guidance sets out practical advice that will help you to prepare to improve the safety of your platform.

How to improve the safety of your platform

The following actions will make it easier for you to manage a broad range of online harms. This guidance is not mandatory, but may help you to improve the safety of your website, app or software.

1. Review your platform design for risks and harms

Different design choices lead to different kinds of risks and harms. Learning how the design of features and functions can lead to harms on your platform will help you improve safety for your users. Consider making a record of:

  • what harmful content or activity might or does exist on your platform

  • what action you are already taking to prevent and manage risks

  • who is responsible for carrying out the action

  • how your platform’s features interact with each other

2. Identify and protect users who may be vulnerable

Some of your users may be at increased risk from online harm due to their age or circumstances. For example, people under the age of 18, or users with a disability.

You could use age assurance technology to establish the age of your users. This can help you take steps to:

  • provide younger users with greater protections

  • prevent younger users from accessing your service

You should find a solution that is best suited to your platform and your users. If your platform presents a high level of risk to younger users, you should use a solution that will provide you with a high level of confidence in the age of your users, such as age verification.

Find out more about safety technology providers

Users can be vulnerable for reasons other than age. Not all users who may be vulnerable can be easily identified, so your platform should be designed to be inclusive and improve safety for everyone. This includes making sure your reporting and complaints processes are accessible, clear and easy to use.

Where you can identify children and other users who may be vulnerable, you can put specific protections in place to manage their safety.

Example of safer design for vulnerable users

A video gaming app enables users to interact with one another through private messaging. They can also share images and short videos with other users.

The app features robust age assurance measures for establishing which users are children. Child accounts have limited access to direct messaging functionality. This means children cannot be messaged privately by adults, making it less likely they will be exploited by offenders.

3. Assess how your users make reports or complaints

You should create clear terms of service explaining what is acceptable on your platform. These should be prominent and accessible to users of all ages and abilities. You should make it easy for anyone to report content or behaviour that breaks those rules.

This means your users and employees (if you run a business) should know:

  • where and how to make a report or complaint

  • what will happen afterwards

  • how long it will take before someone responds

  • how a user can appeal a decision if they disagree with the outcome

4. Review and test your safety measures

Regularly review and test whether your safety measures are working and, where appropriate, that all your users understand them. These measures may include:

  • how you make design choices and if your choices are creating risk

  • how you identify harmful content or activity

  • the steps you take to prevent or remove harmful content and behaviour

  • how users report harms, and how you manage and respond to user reports

  • how you currently document harms that happen on your platform

You should check your safety measures do not infringe on your users’ right to privacy or freedom of expression.

5. Keep up to date with information about designing safer online platforms

Having the latest information about safer platform design will help you prepare to keep your users safe. This may include keeping up to date with the latest online harms and safety technology information, as well as changes to safety legislation.

You could consider:

If you run a business

6. Appoint a responsible person

Your business should have someone who understands the risk to users posed by a service and who is responsible for deciding how you manage the online safety of your users. This could be you or one of your employees.

Your responsible person should:

  • make sure they know about online safety best practice and how it should be implemented

  • know what to do if something goes wrong - for example, if one of your users shares inappropriate or illegal content

  • have access to sufficient support and resources

7. Make sure your employees know what to do

Make sure your employees are aware of what they should be doing to help keep users safe on your platform.

You could do this by:

  • providing additional training

  • embedding safer design principles into your work culture

  • making sure your employees understand their individual responsibilities

This also applies if you commission work to be carried out on your behalf, for example by contractors or agency workers.


Part of Online safety guidance if you own or manage an online platform

Published 29 June 2021