Press release

Landmark laws to keep children safe, stop racial hate and protect democracy online published

The Online Safety Bill will help protect young people and clamp down on racist abuse online, while safeguarding freedom of expression.

New Online Harms laws
  • Milestone Online Safety Bill will help safeguard young people and clamp down on racist abuse online
  • Bill to be published today includes new measures to uphold democratic debate online
  • Financial fraud on social media and dating apps included to protect people from romance scams and fake investment opportunities

New internet laws will be published today in the draft Online Safety Bill to protect children online and tackle some of the worst abuses on social media, including racist hate crimes.

Ministers have added landmark new measures to the Bill to safeguard freedom of expression and democracy, ensuring necessary online protections do not lead to unnecessary censorship.

The draft Bill marks a milestone in the Government’s fight to make the internet safe. Despite the fact that we are now using the internet more than ever, over three quarters of UK adults are concerned about going online, and fewer parents feel the benefits outweigh the risks of their children being online – falling from 65 per cent in 2015 to 55 per cent in 2019.

The draft Bill includes changes to put an end to harmful practices, while ushering in a new era of accountability and protections for democratic debate, including:

  • New additions to strengthen people’s rights to express themselves freely online, while protecting journalism and democratic political debate in the UK.

  • Further provisions to tackle prolific online scams such as romance fraud, which have seen people manipulated into sending money to fake identities on dating apps.

  • Social media sites, websites, apps and other services hosting user-generated content or allowing people to talk to others online must remove and limit the spread of illegal and harmful content such as child sexual abuse, terrorist material and suicide content.

  • Ofcom will be given the power to fine companies failing in a new duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher, and have the power to block access to sites.

  • A new criminal offence for senior managers has been included as a deferred power. This could be introduced at a later date if tech firms don’t step up their efforts to improve safety.

Digital Secretary Oliver Dowden said:

Today the UK shows global leadership with our groundbreaking laws to usher in a new age of accountability for tech and bring fairness and accountability to the online world.

We will protect children on the internet, crack down on racist abuse on social media and through new measures to safeguard our liberties, create a truly democratic digital age.

Home Secretary Priti Patel said:

This new legislation will force tech companies to report online child abuse on their platforms, giving our law enforcement agencies the evidence they need to bring these offenders to justice.

Ruthless criminals who defraud millions of people and sick individuals who exploit the most vulnerable in our society cannot be allowed to operate unimpeded, and we are unapologetic in going after them.

It’s time for tech companies to be held to account and to protect the British people from harm. If they fail to do so, they will face penalties.

The draft Bill will be scrutinised by a joint committee of MPs before a final version is formally introduced to Parliament.

The following elements of the Bill aim to create the most progressive, fair and accountable system in the world. This comes only weeks after a boycott of social media by sports professionals and governing bodies in protest at the abuse of footballers online, while at the same time concerns continue to be raised at social media platforms arbitrarily removing content and blocking users.

Duty of care

In line with the government’s response to the Online Harms White Paper, all companies in scope will have a duty of care towards their users so that what is unacceptable offline will also be unacceptable online.

They will need to consider the risks their sites may pose to the youngest and most vulnerable people and act to protect children from inappropriate content and harmful activity.

They will need to take robust action to tackle illegal abuse, including swift and effective action against hate crimes, harassment and threats directed at individuals and keep their promises to users about their standards.

The largest and most popular social media sites (Category 1 services) will need to act on content that is lawful but still harmful such as abuse that falls below the threshold of a criminal offence, encouragement of self-harm and mis/disinformation. Category 1 platforms will need to state explicitly in their terms and conditions how they will address these legal harms and Ofcom will hold them to account.

The draft Bill contains reserved powers for Ofcom to pursue criminal action against named senior managers whose companies do not comply with Ofcom’s requests for information. These will be introduced if tech companies fail to live up to their new responsibilities. A review will take place at least two years after the new regulatory regime is fully operational.

The final legislation, when introduced to Parliament, will contain provisions that require companies to report child sexual exploitation and abuse (CSEA) content identified on their services. This will ensure companies provide law enforcement with the high-quality information they need to safeguard victims and investigate offenders.

Freedom of expression

The Bill will ensure people in the UK can express themselves freely online and participate in pluralistic and robust debate.

All in-scope companies will need to consider and put in place safeguards for freedom of expression when fulfilling their duties. These safeguards will be set out by Ofcom in codes of practice but, for example, might include having human moderators take decisions in complex cases where context is important.

People using their services will need to have access to effective routes of appeal for content removed without good reason and companies must reinstate that content if it has been removed unfairly. Users will also be able to appeal to Ofcom and these complaints will form an essential part of Ofcom’s horizon-scanning, research and enforcement activity.

Category 1 services will have additional duties. They will need to conduct and publish up-to-date assessments of their impact on freedom of expression and demonstrate they have taken steps to mitigate any adverse effects.

These measures remove the risk that online companies adopt restrictive measures or over-remove content in their efforts to meet their new online safety duties. An example of this could be AI moderation technologies falsely flagging innocuous content as harmful, such as satire.

Democratic content

Ministers have added new and specific duties to the Bill for Category 1 services to protect content defined as ‘democratically important’. This will include content promoting or opposing government policy or a political party ahead of a vote in Parliament, election or referendum, or campaigning on a live political issue.

Companies will also be forbidden from discriminating against particular political viewpoints and will need to apply protections equally to a range of political opinions, no matter their affiliation. Policies to protect such content will need to be set out in clear and accessible terms and conditions and firms will need to stick to them or face enforcement action from Ofcom.

When moderating content, companies will need to take into account the political context around why the content is being shared and give it a high level of protection if it is democratically important.

For example, a major social media company may choose to prohibit all deadly or graphic violence. A campaign group could release violent footage to raise awareness about violence against a specific group. Given its importance to democratic debate, the company might choose to keep that content up, subject to warnings, but it would need to be upfront about the policy and ensure it is applied consistently.

Journalistic content

Content on news publishers’ websites is not in scope. This includes both their own articles and user comments on these articles.

Articles by recognised news publishers shared on in-scope services will be exempted and Category 1 companies will now have a statutory duty to safeguard UK users’ access to journalistic content shared on their platforms.

This means they will have to consider the importance of journalism when undertaking content moderation, have a fast-track appeals process for journalists’ removed content, and will be held to account by Ofcom for the arbitrary removal of journalistic content. Citizen journalists’ content will have the same protections as professional journalists’ content.

Online fraud

Measures to tackle user-generated fraud will be included in the Bill. It will mean online companies will, for the first time, have to take responsibility for tackling fraudulent user-generated content, such as posts on social media, on their platforms. This includes romance scams and fake investment opportunities posted by users on Facebook groups or sent via Snapchat.

Romance fraud occurs when a victim is tricked into thinking that they are striking up a relationship with someone, often through an online dating website or app, when in fact this is a fraudster who will seek money or personal information.

Analysis by the National Fraud Intelligence Bureau found in 2019/20 there were 5,727 instances of romance fraud in the UK (up 18 per cent year on year). Losses totalled more than £60 million.

Fraud via advertising, emails or cloned websites will not be in scope because the Bill focuses on harm committed through user-generated content.

The Government is working closely with industry, regulators and consumer groups to consider additional legislative and non-legislative solutions. The Home Office will publish a Fraud Action Plan after the 2021 spending review and the Department for Digital, Culture, Media and Sport will consult on online advertising, including the role it can play in enabling online fraud, later this year.

Ian Russell, Molly Rose Foundation, said:

The Molly Rose Foundation and Molly’s family say government internet regulation can’t come soon enough and welcome this important step towards a safer internet for all.

It is vital to focus the minds of the tech platforms, to change their corporate culture and to reduce online harms, especially for the young and the vulnerable. Now is the time for the platforms to prioritise safety rather than profit; it is time for countries to change the internet for good.

Dr Alex George, The UK Government’s Youth Mental Health Ambassador, said:

This is a landmark moment here in the UK. The problem of online abuse has escalated into a real epidemic which is affecting people physically as well as psychologically and it is time that something is done.

That’s why I welcome today’s announcement about the Online Safety Bill and the protection it will provide people. Social media companies must play their part in protecting those who consume and engage with their content.

Dame Melanie Dawes, Ofcom Chief Executive, said:

Today’s Bill takes us a step closer to a world where the benefits of being online, for children and adults, are no longer undermined by harmful content.

We’ll support Parliament’s scrutiny of the draft Bill, and soon say more about how we think this new regime could work in practice – including the approach we’ll take to secure greater accountability from tech platforms.

Yesterday the Digital Secretary visited Charlton Athletic FC to hear about the club’s work on diversity and inclusion and met players from the first, women’s and academy teams. He also spoke to representatives from UK safety tech firm Crisp.

Charlton Athletic academy player Wassim Aouachria said:

I am very pleased to hear that action is being taken to stamp out discriminatory abuse on social media. I was on the receiving end of abuse on social media a few months ago and it was difficult to understand for myself and my family.

I was grateful for the support I got from the club and more needs to be done so people are held accountable for their actions. Hopefully the upcoming online safety bill can help us create a safer, more welcoming and inclusive environment for players, managers, staff, fans and everyone associated with football.

Adam Hildreth, CEO of UK safety tech start up, Crisp, said:

We set up Crisp in 2005 with a vision of helping to create a digital world that is safe for everyone. We’ve been working alongside the UK government during that time to make sure legislation keeps up with changes in the online environment.

We’re proud to have been contributors to the groundbreaking Online Safety Bill and we’re pleased to play a part in the successful UK safety tech story.

ENDS

Notes to Editors:

  • The Online Safety Bill follows the publication of the Online Harms White Paper in April 2019. An initial Government response to the consultation was published in February 2020, and a full Government response in December 2020. The full government response set out in detail the regulatory framework, which will be taken forward through this bill.
  • The legislation will be published later today in draft, and will be subject to pre-legislative scrutiny by a joint committee of MPs in this session. The make-up of the committee will be confirmed in due course.
Published 12 May 2021
Last updated 14 May 2021 + show all updates
  1. Edited paragraph related to the boycott of social media by sports professionals and governing bodies in protest at the racist abuse of footballers online — edited to include all abuse.

  2. Statistic correction: "Over three quarters of UK adults are concerned about going online, and fewer parents feel the benefits outweigh the risks of their children being online – falling from 65 per cent in 2015 to 55 per cent in 2019." It originally read '50 per cent' but the correct figure is '55 per cent'.

  3. First published.