Online Safety Bill: European Convention on Human Rights Memorandum
Updated 18 January 2023
Introduction
1. This memorandum addresses issues arising under the European Convention on Human Rights (“ECHR”) in relation to the draft Online Safety Bill. It has been prepared by the Department for Digital, Culture, Media and Sport and the Home Office. The government considers that clauses of and schedules to this Bill which are not mentioned in this Memorandum do not give rise to any ECHR issues.
2. The Bill imposes obligations on providers of internet services which allow users to upload or share user-generated content or otherwise to interact (‘user-to-user services’) and on providers of services which allow users to search all or some parts of the internet (‘search services’). It also imposes obligations on providers of internet services which publish or otherwise make accessible pornographic content. These obligations apply to providers of in scope services (“regulated services”), including those based outside the UK.
3. The Bill also includes 2 new communications offences (replacing former offences) recommended by the Law Commission. The Bill also creates a new “cyberflashing” offence.
4. The Bill completed proceedings in the Public Bill Committee on re-committal on Thursday 15 December 2022. The remaining Commons proceedings will take place on 17 January. This updated ECHR memorandum takes account of amendments made by the government since the Bill was introduced to Parliament.
5. Section 19 of the Human Rights Act 1998 requires the Minister in charge of a Bill in either House of Parliament to make a statement before Second Reading about the compatibility of the provisions of the Bill with the Convention rights (as defined by section 1 of that Act). Upon introduction in the House of Commons, the Secretary of State for Digital, Culture, Media and Sport made a statement under section 19(1)(a) of the Human Rights Act 1998 that, in her view, the provisions of the Bill were compatible with the ECHR rights.
Part 3: Providers of regulated services: Duties of care
Part 4: Other duties of providers of regulated user-to-user services and regulated search services
Background:
6. Illegal content: All providers of in-scope user-to-user services will be subject to safety duties (see clause 9). The safety duties include requirements to prevent individuals from encountering “priority” illegal content as well as to mitigate and effectively manage the risk of harm to individuals as identified in the service provider’s illegal content risk assessment. All providers of in-scope search services will be subject to similar safety duties (see clause 23).
7. Clause 170 sets out the approach which providers must take when making a judgement for compliance purposes about whether content is illegal content (or a fraudulent advertisement). The clause makes clear that providers must treat content as illegal content if (and only if) they have reasonable grounds to infer that all elements of a relevant offence are made out. They must make that inference on the basis of all relevant information reasonably available to them.
8. Harmful to children: All providers of in-scope user-to-user services which are ‘likely to be accessed by children’ will be subject to safety duties (see clause 11). The safety duties include requirements to put in place systems and processes to ensure that children are prevented or protected from encountering content which is harmful to children on their services (clause 11). All providers of in-scope search services which are ‘likely to be accessed by children’ will be subject to similar safety duties, including the requirement to minimise the risk of children encountering harmful content in search results (clause 25).
9. Risk assessments: In order to be in a position to comply with these safety duties, service providers will have to carry out detailed risk assessments of the risks posed by the relevant types of content on their services (clauses 8, 10, 22, 24).
10. User empowerment: Providers of regulated user-to-user services which meet thresholds set by the Secretary of State in regulations (Category 1 services) will be subject to user empowerment duties (clause 12). These duties require providers to offer features which adult users may apply to increase their control over particular kinds of regulated user generated content. They also require providers to offer features which adult users may apply if they wish to filter out non-verified users (Category 1 Services must also offer all their adult users the option to verify their identity (see clause 57)).
11. Freedom of expression/privacy: All providers of regulated user-to-user services and search services will be required to have particular regard to the importance of protecting users’ rights to freedom of expression and protecting users from breaches of privacy when deciding on, and implementing, safety policies and procedures to discharge their safety duties (clauses 18 and 28).
12. Content of democratic importance: Providers of Category 1 services must operate their services using systems and processes to take account of the importance of the free expression of content of democratic importance (clause 13). The terms of service of Category 1 services must specify how content of democratic importance will be treated, particularly in relation to decisions about whether to take down content or restrict access to it, or decisions to take action against a user. These terms of service must be clear and accessible and must be applied consistently.
13. News publishers/journalism: Content published by ‘recognised news publishers’ (established newspapers and broadcasters) will be exempt from the scope of the safety duties. Providers of Category 1 services will be required to use systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about how to treat such content, particularly in relation to decisions about whether to take down content or restrict access to it, or decisions to take action against a user (clause 15).
14. Notification requirements in relation to recognised news publisher content: Clause 14 imposes a requirement on Category 1 services to notify a recognised news publisher and provide a right to appeal before removing or moderating its content hosted on the service, or taking action against the publishers’ account. Such content must remain available on the service until the appeal process is complete. The requirement may be disapplied where the news publisher reasonably considers that it would incur criminal or civil liability if it did not remove the content swiftly and / or the news publisher content amounts to a relevant offence as defined by the Bill.
15. Fraudulent advertisements: Part 3, Chapter 5 includes provisions which will place a duty on providers of a subsection of in-scope services to put in place systems and processes to prevent individuals from encountering paid-for fraudulent advertisements being displayed on their service. Clauses 33 and 34 place the duty on providers of Category 1 and 2A services, which are the largest user to user and search services.
16. Complaints: Providers of user-to-user services and search services will be under a duty to have systems in place which allow individuals to make a complaint if the individual considers that the provider is not complying with certain duties imposed by the Bill (see clauses 17 and 27). The complaints procedure must be easy to access, easy to use and transparent. Providers must take appropriate action in response to complaints.
17. Codes of practice: Following consultation with the Secretary of State and other relevant persons specified in the legislation, OFCOM must prepare codes of practice containing steps which a provider may follow in order to comply with the safety duties (see clause 36). A provider does not have to follow the steps in the code of practice in order to comply with the safety duties - providers can choose to comply in other ways. Where appropriate, OFCOM may recommend in the codes of practice the use of technology, including proactive technology (defined in clause 202), for compliance with the illegal content, children’s online safety duties and fraudulent advertising duties (see Schedule 4 paragraph 12). In doing so, OFCOM must be satisfied that the use of technology by a service would be proportionate to the risk of harm that the technology is designed to safeguard against. OFCOM must have regard to the accuracy, effectiveness and lack of bias when deciding whether to include a proactive technology measure in a code of practice.
18. In preparing a code of practice containing recommendations on compliance with their safety duties, OFCOM must consult persons whom OFCOM considers have relevant expertise in equality and human rights, in particular the right to freedom of expression, and the right to privacy (clause 36(6)(f)). All codes of practice that OFCOM prepares must be designed to reflect the importance of protecting rights to freedom of expression and protecting from breaches of privacy (Schedule 4, paragraph 10). When complying with the safety duties, service providers must have regard to the importance of protecting rights to freedom of expression and protecting users from breaches of privacy (see clauses 18 and 28).
19. Terms of service: transparency, accountability and freedom of expression: Part 4, Chapter 3 will impose duties on providers of Category 1 services in relation to the removal or restriction of access to regulated user-generated content and the banning or suspension of users. These duties will require providers to put in place proportionate systems and processes designed to ensure (i) that such action is not taken against content or users except in accordance with the terms of service or another legal obligation (clause 64), and (ii) that any relevant term of service is clear and accessible and consistently applied (clause 65).
Article 10 (freedom of expression)
20. Article 10 provides that everyone has the right to freedom of expression. The right includes freedom to hold opinions and to receive and impart information and ideas without interference. Selected interference with this right is permitted by article 10(2) where that is prescribed by law and necessary in a democratic society.
21. The safety duties in the Bill engage Article 10 ECHR to the extent that the duties will affect the ability of users to receive and impart certain types of information online.
22. An appreciable amount of the content affected by the application of the safety duties will be of a nature which does not attract the protection of Article 10. The case law of the European Court of Human Rights indicates that, under Article 17 of the Convention (prohibition of abuse of rights), applicants are prevented from relying on Article 10 in order to perform acts characterised by factors such as hatred, violence, xenophobia and racial discrimination (see, for example, Norwood v United Kingdom, case no. 23131/03). Thus the Court has held that content expressing support for terrorism does not, by virtue of Article 17, attract the protection afforded by Article 10 (Roj TV A/S v Denmark, case no. 24683/14).
23. The Bill does not impose any restrictions on content produced and published by service providers on their own services. Subject to the requirement in respect of “recognised news publisher” content referred to at paragraph 26 below, it does not require service providers to carry any specific pieces of user-generated content on their services in breach of their terms of service.
24. Services which give rise to a low risk of harm to individuals in the UK are exempted (Schedule 1).
25. The Bill will not apply to news publishers’ websites. Full articles and recordings published by recognised news publishers are also exempted from the scope of the safety duties which apply to user-to-user services and search services (see clause 49(2), (8)-(10), and clause 50. The Bill also includes at clause 14 additional protections for recognised news publisher content, by requiring Category 1 providers to notify RNPs if they intend to remove their content, and also providing for a specific avenue for the RNP to make representations, which must be heard before that content can be removed. This can be justified as a proportionate means of achieving the legitimate aim of protecting the freedom of the press.
Justification for interference with Article 10 rights of users
26. Article 10 is a qualified right. Any interference with the Article 10 rights of internet service providers and users by Parts 3 or 4 can be justified. The interference is prescribed by law, pursues the legitimate aims of the protection of health and morals as prescribed in Article 10(2), and is necessary in a democratic society.
27. Prescribed by law: The Bill establishes an overarching regulatory framework governing the treatment of content online. In addition to the detailed provisions contained in the primary legislation, its application in specific cases will be given a higher degree of legal certainty through the exercise of delegated powers by the Secretary of State and the issuing of codes of practice by OFCOM.
28. Pursuit of a legitimate aim: The purpose of the obligations imposed on service providers in relation to the safety duties is to reduce the risk of illegal and harmful content online causing harm to individuals, particularly children. This is a legitimate aim for the interference with the Article 10 rights of users.
29. To the extent that they relate to terrorism content, the obligations imposed on service providers under the illegal content safety duties can be justified as being necessary in the interests of national security (see, for example, Zana v Turkey, case no. 18954/91).
30. The effect of the safety duties in reducing the prevalence of CSEA content and other illegal content online can also be justified as being necessary for the protection of health and morals as well as for the prevention of crime (Mouvement Raëlien Suisse v. Switzerland, case no. 16354/06).
31. In addition, the safety duties for content which is harmful to children will protect the health and morals of children by requiring service providers to reduce the exposure of children to content such as pornography and material encouraging suicide and self-harm. In its case law the European Court of Human Rights has emphasised the margin of appreciation Contracting States enjoy in relation to measures designed to protect children from potentially harmful material (see, for example, Handyside v United Kingdom, case no. 5493/72).
32. Necessary in a democratic society: The European Court of Human Rights (ECtHR) has recognised the importance of the internet for freedom of expression but also its potential for abuse (Delfi AS v Estonia, case no. 64659/09, at paragraph 110). As indicated above, the government is concerned about the prevalence of illegal and harmful content online and the harm it can cause to individuals, particularly children.
33. Additional mitigating factors: The safety duties require providers to take proportionate steps to mitigate risks to users and to operate their services with proportionate systems and processes (see, for example, clause 9(2) and (3)). Clause 9(9)(a) specifically requires providers to consider harm resulting from illegal content in determining what is proportionate, ensuring that the extent to which providers will need to take action in respect of illegal content is limited to what is proportionate with a view to the Bill’s aim to prevent harm caused to users online.
34. Under clause 18, all in-scope service providers are required to have regard to the importance of protecting freedom of expression when deciding on and implementing their safety policies and procedures. This will include assessments as to whether content is illegal or of a certain type and how to fulfil its duties in relation to such content. Clause 170 makes clear that providers are not required to treat content as illegal content (i.e. to remove it from their service) unless they have reasonable grounds to infer that all elements of a relevant offence are made out. They must make that inference on the basis of all relevant information reasonably available to them.
35. Providers’ compliance with the safety duties will also be informed by the Codes of Practice issued by OFCOM (and approved by the Secretary of State). Paragraph 10 of Schedule 4 requires OFCOM to ensure that the measures recommended in the codes of practice for the purpose of compliance with the duties are designed in the light of protecting the right of users to freedom of expression, and must incorporate safeguards for the freedom of expression, as appropriate.
36. In the exercise of their powers and functions OFCOM and the Secretary of State are required, under section 6 of the Human Rights Act 1998, to act in a way which is compatible with ECHR rights.
37. Enhanced user complaints and redress mechanisms will also enable users to challenge decisions made by service providers about the treatment of content. Clause 17 imposes duties on providers of user-to-user services to put in place complaint procedures which allow users to complain to the provider in relation to a decision to take down or restrict access to content on the basis that it is illegal content or (where appropriate) harmful to children, and which provide for effective redress if a complaint is upheld. Equivalent obligations on user reporting and complaints are imposed by clause 65 in relation to content covered by the terms of service duties. Clause 65 also imposes a requirement on providers of all user-to-user services to include clear and accessible provisions in their terms of service informing users of their existing right to bring an action in the courts for breach of contract if their content is taken down, or access to it restricted, or their accounts are suspended or banned, in breach of the provider’s terms of service.
38. The Bill imposes a duty on Category 1 service providers to take measures to protect journalistic content (clause 15). The definition of “journalistic content” includes content “generated for the purposes of journalism” and will therefore capture content produced by citizen journalists as well as content published by recognised news publishers. These measures will afford all creators of UK-linked journalistic content (including citizen journalists) with substantive additional safeguards and protections against the removal of journalistic content, including notification and appeal requirements, and expedited complaints procedure on user-to-user services with the largest numbers of users and highest risk features.
39. The Bill is expected to have a positive effect on the freedom of expression of some users, by reducing the prevalence of bullying and abuse online and so creating a safer environment in which users (particularly those in vulnerable groups) feel more able to express their views. That will be bolstered by the user-empowerment and terms of service duties, which will give users of Category 1 services greater control over the content they encounter online (including the ability to filter content from unverified users), and prevent the arbitrary restriction of content and user accounts.
40. For these reasons any interference with users’ rights to free expression resulting from the imposition of safety duties in relation to illegal and harmful content is compatible with Article 10.
Recognised news publisher exemption: Combined effect of Article 10 and Article 14
41. The “recognised news publisher” content exemption applies only to content published by established national, local and online newspapers and broadcasters, not to journalistic content produced by citizen journalists.
42. Article 14 of the ECHR provides that the rights and freedoms set out therein shall be secured without discrimination on any ground. It therefore needs to be read in relation to the other substantive Convention rights.
43. Article 14 ECHR (read with Article 10) may be engaged because a citizen journalist could seek to argue that they are being discriminated against in the enjoyment of their Article 10 rights because their content does not benefit from the additional protections offered by the Bill to content published by a recognised news publisher.
44. Article 14 applies to discrimination on the grounds such as specific personal characteristics or ‘other status’. To the extent that being a citizen journalist would amount to an appropriate ‘other status’ on which to base an Article 14 discrimination claim, there are strong grounds for treating the two groups differently, since the conditions which apply to recognised news publishers (e.g. editorial control, standards code, complaints procedure - see clause 50(2)) mean that content they produce gives rise to a lower risk of harm. In addition, citizen journalists will benefit from strong protections under clause 15 of the Bill.
45. In relation to journalistic content, the Bill is therefore compliant with Article 14 (read with Article 10).
Part 5 - Duties of providers of regulated services: certain pornographic content:
Background
46. Part 3 of the Digital Economy Act 2017 (“DEA 2017”) contains provisions which would require persons who make pornography available on the internet to prevent children from accessing that content. This duty would not apply to internet services which will be user-to-user services under this Bill. In October 2019, DCMS announced that Part 3 DEA 2017 would not be brought into force. Clause 188 will repeal Part 3 DEA 2017.
47. In place of Part 3 DEA 2017, Part 5 of this Bill will introduce a requirement for providers of internet services which publish pornography on their services to ensure that children are not able to encounter that pornography. The intention is that Part 5 will dovetail with the duty in Part 3 of the Bill to prevent children from encountering primary priority content that is harmful to children (clause 11). Part 3 will apply to pornography which is uploaded by users, while Part 5 will apply to pornography which is uploaded by the provider of the service. Part 5 will apply to all internet services, unless they are otherwise exempt or excepted, including those which are neither regulated user-to-user services nor regulated search services.
48. Clause 72 also includes duties intended to protect the privacy of users of services in scope of Part 5 and to maintain a written record of how the service has done this. OFCOM will be required by clause 73 to produce guidance to assist services in complying with the duties described in this paragraph and the one above.
Article 10 ECHR
49. Part 5 engages Article 10 ECHR to the extent that it will affect the ability of providers of internet services to impart information and users to receive information. Part 5 imposes restrictions on internet service providers publishing pornographic content on their own services. It also restricts children’s access to that content. As a consequence of that restriction, it will also be more difficult for adult users to access online pornography as they will need to demonstrate they are over 18.
50. Article 10 is a qualified right. Any interference with the Article 10 rights of internet service providers and users by Part 5 can be justified. The interference is prescribed by law, pursues the legitimate aims of the protection of health and morals as prescribed in Article 10(2), and is necessary in a democratic society.
51. Prescribed by law: The interference with the Article 10 rights of providers of internet services and users will be set out in the primary legislation. The duties which internet services must comply with if they publish regulated provider pornographic content on their service are set out in clause 72. The duties are clear and precise enough to enable internet service providers to be able to foresee the consequences of their decisions on which content to publish on their services and the steps that they must take to prevent children from accessing any regulated provider pornographic content. The definition of what is pornographic is clear. The definition is based on existing definitions in primary legislation which have been used in the context of trying pornography-related criminal offences for more than a decade. Ofcom will be required by clause 73 to produce guidance in relation to these duties to assist internet services providers with complying with their duties and to provide transparency on how OFCOM will use the enforcement powers granted to them in Part 7.
52. Pursuit of a legitimate aim: The purpose of the obligations imposed on internet service providers in Part 5 is to protect children from suffering harm resulting from exposure to pornography during childhood. This is a legitimate aim.
53. Necessary in a democratic society: The ECtHR has recognised that contracting states have a wide margin of appreciation when assessing the necessity of restrictions on freedom of expression for the sake of protecting health and morality (Mouvement Raëlien Suisse). The ECtHR has held that wide margin of appreciation applies in the case of restrictions on the publication of sexually explicit content in publicly accessible places (Müller v Switzerland case no. 10737/84).
54. Part 5 has been drafted to achieve its legitimate aim in a manner which interferes with the Article 10 rights of users no more than is necessary. The restriction on users accessing regulated provider pornographic content only prevents children from receiving the content. While adults will need to complete an age verification step to be able to access the content, their ability to receive the content is otherwise not interfered with. As Part 5 does not prescribe how the age verification step must be done, internet services will have flexibility to introduce age verification in a way which is least burdensome for their adult users (as long as it still restricts children’s access).
55. The interference with the Article 10 rights of internet service providers is similarly limited to no more than is necessary to achieve Part 5’s legitimate aim. Internet service providers will not be banned from publishing regulated provider pornographic content on their service. They will be free to continue to do so as long as it is done in a manner which children are unable to access that pornographic content.
56. In order to minimise the burden on internet service providers, Schedule 1 and Schedule 9 contain a number of exemptions and exceptions to the duty for circumstances where it is felt that the risk of harm to children is low and that applying the duties would therefore be disproportionate.
57. When assessing the proportionality of measures intended to protect health and morality, the ECtHR has held that the audience of the speech and the impact on that audience has to be taken into consideration (Ponson v. France, case no. 26935/05). In Ponson, the ECtHR held that content being capable of inciting young people to act in unhealthy ways was a relevant and sufficient reason to justify the interference with the publisher’s Article 10 rights.
Notices in relation to terrorism content and child sexual exploitation and abuse content - Part 7, Chapter 5
Background
58. A notice issued under Clause 110(2)a) would require a service provider (of user-to-user services and/or search services) to use accredited technology to identify and remove, and/or prevent users from encountering illegal CSEA content or terrorism content on their services. A notice under 110(2)(b) may also require the provider of a user-to-user service to use its best endeavours to develop or source technology that works on its particular platform design that would identify and remove CSEA content or prevent users from encountering such content, for example where existing technologies are not compatible with their service design.
59. The Bill lists in Schedule 5 the offences that constitute “terrorism content”; these offences are capable of being committed online. The offences which constitute “CSEA content” are listed in Schedule 6 to the Bill; again these offences are capable of being committed online.
60. OFCOM will be able to issue a notice under 110(2)(a) irrespective of whether a service provider has complied with the safety duty. This is because it is possible for a service provider to comply with all of the steps in the codes of practice but none-the-less still have significant amounts of CSEA and / or terrorism content on their platform. A notice is therefore a means by which OFCOM can require the provider to utilise specific technology to identify and remove the material. A notice may be issued in relation to CSEA content present on any part of the service (public or private) but only in relation to public terrorism content.
61. Before issuing a notice under clause 110(2)(a) requiring a provider to use accredited technology, OFCOM must issue a warning notice (see clause 111(2)) containing details including the technology that OFCOM are considering requiring, whether the technology is to be required in relation to terrorism or CSEA content (or both), and explaining that the provider may make representations during a specified period. OFCOM may only issue a notice under clause 110(2)(a) after the period for issuing representations has expired, and only if satisfied that it is necessary and proportionate in the particular case, taking into account a non-exhaustive list of factors (see clause 112). These factors include the prevalence of the content, the risk of harm to individuals in the UK, and the severity of that harm, any interference with freedom of expression, risks in relation to privacy and that no less intrusive step is available.
62. The list of factors for warning notices relating to notices to use best endeavours to develop or source technology is the same as it is for notices requiring providers to use accredited technology, except that some factors which do not apply to the development of technology have been omitted. Once a company has developed technology, Ofcom must issue a separate notice to use accredited technology before the company can be required to use the technology that it has developed. This is in order to ensure that all the safeguards that apply to the use of accredited technology are applied to the development of any new technology. In addition, the new technology must be accredited before its use can be required by a notice. Therefore, new technology will be subjected to the safeguards set out above (see clause 111(2) and 112) before OFCOM can require its use.
63. In addition to the safeguards in subsection clauses 111 and 112(2), the Bill imposes a duty on OFCOM to produce guidance as to how it proposes to exercise its functions under this Chapter (clause 115).
64. Technology can be accredited by OFCOM or a person appointed by OFCOM, which can only accredit technology that meets minimum standards of accuracy (approved and published by the Secretary of State) (see clause 113(12-13)).
65. The Bill will confer a right on service providers to appeal a notice to the Upper Tribunal (see clause 149).
Article 8 (right to privacy), Article 10 (freedom of expression)
66. Content analysis occurring pursuant to a notice under 110(2)(a) engages the right to privacy protected by Article 8(1) ECHR. OFCOM will have the power to require the analysis or scanning of private communications for the purposes of identifying and removing CSEA content.
67. The provisions are also likely to engage Article 10 ECHR because they allow OFCOM to require service providers to analyse or scan private and public content for the purposes of identifying and removing CSEA content, or to try to develop technology that does this, and public content for the purposes of identifying and removing terrorism content. While an individual’s right to freedom of expression will not be engaged in relation to terrorism content or CSEA content, requiring a user-to-user service to use such automated tools may constitute a restriction on the Article 10 rights of user-to-user service users, because it will involve the removal of content, and there is a very small possibility that it leads to the inadvertent removal of legal content. Justification for interference with Article 8 and Article 10 rights
68. Prescribed by law: The interference is ‘in accordance with the law’ because it is sufficiently foreseeable in its terms to give individuals an adequate indication as to the circumstances in which their rights under Article 8 and 10 may be affected: in addition to the safeguards in clause 112(2), guidance from OFCOM will make clear to individuals the circumstances in which OFCOM may issue a use of technology notice.
69. Legitimate aim: The measures pursue a legitimate aim. To the extent that the measures relate to CSEA content, the measures can be justified as necessary for the protection of health and morals as well as for the prevention of crime (Mouvement Raëlien Suisse).
70. To the extent that they relate to terrorism content, the obligations imposed on service providers under the illegal content safety duties can be justified as being necessary in the interests of national security (see, for example, Zana).
71. Necessary in a democratic society: The issue of a notice under clause 110(2)(a) by OFCOM is a proportionate means of achieving the legitimate aim of identifying and removing illegal CSEA and terrorist content. The tools that OFCOM will be able to require a company to use will be accredited based on standards published by the Secretary of State to ensure that only the most accurate tools are required. OFCOM must consider a number of matters before it issues each notice, including whether the use of less intrusive means would be likely to achieve a significant reduction in the amount of relevant content, the severity of the harm and the widespread nature of the problem. These factors will ensure that OFCOM only issues such a notice where it is proportionate to do so. For these reasons, the provisions relating to notices under clause 110(2)(a) are compatible with article 8 and article 10 rights.
Interference with Article 1 of Protocol 1
72. The provisions requiring businesses to use or make best endeavours to develop technologies in particular circumstances will constitute some interference with the way in which affected providers run their businesses. As such these provisions may engage the protections conferred by Article 1 of Protocol 1 against deprivation of possessions, save in the public interest and subject to the conditions provided for by law and by the general principles of international law.
Justification for interference with Article 1, Protocol 1 rights
73. The measures will not amount to deprivation of property but a restriction of its use or enjoyment. Any restriction can be justified as part of the measures necessary to enforce laws in the general interest as set out in the Bill. OFCOM may only impose the requirements where it is proportionate and necessary, and we therefore expect that the measures will only be used in scenarios where the risk of harm to users is sufficiently high in order to justify the intrusion and where such steps are the only effective measure to protect from harm the users of online services, especially more vulnerable users such as children.
Enforcement powers - Part 7, Chapter 6:
74. Background: A range of enforcement powers will be conferred on OFCOM, enabling it to use a spectrum of measures in order to tackle infringements in a proportionate manner. Clause 118 enables OFCOM to issue a provisional notice of contravention where it considers there are reasonable grounds that the provider has failed or is failing in respect of the Bill’s enforceable requirements. This notice must specify a period during which the provider can make representations, after which OFCOM may then issue a confirmation decision (see clause 120) if it considers, following the representations, that the person is failing or has failed to comply with the notified requirement. This notice will either set out steps the provider must take to comply, or impose a penalty, or both. There are also provisions allowing penalties to be imposed for the non-payment of fees (see clauses 128) and for non-compliance with a notice to deal with terrorism or CSEA content (see clause 127). Maximum penalties of £18 million or 10% of qualifying worldwide revenue, whichever is greater, are available to provide a suitable deterrent. Confirmation decisions and penalty notices can be appealed in the Upper Tribunal (see clause 149).
75. More stringent enforcement tools known as business disruption measures (BDMs) are also available to OFCOM. BDMs can be either service restriction measures (SROs) or access restriction measures (AROs) (see clauses 131 - 135). BDMs are intended to have a deterrent or persuasive effect, encouraging the service provider to bring their processes into alignment with the requirements of the Bill. Following an application to an appropriate court, OFCOM can, through the use of SROs, require third party providers to withdraw access to certain services (such as the processing of payments). Through the use of AROs, OFCOM can restrict access to the non-compliant service for UK users. Any application by OFCOM for a SRO or ARO must be supported by detailed grounds and evidence. The court must take into account the rights and obligations of all relevant parties. These safeguards also apply to interim SROs and AROs.
Article 6 (right to a fair trial)
76. Article 6 ECHR provides that everyone is entitled to a fair trial, that is a fair and public hearing within a reasonable time by an independent and impartial tribunal established by law.
77. In relation to the general enforcement powers (clauses 118-129), the power to fine arguably engages the protections of Article 6(1) as it involves the imposition of an obligation to pay the relevant penalty. Given the potential severity of the penalties, it is possible that a court would find that the penalties are criminal in nature and so impose stricter requirements in order to comply with Article 6. The protections of Article 6(1) may also be engaged by OFCOM’s powers to impose BDMs. These decisions arguably involve the determination of a ‘civil right or obligation’ - primarily they affect the rights of persons to carry out a business (Tre Traktorer AB v Sweden, case no. 10873/84 (para 43)). The Bill provides that such action may, in all cases, only be taken following the issue of a court order made on an application filed by OFCOM. Justification for interference with Article 6 rights
78. Penalties: Whether the penalties are criminal in nature or not, we consider that any potential interference with this right can be justified as there are relevant safeguards built into the Bill, including an explicit statutory requirement for OFCOM to give reasons for issuing a penalty (see clause 129(2)a)), a right to make representations and provide evidence to the regulator (see clause 129(2)(f)) and a right of appeal to the Upper Tribunal where the decision to issue the penalty notice and the decision to impose a penalty of a particular amount may be challenged (see clause 149)).
79. Business disruption measures: Whilst part of the enforcement framework, BDMs are not primarily designed to be punitive (there is no formal finding of guilt or handing down of penalties); instead they will act to lower the risks and reduce the potential harm of the breach by reducing the availability of the infringing online service.
80. In light of the above we do not consider BDMs to amount to criminal penalties. However, in the event that they do, their exercise would not give rise to undue interference with the Article 6 rights given the procedural safeguards built into the BDM framework within the Bill.
81. The initial application to the court for any BDM will be open to challenge by both the proposed recipient of the order as well as the non-compliant provider, save where a ‘without notice’ application has been filed (which itself will have to be justified to the court’s satisfaction). Any order made by the court will be susceptible to an appeal through the usual channels.
82. Taken together, this will provide ‘sufficiency of review’ for the purposes of Article 6(1) (Bryan v United Kingdom, case no. 19178/91). Finally, all decisions made in relation to BDMs - whether to apply for an order or to issue one - will be based on a clear and predictable legal basis as set out on the face of the Bill, with clear and exacting requirements for both the application to the court and any order made, as well as in relation to the factors for the court to consider and the thresholds that must be met before an application can succeed.
83. We therefore consider that the enforcement provisions in the Bill are compatible with Article 6. Article 1 of Protocol 1 (Protection of property)
84. Article 1 of Protocol 1 entitles everyone to the peaceful enjoyment of their possessions. Noone is to be deprived of their possessions except in the public interest and subject to the conditions provided for by law and by the general principles of international law.
Interference with Article 1 of Protocol 1
85. The business disruption enforcement provisions (clauses 131 - 135) engage the protections conferred by Article 1 of Protocol 1 against deprivation of possessions, save in the public interest and subject to the conditions provided for by law and by the general principles of international law.
Justification for interference with Article 1, Protocol 1 rights
86. The measures will not amount to deprivation of property but a restriction of its use or enjoyment. Any restriction can be justified as part of the measures necessary to enforce laws in the general interest as set out in the Bill. Moreover, they are proportionate and necessary measures that will normally only be used as the culmination of the enforcement process and/or when addressing the most egregious breaches of the duties set out in the Bill where such steps will be required to protect from harm the users of online services, especially more vulnerable users such as children.
87. BDMs will only be imposed after a detailed application including grounds and evidence has been considered by a court. The court can only make an order if it is satisfied that the making of the order is appropriate for the purposes of preventing harm to individuals, and is proportionate to the risk of such harm (see e.g. clause 131(6)(c)). The court is expressly required to consider the rights and obligations of all relevant parties, including the persons on whom the court is considering imposing the requirements (see clause 131(7)). The court will therefore be able to ensure that any interference with the Article 1 Protocol 1 rights of third parties is in the public interest and strikes a fair balance between the general interest and the protection of the rights of third parties to whom an order is directed.
88. These provisions could have retrospective effect in that they may affect contracts entered into before the relevant provisions in the Bill come into force. There is no presumption that legislation is not intended to interfere with existing rights, nor does the ECHR create an absolute prohibition on retrospective application of legislation (cf. A v United Kingdom, case no. 8531/79).
89. We therefore consider that the measures in Part 7, Chapter 6 are compatible with Article 1, Protocol 1.
Part 10 - Communications Offences
90. The Bill will repeal offences relating to false and threatening communications in the Malicious Communications Act 1988, and subsections 127 2(a) and (b) of the Communications Act 2003 so far as they apply to England & Wales.
91. Part 10 contains two new offences which are intended to replace the repealed offences: (i) a false communications offence (clause 160); (ii) a threatening communications offence (clause 162), (together the “communications offences”).
Article 10 (freedom of expression) and Article 8 (respect for private life and correspondence)
92. Any offence criminalising speech will constitute an interference in the qualified right to freedom of expression, and could, depending on circumstance, constitute an interference in the qualified right to respect for private life and correspondence.
93. We consider that the communications offences within the Bill are adequately prescribed by law, pursue a legitimate aim within Articles 8(2) and 10(2), and are necessary in a democratic society.
Prescribed by law
94. We consider that the offences are formulated in such a way as to enable people to foresee with a reasonable degree of certainty the consequences of their conduct.
Legitimate aim
95. The offences in various ways focus on the potential harm to those likely to see the communication. The rationale for interfering in the defendant’s Article 10 (and, where relevant, Article 8) rights is thus primarily to protect the health and rights of others (ie those likely to or intended to encounter the communication). Necessary in a democratic society
96. It is clear from the range of harms that attend to abusive communications (especially though not uniquely online) that there is a pressing social need for some kind of interference in order to prevent such harms. The criminal law is not the least restrictive form of interference, and that is why the offences are targeted at particularly harmful and culpable forms of behaviour.
97. For these reasons, we consider that any interference in a person’s Article 8 or 10 rights that might result from the communications offences in this Bill would be compatible with the ECHR.