Crime and Policing Bill: ECHR supplementary memorandum: 20 March 2026
Updated 25 March 2026
Introduction
This supplementary memorandum addresses the issues under the European Convention on Human Rights (“ECHR”) that arise in relation to Government amendments tabled on 20 March for Lords Third Reading. It has been prepared by the Department for Science, Innovation and Technology (“DSIT”).
ECHR Analysis
New clause: “Taking down intimate image content”
1. New clause “Taking down intimate image content” amends sections 10 and 27 of the Online Safety Act 2023 (“the OSA”) to impose new duties on service providers of regulated user-to-user services and search services.
2. In relation to user-to-user services, it imposes a new duty to have proportionate systems and processes designed to take down non-consensual intimate images (“NCII”) (defined as content which amounts to the offence in section 66B(1)-(3) Sexual Offences Act 2003) as soon as reasonably practicable and no later than 48 hours of receipt of an ‘intimate image content report’. This is a report which meets the following requirements: it includes a declaration that intimate image content is present on the service, that the report is made by the subject of the content or someone acting on their behalf, and it provides both sufficient information about the content for it to be identified and contact details for the person making the report. It is a matter for an individual whether they choose to make an intimate image content report.
3. In relation to regulated search services, it imposes a corresponding new duty to have proportionate systems and processes designed to ensure that individuals are no longer able to encounter search content in relation to which an intimate image content report is made, as soon as reasonably practicable and no later than 48 hours of receipt of a report.
4. These duties include removal of copies of the content or of content which the provider identifies as being substantially the same (as the intimate image content reported to it).
5. The duties are triggered where the intimate image content report meets the requirements in subsections (2) of new sections 20A and 31A, unless the provider considers that the content is not intimate image content, or the person making the report is not the subject of the content or a person acting on their behalf.
6. The new clause also provides powers for the Secretary of State to specify additional requirements for an intimate image content report, or to make further provision about how the existing requirements are to be met.
7. Existing duties in section 10 of the OSA protect users from illegal content on regulated user-to-user services and include requirements to prevent individuals from encountering “priority” illegal content such as NCII, as well as to mitigate and effectively manage the risk of harm to individuals as identified in the service provider’s illegal content risk assessment. Further, services are required to have proportionate systems and processes designed to remove illegal content swiftly once they are aware of this content. All providers of regulated search services will be subject to similar safety duties (see section 27). The new duties build on these existing duties. Cross-cutting duties in the OSA will apply to the new duties, such as duties for providers in relation to user reporting and complaints, as well as a duty to consider the importance of freedom of expression and right to privacy.
Article 10: freedom of expression
8. Article 10 provides that everyone has the right to freedom of expression. The right includes freedom to hold opinions and to receive and impart information and ideas without interference. Selected interference with this right is permitted by Article 10(2) where the interference is prescribed by law, pursues a legitimate aim and is necessary in a democratic society.
9. Article 10 is engaged to the extent that any duties imposed on service providers could impact the ability of users to receive or impart certain types of information online.
10. The Government considers that any interference would be justified as it is in accordance with the law, necessary in a democratic society both to protect the rights of individuals (whose intimate images are circulated on the internet without their consent) and for the prevention of crime (sharing NCII is a crime under UK law, see section 66B of the Sexual Offences Act 2003). Given the prevalence of sharing of NCII (otherwise known as “revenge porn”) on the internet, the measures in question meet a pressing social need.
11. The duty to remove content does not apply where the provider considers that the content is not intimate image content or that the person making the report is not the subject of the content or acting on their behalf. This is intended to safeguard the right to access information and freedom to impart information, by mitigating the risk of removal of content that isn’t NCII content e.g. as a result of a malicious report.
12. Further, the duty itself is subject to proportionality safeguards and the cross-cutting duty on providers to consider, in relation to their safety measures, the importance of the right to freedom of expression (sections 22 and 33 of the OSA). Measures described by Ofcom in their codes (for the purpose of compliance with duties in the OSA) must also be designed in light of the importance of protecting the rights of users and interested persons to the freedom of expression (paragraph 10 of Schedule 4 to the OSA). In exercise of their functions Ofcom is required, under section 6 of the Human Rights Act 1998, to act in a way which is compatible with ECHR rights.
13. Existing duties on providers to operate a complaints procedure (sections 21 and 32) also apply to the new duty and this includes complaints by users who have posted content which has been taken down by a service provider on the basis that it is illegal content, or - in the case of a search service - an interested person if content relating to them is no longer appearing in search results or is being given lower priority. Users may also complain if they consider that a service provider is not complying with their duties in relation to the importance of freedom of expression when complying with the illegal content safety duties.
14. The safeguards listed above ensure that the measures are the least restrictive possible to effectively address the problem of NCII/revenge porn being shared online.
Department for Science, Innovation and Technology
20 March 2026