Government response to draft Statement of Strategic Priorities for online safety
Published 8 May 2025
Introduction
The government’s consultation on its draft Statement of Strategic Priorities for online safety (SSP) sought views from respondents on two questions:
1. From the document you have read, are there any government strategic priorities for online safety that you disagree with or would like to see amended? Please be clear which priority you are referring to in your answer; and
2. are there are any sections in the document where Ofcom’s role in contributing to the strategic priorities could be clearer? Please be clear which section of the document you are referring to in your answer.
The consultation ran from 20 November 2024 to 10 January 2025, in accordance with the requirements of section 173 of the Online Safety Act 2023 (“the Act”). This was a targeted stakeholder consultation.
We received 21 responses, having invited a total of 38 organisations to respond. These responses came from civil society organisations and industry.
For each section of the draft SSP, we set out a summary of responses followed by the government response including, where relevant, how the government will change the draft SSP having considered respondents’ views. We also indicate the number of responses and the strength of feeling in support of any particular position. However, like all consultations, this is not a representative survey and the number or strength of particular positions should not be taken to indicate the view of the general population.
Respondents occasionally provided views on subject matter not directly relevant to the consultation, or only tangentially relevant. Nonetheless, these views are also reflected.
General observations on the SSP
Overall, most respondents were broadly supportive of the priorities set out within the government’s SSP. In particular, they welcomed the focus on ambitious, outcomes-focused priorities. Respondents also highlighted specific areas of focus they were pleased to see featured, including violence against women and girls, innovation and material reductions in illegal content.
Fewer respondents directly addressed the second consultation question. As a result, throughout this document the responses to both questions are considered in tandem.
Some respondents highlighted general observations on the SSP, which are set out below.
Framing of the priorities
Some respondents requested greater clarity in the drafting of the priorities within the SSP. In some instances, this referred to the framing of a whole priority, in others, respondents suggested clarifying how key issues, such as tackling violence against women and girls or child sexual abuse material, had been considered within the priorities.
Beyond this, a few stakeholders indicated a desire for the SSP to not only reinforce measures in the Online Safety Act, but to go beyond these.
Delivery of the priorities
Further to this, some stakeholders asked for greater clarity on government’s expectations for how Ofcom will need to fulfil its duties to have regard to the priorities within the SSP. This included asking for further detail to be set out on requirements for the annual reviews Ofcom will need to conduct setting out what action it has taken in consequence of the SSP as well as identifying delivery milestones and metrics. A few respondents also highlighted a desire for Ofcom to clarify in its early reports where they do not believe they have the levers to deliver on specific priorities.
Some respondents requested that government explicitly set out the expectation that the SSP should not lead to Ofcom deprioritising delivery across any areas of online safety regulation which do not contain a specific reference within the SSP.
Additionally, a few stakeholders indicated that it might be helpful for the government to set out its expectations for the role in scope services have in delivering the priorities within the SSP.
One organisation proposed that government should also be held accountable, alongside Ofcom, for delivering against the priorities throughout the time the SSP is in place.
Finally, some organisations noted that the statement references the role of research in supporting the delivery of some of the priorities. Clarity over responsibility for any such research was requested.
Government response: framing of the priorities
In considering the points raised by respondents across each of the priorities, the government has sought to draw out some of the further clarity requested.
The SSP has been drafted to establish a set of forward looking and ambitious priorities. However, the government does not agree that the SSP can be used as a vehicle to request Ofcom considers changes beyond what is set out within the framework established by the Act. The Act provides a clear scope for ‘online safety matters’, under section 235, which requires priorities within the SSP to be related to Ofcom’s existing online safety functions.
Government response: delivery of the priorities
Once the SSP has been formally designated by the Secretary of State, the Act requires that Ofcom publishes an initial 40 day statement setting out how it plans to have regard to the priorities, as well as subsequent annual reports on what action was taken. The Act however does not stipulate what these outputs should look like. In respect of Ofcom’s independence as the online safety regulator, it would therefore not be appropriate for the government to mandate how Ofcom should choose to deliver these.
Further to this, the government is committed to working with Ofcom on the delivery of these priorities once the SSP is designated. The government is establishing a shared monitoring and evaluation framework with Ofcom to evaluate the online safety regime.
The government agrees with respondents that our intent with establishing a set of clear priorities within this statement, is not for this to lead to the deprioritisation of wider areas of Ofcom’s online safety regulatory remit. The government has included language to clarify this.
The Act only requires that Ofcom has regard to these priorities. However, the government expects services to note the clear positions it has set out on the online safety issues considered within this statement. As Ofcom has regard to these priorities in its approach to regulation, we anticipate this will further shape expectations for platforms.
On clarity for research responsibility, the government recognises the importance of a robust evidence base. Within the SSP, the government has highlighted areas it thinks could be considered by Ofcom to further develop the evidence base. However, it does not agree that it would be appropriate for government to be more prescriptive on where responsibility for delivering specific pieces of research should lie.
Priority 1: Embed safety by design to deliver safe online experiences for all users but especially children, tackle violence against women and girls, and work towards ensuring that there are no safe havens for illegal content and activity, including fraud, child sexual exploitation and abuse, and illegal disinformation
Summary of responses
Overall, most respondents welcomed the inclusion of the safety by design priority within the statement. This was particularly the case amongst civil society organisations. Beyond this, some respondents also highlighted support for specific areas of focus within this priority including violence against women and girls, and age-appropriate design.
One industry stakeholder highlighted the work that has already been undertaken to deliver these aims by technology companies. In some areas they felt this went beyond what the Act requires in regards to embedding safety by design.
Several civil society organisations requested greater clarity from the government on its definition of safety by design. A few respondents specifically requested that the reference to ‘proportionate’ safety by design principles be clarified. One organisation suggested that any such definition should consider intersectionality. Two respondents pointed to a set of principles published in guidance by the Department of Media, Culture and Sport in 2021. Two others suggested that attaching metrics to safety by design expectations could be beneficial. One organisation suggested clarifying, even at a high level, what safety by design means for enforcement and user redress.
In seeking clarity on the definition of safety by design, some respondents put forward key areas of consideration which they feel are integral to effective safety by design measures. Of these responses, most referred to the importance of a precautionary approach to deploying and developing features and services. One organisation suggested that an explicit link between the approach to safety by design and the risk assessment and safety duties under the Act would be helpful in forging clearer links between this concept and the legislation.
In response to this priority, one organisation set out the view that the safety by design priority should capture content that is not only illegal but also content which is legal but harmful to users.
Some respondents also made references to government, Ofcom and platform engagement. This included highlighting the need to engage with expert organisations, those with lived experiences and a range of children and young people as part of the process to develop and deploy safety solutions and policies. One raised concern over expectations that civil society organisations are largely expected to share evidence and expertise for free.
Government response
Within the introductory text of this priority, the government set out its position on what safety by design means. The government has provided further detail under this priority to ensure the objective regarding safety by design is sufficiently clear. It has also removed the reference to proportionality.
The government agrees that safety by design measures should deliver safer online experiences for all users, including children. The government has added clarification on this point into the SSP. However, it otherwise feels that the expectation that safety by design should respond to all in scope harms is clear.
The government recognises the valuable work of civil society organisations, and the expertise they can offer government, Ofcom and others to support progress on the online safety agenda. The government remains committed to engaging with a wide range of stakeholders to deliver evidence-based policy solutions.
1.1 Developing a strong evidence-base to support children to have safe, age-appropriate experiences online
While some organisations welcomed prioritisation of continuing to develop the evidence-base, they also noted that government should recognise the existing evidence that is available, suggesting language such as ‘build upon’ may be more appropriate. Some expressed the view that work towards delivering age-appropriate experiences should not be delayed to wait for new evidence, urging steps to be taken to act on existing evidence and best practice. One respondent requested the government use the SSP to set out an expectation that Ofcom should share its findings in relation to child online safety and promote best practice.
One civil society organisation noted the importance of the role of communicating the evidence-base to relevant stakeholders and the public, especially to keep parents and carers informed.
Building on the reference to expanding the evidence-base on age-appropriate design, one respondent put forward specific areas of research it felt important to pursue to support the delivery of overall safety by design outcomes. This included research into demographic and platform specific types and volume of harms. One organisation suggested the government note the difference between proactively embedding safety by design and developing the evidence-base to inform iteration of online safety measures.
Government response
The government welcomes feedback on this priority. It has amended the SSP text in light of consultation responses to use language such as ‘build upon’ in place of ‘developing’. However, the government remains satisfied that the wider drafting of this priority does generally reflect this intent. The government notes that Ofcom regularly publishes its own independent research.
The government welcomes the delivery of a wide range of research to bolster the existing evidence-base. However, the government does not agree that it is appropriate for it to direct Ofcom or other bodies to conduct specific research projects.
1.2 Ensuring companies are effectively deploying age assurance technology to protect children from harm online and investing in technological developments
One issue raised by a few respondents was the enforcement of minimum age requirements, with a request for an explicit expectation to be included that the enforcement of minimum age requirements is addressed by Ofcom. One stakeholder also suggested that this priority may benefit from the inclusion of a definition of highly effective age assurance.
It was also suggested by a stakeholder that this priority should be bolstered with a reference to the link between age assurance and mitigations against grooming.
Government response
The government notes that the Act requires platforms to enforce their chosen age limits consistently.
The government welcomes the point raised about age assurance technology as a mitigation against grooming. We anticipate that the illegal content duties will also make sure providers take action against grooming. For example, Ofcom’s first codes of practice for the illegal content duties include strong recommendations about child sexual exploitation and abuse (CSEA), including measures which will make it materially harder for strangers to contact children online in order to protect children from grooming and other predation.
1.3 Deploying effective and accessible additional protections for adult users, particularly vulnerable users
A few stakeholders cautioned against conflating embedding safety by design and the use of user empowerment tools. Further to this, a few organisations also made a similar reference to recognising that content moderation is distinct from inherent safety by design.
One respondent requested the government assert a stronger position and request that Ofcom promote, where possible, the standardisation of user empowerment tools.
One respondent highlighted the importance of tackling misinformation and disinformation content even where this does not meet the bar for illegality, noting it should face increased friction.
Government response
The government acknowledges the distinct role between ensuring platforms are safe by design and the additional protections that user empowerment tools can provide for adults. These tools seek to provide adult users of Category 1 services with choices to bolster the protections provided by platforms to minimise their exposure to content which, while legal, an individual may still find harmful to them. The SSP has been amended to provide clarity on this point.
As part of the government’s priority on creating an inclusive and resilient society, we highlight the importance of users being aware of and resilient to misinformation and disinformation. As well as duties in the Act to address illegal disinformation, under the terms of service and accountability duties in the Act, if certain types of legal misinformation and disinformation are prohibited in the largest platforms’ terms of service, they will have to remove it and enforce this consistently.
1.4 Using risk and evidence-based approaches to work towards ensuring there are no safe havens online for illegal content and activity
A few respondents noted concern that language in the priority to ‘work towards’ ensuring there are no safe havens is not specific or ambitious enough.
A few organisations highlighted the challenges posed by private messaging and end to end encrypted environments, particularly in relation to tackling child sexual exploitation and abuse. Further to this, one organisation suggested amending the language of this priority to emphasise that section 121 of the Act should be used to tackle child sexual abuse material and terrorism content.
Two civil society stakeholders noted that in places the statement read as if assuming victims of online harm do not know their perpetrators. In connection with this, one respondent suggested the inclusion of doxing in reference to fraud to acknowledge this.
One respondent suggested that the principle of reduction should be broadened within the statement to apply to the Act as a whole, not just illegal content. This respondent also suggested that the government should make specific directions on Ofcom’s approach to illegal content more broadly.
Government response
It is not appropriate for government to seek to direct Ofcom, through the SSP, to use specific powers to implement the regime. However, the government has updated the text which highlights the importance of tackling terrorism content in online environments to also include a reference to child sexual abuse material.
The government has also amended the SSP to recognise that perpetrators of online harm can be both known and unknown to a victim.
Priority 2: Ensure industry transparency and accountability for delivering online safety outcomes, driving increased trust in services and expanding the evidence-base to provide safer experiences for users
Summary of responses
2.1 Improve transparency to increase understanding of the harms occurring on platforms, why they are occurring and the best way to tackle them
One organisation welcomed the emphasis on the use of the transparency regime to develop an understanding of the harms landscape.
Several civil society organisations expressed concerns about the language used in this priority, specifically the government’s assertion that for the reports to be helpful to the wider public, they should be clear, easy to use and accessible. These organisations raised concerns that the trade off in prioritising accessibility to the public could be services oversimplifying the reports, reducing their utility for groups such as civil society organisations and researchers. It was suggested that if the government was keen to see this type of reporting, this should be a secondary product that services should publish alongside the main transparency report. One organisation recommended that Ofcom consider creating and collating all reports in one place for ease of access. A few organisations suggested that, while recognising it is not required by the legislation, the government should also promote the publication of transparency reports amongst smaller services, in particular small but risky services. Specific areas to be included within the reports were also suggested, such as data that highlights user experiences and outcomes of user redress processes.
The reference to algorithmic transparency was welcomed by two respondents. One of these suggested Ofcom invests in understanding how algorithms operate.
One respondent highlighted the challenges faced by researchers seeking access to data on social media platforms and noted the importance of data access for researchers. They also requested that the SSP asserts that Ofcom’s report on researcher’s access to data is brought forward.
One organisation highlighted suggested outputs for the Advisory Committee, including expectations around areas of focus and the role it could play in delivering and commissioning research in this policy area.
Government response
The government acknowledges the concerns expressed by civil society organisations about potential oversimplification of transparency reports if they are drafted in a way that ensures they are accessible to the public. The SSP is clear that, as well as being clear, easy to use and accessible for the public, transparency reports must also have sufficient detail and depth to ensure that they can be used by researchers. The SSP has been amended to be even more explicit that the transparency reports will need to serve more than one purpose.
The government has also provided more detail on the Data (Use and Access) Bill, which contains provisions to provide the Secretary of State with the power to make a new framework to allow researchers access to online safety related data. Subject to completing parliamentary passage, this new regime would complement the transparency reporting regime and ensure researchers can access the vital, in-depth data they need to undertake analysis of online safety risks to UK users.
On small services and transparency reports, the government expects all services, regardless of size, to be transparent about the way in which they operate. However, as respondents acknowledge, transparency reporting under the Act is only a requirement for categorised services.
2.2 Parents are treated with respect when requesting information from services following the death of a child, and, through Ofcom, coroners have access to data to understand how online activity may have contributed to the death of a child
A few respondents directly welcomed the inclusion of this priority within the statement. One organisation suggested these provisions should be extended to deceased adults and also noted they felt there was an omission within the statement in relation to information sharing with law enforcement. One further respondent suggested this could be extended to femicide and suicide. One organisation suggested the government clarify that carers should also be treated with respect when requesting information following the death of a child.
Government response
The government understands the importance of treating those who are seeking information following the death of an individual with respect. The government agrees that this priority should apply to both parents and carers and has amended the SSP to reflect this. However, the Act does not provide for some of the wider circumstances to be considered and so we have therefore not expanded the text within the SSP to reflect these suggestions.
2.3 Users are clear what is allowed on services through providers’ Terms of Service, and these are applied consistently
A few respondents put forward their agreement with the importance of clear Terms of Service for users. Some of these put forward suggestions on how this should be delivered, including developing summary or video versions of Terms of Service and child friendly versions. One organisation suggested transparency around risks identified by services on their platforms would be welcome. One suggested a more ambitious framing of this priority than ensuring consistent application of Terms of Service.
Further to this, a few organisations highlighted concerns that the Terms of Service duties within the Act do not preclude platforms from removing provisions to avoid removing legal content which could be harmful for some adult users.
Government response
The government welcomes feedback from respondents on the approach to Terms of Services. The need for Terms of Service to be clear, accessible and applied consistently is established by the Act and the government therefore has maintained this framing in line with the legislation. The government has clarified that all users, including children, should be aware of and understand Terms of Service.
Ofcom will produce and consult on guidance for the Terms of Service duties under the Act.
2.4 Platforms are accountable to users, increasing the incentives on providers to keep their users safe and benefit wider society by fostering a greater level of trust, safety and transparency
The government’s recommendation that Ofcom analyse user complaints to identify trends that may need to be addressed was welcomed by one respondent. One other respondent suggested that the overall priority should include a reference to children, and parents acting on behalf of children in relation to complaints made to services.
A few organisations noted that the statement does not reference alternative dispute resolution. One provided more detail on the role they felt this type of mechanism could play in user trust and accountability for services, noting the impartiality and expertise that could be established within such a mechanism. One organisation noted that this priority does not make reference to the supercomplaints process.
Government response
User redress mechanisms will need to work for all users on a platform, including children. The government is therefore content that the position set out in the current drafting adequately reflects how these mechanisms should apply to all users.
The government welcomes the comments on alternative dispute resolution mechanisms. The SSP has been amended to reference the report that Ofcom is required to produce on the reporting and complaints procedures. Following the publication of this report, the Secretary of State will be able to consider if an alternative dispute resolution system is deemed necessary.
Priority 3: Deliver an agile approach to regulation, ensuring the framework is robust in monitoring and tackling emerging harms - such as AI-generated content - and increases friction for technologies which enable online harm
Summary of responses
A few respondents directly welcomed the focus on agile regulation as a priority. One industry stakeholder was particularly welcoming of this priority whilst also calling for regulation that recognises the role the regulator could play in driving growth.
Some organisations also suggested that further clarity on how government defines agile regulation would be welcome.
Government response
The government acknowledges the requests to define agile regulation. The government is content that the overall meaning of agility within the SSP is sufficiently clear. The government believes it is for Ofcom to decide how it will consider this principle across its regulatory approach.
3.1 Changes in the use of technology that enable online harm are monitored, risk assessed and where appropriate mitigated against
One civil society organisation noted that agile regulation should consider the risks posed by algorithms as well as the role algorithms could play in harm reduction. Further to this, they suggested that there could be value in providing users the option to stop the algorithm from recommending harmful content.
Another suggested this priority may be strengthened by clarifying links between agile regulation and the duties for services to assess and mitigate risks. The same organisation also recommended an outcomes focus to tackling these risks.
Government response
The government has asserted its view that embedding safety by design includes considerations of the role of algorithms. It is therefore content that its expectations in relation to algorithms are clear within the SSP.
3.2 Threats from AI generated content and activity are effectively mitigated
One respondent suggested that Ofcom use its media literacy research to consider the intersection between mis and disinformation risks with those posed by AI. The same respondent also highlighted the value of disclosing when content is AI-generated to maintain trust in the information environment.
One respondent highlighted the importance of considering lived experiences in relation to the role of AI in online harm.
A few organisations raised risks posed by AI and suggested this priority should focus on AI risks most likely to harm children. Some stakeholders noted their concerns with the rise in harms related to generative AI and AI chatbots, including concerns about the ability of the framework to continue to respond to these new and emerging technologies.
Government response
The Act was designed to provide safer online experiences for all users, but especially for children. The government has added additional clarity into the SSP to confirm that this also applies with regard to AI risks in-scope of the Act.
As set out in the SSP, the government recognises the benefit of continuous engagement with Ofcom. Officials meet with Ofcom regularly and discuss a wide range of issues, including emerging technologies and their possible effects on online safety.
3.3 International cooperation should enable new ideas to tackle online safety to be shared, building a global consensus on online safety
One respondent expressed concern about the uncertainty that platforms’ policy changes may have on successful delivery of online safety regulation that protects UK users. Further to this, they noted the important role the UK has to play in encouraging the deployment of safety technology across jurisdictions.
Government response
International collaboration is crucial in tackling the global threat of online harms. The government agrees that we must build consensus around approaches that uphold our democratic values.
The government believes regulatory approaches should focus on the systems and processes companies put in place to mitigate and manage risks to users.
Tech company accountability and transparency in relation to the development, design and operation of online platforms is fundamental to this approach, as well as the protection and promotion of human rights online.
3.4 Small but risky services are regulated effectively
In response to this section, several organisations highlighted concerns about small but risky services and the fact that they are not likely to be Category 1 services.
A few respondents highlighted concern about the exposure of adults to legal but harmful content.
Government response
This priority sets out clear expectations for small but risky services, which includes the requirement to comply with the illegal content duties and, where relevant, the child safety duties. The government has been clear that where these services do not comply with the duties that apply to them under the Act, the government expects to see enforcement action.
Priority 4: Create an inclusive, informed and vibrant digital society resilient to potential harms, including disinformation
Summary of responses
A few respondents welcomed the focus across this section on the importance of media literacy.
A few organisations also welcomed the emphasis on action on misinformation and disinformation.
One organisation welcomed the reference to the Advisory Committee under Priority 2 but suggested given its link to misinformation and disinformation, an inclusion within this priority may also be beneficial.
The reference to violence against women and girls in this section was welcomed by some stakeholders. One organisation suggested this priority include a reference to online violence against women and girls affecting offline behaviours. Another made a more specific reference to the impact of misogynistic content on men and boys’ behaviour. One group suggested that inclusivity within this priority could be broadened to capture other marginalised groups.
Government response
The government agrees that violence against women and girls online can often manifest to violence offline. It has accepted the suggestion to clarify this within the SSP.
4.1 Users who are aware of and resilient to mis-and disinformation
While generally welcoming the intention of creating an informed and resilient population, some organisations cautioned against reliance on media literacy levers to achieve this. In particular, some organisations welcome the references to a cross-sectoral approach to deliver these outcomes, while one requested clarity on government’s expectations for cross-sectoral coordination. This organisation also suggested that a reference to the Department for Science, Innovation and Technology’s (DSIT) media literacy strategy would be welcome.
Government response
The government notes the concerns raised on focusing on media literacy within this priority. Making the internet safer requires a broad toolkit, using the Online Safety Act to ensure platforms limit harmful content and equipping both children and adults with the knowledge and skills to navigate the online world.
Media literacy is complementary to the clear expectations set out earlier in the statement on safety by design, including as it relates to illegal misinformation and disinformation content. Media literacy not only helps prevent harm but also empowers individuals to embrace opportunities for social engagement, learning, and participation in the digital world.
Media literacy is also a broad subject area that cuts across a huge number of issues. DSIT works closely with Ofcom to ensure our approaches are complementary and address a range of media literacy issues. We have updated the SSP to make clearer DSIT’s role in media literacy, referencing the Online Media Literacy Strategy and explaining how DSIT can help Ofcom engage cross-government, where needed.
4.2 Widespread adoption of best practice principles for literacy by design provides users with the tools to navigate their online environments
While one respondent highlighted the positive results achieved through Ofcom’s media literacy programme, a few organisations suggested they would welcome clarity on metrics and an increase in funding for delivery of Ofcom’s media literacy strategy. One respondent suggested there was value in ensuring Ofcom’s media literacy reports were publicly available. Another suggested the opportunity for users to feedback on the strategy would be welcome.
A few responses highlighted the role for government funding and cross-department collaboration on media literacy.
Government response
Alongside publishing its research, Ofcom also publishes its media literacy reports. Ofcom consulted on its current three-year media literacy strategy between April and June 2024, welcoming views on its proposals.
The government also acknowledges its own role in supporting media literacy initiatives to build resilience to a variety of online harms. The SSP has been amended to reflect the role of government more clearly.
4.3 Parents, carers and children understand risks and are supported to stay safe against online harm
A number of organisations welcomed the prioritisation of supporting parents, carers and children to safely navigate their online environments. Some respondents suggested platforms could play a role in educating users in how to safely navigate their sites. One organisation suggested both government and Ofcom go further in raising awareness of the Act to support delivery of this aim.
Two organisations welcomed the reference to the curriculum review.
Government response
The government will work with Ofcom to ensure that the right information is available to people about the protections the Act introduces, including through our Online Safety Act explainer on gov.uk.
4.4 Young people feel included in the policy making process shaping their digital experiences online
Some civil society organisations welcomed the focus on engagement with children and young people set out in this priority. One organisation suggested providing a variety of engagement mechanisms may support delivery of this aim.
Government response
The government welcomes the views expressed on this priority and the work already being undertaken by Ofcom to engage children and young people.
4.5 Risks to trust in online information due to AI-generated content are effectively mitigated
One respondent welcomed the mention of labelling AI-generated content. Another suggested that the language on increasing platform transparency should be strengthened to call for full transparency.
Government response
The government has maintained its position on requiring increased transparency. It believes it has struck an appropriate balance between ambitious proposals for furthering platform transparency on online safety, whilst recognising that there will be some information which should not be shared more widely, such as personal or commercially sensitive data.
Priority 5: Foster the innovation of online safety technologies to improve the safety of users and drive growth
Summary of responses
A few respondents explicitly welcomed the inclusion of innovation as a priority within the statement, including welcoming its overall ambition. In particular, those organisations were supportive of the government’s intention that innovation should seek to raise not only the floor but also the ceiling of what is possible.
One industry stakeholder requested clarity on the role of the regulator in supporting economic growth and innovation.
Government response
The government has also provided further clarity on Ofcom’s role in supporting economic growth and innovation by making clearer that Ofcom should target its regulatory efforts towards high-risk services, whilst giving low-risk services the freedom to innovate without disproportionate regulatory burdens.
5.1 Encouraging innovation in safety technology to improve experience of all users online
In relation to encouraging innovation, some respondents made references to issues related to funding and barriers to investment. One respondent suggested government should look to commit funding to drive innovation in the safety technology sector.
Others raised wider risks to innovation within the safety technology sector. One industry respondent highlighted that a focus on developing safety solutions in-house poses a risk to innovation in the wider safety technology ecosystem. Further to this, the importance of the availability of data on harms to develop detection capabilities, and the difficulty in securing this was also highlighted.
One respondent suggested that a reference to government and Ofcom accrediting safety technologies should be included within this priority.
Another respondent suggested that the Department for Science, Innovation and Technology should explore the role algorithms could play in detecting harmful browsing patterns. A third organisation suggested that transparency from Ofcom on gaps in best practice and action to stimulate action to address this would be welcome.
Government response
Ofcom published a consultation in December 2024 on accredited technologies and the use of proactive technology. The government looks forward to receiving its advice on this topic.
The government recognises the importance of investment to drive innovation. It supports and encourages investment to develop and improve new and existing safety technologies in support of the aims of this statement, and the online safety regime more broadly.
In setting out the government’s expectations for safety by design within this statement, it has also clearly referenced a need for services to consider the role their algorithms play, including a more detailed discussion of the role algorithms play in the dissemination of harmful content. The government has therefore not amended this priority to further reflect this issue.
5.2 Driving adoption of safety technologies by online services to improve experience and support compliance
One respondent made reference to the role deployment of device level safety technology could play to deliver safety by design across all applications.
One organisation welcomed the references in particular to the use of hash matching technology to detect child sexual abuse material. They suggested the application of this technology should be broadened to other types of harms including non-consensual intimate images and terrorism content. Further to this, they also described the value of smaller platforms deploying such measures.
Government response
The government has set out clear expectations in the SSP that it wants to see material reductions in UK users encountering illegal content through online platforms, and of online platforms being used to facilitate priority offences. We have also been clear that we expect Ofcom to use its powers to oversee this, alongside platforms taking proactive steps to reduce the risks their services can be used to carry out the most harmful illegal activity. The government is therefore content that the expectations around tackling these types of content are clear within the SSP.
5.3 Supporting the development of more effective age assurance technologies
One respondent recognised the role of Ofcom in recommending age assurance technologies but noted that there should also be an expectation that Ofcom ensure services are held to account for successfully implementing these measures.
One stakeholder expressed concern that the framing of this priority could suggest that the government does not perceive there to be existing age assurance technologies which operate to sufficient degrees of accuracy. One respondent noted they would welcome the inclusion of government’s desired outcomes in relation to this priority. They also noted the role that development of effective age assurance technologies could play in supporting the provision of age-appropriate experiences for children on services.
Government response
Through this priority, the government highlights the development of age assurance technologies which recognises a range of methods to assess the age of a user, including, but not limited to, using age verification or estimation technology. The government has further clarified this point in the SSP and has made clear that effective methods exist.
The government has also further amended the SSP to highlight links between implementing age assurance and delivering age-appropriate experiences.
Other comments
Some organisations highlighted some wider areas for consideration. We address these below.
Implementation of the Online Safety Act
One respondent highlighted the scale and complexity of the challenge for Ofcom to implement this new regime and commended its efforts and expertise. However, a small number of respondents expressed concern that, while the ambition of the statement was welcomed, that it does not align with the early implementation approach for the Act, primarily in relation to the illegal content codes, identifying risks and the approach to safety by design.
One organisation highlighted the need for transparency about the success of the online safety regime.
Existing legislative framework(s)
In response to the consultation, some civil society organisations highlighted pieces of legislation they would like to see amended or introduced to support delivery of the priorities within the SSP.
Online Safety Act
Among the respondents who suggested areas for amends to existing legislation, most discussed amends to the Online Safety Act. These responses made reference to updating areas that are already captured by the Act as well as new areas they would like to see captured by the framework.
Some stakeholders also discussed areas they felt the duties were more limited within this framework, for instance in regards to how the framework addresses misinformation and disinformation. Some civil society organisations proposed amending the Act to remove “safe harbour” and to loosen the language in Schedule 4 on the evidence threshold for recommending safety measures.
One respondent set out its view that the ambition of the SSP does not align with the design of the Act’s framework. A few respondents set out their expectations that future legislation would be required to capture new and emerging technologies within online safety regulation.
Wider Legislation
A small number of respondents highlighted measures they welcomed or would want to see within the Data (Use and Access) Bill. This included suggestions to clarify that researchers conducting research on online safety matters in the UK will be able to access information regardless of where they are located and setting out protections for researchers from litigation.
Two organisations provided views that the government should consider future legislation on the enforcement of minimum age requirements if services do not implement these measures under the Act effectively. One respondent also highlighted the views on what should be captured within any future AI legislation, including in relation to using AI to generate illegal content such as child sexual abuse material and responding to emerging and evolving AI technologies.
Advertising
A few organisations suggested that the SSP could include references to advertising, with one organisation providing greater detail on this. This respondent suggested the SSP should highlight links between advertising and revenue generated by services; set out expectations for greater transparency in the advertising supply chain; and consider the regulation of influencers and their content, in particular in relation to the information environment.
Government response
Implementation of the Act
The government continues to work closely with Ofcom to ensure the Act is implemented as quickly and effectively as possible. Ofcom has already set out its plans to consult on the first next iteration of the illegal content codes this year.
The Act requires that a post-implementation review is conducted to evaluate the success of the regime. The government and Ofcom are establishing a joint monitoring and evaluation framework to inform this work. Once completed, this review will be laid in Parliament.
Existing legislative framework
The priorities within the SSP must adhere to the scope established by section 235 of the Act, which defines online safety matters, requiring that priorities are linked to Ofcom’s existing online safety functions. As such, the SSP cannot consider issues which are not captured within the framework established by the Act. The government is clear in its draft SSP that where it is clear that new legislation is required to respond to problems, it will consider this, but that it aims for the SSP to enable innovation to deliver the best outcomes within the existing framework.
Advertising
Given the scope of the SSP, the wider issues raised about the advertising environment have not been included. Government is continuing to work with industry through the Online Advertising Taskforce, which has a remit to help build the evidence base and help address concerns about trust, accountability and transparency in the online advertising ecosystem.
Detail of feedback received
From the targeted stakeholder consultation, we received 21 responses.
38 organisations were invited to respond. These responses came from civil society organisations and industry. They included:
- 5Rights Foundation
- Center for Countering Digital Hate
- Centre for Young Lives
- Children’s Commissioner for England
- Conscious Advertising Network
- End Violence Against Women Coalition
- Full Fact
- Institute for Strategic Dialogue
- Internet Matters
- Internet Watch Foundation
- Molly Rose Foundation
- NSPCC
- Online Safety Act Network
- Online Safety Technology Industry Association
- Parent Zone
- Refuge
- Reset Tech
- Samaritans
- Tech UK
- UK Safer Internet Centre
- UsForThem