Policy paper

Online safety and competition in digital markets: a joint statement between the CMA and Ofcom

Published 14 July 2022

Logos: Competition and Markets Authority, Ofcom and Digital Regulation Cooperation Forum

Foreword

Online services have become central to how most of us live our lives. This has generated significant benefits for people but has also created new concerns about the potential for consumer harm. The government is responding to these challenges by proposing new forms of digital regulation, building on the existing competition and consumer protection powers of the Competition and Markets Authority (CMA) and Ofcom. The CMA has created the Digital Markets Unit to expand its capacity, capability and expertise in digital markets and to oversee a proposed pro-competition regime. Ofcom will implement the new online safety regime, alongside its existing online safety responsibilities for video-sharing platforms.

While competition and online safety objectives are clearly distinct, they can be complementary. Interventions that promote competition can empower consumers and advertisers to leave if a service does not offer sufficient protection from harmful content, encouraging these services to invest in safety. Similarly, online safety regulation can give people more confidence to engage with new entrants and can ensure that firms do not restrict competition unnecessarily as they seek to protect their users. But there may be occasions where there is a need to balance these objectives, and a careful approach can ensure this is done in a proportionate way.

This statement explains how these interactions create an opportunity for the CMA and Ofcom to implement their competition and safety duties in ways that create the greatest possible benefit for UK consumers and citizens. It is also a first step in building a clear understanding of the way our roles interact in digital markets and how we need to respond. We will continue to work together on the questions raised in this statement, to ensure we understand the case-by-case nature of these interactions and the ways in which they may evolve.

This statement is part of a work programme to support wider cooperation among digital regulators and to ensure a coherent approach to tackling concerns about online activities. It has been developed by the Digital Regulation Co-operation Forum (DRCF) in collaboration with the CMA and Ofcom’s co-members, the Financial Conduct Authority (FCA) and Information Commissioner’s Office (ICO).

Andrea Coscelli and Melanie Dawes

Summary

Consumers benefit significantly from online services that offer an unprecedented choice of content and enable new and varied ways to connect more easily with people and businesses. But these services have the potential to create harm where the competition they face is weak, or they do not invest enough in keeping users safe from harmful content.

This statement explores the interactions between competition and safety online. Our aim is to ensure that the CMA and Ofcom take a coherent approach when dealing with these concerns using their existing powers and under new duties.

How algorithms, data and user interfaces shape consumer engagement with online services is key to understanding some of the competition and user safety issues that are seen today. This creates significant scope for regulators to share knowledge and ensure a joined-up approach where interventions target these service features. But the interactions between safety and competition can be even more direct. This is particularly the case for those platforms that govern the rules by which content reaches users and people connect with others, and which have gained unassailable positions through the data or network economies in their markets.

This can create opportunities for interventions that target these platforms to improve outcomes across both regimes. Competition interventions can strengthen incentives for these platforms to improve safety by making it easier for consumers and advertisers to switch in search of better protections. Where online safety interventions clarify the requirements for firms, this can give consumers greater confidence to switch from larger incumbents to new entrants or smaller competitors. Considering these synergies will allow us to shape interventions in both areas to maximise the benefits to UK consumers.

But the effects of interventions across these policy areas are not always complementary. As online safety and competition regimes have distinct policy aims, there can be occasions where interventions for one objective may impact adversely on outcomes for the other. It is important that such impacts are identified and mitigated. For example, interventions to improve online safety may make it harder for new firms to enter and compete with large and established services. Where such unintended effects are unavoidable, regulators will need to be transparent about trade-offs and ensure that the impact is proportionate to the problems they are targeting.

Finally, where platforms act as gateways for other businesses to reach customers, they can take on a ‘quasi-regulatory’ role. In some cases, these gateway platforms may impose safety standards on other businesses that restrict competition more than is needed. Clear online safety standards and collaboration among regulators can ensure that firms meet safety requirements in a way which does not distort competition unnecessarily.

Introduction

Online services are fundamental to most people’s personal and working lives, generating significant benefits. Consumers now have access to an unprecedented choice of content online and digitalisation has enabled new and varied ways to engage more easily with others and with businesses. But these services also have the potential to harm individuals and society. These risks can be accentuated where the competition they face is weak, or they do not invest sufficiently in keeping their users safe.

To tackle these challenges, the UK government has developed legislation to improve online safety for users (the Online Safety Bill, which is currently being considered by Parliament) and is developing a pro-competition regime for digital markets.[footnote 1] These will work alongside the existing competition and consumer duties of the CMA and Ofcom, as well as Ofcom’s duties to keep users safe on UK established video-sharing platforms and its responsibilities for media plurality.[footnote 2]

These regimes provide distinct tools to ensure digital markets deliver good outcomes for UK consumers. Competition interventions seek to ensure that online services keep innovating and offer quality products at competitive prices to attract consumers, while online safety regulation ensures that market players have the right incentives to keep people safe.

However, the nature of digital markets means that online safety and competition interventions can interact. This statement explores these interactions and how the CMA and Ofcom can capitalise on them to maximise benefits for UK consumers.

While not the focus of this statement, competition and online safety can also interact with other policy areas, such as privacy, media plurality and the broad range of issues where consumer protection is needed. Some of these interactions are explored further in a separate statement on interactions between competition and privacy and a forthcoming statement on interactions between privacy and safety.[footnote 3] The CMA and Ofcom will also consider how best to take a coherent approach between online safety and tackling economically harmful and illegal activities (including fraud), as government plans to update consumer rights and strengthen the enforcement of consumer law.

The remainder of this statement is structured as follows:

Features of online services that benefit users can also generate online safety and competition concerns

Online services have created significant benefits for users

Digital technologies have lowered the costs of producing and disseminating content such as photos, video, messages, and livestream broadcasts, while online services have made it easier to find and interact with this content and other people online. This has created significant benefits for people and businesses by facilitating interactions between a huge number of users.

It has also enabled users to access an unprecedented choice of content, often created by users communicating with friends and family or sharing their views and experiences with others. For example, people upload more than 500 hours of video to YouTube every minute[footnote 4] and WhatsApp carries around 100 billion messages every day.[footnote 5]

However the features of online services, when combined with other factors, can give rise to competition and safety harms

Platform intermediaries have emerged to help users navigate the vast amount of content available and connections that are possible online, creating scope for significant network effects. For example, users may prefer to use a platform where other people are present with whom they want to interact; while content creators want to be on a platform that attracts the users they value.

As their user numbers grow, platforms curate content by automating the recommendations they make to users. In many cases this curation relies on search and ranking algorithms to identify content in which users may be interested, or to match them to other users with whom they may engage. Platforms’ algorithms can be more effective at matching users’ preferences where they have access to more data about user behaviour, meaning platforms with many users benefit from scale in collecting such data.[footnote 6]

In many cases platforms monetise users’ attention and data by selling advertising. The scale of content and interactions they host has created a wide range of opportunities for advertisers to place their ads in front of relevant audiences. Platforms have also innovated in how they place advertising, both to automate the process and to target ads. Economies of scale and scope in data allow platforms to target ads more effectively.[footnote 7] Network effects also occur, where advertisers are attracted to platforms that offer them access to a large base of users that advertisers wish to target.

In some digital markets, these features contribute to online platforms achieving unassailable positions that afford them significant market power. This can result in consumers losing out through lower quality services, reduced innovation, and higher prices. For example, the CMA has found that network effects and economies of scale play an important role in driving Google and Meta’s market power in search and social display advertising respectively, alongside other features of those markets such as these firms’ vertical integration. This can lead to higher prices to advertisers which feeds through to increased prices of goods and services across the economy.[footnote 8]

The features of online services described above can also contribute to some online safety risks, to individuals and to society. The near-zero cost of creating and disseminating content has facilitated the distribution of harmful content, including misleading or fraudulent content (which often causes financial harm) and abusive content (which poses risks of physical or psychological harm). It can also create new ways for harmful interactions to occur between online users, such as child grooming. Concerns have also been raised that the dissemination of disinformation or misinformation risks distorting the political debate, or that algorithmic content selection may increase polarisation between different groups of users.

These concerns are particularly acute if the network effects that drive scale in online platforms can allow harmful content to reach a wide audience quickly. Risk of harm can also be greater where services create friction for users to move following a harmful experience, or if algorithms serve engaging harmful content to the most vulnerable.[footnote 9]

Mitigating these safety and competition harms requires distinct policy tools

Even in competitive markets, online services may not have sufficiently strong incentives to protect their users from harmful content and harmful interactions. This can be the case if users lack information to compare their potential risk of a harmful experience on one service against another, or if many users wish to engage with content that may be harmful to them.

For this reason, regulators need different toolkits to address market power and online safety risks.

The features of online services set out above are essential in shaping how users engage with them and relevant to understanding some of the competition and user safety issues that can be seen today.

For example, the performance of algorithms that curate content can be a key factor in how services compete for users, while also playing an essential role in whether and how these users are protected from harmful content or harmful interactions. Likewise, user interfaces that are designed to increase the cost of switching or unduly influence consumer choices can dampen competition and make it harder for users to leave when faced with unsafe experiences.

This creates scope for regulators to share knowledge and work together where they have overlapping interests in these service features. Accordingly, the CMA and Ofcom have worked alongside other DRCF members to publish a discussion paper on algorithmic harms and how to audit them.[footnote 10] The CMA and Ofcom are also sharing knowledge on how choice architecture can affect consumer engagement with online services and the implications for safety and competition.[footnote 11]

But the interactions between safety and competition can be even more direct. In particular, they interact closely where platforms that govern the rules by which content reaches users and people connect with others have gained unassailable positions through the data or network economies in their markets. The next section explores in greater detail how these interactions can come about, and the implications this can have for designing online safety or competition interventions.

How competition and online safety policies interact in digital markets

This section explores the scope for interactions between competition and online safety under three broad categories:

  • Policy synergies: sometimes interventions in support of one policy objective can have a positive impact on outcomes for another. In these cases, the CMA and Ofcom can take that into account when creating policy to maximise the consumer benefits of an intervention.
  • Policy tensions: in some circumstances, interventions in support of one policy objective risk creating unintended negative effects for another. As regulators assess the impact of potential interventions they should identify such effects and seek to mitigate them. Where such effects are unavoidable, regulators will need to be transparent about trade-offs and ensure impacts are proportionate to the concerns they are targeting.
  • Unnecessary constraints: gateway platforms may design online safety measures in a way which has an unnecessarily adverse impact on competition. In such situations, there is a role for regulators to challenge companies and consider whether more competition-friendly options can achieve equivalent safety outcomes. In some cases, clear online safety standards can also mitigate the ability for companies to weaken competition by imposing overly restrictive safety requirements on others.

This statement sets out some emerging views of where interactions between competition and online safety can occur in practice. As the CMA and Ofcom develop their regulatory approaches in these areas they will continue to talk to each other, and to other stakeholders, to explore whether and how these interactions should affect their interventions. In doing so, the CMA and Ofcom will also take into account the provisions of forthcoming legislation to implement the online safety and pro-competitive regimes.

Policy synergies: some interventions may promote both competition and online safety

Competition interventions can sometimes strengthen online safety

Competition interventions can sometimes improve online safety outcomes. By creating more choice and enabling users to switch more easily, competition interventions can allow consumers or advertisers to choose to engage most with those platforms that are best at keeping them safe or safeguarding their commercial interests. For example, they allow consumers and advertisers to ‘vote with their feet’ if they think that a platform is not providing a sufficient level of protection.[footnote 12] This can in turn create stronger incentives for providers to meet users’ safety demands.

Such competition measures may interact in a beneficial way with online safety interventions that increase transparency with respect to companies’ efforts to protect users. Informing users about the risk of encountering harmful content on some platforms could help drive competition to offer greater safety, if it empowers users to leave in search of better protection. This may be particularly relevant when parents consider which services their children use.

Where competition interventions can assist online safety outcomes, there can be value in identifying the potential for such wider effects. In turn this can ensure that regulators shape interventions to achieve the greatest benefits for consumers and avoid duplicative interventions. For example, Box 1 describes how greater transparency and control in digital advertising markets may benefit both competition and online safety. Given the potential synergies between online competition and safety in this sector, the CMA and Ofcom will continue to collaborate in this area in the future.

Box 1: Online platforms and digital advertising market study

The CMA’s online platforms and digital advertising market study found that a lack of transparency in the market was undermining competition in the sector.[footnote 13] It recommended that Google and Facebook should provide greater access to underlying data and be more transparent about their targeting algorithms to allow advertisers to understand more about the effectiveness of their adverts, including where their adverts are placed.

Measures that give advertisers more transparency and control over their advertising may not only help address the market power of Google and Facebook but may also help improve online safety. Some advertisers are concerned about ‘brand safety’, i.e. the need to protect their brand image from the negative consequences of advertising next to some types of content. Having greater transparency and control would make it easier for advertisers to avoid advertising next to such content. Where content that is not brand safe is also harmful for users, this would reduce the risk that advertisers inadvertently fund such harmful content. In turn this may help improve online safety, as it would mean platforms have less incentive to promote this content.

Competitive markets can also promote positive safety outcomes in other ways. There are likely to be opportunities for entrants to develop innovative solutions that can foster the emergence of a competitive market for the third-party provision of online safety technologies, like content moderation and age verification. Third-party safety tech markets can enable the availability and take-up of safety solutions, particularly by companies that otherwise may not have the means to develop as effective solutions themselves.

However, if competition to supply safety tech services is weak, this could dampen innovation or raise costs for smaller companies. Such weakening may occur, for example, through the large platforms purchasing safety tech start-ups. It could also arise where a safety tech market is subject to economies of scale in data or network effects, which in turn may result in the market tipping in favour of an incumbent that faces limited competition. Competition tools could be used to ensure these markets remain contestable and businesses of all sizes have access to safety measures at reasonable prices.

Safety interventions can sometimes improve competition

Online safety interventions may sometimes strengthen competition. For example, online safety interventions can clarify for users the requirements that apply to all online services in scope of regulation. This can give users confidence that these firms will provide a reasonable level of online safety. Users may then be more inclined to switch from larger incumbents to new entrants or smaller competitors, without being concerned about whether those companies will provide sufficient online safety.

Policy tensions: interventions in one policy area sometimes create a risk of negative unintended effects in another

As online safety and competition regimes have distinct policy aims, sometimes interventions in support of one objective may adversely impact outcomes for the other. The CMA and Ofcom consider it important to identify and mitigate such impacts wherever they can, and that the rules set for online services do not impose conflicting requirements. Where such unintended effects are unavoidable, regulators will need to be transparent about trade-offs and ensure impacts are proportionate to the concerns targeted.

It is possible that interventions which seek to enhance online safety may increase the cost of entry to a market, reducing the ability of start-up firms to enter or compete with existing services. Recognising this risk, the online safety regime is being designed to take a proportionate approach to safety duties, with some duties to be imposed only on those companies with the widest reach. Separately, Ofcom will take a proportionate approach when designing the codes of practice for firms in relation to online safety. Ofcom appreciates that the largest services have capabilities and resources that vastly outstrip those of most in-scope services, and it will want to set clear expectations for the biggest firms without imposing a disproportionate burden on smaller or lower-risk services.

Similarly, some competition interventions may risk worsening online safety if they prevent companies from taking actions that can help to protect users. For example, some platforms have voiced concerns about the potential risk that interoperability requirements could undermine their ability to maintain online safety commitment to users. In such a case, the CMA and Ofcom would need to examine the validity of such claims. If the risk is deemed material, they would look at what measures can be undertaken to mitigate any negative effects for online safety.

Box 2 illustrates how the CMA and Ofcom have collaborated to avoid unintended effects when developing advice to government on how a code of conduct could help govern the relationships between online platforms and content providers. This case study exemplifies how careful design of competition interventions can limit the risk of unintended consequences for online safety.

Box 2: Advice to government on a code of conduct for platforms and publishers

Major digital platforms such as Facebook and Google are an ever more important link between consumers and content providers, whether through search facilities, the posting of content or links, or the hosting of third-party content.

Publishers can benefit from posting on these platforms through the promotion of their content, allowing them to create relationships with new audiences. Platforms benefit from direct and indirect advertising revenues, an improvement in the services they offer, increasing consumer loyalty, and a greater understanding of their consumers through the way they engage with content.

However, where publishers rely on a platform for access to a significant share of customers, this can give the platform significant bargaining power. In April 2021, the Secretary of State for Digital, Culture, Media and Sport asked the CMA’s non-statutory Digital Markets Unit to work with Ofcom to ‘look at how a code would govern the relationships between platforms and content providers such as news publishers, including to ensure they are as fair and reasonable as possible’.[footnote 14]

The CMA and Ofcom concluded that, where platforms have significant bargaining power, conduct requirements could ensure that terms imposed on publishers are fair and reasonable. This could relate to both the remuneration that they receive for their content or other non-monetary terms of the contracts: for example, publishers’ access to data on users’ interaction with their content when it is hosted on a platform.

As part of this work, the CMA and Ofcom considered how best to shape this competition intervention so that platforms would remain able to de-monetise harmful content where appropriate for safety, and advised that the guidance should not oblige firms to display or pay for harmful content. The CMA and Ofcom suggested in their advice that the wording of the conduct requirement should reflect this.[footnote 15]

While not the focus of this statement, there are also opportunities to take a consistent and joined-up approach between the application of consumer protection law to economically harmful illegal content and online safety regulation. The CMA and Ofcom are already sharing expertise about the framework that the CMA has developed to establish platforms’ legal responsibilities on issues like hidden advertising and trading in fake reviews.[footnote 16] This engagement will inform ongoing work to ensure the CMA and Ofcom take a coherent approach to designing remedies using online safety regulation powers and applying consumer protection law.

Unnecessary constraints: gateway platforms must not restrict competition unnecessarily when keeping users safe

Platforms that act as important gateways for businesses to reach consumers are increasingly setting and applying their own standards for users’ security, privacy, and safety. This sometimes has the effect of requiring other firms to use these standards if they interact with the platform, so that the gateway platforms are increasingly playing a quasi-regulatory role. There may be good reasons for some of the restrictions these platforms put in place, for example in the interests of ensuring privacy, security and online safety, and the setting of standards in themselves may not pose any regulatory concerns.

However, the standards that gateway platforms set can have a significant impact on wider market participants and hence raise concerns about decision transparency, consistency of consumer experience and competition. Furthermore, when weighing up the costs and benefits of the standards, gatekeeper platforms may at times have conflicts of interest. This can arise if these platforms have an incentive to impose restrictions that favour their own commercial operations, notwithstanding the interests of consumers.

As such, there are risks that platforms can adopt online safety measures that go further than necessary and unnecessarily restrict competition. In this case the CMA and Ofcom would need to consider whether there are alternative safety solutions which create less distortion of competition. Or alternatively, there may be scope to clarify online safety standards to ensure that gateway platforms are unable to use safety as a pretext for restricting competition.

Box 3 illustrates an area where platforms may unnecessarily restrict competition through how they take account of online safety considerations, by reference to the CMA’s mobile ecosystems market study. Here the CMA found that Apple’s actions regarding cloud gaming app restrictions, which it has justified on online safety and privacy grounds (amongst other reasons), have a disproportionate negative impact on competition with a range of Apple products. This illustrates how it can be important for regulators to assess whether restrictions on choice and competition imposed by platforms are genuinely in consumers’ interests. This is an area where the CMA and Ofcom anticipate engaging further once Ofcom takes on its new role as the online safety regulator.

Box 3: Mobile ecosystems market study

The final report of the CMA’s mobile ecosystems market study[footnote 17] provides several examples of Apple citing online safety as a justification for imposing restrictions that diminish competition or user choice.

One example of this relates to Apple’s ban on cloud gaming apps on its App Store. Cloud gaming allows users to play games on mobile devices by streaming content from high-powered computers in the cloud. Users typically subscribe to a cloud gaming service, which offers access to a large catalogue of games. However, Apple requires that iOS users download games as standalone apps from the App Store. As a result, cloud gaming can only be accessed through a web app on Apple devices, which the CMA found provides a sub-optimal experience when compared to the cloud gaming apps available on Android devices.

Apple has justified the restrictions on cloud gaming on the grounds of online safety, security, privacy and user experience and expectations. In terms of online safety, Apple stated that games need to be standalone so that each game can have an App Store product page which displays relevant information about it, such as the age rating and privacy information. Thus cloud gaming service providers may only create a catalogue app insofar as it links to the individual App Store product page for each game (as otherwise, Apple argued, its current safety protections fall away). In addition, Apple stated that it is unable to apply its parental controls to cloud-streamed games, such as setting time limit controls or parental approvals for purchases.

However, the CMA considered that some parental controls, such as the screen time limit, could be applied to a cloud gaming app as a whole and noted that some cloud gaming services have similar parental control systems. Additionally, cloud gaming providers are well-placed to (and in most cases do) offer game-specific information on their own apps equivalent to the information found on the App Store product page (such as age ratings), further mitigating safety concerns.

How the CMA and Ofcom expect to work together

The Online Safety Bill requires that Ofcom considers the impact of proposed online safety interventions on firms of different sizes and capacities, and specifically the impact on small businesses. This approach will tend to identify and mitigate the potential adverse impacts of online safety regulation on competition. Ofcom is well placed to consider these interactions given its experience as a competition and consumer enforcement authority in communications sectors.[footnote 18]

In addition, as the CMA discharges its duties as a cross-sectoral competition and consumer authority, it may at times need to consider the impact of its interventions on consumer safety outcomes. This will also be true when, subject to forthcoming legislation, it implements the pro-competitive regime for digital markets. The CMA and Ofcom will therefore work together to identify cases where such interactions may arise and close collaboration can help deliver better outcomes for UK consumers.

The CMA and Ofcom already work together in the context of their concurrent powers as competition and consumer enforcement authorities in communications sectors. For example, the CMA and Ofcom have collaborated to:

  • Consider proposed telecoms and media mergers, including where they have worked together to examine the potential impact of mergers on media plurality and competition.[footnote 19]
  • Conduct market studies and generate recommendations for sectoral policy (e.g. on digital comparison tools and on the mobile ecosystems market study).
  • Coordinate the use of ex-post competition powers.
  • Provide advice to government (for example through the Digital Markets Taskforce, where the CMA, with input from Ofcom and the ICO, provided advice to government on the potential design and implementation of pro-competitive measures for unlocking competition in digital markets).

The CMA and Ofcom’s Memorandums of Understanding set out in more detail how they work together.[footnote 20]

The CMA and Ofcom have also coordinated with other regulators to tackle cross-cutting issues, both through the UK Regulators Network and, since 2020, through the DRCF. The DRCF ensures a greater level of cooperation in response to the unique challenges posed by the regulation of online platforms. By working together through the DRCF, the CMA and Ofcom will build capability across all digital regulators and collaborate on projects to ensure coherence between regulatory regimes. The CMA and Ofcom will also continue to work closely with other regulators on specific issues, for example, with the FCA and ICO on online fraud.

This joint statement has been developed under the umbrella of the DRCF. It reflects the CMA and Ofcom’s expectation that the forthcoming online safety regime will require deeper collaboration and coordination between them. This will be necessary to better understand and manage the potential synergies, complementarities, or tensions between competition and online safety issues, both in relation to issues that cut across the work by both organisations and also in Ofcom’s internal considerations of online safety and competition interactions.

The CMA and Ofcom will consider how their engagement will need to evolve in response to these interactions, and whether this should be reflected in new working arrangements. The CMA and Ofcom anticipate that the way in which they collaborate will depend on the nature of the interactions which are encountered and the ultimate shape of the forthcoming legislation.

Depending on the circumstances, the CMA and Ofcom will work together to share knowledge and experience and coordinate as they implement policy.

Sharing knowledge and experience

The CMA and Ofcom will continue to share knowledge and insights about technological and commercial developments and their implications for regulation, both bilaterally and with DRCF partners. There are many areas relating to digital markets where the CMA and Ofcom share a common interest, including algorithms,[footnote 21] ad tech, cloud and the metaverse.[footnote 22]

There is also scope for joint learning on how service design can shape users’ choices[footnote 23] in ways that can promote competition, media literacy, online safety and users’ control over their personal data. In addition, the CMA and Ofcom, with our DRCF partners, share the objective of encouraging greater dialogue between industry and regulators about the evolution of digital markets, to ensure that digital regulation is effective and coherent.

Coordination

The CMA and Ofcom will work together to identify opportunities where competition interventions may have implications for online safety, and vice versa. They will collaborate to understand the implications of these interactions, with the objective of creating a coherent approach to digital regulation that realises the best outcomes for UK consumers. The nature and extent of this coordination is likely to vary according to the nature of the interactions identified, and to specific sectors and harms.

First, the CMA and Ofcom will look for opportunities to realise synergies between the two regimes, where interventions in one policy area will promote better outcomes in the other. The CMA and Ofcom will consider how to design competition interventions where the shape or choice of intervention may offer specific benefits with respect to user safety, for example by requiring greater transparency in the digital advertising market.

Second, the CMA and Ofcom will also need to coordinate where interventions in one policy area can have negative implications for the other, which may require a mitigating response. In particular, it will be important for the CMA and Ofcom to coordinate to ensure that interventions to drive competition in digital markets do not undermine the safety of users on the largest platforms.

Third, the CMA and Ofcom anticipate that there will be times where Ofcom’s experience in matters of online safety can help inform the CMA’s assessment of safety actions taken by gateway platforms that may pose risks to effective competition. In particular, the CMA and Ofcom will work together to consider interactions between online safety and competition as the CMA addresses the issues identified in its mobile ecosystems market study (the new pro-competition regime will be best-suited to addressing many of these issues).

Finally, the CMA and Ofcom will also coordinate with the ICO and FCA to take account of the complementary roles of online safety regulation, the enforcement of consumer protection law and data privacy regulation. This is particularly relevant in relation to work that seeks to address economically harmful and illegal content, such as fraud. As set out above, the CMA and Ofcom are collaborating with respect to their enforcement approaches in this area. Ofcom and the FCA are also working together in relation to illegal financial promotions, for example by optimising information sharing and how we approach platform engagement.

  1. The government published its response to the consultation on a new pro-competition regime for digital markets in May 2022: A new pro-competition regime for digital markets - government response to consultation

  2. In July 2022, Ofcom published its current thinking on how to implement regulation based on the Online Safety Bill as introduced in the UK Parliament on 17 March 2022: Online safety: Ofcom’s roadmap to regulation - Ofcom

  3. In 2021, the CMA and the Information Commissioner’s Office (ICO) published a joint statement on the interactions between competition and data protection in digital markets: Competition and data protection in digital markets joint statement (PDF, 460KB). The ICO and Ofcom will publish a joint statement on the interactions between the data protection regime and the online safety regime in autumn 2022. 

  4. See YouTube for Press

  5. See WhatsApp is now delivering roughly 100 billion messages a day, TechCrunch

  6. Algorithmic content recommendation systems may process personal data to target content. This use of personal data is regulated under data protection law. The ICO and Ofcom will publish a joint statement on the interactions between the data protection regime and the online safety regime in autumn 2022. 

  7. In March 2022 the government published a consultation on reviewing the regulatory framework for paid-for online advertising to tackle the evident lack of transparency and accountability across the supply chain: Online Advertising Programme consultation

  8. The CMA’s online platforms and digital advertising market study found competition problems in search advertising (where Google is dominant), social display advertising (where Meta, including Facebook and Instagram, has significant market power), and the provision of ad tech services to third party publishers and advertisers (where Google has a very strong position). The market study also found that a lack of transparency in ad tech, with respect to how much publishers pay in fees and to whom, result in higher fees. See, Online platforms and digital advertising market study, Final report (PDF, 4.89MB)

  9. Not all competition problems and online safety issues stem from the features of online services outlined above. Ofcom explored the wider drivers of online safety and competition risks in its 2019 publication: Online market failures and harms: an economic perspective on the challenges and opportunities in regulating online services

  10. Digital Regulation Cooperation Forum, Findings from the DRCF Algorithmic Processing workstream - Spring 2022

  11. Digital Regulation Cooperation Forum, Plan of work for 2022 to 2023 (PDF, 526KB)

  12. Choice of platform and the ability to switch is not likely to improve the safety of users who are, for example, seeking access to illegal content or who may not immediately perceive harm that they experience online. 

  13. CMA, Online platforms and digital advertising market study, Final report (PDF. 4.89MB), pages 16 to 18. 

  14. See, CMA and Ofcom, Advice to DCMS on how a code of conduct could apply to platforms and content providers

  15. CMA and Ofcom, Platforms publishers Advice (PDF, 1.23MB), paragraph 5.33. 

  16. In response to a CMA investigation, Meta has committed to doing more to prevent hidden advertising from being posted on its Instagram platform, while the CMA has published a compliance summary for influencers to ensure that they are aware of consumer protection law. The CMA also has a programme of work to tackle fake and misleading online reviews, including ongoing investigations into Google and Amazon. Meta and eBay have already given commitments to address the sale of fake and misleading reviews on their platforms. 

  17. CMA, Mobile ecosystems market study, Final report (PDF, 3.74MB)

  18. Ofcom is required to consider competition as part of its primary duties. Interactions between policy objectives are not unique to the regulation of online platforms. Similar links have arisen at times in markets such as telecoms and payments, where interventions to promote competition operate alongside protections to keep communications and/or payments secure. 

  19. See for example, CMA, 21st Century Fox / Sky merger inquiry case page

  20. See, Memorandum of Understanding between the CMA and Ofcom on the use of concurrent powers under consumer protection legislation (PDF, 339KB) and Memorandum of understanding between the CMA and Ofcom (publishing.service.gov.uk) (PDF, 339KB)

  21. Digital Regulation Cooperation Forum, Findings from the DRCF Algorithmic Processing workstream - Spring 2022

  22. Digital Regulation Cooperation Forum, The Metaverse and immersive technologies – A regulatory perspective blog

  23. See for example, CMA, Online Choice Architecture - How digital design can harm competition and consumers - discussion paper (PDF, 338KB) and Ofcom’s Online market failures and harms: an economic perspective on the challenges and opportunities in regulating online services (PDF, 1.35MB) which explores how information asymmetry can impact user choice.