In July 2022, the UK’s Competition and Markets Authority (CMA) and Ofcom published a joint statement on online safety and competition in digital markets. The statement follows the introduction of the Online Safety Bill to the UK parliament in March 2022. The current draft of the Online Safety Bill requires firms in scope – including services that host user-generated content and search engines – to take steps to protect their users from harmful online content and activity.
Ofcom is charged with implementing the new online safety regime, while the CMA is tasked with promoting more competition in digital markets. The joint statement from the two organisations identifies areas where these objectives may intersect. In some cases, those objectives are complementary: interventions designed to promote one objective may also promote the other. In other cases, these objectives may be in tension.
Synergies between competition and online safety objectives
Introducing greater competition into digital services markets could help improve online safety. Competitive pressure should help encourage firms to enhance users’ experience, including reducing users’ exposure to harmful content. The existence of alternative platforms also enables users and advertisers to ‘vote with their feet’ and move to a competing service offering greater levels of safety.
Whether users will actually switch in this way is unclear. Many aspects of digital services act as deterrents to effective switching (for example: network effects, the power of defaults, the logged-in environment). Arguments also supposes that users can access reliable and impartial information on which services are safest.
Another area of synergy is the growth of the market for third-party online safety services, such as content moderation. These services – usually powered by AI – can sift out offensive content before most users can see it. The market for online safety services is important for smaller online service providers, who might lack the means to develop in-house safety solutions.
Tensions between competition and online safety objectives
At the same time, there are also tensions between the two policy objectives. Regulating for online safety could increase the cost of entry into the online services market, as entrants must ensure they are compliant. For this reason, the online safety regime is designed to be proportionate, with more obligations on the larger companies. However, this differentiated regime would appear to make it less likely that users would switch services on safety grounds, as the largest platforms would also have the greatest safety obligations.
The statement also notes that in some cases, pro-competitive interventions might worsen online safety. The example given is that of interoperability requirements, which some platforms claim might undermine their ability to ensure users’ online safety. The CMA and Ofcom note that, in such cases, they would need to examine the validity of such claims, and to consider mitigating measures to limit the impact of pro-competitive interventions on online safety.
The role of ‘gateway’ platforms
A further tension between the two policy aims can arise in the case of platforms which are ‘important gateways’ for businesses to reach consumers. The statement notes that platforms in this position play a ‘quasi-regulatory’ role in terms of setting standards for users’ privacy and safety. Platforms may then require other firms to use those standards if they interact with the platform. This can generate conflicts of interest: platforms will set standards primarily to their own advantage, rather than in the interest of the broader market.
A concern raised in the statement is that platforms adopt online safety measures that are onerous for third parties and restrict competition. The example given is the CMA’s mobile ecosystems market study, which found that Apple had cited online safety as justification for restricting cloud gaming apps on its App Store.
In such cases, the CMA and Ofcom note that they would need to consider alternative safety solutions which lead to less distortion of competition. They also note that there may be scope to clarify online safety standards – though it is hard to see how this could cover every use case or situation.
The CMA and Ofcom have recognised that, in the case of digital markets, policy aims are not always aligned. In particular, tensions exist between ensuring users’ online safety and digital market competition. Here, regulators must weigh up the long-term benefits from facilitating greater market competition against the more immediate benefits of alleviating online harms, which are significant (this InterMedia article contains discussion of the potential impact of online harms in the UK).
Perhaps the most significant element of the statement is the growing level of collaboration between the CMA and Ofcom, including via the Digital Regulation Cooperation Forum (which also includes the Information Commissioner’s Office and the Financial Conduct Authority). The regulators are working collectively to tackle a variety of issues, such as online fraud and data privacy.
As people’s lives and interactions become increasingly digitised, it will become ever more important for digital regulators to pool expertise. This will particularly be the case with new and emerging phenomena, such as the Metaverse or synthetic media (deepfakes). It is likely that there will be increased dialogue between different regulators, both in the UK and in other markets, as they seek to tackle cross-cutting issues.
 Available at: https://www.gov.uk/government/publications/cma-ofcom-joint-statement-on-online-safety-and-competition/online-safety-and-competition-in-digital-markets-a-joint-statement-between-the-cma-and-ofcom
 Services which have significant numbers of UK users or which are targeted at the UK market will be in scope of the new law.
 Specifically through the establishment of a Digital Markets Unit (DMU) within the CMA.