On 13 May 2024, the European Commission (Commission), who are responsible for the enforcement of the Digital Services Act (DSA), signed an administrative arrangement (Arrangement) with Ofcom – the media regulator of the UK. The Arrangement will support the Commission's and Ofcom's supervisory work in relation to online platforms and safety, with the relevant legislation being the EU's DSA and the UK's Online Safety Act (OSA). 

The DSA and OSA both have the key objective of ensuring better online safety for EU and UK users and whilst they address some common areas, they are notably different in other respects (see here for a comparison overview of the two).

Areas of common interest in both the DSA and OSA include the protection of minors online, age-appropriate design technologies, the transparency of online platforms, risk assessments and the impact of algorithms on systemic risks for society. 

The co-operation between the Commission and Ofcom will be carried out through technical expert dialogues, joint training of technical staff, sharing of best practice, joint studies and coordinated research projects. The Commission notes that effective and active co-operation with international partners is crucial to shape a secure and trusted online environment. 

Many digital businesses will be subject to both the DSA and the OSA, given their wide extra-territorial scope. Those businesses facing compliance with these similar albeit different regimes will welcome this Arrangement and will hope to see more guidance about how the EU and UK regulators will co-operate in this field.

Under the DSA, the Commission is responsible for the supervision and enforcement of provisions applying to designated Very Large Online Platforms and Search Engines (VLOP/SEs). Since the DSA became generally applicable on 17 February 2024, the Commission has not been slow to take action to ensure that the DSA creates a safer online environment: by bringing transparency (including over 14.5 billion statements of reasons for content moderation decisions in a public database – for more information see our article here), offering due process for platforms' content moderation decisions, offering access to platforms' public data for researchers, issuing guidelines and by making requests for information and/or opening proceedings where it suspects that in-scope platforms may be infringing their DSA obligations. For more information see our articles about Tik Tok Liteelection integrity guidance and Amazon’s online advertising archive.

Closer to home, under the title of the Digital Regulatory Cooperation Forum (DRCF), Ofcom and the ICO also recently published a Joint Statement on Collaboration on the Regulation of Online Services. This sets out: 

  • Collaboration themes - areas where the regulators will work together
  • Companies of mutual interest - how they will identify when they are looking at the same issues or services
  • New ways of working - information sharing and collaboration.

Collaboration themes highlighted in the joint statement will be of particular interest to online businesses with safety duties under the OSA involving the processing of children's personal data and whose services are also caught under the ICO's Children's Code. They will initially cover:

  • age assurance
  • recommender systems
  • proactive tech and relevant AI tools
  • default settings and geolocation settings for child users
  • online safety privacy duties
  • upholding terms, policies and community standards.

Again, it is encouraging to see UK regulators taking a collaborative approach to areas of common interest, which should help provide more clarity to impacted online businesses and ensure consistency in supervision and enforcement.