Ofcom continues its work to bring the Online Safety Act 2023 (the “Act”) into force and has launched two new consultations on transparency and information gathering respectively.
While the Act became law in October 2023 it is coming into force in stages, with the majority of obligations due to come into force either later this year or early 2025 following the publication of applicable codes and guidance by Ofcom. These consultations are therefore an important way for industries likely to be caught within the scope of the Act to have their say on the proposed scope of the obligations.
Transparency
Ofcom is consulting on draft statutory transparency reporting guidance, which covers the process that it uses to decide what providers must include in their transparency reports, and how information from these reports will be used to inform Ofcom’s own transparency report.
As our regular readers know, the Act makes platforms (including social media, search, social games and pornography services) legally responsible for keeping people, especially children, safe online. Certain duties in the Act apply to all regulated services, and there are additional duties on certain bigger, higher risk, services. The duty to publish transparency reports only applies to providers of certain regulated services, specifically those that appear on a public register of “categorised services” prepared by Ofcom.
Categorised services will have to publish transparency reports according to requirements that are set out by Ofcom in transparency notices. Ofcom’s draft guidance sets out its proposed approach to deciding what information relevant services are required to publish in their reports, as well as information about how it will engage with services throughout the reporting process.
The information that it will require will differ from platform to platform, taking account of the type of service, its number of users, the proportion who are children, along with certain other factors. It might require data such as how prevalent illegal content is on their service, how many users have come across such content, and the effectiveness of features used by a platform to protect children.
Ofcom is also required to produce its own transparency report that draws conclusions based on the substance of the reports produced by providers. The draft guidance presents its proposed approach to using information from service providers’ transparency reports in its own report.
It’s important to note that the categorisation of platforms is also still to be confirmed; Ofcom has provided its advice to the Government on the thresholds which would determine if a service would be classified as categorised. Taking this advice into account, the Secretary of State will need to set the threshold conditions in secondary legislation. Once passed, Ofcom will then gather information from regulated services and produce a published register of categorised services.
Information gathering
Ofcom’s proposed guidance explains when and how it might exercise its powers with respect to information gathering. It is intended to be flexible to allow it to consider the individual circumstances in which it might use its powers and covers the factors it might take into account when deciding whether to exercise its powers. It might for example:
- carry out an audit of a tech firm’s safety measures or features;
- remotely inspect the workings of their algorithms in real time;
- obtain information to allow it to respond to a Coroner’s request if a child passes away; and
- in exceptional cases, enter UK premises of tech companies to access information and examine equipment.
It also explains the legal duties imposed on regulated services and other third parties in relation to information gathering, and sets out its expectations on how services or other third parties should respond when Ofcom exercises its information gathering powers.
Both consultations end on 4 October and Ofcom expects to publish the final guidance in 2025. Failure to comply with either a transparency or information notice from Ofcom could result in fines of up to £18m or 10% of a company’s worldwide revenue (whichever is higher) and so interested platforms should take this opportunity to review the proposed guidance carefully and respond to the consultation(s) before the October deadline.
Our comprehensive transparency powers under the Act will be transformational in shining a light on the best and worst safety practices across the industry - encouraging safety by design – and empowering users.