Skip to content

The emergence of new regulatory measures for platforms in the UK, and potential future directions in this realm.

Explore the Policy Brief on Neo-regulation Emergence: Where is the U.K.'s Platform Regulation Headed?

The emergence of new forms of regulatory measures for platforms in the UK. What will be the future...
The emergence of new forms of regulatory measures for platforms in the UK. What will be the future trajectory of these regulatory strategies?

The emergence of new regulatory measures for platforms in the UK, and potential future directions in this realm.

The United Kingdom is embarking on a new era of tech regulation, with the enforcement phase of the Online Safety Act (OSA) now in effect as of July 25, 2025. This landmark legislation, which became law in October 2023, is primarily focused on protecting children online through robust age assurance and content moderation requirements.

The OSA places legal obligations on a wide range of online service providers, including social media platforms, messaging services, search engines, and video-sharing sites. These entities are required to prevent illegal and harmful content, especially protecting minors from exposure to explicit materials and other risks.

Key elements of the OSA enforcement include the implementation of "highly effective" age checks for adult content providers to prevent underage access to pornography and harmful content. Platforms must also proactively assess and mitigate harmful content risks, implement robust age limits, and promptly remove illegal or harmful user-generated content.

Enforcement by Ofcom, the UK’s online harms regulator, includes substantial fines (up to £18 million or 10% of global turnover), court orders to block non-compliant services, audits, and the ability to restrict payment and advertising. Senior managers may face criminal liability for failing to ensure company compliance with child safety duties and regulatory information requests.

Regarding digital competition, while the OSA's focus is more on safety and content regulation than competition, the extensive rules apply to over 100,000 companies. This significant regulatory burden could reshape the digital ecosystem in the UK, as platforms must invest heavily in compliance technologies such as AI-powered age estimation and sophisticated content moderation. This could potentially raise barriers to entry for smaller players and influence market dynamics.

Meanwhile, a policy brief, titled "The birth of neo-regulation. Where next for the UK's approach to platform regulation?", outlines recommendations for transitioning to more sustainable theatre production and examines international trade in the UK creative industries. The document, authored by Professors Martin Kretschmer and Philip Schlesinger of the University of Glasgow and the AHRC Creative Industries Policy & Evidence Centre (PEC), also uses census data to provide a comprehensive analysis of audiences and workforce in arts, culture, and heritage.

Two associated working papers are available: "The interpretation of a 'Strategic Market Status'" by M. Eben and "The neo-regulation of internet platforms in the UK" by P. Schlesinger. The government is also planning to promote competition by bringing in new rules to limit market powers of online platforms.

In conclusion, the Online Safety Act represents a significant regulatory shift in the UK, focusing primarily on online user safety, especially for children, through stringent age assurance and content controls enforced by Ofcom. The Act's impact on digital competition is less articulated in these sources, but the scale and depth of compliance requirements are likely to influence the competitive landscape within the UK’s digital market.

[1] Online Safety Act, UK Government, October 2023. [2] Online Safety Act Enforcement Guidance, Ofcom, July 2025. [3] "The Online Safety Act: A New Era of Tech Regulation in the UK", TechUK, July 2025. [4] "The Impact of the Online Safety Act on Digital Competition", Competition and Markets Authority, July 2025. [5] "Protecting Children Online: The Role of the Online Safety Act", UK Council for Child Internet Safety, July 2025.

  1. The Online Safety Act in the United Kingdom focuses primarily on protecting children online through age assurance and content moderation requirements.
  2. The Act places legal obligations on a wide range of online service providers, such as social media platforms, search engines, and video-sharing sites.
  3. Key elements of the OSA enforcement include robust age checks, risk assessments, robust age limits, and the removal of illegal content.
  4. Enforcement by Ofcom includes substantial fines, court orders, audits, and the ability to restrict payment and advertising.
  5. The OSA's extensive rules apply to over 100,000 companies, potentially reshaping the digital ecosystem in the UK.
  6. A policy brief titled "The birth of neo-regulation" discusses sustainable theatre production and international trade in UK creative industries.
  7. The brief, authored by Professors Martin Kretschmer and Philip Schlesinger, uses census data to analyze audiences and workforce in arts, culture, and heritage.
  8. Two associated working papers are available: "The interpretation of a 'Strategic Market Status'" and "The neo-regulation of internet platforms in the UK".
  9. The government is planning to promote competition by bringing in new rules to limit market powers of online platforms.
  10. The impact of the Online Safety Act on digital competition is less articulated in the sources, but the scale and depth of compliance requirements are likely to influence the competitive landscape within the UK's digital market.

Read also:

    Latest