Technology regulation in the UK is accelerating on multiple fronts simultaneously. The Online Safety Act became law in October 2023, giving Ofcom sweeping new enforcement powers over digital platforms. The Data Protection and Digital Information Act reshapes the UK's data privacy framework post-Brexit. The CMA's Digital Markets, Competition and Consumers Act introduced a new pro-competition regime for the most powerful digital firms. And the UK's AI governance framework — deliberately light-touch compared to the EU's AI Act — is being operationalized through sector regulators. For technology companies, the UK regulatory environment in 2026 is fundamentally different from even two years ago.
Key Regulatory Bodies
Information Commissioner's Office (ICO) — the UK's data protection authority, enforcing the UK GDPR, the Data Protection Act 2018, and the new Data Protection and Digital Information Act. The ICO issued over £40 million in fines in 2024 and has expanded its focus to include AI training data, adtech, and children's data. Its regulatory sandbox and innovation hub provide guidance to tech companies developing new products.
Ofcom — the UK's communications regulator, now also the principal online safety regulator under the Online Safety Act 2023. Ofcom is developing and enforcing codes of practice covering illegal content, child safety, and adult user empowerment across search engines, social media platforms, messaging services, and online marketplaces. Its enforcement powers include fines of up to £18 million or 10% of global turnover.
Competition and Markets Authority (CMA) — the UK's competition regulator, with new powers under the Digital Markets, Competition and Consumers Act 2024 to designate firms with "strategic market status" and impose conduct requirements. The CMA's Digital Markets Unit is actively investigating the competitive dynamics of AI foundation models, cloud computing, and app ecosystems.
Department for Science, Innovation and Technology (DSIT) — the government department setting technology policy including AI governance, digital infrastructure, and R&D strategy. DSIT published the UK's pro-innovation AI regulation framework and coordinates cross-regulator AI governance through the AI Safety Institute.
National Cyber Security Centre (NCSC) — part of GCHQ, the NCSC publishes cybersecurity guidance, incident alerts, and compliance standards that affect technology companies' security obligations. Its Cyber Assessment Framework is increasingly referenced by sector regulators as a baseline for cybersecurity requirements.
Critical Regulations
- Online Safety Act 2023 — requires platforms to conduct risk assessments for illegal content and content harmful to children, implement safety measures, and comply with Ofcom's codes of practice. Category 1 services (the largest platforms) face additional duties around user empowerment and transparency. Ofcom is publishing codes of practice in phases through 2025-2026, each creating new compliance obligations.
- Data Protection and Digital Information Act 2024 — amends the UK GDPR to create a more business-friendly data protection framework. Changes include a revised legitimate interests basis, reduced DPIA requirements, and reforms to the ICO's structure. Tech companies must update their data processing practices and privacy documentation to reflect the new provisions.
- Digital Markets, Competition and Consumers Act 2024 — creates a new regulatory regime for firms designated as having "strategic market status" in digital activities. Designated firms face binding conduct requirements set by the CMA, merger reporting obligations, and potential fines of up to 10% of global turnover for non-compliance.
- Network and Information Systems Regulations 2018 (NIS, as amended) — the UK's implementation of cybersecurity requirements for operators of essential services and digital service providers. The government has consulted on updates to align with the evolving threat landscape and the EU's NIS2 Directive scope expansion.
What You're Missing
Ofcom's Online Safety codes are arriving in phases. The Online Safety Act is the framework — Ofcom's codes of practice are the actual compliance requirements. These are being published throughout 2025 and 2026, each applying to different categories of services. Companies that read the Act and stopped monitoring may not realize their specific obligations until a code arrives with a short implementation deadline.
AI governance is being distributed across existing regulators. Unlike the EU's centralized AI Act approach, the UK is pushing AI oversight to sector regulators — the FCA for financial AI, the MHRA for medical AI, Ofcom for AI in content moderation. Each regulator is developing its own approach at its own pace, making cross-sector AI governance a multi-regulator monitoring challenge.
How RegPulse Helps
RegPulse tracks the ICO, Ofcom, CMA, DSIT, and NCSC for technology-relevant regulatory publications. When Ofcom issues a new Online Safety code of practice, when the ICO publishes enforcement guidance on AI training data, when the CMA opens a digital markets investigation — you receive same-day alerts. Technology companies can set up monitoring profiles that span all relevant regulators, capturing the cross-cutting regulatory picture that no single agency provides.
Start monitoring technology regulations in the United Kingdom
Track Online Safety Act codes, ICO enforcement, CMA digital markets decisions, and AI governance developments in one feed.
Start free trial — no credit card