The UK Online Safety Act is now in enforcement phase
The UK's April 30 deadline for social platforms to prove they're protecting children lands this week.
The Online Safety Act is now in the enforcement phase, and Ofcom’s April 30 deadline requires Facebook, TikTok, YouTube, and others to set out specifically how they’ll keep children safe.
Digital rights organisations spent the better part of a decade pushing for exactly this kind of regulatory pressure on platforms.
The reaction from most civil society groups isn’t a celebration. It’s cautious watchfulness, and for good reason.
The deadline covers four things:
Age controls that actually work,
Grooming protections,
Safer recommendation algorithms, and
No AI product testing on children without prior risk assessment and Ofcom notification.
That last requirement is genuinely new. No major regulator has inserted itself so directly into a platform’s AI deployment cycle before. It matters.
But April 30 is a plan deadline, not an action deadline. Platforms file explanations; Ofcom publishes a summary in May. Real behavioural change, and any enforcement, comes later.
The organisations that pushed hardest for this law are now doing the less visible work: reading those May reports carefully, and telling Ofcom publicly whether the platforms’ promises hold up.
