Social media giants have been sent a final list of measures they must put in place to protect kids online by July - or risk being fined 10% of their global turnover.
A legal responsibility for online services to make their sites safe for children in the summer after years of dithering by media regulator Ofcom. Secretary Peter Kyle described today as a “watershed moment” in turning the tide on “toxic experiences” on social media.
Under the Online Safety Act, social media firms will be ordered to ensure they tame toxic algorithms, take faster action on removing harmful content and introduce proper age checks on their platforms.
Tech companies will now be expected to begin assessing the risk of harm to children on their platforms after Ofcom’s final children’s safety codes were published today. From July these protections will be fully enforceable and services that don’t comply could face serious enforcement action from Ofcom.
READ MORE:

This could include fines of £18million or up to 10% of their global revenue or impose other business disruption measures, such as requiring payment providers or advertising services to withdraw from an online site.
Under the legally-binding guidance, firms must protect children from content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, as well as pornography, bullying and violent material. This includes instructing platforms on how they can reduce toxic algorithms which are known to recommend harmful content to children without them seeking it out.
Steps platforms can take will vary depending on the risk of harmful content, but includes introducing age checks like photo ID matching, facial age estimation or credit card checks and filtering out harmful content from algorithms. Social media sites will also need to ensure they have more robust content moderation systems to take swift action against harmful content when they become aware of it.
READ MORE:
But it comes amid fears ’s return to the White House marks a new era in the age of tech giants. The US President - who is close pals with X/ owner Elon Musk - has voiced his fierce support for Silicon Valley, especially in defence of free speech and less moderation. Meta, which owns and , abandoned its use of independent fact checkers in the US earlier this year after Mr Trump criticised them.
Concerns have been raised that the Government, which is scrambling to maintain good relations with the Trump administration, could pander to the US on tech issues.
While the Online Safety Act became law in October 2023, Ofcom has not started using its powers yet as it has been undertaking painstakingly long consultations on its new guidance. In January, Tech chief Mr Kyle admitted the laws are “very uneven [and] unsatisfactory”. He said MPs need to get into a better cycle of “updating” current laws due to the extremely fast pace technology develops.

The dad of , who took her own life at 14 after being bombarded with harmful material online, said Ofcom's measures "will fail to prevent more young deaths like my daughter Molly's". Ian Russell, chair of suicide prevention charity the Molly Rose Foundation, said: "I am dismayed by the lack of ambition in today's codes. Instead of moving fast to fix things, the painful reality is that Ofcom’s measures will fail to prevent more young deaths like my daughter Molly's.
"Ofcom’s risk adverse approach is a bitter pill for bereaved parents to swallow. Their overly cautious codes put the bottom line of reckless tech companies ahead of tackling preventable harm. We lose at least one young life to tech-related suicide every single week in the UK which is why today’s sticking plaster approach cannot be allowed to stand."
On the publication of Ofcom’s codes this morning, Mr Kyle said: “Growing up in the digital age should mean children can reap the immense benefits of the online safely but in recent years too many young people have been exposed to lawless, poisonous environments online which we know can lead to real and sometimes fatal consequences. This cannot continue.
“The Children’s Safety codes should be a watershed moment – turning the tide on toxic experiences on these platforms - with the largest social media companies now having to prioritise children’s safety by law. This means age checks to stop children being exposed to the most extreme harmful content, as well as changes to platform design including algorithms to stop young users being served up harmful content they often aren’t even seeking.
“Like parents across the country I expect to see these laws help create a safer online world, so we set every child up for the best start in life. But we won’t hesitate to go further to protect our children; they are the foundation not the limit when it comes to children’s safety online.”
Dame Melanie Dawes, Ofcom's chief executive, said: “These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”
READ MORE:
You may also like
Inside Pope Francis's last moments as staff desperately battled to revive him
INS Surat carries out successful test firing of MR-SAM air defence missile system in Arabian Sea
China rocket launch fires three astronauts to satellite amid fears of new space race
India's domestic air passenger traffic up 11.3 pc at 148.8 lakh in March, outlook stable
Princess Anne and Sir Tim Laurence arrive in Turkey to carry out very important royal job