This is the last article you can read this month
You can read more article this month
You can read more articles this month
Sorry your limit is up for this month
Reset on:
Please help support the Morning Star by subscribing here
A SUICIDE prevention charity said it was “astonished and disappointed” at Ofcom’s new online safety rules today.
The regulator published its first set of codes of practice as part of the Online Safety Act.
Platforms will now have three months to assess the risk of their users encountering illegal content and implement safety measures to mitigate those risks or face enforcement action.
The Act will empower Ofcom to fine firms up to £18 million or 10 per cent of their qualifying global turnover.
In some cases, sites can even be blocked altogether.
But the Molly Rose Foundation, named after a 14-year-old who ended her life after viewing suicide content online, said the rules will mean that “preventable illegal harm can continue to flourish.”
The charity’s chief executive Andy Burrows said: “While we will analyse the codes in full, we are astonished and disappointed there is not one single targeted measure for social media platforms to tackle suicide and self-harm material that meets the criminal threshold.
“Robust regulation remains the best way to tackle illegal content, but it simply isn’t acceptable for the regulator to take a gradualist approach to immediate threats to life.
“The government must commit to fixing and strengthening the regime without delay.”
Technology Secretary Peter Kyle said: “Ofcom’s illegal content codes are a material step-change in online safety meaning that from March, platforms will have to proactively take down terrorist material, child and intimate image abuse, and a host of other illegal content, bridging the gap between the laws which protect us in the offline and the online world.”