Online Safety Bill to tackle ‘absurd situation’ of age limit verification
Changes to the Online Safety Bill will tackle the “absurd situation” surrounding the enforcement of age limits on social media platforms, the Culture Secretary said.
The Government has announced it is making amendments to the proposed internet safety laws in order to boost child online safety.
The updates will require tech firms to show how they enforce user age limits, as well as publish summaries of risk assessments in regard to potential harm to children on their sites and declare details of enforcement action taken against them by Ofcom – the new regulator for the tech sector.
Writing for The Telegraph, Culture Secretary Michelle Donelan said the new version of the Bill better reflects its “original purpose” to “protect young people”.
“Protecting children is the fundamental reason why the Online Safety Bill was created, and so the changes I have made strengthen the child protection elements of the bill significantly,” she said.
“Some platforms claim they don’t allow anyone under 13 – any parent will tell you that is nonsense. Some platforms claim not to allow children, but simultaneously have adverts targeting children. The legislation now compels companies to be much clearer about how they enforce their own age limits.”
Along with the changes to boost child safety online, controversial measures that would have forced social media sites to take down material designated “legal but harmful” are to be removed.
Under the original Bill’s plans, the biggest platforms would have been compelled to not only remove illegal content, but also any material which had been named in the legislation as legal but potentially harmful.
These measures drew criticism from free speech campaigners, who claimed that governments or tech platforms could use the Bill to censor certain content.
The amended Bill would require platforms to remove illegal content, as well as take down any material that is in breach of its own terms of service.
And instead of the legal but harmful duties, there will now be a greater requirement for firms to provide adults with tools to hide certain content they do not wish to see – including types of content that do not meet the criminal threshold but could be harmful to see, such as the glorification of eating disorders, misogyny and some other forms of abuse.
It is an approach which the Government is calling a “triple shield” of online protection which also allows for freedom of speech.
Under the Bill, social media companies could also face being fined by Ofcom up to 10% of annual turnover if they fail to fulfil policies to tackle racist, homophobic or other content harmful to children on their platforms.
The updated rules will also prohibit a platform from removing a user or account unless they have clearly broken the site’s terms of service or the law.
“I have carefully amended the Online Safety Bill to ensure it reflects the values of our way of life – protecting children, safeguarding the vulnerable, protecting legal free speech and defending consumer choice,” Ms Donelan wrote.
“These common sense solutions combine to form the basis of a bill that will genuinely change lives for the better, while also protecting the rights and values we hold dear.”
However, Julie Bentley, chief executive of Samaritans, described dropping the requirement to remove “legal but harmful” content as “a hugely backward step”.
“Of course children should have the strongest protection but the damaging impact that this type of content has doesn’t end on your 18th birthday,” she said.
“Increasing the controls that people have is no replacement for holding sites to account through the law and this feels very much like the Government snatching defeat from the jaws of victory.”
Shadow culture secretary Lucy Powell said it was a “major weakening” of the Bill, adding: “Replacing the prevention of harm with an emphasis on free speech undermines the very purpose of this Bill, and will embolden abusers, Covid deniers, hoaxers, who will feel encouraged to thrive online.”
The latest changes come in the wake of other updates to the Bill, including criminalising the encouragement of self-harm and of “downblousing” and the sharing of pornographic deepfakes.
The Government also confirmed further amendments will be tabled shortly aimed at boosting protections for women and girls online.
In addition, the Victim’s Commissioner, Domestic Abuse Commissioner and Children’s Commissioner will be added as statutory consultees to the Bill, meaning that Ofcom must consult them with drafting new codes of conduct it will create that tech firms must follow in order to comply with the Bill.
The Online Safety Bill is due to return to Parliament next week after being repeatedly delayed.
Published: by Radio NewsHub