The NSPCC says the watchdog should be created to function as an early warning system to alert Ofcom.
Upcoming online safety laws should include provisions to create a new watchdog to advocate for children and protect them online, a charity has said.
The children’s charity said the current Bill – which is making its way through 議会 – would still leave children at risk of sexual abuse online.
It says the watchdog should be created to function as an early warning system to alert Ofcom, the new regulator for the sector, of threats to children as they emerge and technology evolves.
The watchdog could also be used to offer a counterbalance to any lobbying done by tech giants in an attempt to influence the regulator, the NSPCC said.
“The low priority tech firms place on reacting rapidly to protecting children relative to other business imperatives won’t end with regulation, which is why it is so important to have a watchdog to stand up for children at risk of abuse on their platforms,” NSPCC chief executive Sir Peter Wanless said.
“Access to dangers faced by children in real-time will equip industry and the regulator with the information they need to respond quickly.
“Other regulated sectors have bodies that promote the interests of users and ministers have the opportunity to ensure that children are given that voice in the online space.
“The landmark Online Safety Bill will compel companies to finally address the way their sites put children in harm’s way and its effectiveness can be bolstered by a watchdog that ensures children are at the heart of regulation for generations to come.”
The charity’s proposals have been backed by a number of fellow online safety campaigners, including Ian Russell, whose daughter Molly took her own life after viewing harmful content on social media.
Mr Russell, who has since set up an online safety foundation in his daughter’s name, 前記: “I know how isolating it can feel when speaking up about harmful online content; a small voice crying into a violent storm.
「私の経験から, I understand how beneficial it would be to formalise a system to amplify calls for change.
“Co-ordinated user advocacy would ease the burden placed on those directly affected by harmful content and help to bring about the prompt change required to create a safer online world.
“The creation of a statutory watchdog would also help raise an early alarm when future harms are first encountered.
"そう, I support the NSPCC’s call for the Online Safety Bill to create a statutory watchdog to advocate for children and help ensure their safety is at the heart of the new regulation.”
The NSPCC said it had also seen support from the public for the idea of a watchdog, citing new research it has published which found that 88% of people think such a body was necessary.
A Department for Digital, 文化, Media and Sport spokesperson said: “Our comprehensive online safety laws are built to protect young people and force social media platforms to take tough action against harmful content. We’re making sure organisations can raise serious concerns about platforms and giving Ofcom robust powers to hold tech firms to account.”