April 8th 2019
The Department for Digital, Culture, Media and Sport (DCMS) has proposed a code of practice for technology companies, produced by an independent watchdog. It would be designed to tackle a range of online issues which threaten children and other vulnerable groups, including cyberbullying, young people accessing inappropriate material, revenge pornography, inciting violence, violent content, encouraging suicide and the sale of illegal goods. Even the spread of fake news and disinformation is covered.
Stringent requirements would be put in place for companies such as Facebook, Twitter, Google, Snapchat and cloud storage services to take firm action on terrorist and child abuse and sexual exploitation. Additional powers could include forcing internet service providers (ISPs) to block websites that fail to comply with the rules. Tech firms’ senior managers could be held liable for breaches.
The Online Harms White Paper – which has been jointly produced by the DCMS and the Home Office – also includes a possible industry levy to fund the regulator. It is subject to a 12-week public consultation.
Culture Secretary Jeremy Wright, speaking on today’s BBC Breakfast, said: “The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.
Mr Wright added: “If you look at the fines available to the Information Commissioner around the GDPR rules, that could be up to 4% of company’s turnover… we think we should be looking at something comparable here.”
Home Secretary Sajid Javid commented that tech giants and social media companies had a moral duty “to protect the young people they profit from. Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism – is still too readily available online.”
Peter Wanless, Chief Executive of the children’s charity NSPCC, said: “This is a hugely significant commitment by the government that once enacted, can make the UK a world pioneer in protecting children online.
“For too long social networks have failed to prioritise children’s safety and left them exposed to grooming, abuse, and harmful content. So it’s high time they were forced to act through this legally binding duty to protect children, backed up with hefty punishments if they fail to do so.”
It has not yet been decided whether a new regulator would be set up, or whether the task would be assigned to an existing body such as Ofcom.