Websites to face fines over 'online harms'
The Government is proposing new laws to fine websites over “harmful online content”, such as terrorist propaganda or images of child abuse, in a bid to make the UK “the safest place in the world to go online”.
The proposals were announced earlier this week in the Government’s Online Harms White Paper, a joint proposal from the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office and include the creation of an independent watchdog that will create a code of practice for online companies.
The plans cover a range of issues that are clearly defined in law such as spreading terrorist content, child sex abuse, so-called revenge pornography, hate crimes, harassment and the sale of illegal goods. But also cover harmful behaviour that has a less clear legal definition such as cyber-bullying, trolling and the spread of fake news and disinformation.
“Given the prevalence of illegal and harmful content online, and the level of public concern about online harms, not just in the UK but worldwide, we believe that the digital economy urgently needs a new regulatory framework to improve our citizens’ safety online,” the White paper States.
Outlining the proposals, Culture Secretary Jeremy Wright said: “The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.
“If you look at the fines available to the Information Commissioner around the GDPR (General Data Protection Regulation) rules, that could be up to four per cent of company’s turnover… we think we should be looking at something comparable here.”
An independent regulator proposed in the White Paper would be “funded by industry in the medium term”, although “the Government is exploring options such as fees, charges or a levy to put it on a sustainable footing”.
It adds that the new laws introduced following on from the White Paper would be compatible with the EU’s e-Commerce Directive.
The proposals are particularly aimed at material that advocates self-harm and suicide, which became a prominent issue after 14-year-old Molly Russell took her own life in 2017. Following her death her family found distressing material about depression and suicide on her Instagram account and has expressed the view that the social media company, which is owned by Facebook, partly responsible for her death.
The changes will also outlaw the use of online platforms by criminal gangs “to promote gang culture and incite violence”, as well as “the illegal sale of weapons to young people online”.
They will also make it harder for terrorist groups using the internet “to spread propaganda designed to radicalise vulnerable people” or to “distribute material designed to aid or abet terrorist attacks” – a reference to the recent terrorist attack in Christchurch, where the attacker broadcast live using social media.
Home Secretary Sajid Javid said internet companies had a moral duty “to protect the young people they profit from”.
“Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism – is still too readily available online”, he added.
A public consultation on the plans will run for the next 12 weeks.