Facebook accused of ‘not caring’ about police child abuse cases
The National Police Chiefs’ Council lead for Child Protection says social media giant Facebook has failed in its duty of care to children and does not care about helping police solve child abuse cases.
Simon Bailey, chief constable of Norfolk Constabulary, said the company’s plan to encrypt its one-billion-user Messenger service showed the company was “not listening” to police concerns.
Speaking in a BBC Radio 4 documentary broadcast last week, Mr Bailey also warned that the public was not aware of the full “horror” and sheer scale of online abuse and said technology companies were paying “lip service to the threat” to children.
The programme, Boy in the Video, follows a reporter’s attempts to identify a boy in a distressing child sex abuse video that was shared with a ‘school mums’ WhatsApp group, of which she was a member.
Mr Bailey said: “When you look at Facebook’s latest decision to end-to-end encrypt Facebook Messenger, they are simply turning around and saying, ‘okay we’re not listening to you. We don’t care – privacy’s more important’. How can any reasonable person think that that is all right?”
He added: “If I have one really significant regret around my leadership and our response to this, it’s the fact that we have struggled to land with the public the true scale of what we are dealing with [and] the horrors of what we are dealing with.
“I’d like to have thought that actually the Facebooks, the Microsofts, the Apples, the Googles of this world would had to have done something more than what they’ve currently done because, candidly, they are paying, in my opinion, lip service to the threat.
“They hold the key to so much of this. Their duty of care, I think, to children – they have completely absolved themselves of that.”
Mr Bailey said that a 1990s Home Office study on the proliferation of images of child abuse found there were around 10,000 in circulation. There are now more than 14 million images on the UK child abuse image database.
Last year, the US National Center for Missing and Exploited Children said 12 million of the more than 18 million reports of child abuse images it received in 2018 came from Messenger.
Mr Bailey’s comments were made shortly before today’s (December 11) announcement that Facebook had turned down requests from global law enforcement agencies to provide backdoor access to the company’s encrypted messaging products, including WhatsApp and Messenger.
In a letter, WhatsApp and Messenger heads Will Cathcart and Stan Chudnovsky said that providing such access into Facebook’s messaging products would be a gift to “hackers, criminals and repressive regimes”.
“People’s private messages would be less secure and the real winners would be anyone seeking to take advantage of that weakened security. That is not something we are prepared to do,” they added.
In October, senior government officials from the UK, US and Australia sent an open letter to Facebook, requesting the company halts its plans to introduce end-to-end encryption in its messaging products. They also urged the social network platform to enable government agencies “to obtain lawful access to content in a readable and usable format”.
They argued that in absence of backdoor access, investigative agencies would be unable to obtain critical evidence, including reports about child sexual exploitation.
“Companies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes,” the letter read.
“This puts our citizens and societies at risk by severely eroding a company’s ability to detect and respond to illegal content and activity, such as child sexual exploitation and abuse, terrorism, and foreign adversaries’ attempts to undermine democratic values and institutions, preventing the prosecution of offenders and safeguarding of victims.”
In July, ‘Five Eyes’ member countries also called for technology firms to help government agencies by providing them with special access to WhatsApp and other encrypted communications. The group warned that failing to do so would put lives of thousands of people at risk.
The UK’s intelligence and security organisation, GCHQ, had further suggested that technology firms should develop systems to “quietly” add an intelligence agent to conversations or group chats.
In a statement, Facebook said: “Keeping young people safe on our platforms is our priority and our systems remove 99 per cent of child abuse content before it’s reported to us.”