Record high number of recorded grooming crimes, as cases increase by 70% in three years
The NSPCC says stronger online safety legislation is needed to tackle offenders exploiting ‘risky’ design features on apps and platforms popular with children.
25/08/21
Online grooming crimes recorded by police jumped by around 70% in the last three years, reaching an all-time high in 2021.
The figures, obtained from Freedom of Information (FOI) requests from 42 police forces in England and Wales, found that there were 5,441 Sexual Communication with a Child offences recorded between April 2020 and March 2021, an increase of around 70% from recorded crimes in 2017/18.
When comparing data provided by the same 42 police forces from 2019/20, there was also an annual increase of 9% – making the number of crimes recorded last year a record high.
Almost half of the offences where the app was known used Facebook or Facebook-owned apps, such as Instagram, WhatsApp and Messenger. Instagram was the most common site used, flagged by police in 32% of instances where the platform was known last year. Snapchat was also used in over a quarter of offences.
The NSPCC says the true scale of grooming is likely to be even higher as Facebook tech failures saw a drop in removal of abuse material during pandemic.
“We believe last year’s figures don’t give a full understanding of the impact of the pandemic on children’s safety online,” the charity said in a statement.
In March this year, it was reported that Facebook removed less than half the child abuse content in the last six months of 2020 than it had done previously. From July to September 2020, the social media giant removed 12.4 million pieces of child abuse content, however this dropped to just 5.4 million in the following quarter. The company blamed this on a technical issue with its 'media-matching' technology, which identifies illegal uploads.
“With around half of recorded offences happening on Facebook’s platforms, we’re urging the company to invest in technology to ensure plans for end-to-end encryption will not stop them from identifying and protecting against abuse,” the NSPCC said.
“Facebook should proceed only when they can prove child protection tools won’t be compromised by encryption. The Online Safety Bill must hold named-mangers personally liable for design choices that put children at risk.”
However, Facebook said it was dealing with the “abhorrent behaviour” and works closely with the relevant authorities to find and report abusive and grooming content.
“We also block adults from messaging under 18s they're not connected with and have introduced technology that makes it harder for potentially suspicious accounts to find young people,” the tech giant said in a statement.
"With tens of millions of people in the UK using our apps every day, we are determined to continue developing new ways to prevent, detect and respond to abuse."
However, campaigners say that tech firms – like the so-called ‘big four’ – have failed to reduce the risk children faced on their platforms and apps during pandemic lockdowns. They say that despite the recent flurry of safety announcements from companies such as Instagram, Apple and TikTok, tech firms are ‘playing catch up’ in responding to historically poorly designed sites that fail to protect their young users.
The Draft Online Safety Bill published in May sets out measures to tackle this, but campaigners say it does not go far enough.
Andy Burrows, Head of Child Safety Online Policy at the NSPCC, said the Government must put child protection “front and centre” of the legislation.
“Year after year tech firms’ failings result in more children being groomed and record levels of sexual abuse.”
“Safety must be the yardstick against which the legislation is judged and Ministers’ welcome ambition will only be realised if it achieves robust measures to keep children truly safe now and in the future.”
£33.945- £36,648
Featured event
Most popular articles today
Sponsored Content