Record numbers of online child grooming crimes have been recorded by police in the UK in the last three years, figures obtained by the NSPCC have revealed.
The child protection charity has warned that offenders are exploiting “risky design features” on popular apps for children, with Instagram being the most common platform used by groomers in the past year.
The NSPCC has called on the Government to respond by introducing tougher measures in the Online Safety Bill set to be scrutinised this month, as recorded figures increased by almost 70% – an all-time high.
The charity obtained numbers from 42 police forces, who recorded 5,441 Sexual Communication with a Child offences between April 2020 and March 2021, an increase of 69% from 3,217 in 2017/18.
In London, Metropolitan Police & City of London Police recorded 551 offences in the year 2020/21 in comparison to 145 in 2017/18, a rise of 280% in just three years.
Hannah Ruschen, Policy and Public Affairs Officer in the child safety online team at the NSPCC said: “This shows that the children in London are at risk of being groomed online and this follows trends in line with the rest of the country.”
Sexual communication with a child is an offence criminalised under Section 67 of the Serious Crime Act 2015.
It came into force on April 2017 following the NSPCC’s Flaw in the Law campaign which called on the Government to make online grooming a crime.
Ruschen said the increase has been partly down to the increased usage of the internet over the past year as a result of lockdown.
She said: “It’s incredibly concerning.
“Especially during the course of the pandemic, we’ve seen a perfect storm for child abuse online, where we know that children are spending more time online, whether that be for socialising, or for taking part in school.”
Social Media Platforms
The charity found that Facebook owned apps were used in almost half of offences where the means of communication was known in 2020/21.
These platforms include popular spaces like Instagram, WhatsApp and Messenger.
The charity believes the true scale of such an offence is likely to be higher as Facebook tech failures meant there was a drop in the removal of abuse material during the pandemic.
Ruschen said: “Sites up until now have not taken full responsibility and have not thought about child safety from the very beginning of the design process for new features of that app or website.
“We really need that transparency for sites like Facebook to be clear about why they might have technical difficulties about moderation capacity and what the true picture and the true scale of this is, because we know that these figures, as stark as they are just the tip of the iceberg.”
An Ofcom report earlier this year revealed the majority of children use social media sites or apps – just over half of five-15s, and increasing to 87% of 12-15s.
Around a third of five-15s used Instagram, Snapchat and Facebook.
The same survey reported over half of 12-15s have had some form of negative online experience, the most common type being contacted by someone they did not know who wanted to be their friend, which happened to almost a third of children in this age group.
Legislation
NSPCC have called for Online Safety Bill to hold named-mangers personally liable for design choices that put children at risk, and monitor abuse across different platforms, where cases of grooming happen from site to site.
Ruschen said: “It can be really, really upsetting for a child to to experience abuse like this online, and it can be life changing, if they can’t come forward to receive support.
“That’s why we need this this legislation to be preventative because once harm has happened, you can remove the images, you can remove the accounts, but ultimately, there is a child that has been harmed. That’s why we need to think about this from a preventative approach.
“Grooming really could happen on any any manner of online sites and can also move between them. So it’s really important legislation is far reaching and covers not only the biggest sites, but also smaller, newer sites that might pop up that can pose grooming risk for children and young people.”
She added: “It’s really important for parents to be having those open and honest conversations with their children about the risks they might be facing online.
“So what are their favourite games to play? What apps do they use? How do they speak to their friends online? Have they seen anything concerning online that they want to talk to someone about?
“It’s really important that children know they have someone to come to if they are concerned about anything that they see in their online lives.”
A spokesperson for Facebook said: “This is abhorrent behaviour and we work quickly to find it, remove it and report it to the relevant authorities.
“With tens of millions of people in the UK using our apps every day, we are determined to continue developing new ways to prevent, detect and respond to abuse.”
A Government spokesperson said: “Keeping children safe is one of our highest priorities and the strongest measures contained in the Online Safety Bill are designed to protect children.
“If social media companies do not properly assess or take action against the risks their sites pose to children, they will face heavy fines or have their sites blocked. The Bill will further make tech companies accountable to an independent regulator.
“We are clear that companies must continue to take responsibility for stopping the intolerable level of harmful material on their platforms and embed public safety in their system designs, which is why the Bill will also compel them to consider the risks associated with all elements of their services and take robust action to keep their users safe.”
Any child concerned about anything they’ve seen online can contact Childline at 0800 1111.