Online grooming crimes have reached record levels in the UK, with more than 7,000 offences recorded by police over the last year for the first time, the NSPCC has said.
The children’s charity said the figures, provided by 45 UK police forces, showed that 7,062 sexual communication with a child offences were recorded in 2023-24, a rise of 89% since 2017-18, when the offence first came into force.
Where the means of communication was disclosed – which was 1,824 cases – social media platforms were often used, with Snapchat named in 48% of those cases.
Meta-owned platforms were also found to be popular with offenders, with WhatsApp named in 12% of those cases, Facebook and Messenger in 12% and Instagram in 6%.
In response to the figures, the NSPCC has urged online regulator Ofcom to strengthen the Online Safety Act.
It said there is currently too much focus on acting after harm has taken place, rather than being proactive to ensure the design of social media platforms does not contribute to abuse.
The charity has also called on the Government to do more to disrupt child sexual abuse in private messages.
Sir Peter Wanless, NSPCC chief executive, said: “One year since the Online Safety Act became law and we are still waiting for tech companies to make their platforms safe for children.
“We need ambitious regulation by Ofcom who must significantly strengthen their current approach to make companies address how their products are being exploited by offenders.
“It is clear that much of this abuse is taking place in private messaging, which is why we also need the Government to strengthen the Online Safety Act to give Ofcom more legal certainty to tackle child sexual abuse on the likes of Snapchat and WhatsApp.”
Minister for safeguarding and violence against women and girls, Jess Phillips, said: “Child sexual abuse is a vile crime that inflicts long-lasting trauma on victims and the law is clear – the creation, possession and distribution of child sexual abuse images, and grooming a child is illegal.
“I met with law enforcement leads and the NCA (National Crime Agency) only last week to hear about the tremendous work they do to bring these offenders to justice.
“Social media companies have a responsibility to stop this vile abuse from happening on their platforms.
“Under the Online Safety Act they will have to stop this kind of illegal content being shared on their sites, including on private and encrypted messaging services, or face significant fines.
“The shocking case involving Alexander McCartney, who alone groomed over 3,500 children, demonstrates more clearly than ever that they should act now and not wait for enforcement by the regulator.”
A Snapchat spokesperson said: “Any sexual exploitation of young people is horrific and illegal and we have zero tolerance for it on Snapchat.
“If we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities.
“We have extra protections including in-app warnings to make it difficult for teens to be contacted by strangers, and our in-app Family Centre lets parents see who their teens are talking to, and who their friends are.”
An Ofcom spokesperson said: “From December, tech firms will be legally required to start taking action under the Online Safety Act, and they’ll have to do far more to protect children.
“Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children.
“We’re prepared to use the full extent of our enforcement powers against any companies that come up short when the time comes.”
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules here