Snapchat most-used app for grooming, says NSPCC
The messaging app Snapchat is the most widely-used platform for online grooming, according to police figures supplied to the childrenâs charity the NSPCC.
More than 7,000 Sexual Communication with a Child offences were recorded across the UK in the year to March 2024 â the highest number since the offence was created.
Snapchat made up nearly half of the cases where the platform used for the grooming was recorded by the police.
The NSPCC said it showed society was âstill waiting for tech companies to make their platforms safe for children.â
Snapchat told the BBC it had âzero toleranceâ of the sexual exploitation of young people, and had extra safety measures in place for teens and their parents.
Becky Riggs, the National Police Chiefâs Council lead for child protection, described the data as âshocking.â
âIt is imperative that the responsibility of safeguarding children online is placed with the companies who create spaces for them, and the regulator strengthens rules that social media platforms must follow,â she added.
Groomed at the age of 8
The gender of the victims of grooming offences was not always recorded by police, but of the cases where it was known, four in five victims were girls.
Nicki â whose real name the BBC is not using â was eight when she was messaged on a gaming app by a groomer who encouraged her to go on to Snapchat for a conversation.
âI donât need to explain details, but anything that you can imagine happening happened in those conversation â videos, pictures. Requests of certain material from Nicki, etcetera,â her mother, who the BBC is calling Sarah, explained.
She then created a fake Snapchat profile pretending to be her daughter and the man messaged â at which point she contacted the police.
She now checks her daughterâs devices and messages on a weekly basis, despite her daughter objecting.
âItâs my responsibility as mum to ensure she is safe,â she told the BBC.
She said parents âcannot relyâ on apps and games to do that job for them.
âProblems with the design of Snapchatâ
Snapchat is one of the smaller social media platforms in the UK â but is very popular with children and teenagers.
That is âsomething that adults are likely to exploit when theyâre looking to groom children,â says Rani Govender, child safety online policy manager at the NSPCC.
But Ms Govender says there are also âproblems with the design of Snapchat which are also putting children at risk.â
Messages and images on Snapchat disappear after 24 hours â making incriminating behaviour harder to track â and senders also know if the recipient has screengrabbed a message.
Ms Govender says the NSPCC hears directly from children who single out Snapchat as a concern.
âWhen they make a report [on Snapchat], this isnât listened to, and that theyâre able to see extreme and violent content on the app as well,â she told the BBC.
A Snapchat spokesperson told the BBC the sexual exploitation of young people was âhorrific.â
âIf we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities,â they added.
Record offending
The instances of recording grooming has been increasing since the offence of Sexual Communication with a Child came into force in 2017, reaching a new record high of 7,062 this year.
Of the 1,824 cases where the platform was known last year, 48% were recorded on Snapchat.
The number of grooming offences recorded on Snapchat has risen each year since 2018/19.
Reported grooming offences on WhatsApp also rose slightly in the past year. On Instagram and Facebook, known cases have fallen over recent years, according to the figures. All three platforms are owned by Meta.
WhatsApp told the BBC it has ârobust safety measuresâ in place to protect people on its app.
Jess Phillips, minister for safeguarding and violence against women and girls, said social media companies âhave a responsibility to stop this vile abuse from happening on their platformsâ.
In a statement, she added: âUnder the Online Safety Act they will have to stop this kind of illegal content being shared on their sites, including on private and encrypted messaging services or face significant fines.â
The Online Safety Act includes a legal requirement for tech platforms to keep children safe.
From December, big tech firms will have to publish their risk assessments on illegal harms on their platforms.
Media regulator Ofcom, which will enforce those rules, said: âOur draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children.
âWeâre prepared to use the full extent of our enforcement powers against any companies that come up short when the time comes.â