#childsafety | Half of online child sex crimes committed on Facebook-owned apps


More than half of online child sex crimes are committed on Facebook-owned apps, figures reveal.

The data, obtained by the NSPCC under Freedom of Information Laws, show more than 9,470 instances where the means of communication was known in reports of child sex abuse images and online child sex offences.

Between October 2019 and October 2020, police forces in the North-east and Cumbria recorded 1,010 online child sex crimes where the method of communication was known – 57% via Facebook-owned apps.

On Teesside, figures show Cleveland Police recorded 273 online child sex crimes – 149 via Facebook-owned apps.

Instagram was used more than any other Facebook-owned platform, in over a third of all instances.

Meanwhile, Facebook and Messenger were used in a further 13%, according to the data obtained from 35 police forces in England, Wales and the Channel Islands.

The NSPCC fears many of these could go unreported if Facebook proceeds with end-to-end encryption across all its messaging platforms without necessary safeguards in place.

This would mean messages – including text and imagery – being digitally scrambled so that only the sender and receiver can make sense of them.

The charity urge the Government to give Ofcom the power to take “early and meaningful action” against firms whose dangerous design choices put children at risk.

The NSPCC has repeatedly demanded that encryption plans should only be rolled out if and when platforms can demonstrate it won’t compromise children’s safety.

According to the new data, WhatsApp accounts for one in 10 instances recorded by police where Facebook’s apps were involved in online child sexual abuse.

And last month, the Office for National Statistics revealed children are contacted via direct message in nearly three quarters of cases when they are approached by someone they don’t know online.

Andy Burrows, NSPCC head of Child Safety Online Policy, said: “Facebook is willingly turning back the clock on children’s safety by pushing ahead with end-to-end encryption despite repeated warnings that their apps will facilitate more serious abuse more often.

“This underlines exactly why Oliver Dowden must introduce a truly landmark Online Safety Bill that makes sure child protection is no longer a choice for tech firms and resets industry standards in favour of children.

“If legislation is going to deliver meaningful change it needs to be strengthened to decisively tackle abuse in private messaging, one of the biggest threats to children online.”

A Cleveland Police spokesperson said: “We take action on all reports of online child abuse and we are continually raising awareness to parents and children of how they can keep their children safe online.

“We have a Paedophile Online Investigation Team(POLIT) actively working to bring perpetrators to justice. Sadly we do still need to arm parents with how they can protect their children online whilst we investigate criminals.

“In preparation for the Easter Holidays, we are publishing an animated video which contains tips and advice on how to monitor children’s devices and keep them secure.

“It is important as a Force that we continue to highlight safety measures to keep children safe as the internet is accessible off many devices and apps.”

A Facebook spokesperson added: “Child exploitation has no place on our platforms and we will continue to lead the industry in developing new ways to prevent, detect and respond to abuse.

“For example, last week we announced new safety features on Instagram including preventing adults from messaging under 18s who don’t follow them.

“End-to-end encryption is already the leading security technology used by many services to keep people, including children, safe from having their private information hacked and stolen.

“Its full rollout on our messaging services is a long-term project and we are building strong safety measures into our plans.”





Source link
.  .  .  .  .  .  . .  .  .  .  .  .  .  .  .  .   .   .   .    .    .   .   .   .   .   .  .   .   .   .  .  .   .  .