Online child abuse images are increasingly being made by young girls | #childabuse | #children | #kids


Child abuse images online are increasingly being made by young girls, the UK’s watchdog as warned as the Home Secretary and Culture Secretary told tech giants they are still failing to protect children.

The Internet Watch Foundation (IWF) said that four in five “self-generated” abuse images it found last year were created by or involved girls aged between 11 and 13.

The images are often the result of paedophiles coercing or tricking children into taking pictures of themselves or going onto live stream videos where screenshots can be surreptitiously taken of them.

Earlier this year, one of Britain’s most prolific paedophiles David Wilson was sentenced to 25 years in jail for abusing over 50 children and approaching more than 5,000 over social media sites such as Facebook.

In response to the dramatic rise in self-generated abuse, the Government is this week launching a new hard-hitting campaign warning how children can be groomed and abused in their own bedrooms.

The IWF figures also prompted calls from the two cabinet ministers for tech giants to clamp down on the “abhorrent” child abuse happening on their platforms.

Home Secretary Priti Patel said the scale of abuse happening online was “shocking” and tech companies would have a greater responsibility placed on them to tackle it under coming Duty of Care legislation, which The Telegraph has campaigned for since 2018.

She said: “These companies should not wait for legislation to be in place before they take action to address these abhorrent crimes. The Government has already set out vital steps the tech giants can take to stop sick predators operating on their platforms.”

Culture Secretary Oliver Dowden, who is co-authoring the Duty of Care bill with the Home Secretary, added: “These latest figures make it clearer than ever that internet companies need to do more to protect their youngest users.”

The IWF, which hunts down and erases child abuse material online, said it has seen a massive increase in self-generated images of children, which now account for 44 per cent of total 132,000 images it deleted in 2020.

Young girls aged between 11 to 13 are now the overwhelming victims of such abuse, featuring in 80 percent of the self-generated image the IWF discovered.

The IWF said the figure represents a huge 84 per cent rise in the number of girls that age appearing in such abuse material compared to 2019.

The IWF has previously warned that girls that age are at particular risk of grooming as it is a time when many are given their first smartphone and other camera connected devices.

Susie Hargreaves OBE, Chief Executive of the IWF, said: “The scale of the problem is appalling, and our fear is without intervention it will get worse, and more and more girls will fall victim to this pernicious and manipulative form of abuse.

“This is a pivotal time. With more people spending more time online, predators are finding new ways to contact and manipulate children who are, in many cases, a captive audience at home with their devices. Lockdown has made this worse.”

The Government campaign is launching this week with an unsettling TV advert showing a young girl being called to dinner by her mother while a stream of men on phones queue to get into her room.

The advert will ask parents ‘how many child abusers have been in your home’. It will also urge parents to put limits and checks on their children’s devices so they know who is trying to contact them online.

The campaign has been backed by senior police officers who also called on social media giants to drive abuse off their platforms.

Rob Jones, NCA Director of Threat Leadership, said: “Social media companies have a key role to play in keeping their users safe online. It is vital they prioritise driving out all abuse from their platforms and continue to work with law enforcement to support the prosecution of offenders.”





Source link

.  .  .  .  .  .  . .  .  .  .  .  .  .  .  .  .   .   .   .    .    .   .   .   .   .   .  .   .   .   .  .  .   .  .