Facebook is OK with child bullying, but threatening Donald Trump is a no-go

THE SOCIAL NETWORK Facebook has come under fire after a mega internal document leak has revealed that while it will delete threats made against Donald Trump, it doesn’t care so much about ordinary folk.

The Guardian has got its mitts on more than 100 of Facebook’s internal training manuals, which give an insight into how the social network moderates issues such as violence, hate speech and pornography, and they – perhaps unsurprisingly – are a grim read.

The leaked dossier reveals that “credible violence”, such as posting the phrase “someone shoot Trump”, will be quickly removed by moderators because he is a head of state. However, Facebook has no problem with someone saying “to snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, or “let’s beat up fat kids.”

Other threats regarded as “not credible” by Facebook include “little girl needs to keep to herself before daddy breaks her face,” and “I hope someone kills you”, with the social network labelling such abusive remarks as “generic”.

“We should say that violent language is most often not credible until specificity of language gives us a reasonable ground to accept that there is no longer simply an expression of emotion but a transition to a plot or design,” the internal documents read, somewhat astoundingly.

“From this perspective language such as ‘I’m going to kill you’ or ‘F*** off and die’ is not credible and is a violent expression of dislike and frustration.”

Things don’t get any better, as the documents go onto reveal that Facebook is quite happy for videos of violent deaths, child bullying and self harm to be posted on its website, as long as they – heaven’s forbid – have a “sadistic or celebratory element”.

It’s also quite OK with photos of animal abuse, and it’s fine with people sharing videos of abortions, as long as they don’t show any nudity.

Facebook has commented on the Guardian’s report, and laughably argued that it has a lot of users and that there’s “always going to be some grey area” regarding what type of content is OK.

“We have a really diverse global community and people are going to have very different ideas about what is OK to share. No matter where you draw the line there are always going to be some grey areas. For instance, the line between satire and humour and inappropriate content is sometimes very grey. It is very difficult to decide whether some things belong on the site or not,” she said.

“We feel responsible to our community to keep them safe and we feel very accountable. It’s absolutely our responsibility to keep on top of it. It’s a company commitment.

“We will continue to invest in proactively keeping the site safe, but we also want to empower people to report to us any content that breaches our standards.”

We don’t like what Facebook has to say, but thankfully the Open Rights Group has spoken up and talked some sense.

“With almost 2 billion users each month, Facebook’s decisions about what is and isn’t acceptable have huge implications for free speech. These leaks show that making these decisions is complex and fraught with difficulty,” the group said.
“Facebook will probably never get it right but at the very least there should be more transparency about how their processes.
“This is why plans in the Conservative manifesto pledge to compel private companies to regulate content on the Internet are problematic and bound to chill free speech in the UK.”