TikTok is a convenient hunting ground for pedophiles and child abusers, according to leaked documents of the platform’s policies and ex-employees working as moderators for the fast-growing social media app. There have been growing concerns about the video-sharing app being used by perverts to target underage children.
TikTok, which was officially launched in 2017 by Beijing-based tech company Byte Dance, now boasts of over 800 million active users, which makes it more popular than Twitter and almost catching up with Facebook and Instagram.
Among these users are children. Data from 2019 showed that 27% of TikTok users were aged between 13 and 17. However, the app has users as young as eight years old. According to an report from The Telegraph, more than half of kids in the UK use TikTok despite the growing safety concerns.
Double your web browsing speed with today’s sponsor. Get Brave.
It is the most popular social media app among children under the age of 18.
Recently leaked documents showed that users found sexually grooming kids are only suspended for a week. Those caught the second time are suspended for a month. It’s only after being caught a third time that such perverts are permanently banned.
Former moderators of the app backed the revelation from the leaked documents. These former employees said that the bad policy allows violators back with the same accounts, meaning they could continue sexually harassing the same children who reported them.
These moderators warned that the app is dangerous as it is filled with sexual predators targeting children.
“These people should be completely off; you don’t need to wait twice or for a third time.
“We saw pedophiles banned for a week and then back on there talking to children again.”
The former employees further revealed TikTok’s inadequate moderation practices. The company required them to prioritize reviewing videos over reported messages. As a result, a reported message could go for days without being reviewed.
The ex-employees estimated that adults sending inappropriate messages to children accounted for 10% of flagged messages. Additionally, some kids’ accounts could be receiving inappropriate messages from as many as ten adults.
When approached for comment, TikTok claimed that it implements a “zero-tolerance policy” on child abuse.
“Flagged and suspected grooming behavior is escalated to our internal Child Safety Team (CST) to investigate,” the statement said. “In line with international standards, we report all necessary information to NCMEC [National Center for Missing & Exploited Children] which works directly with law enforcement, and ban the offending user.”
TikTok also claimed that most of the claims made by the former employees refer to policies it has since abandoned.
Click here for the original source.