#minorsextrafficking | These TikTok Accounts Are Hiding Child Sexual Abuse Material In Plain Sight


Many accounts on TikTok have become portals to some of the most dangerous and disturbing content on the internet. As private as they are, nearly anyone can join.


The following article contains descriptions and discussions of graphic social media content, including child sexual abuse material and adult pornography.


Don’t be shy, girl.

Come and join my post in private.

LET’S HAVE SOME FUN.

The posts are easy to find on TikTok. They typically read like advertisements and come from seemingly innocuous accounts.

But often, they’re portals to illegal child sexual abuse material quite literally hidden in plain sight—posted in private accounts using a setting that makes it visible only to the person logged in. From the outside, there’s nothing to see; on the inside, there are graphic videos of minors stripping naked, masturbating, and engaging in other exploitative acts. Getting in is as simple as asking a stranger on TikTok for the password.

TikTok’s security policies explicitly prohibit users from sharing their login credentials with others. But a Forbes investigation found that’s precisely what’s happening. The reporting, which followed guidance from a legal expert, uncovered how seamlessly underage victims of sexual exploitation and predators can meet and share illegal images on one of the biggest social media platforms on the planet. The sheer volume of post-in-private accounts that Forbes identified—and the frequency with which new ones pop up as quickly as old ones are banned—highlight a major blind spot where moderation is falling short and TikTok is struggling to enforce its own guidelines, despite a “zero tolerance” policy for child sexual abuse material.

The problem of closed social media spaces becoming breeding grounds for illegal or violative activity is not unique to TikTok; groups enabling child predation have also been found on Facebook, for example. (Its parent, Meta, declined to comment.) But TikTok’s soaring popularity with young Americans—more than half of U.S. minors now use the app at least once a day—has made the pervasiveness of the issue alarming enough to pique the interest of state and federal authorities.

“There’s quite literally accounts that are full of child abuse and exploitation material on their platform, and it’s slipping through their AI,” said creator Seara Adair, a child sexual abuse survivor who has built a following on TikTok by drawing attention over the past year to exploitation of kids happening on the app. “Not only does it happen on their platform, but quite often it leads to other platforms—where it becomes even more dangerous.”

Adair first discovered the “posting-in-private” issue in March, when someone who was logged into the private TikTok account @My.Privvs.R.Open made public a video of a pre-teen “completely naked and doing inappropriate things” and tagged Adair. Adair immediately used TikTok’s reporting tools to flag the video for “pornography and nudity.” Later that day, she received an in-app alert saying “we didn’t find any violations.”

The next day, Adair posted the first of several TikTok videos calling attention to illicit private accounts like the one she’d encountered. That video went so viral that it landed in the feed of a sibling of an Assistant U.S. Attorney for the Southern District of Texas. After catching wind of it, the prosecutor reached out to Adair to pursue the matter further. (The attorney told Adair they could not comment for this story.)

Adair also tipped off the Department of Homeland Security. The department did not respond to a Forbes inquiry about whether a formal TikTok probe is underway, but Special Agent Waylon Hinkle reached out to Adair to collect more information and told her via email on March 31 that “we are working on it.” (TikTok would not say whether it has engaged specifically with Homeland Security or state prosecutors.)

TikTok has “zero tolerance for child sexual abuse material and this abhorrent behavior which is strictly prohibited on our platform,” spokesperson Mahsau Cullinane said in an email. “When we become aware of any content, we immediately remove it, ban accounts, and make reports to [the National Center for Missing & Exploited Children].” The company also said that all videos posted to the platform—both public and private, including those viewable only to the person inside the account—are subject to TikTok’s AI moderation and in some cases, additional human review. Direct messages may also be monitored. Accounts found to be attempting to obtain or distribute child sexual abuse material are removed, according to TikTok.

The app offers tools that can be used to flag accounts, posts and direct messages containing violative material. Forbes used those tools to report a number of videos and accounts promoting and recruiting to post-in-private groups; all came back “no violation.” When Forbes then flagged several of these apparent oversights to TikTok over email, the company confirmed the content was violative and removed it immediately.

Peril hidden in plain sight

This “posting-in-private” phenomenon—which some refer to as posting in “Only Me” mode—isn’t hard to find on TikTok. While a straightforward search for “post in private” returns a message saying “this phrase may be associated with behavior or content that violates our guidelines,” the warning is easily evaded by algospeak. Deliberate typos like “prvt,” slang like “priv,” jumbled phrases like “postprivt” and hashtags like #postinprvts are just some of the search terms that returned hundreds of seemingly violative accounts and invitations to join. Some posts also include #viral or #fyp (short for “For You Page,” the feed TikTok’s more than a billion users see when they open the app) to attract more eyeballs. TikTok told Forbes it prohibits accounts and content mentioning “post to private” or variations of that phrase. Only after Forbes flagged examples of problematic algospeak did TikTok block some hashtags and searches that now pull up a warning: “This content may be associated with sexualized content of minors. Creating, viewing, or sharing this content is illegal and can lead to severe consequences.”

Within days of an active TikTok user following a small number of these private accounts, the app’s algorithm began recommending dozens more bearing similar bios like “pos.t.i.n.privs” and “logintoseeprivatevids.” The suggestions began popping up frequently in the user’s “For You” feed accompanied by jazzy elevator music and an option to “Follow” at the bottom of the screen. TikTok did not answer a query on whether accounts with sexual material are prioritized.

With little effort, the user was sent login information for several post-in-private handles. The vetting process, when there was one, focused mainly on gender and pledges to contribute images. One person who was recruiting girls to post in his newly-created private account messaged that he was looking for girls over 18, but that 15- to 17-year-olds would suffice. (“I give the email and pass[word] to people I feel can be trusted,” he said. “Doesn’t work every time.”) Other posts recruited girls ages “13+” and “14-18.”

Accessing a post-in-private account is a simple matter and does not require two-step verification. TikTok users can turn on this extra layer of security, but it is kept off by default.

One account contained more than a dozen concealed videos, several featuring young girls who appeared to be underage. In one post, a young girl could be seen slowly removing her school uniform and undergarments until she was naked, despite TikTok not allowing “content depicting a minor undressing.” In another, a young girl could be seen humping a pillow in a dimly lit room, despite TikTok prohibiting “content that depicts or implies minor sexual activities.” Two others showed young girls in bathrooms taking off their shirts or bras and fondling their breasts.

TikTok users purporting to be minors also participate in these secret groups. On one recent invitation to join a private account, girls claiming to be 13, 14 and 15 years old asked to be let in. Their ages and genders could not be independently verified.

Other users’ bios and comments asked people to move the private posting and trading off TikTok to other social platforms including Snap and Discord, though TikTok explicitly forbids content that “directs users off platform to obtain or distribute CSAM.” In one such case, a commenter named Lucy, who claimed to be 14, had a link to a Discord channel in her TikTok bio. “PO$TING IN PRVET / Join Priv Discord,” the bio said. That link led to a Discord channel of about two dozen people sharing pornography of people of all ages, mostly female. Several of the Discord posts had a TikTok watermark—suggesting they had originated or been shared there—and featured what appeared to be underage, nude girls masturbating or performing oral sex. The Discord server owner threatened to kick people out of the group if they didn’t contribute fresh material. Discord did not immediately respond to a request for comment.

These activities are unsettlingly common across major social media apps supporting closed environments, according to Haley McNamara, director of the International Centre on Sexual Exploitation. “There is this trend of either closed spaces or semi-closed spaces that become easy avenues for networking of child abusers, people wanting to trade child sexual abuse materials,” she told Forbes. “Those kinds of spaces have also historically been used for grooming and even selling or advertising people for sex trafficking.” She said that in addition to Snap and Discord, the organization has seen comparable behavior on Instagram, either with closed groups or the close friends feature.

Instagram’s parent, Meta, declined to comment. Snap told Forbes it prohibits the sexual exploitation or abuse of its users and that it has various protections in place to make it harder for predators and strangers to find teens on the platform.

On paper, TikTok has strong safety policies protecting minors, but “what happens in practice is the real test,” said McNamara. When it comes to proactively policing the sexualization of kids or trading of child sexual abuse material, she added, “TikTok is behind.”

“These tech companies are creating new tools or functions and rolling them out without seriously considering the online safety element, especially for children,” she added, calling for safety mechanisms to be built in proportion to privacy settings. “This ‘Only Me’ function is the latest example of tech companies not prioritizing child safety or building out proactive ways to combat these problems on the front end.”

Dr. Jennifer King, the privacy and data policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence, said she does see legitimate use cases for this type of privacy setting. (TikTok said creators may use the feature while testing or scheduling their content.) But King questioned TikTok’s decision not to have default two-factor authentication, an industry standard, and why TikTok is not detecting multiple logins that run afoul of platform policy.

“That’s a red flag, [and] you can absolutely know this is happening,” said King, who previously built a tool for Yahoo to scan for child sexual abuse material.

“It’s often a race against time: You create an account [and] you either post a ton of CSAM or consume a bunch of CSAM as quickly as possible, before the account gets detected, shut down, reported… it’s about distribution as quickly as possible,” she explained. People in this space expect to have these accounts for just a couple hours or days, she said, so spotting and blocking unusual or frequent logins—which is not technically difficult to do—could “harden those targets or close those loopholes” people are taking advantage of.

“You can absolutely know this is happening.”

Dr. Jennifer King, Stanford Institute for Human-Centered Artificial Intelligence

Despite its policy prohibiting the sharing of login credentials, TikTok told Forbes there are reasons for allowing multiple people access to the same account—like managers, publicists or social media strategists who help run creators’ handles. The company also noted that two-factor authentication is required for some creators with big followings.

While popular, public accounts with large audiences tend to draw more scrutiny, “a single account that doesn’t seem to have a lot of activity, posting a couple of videos” may go overlooked, King said. But TikTok maintains that all users, regardless of follower count, are subject to the same community guidelines and that the platform tries to enforce those rules consistently.

Adair, the creator and children’s safety advocate, has complained that she is doing TikTok’s content moderation work for the company—keeping abreast of the ever-changing ways people on the app are exploiting the technology or using it for things other than its intended purpose. But her efforts to contact TikTok have been unsuccessful.

“Almost every single minor that has reached out to me has not told their parents what has happened.”

Seara Adair, TikTok creator and child sexual abuse survivor

Adair said she’s gone on “a spree on LinkedIn,” sending messages to employees in trust, security and safety to escalate the problem.

“I apologize if this is crossing a boundary however I am desperate to get this the attention it needs,” she wrote to one TikTok employee, describing the “private posting” and the way she believes users are gaming the AI “by posting a black screen for the first few seconds” of these videos.

“I personally saw one of the videos that had been unprivated and it was a child completely naked and doing indecent things. I reported the video and it came back no violation,” she continued. “Since posting my video concerning this I’ve had two children come forward and share how they were groomed by one of these accounts and were later made aware that it was an adult behind the accounts. Please. Is there anything you can do to help?”

Adair “never heard back from anybody,” she told Forbes. “Not a single person.”

But she continues to hear from TikTok users—including many young girls—who’ve had run-ins with post-in-private. “Almost every single minor that has reached out to me has not told their parents what has happened,” Adair said. “It’s the fear and the unknown that they experience, and the exposure that they end up getting in this situation, that just breaks my heart.”

MORE FROM FORBES

MORE FROM FORBESHow TikTok Live Became ‘A Strip Club Filled With 15-Year-Olds’MORE FROM FORBESTikTok Moderators Are Being Trained Using Graphic Images Of Child Sexual AbuseMORE FROM FORBESHow Breastfeeding Mothers Are Being Sexualized On Social Media



Source link