Photo: Bloomberg Photo By Gabby Jones.
Almost a quarter of the videos TikTok took down in 2019’s second half involved inappropriate behavior by minors, from illegal drug use to sexual activity.
The Chinese-owned social video service said 24.8% of the clips removed were “depicting harmful, dangerous, or illegal behavior by minors, like alcohol or drug use, as well as more serious content we take immediate action to remove.” Another 15.6% “violated our suicide, self-harm, and dangerous acts policy,” TikTok said in its second transparency report.
TikTok — which has insisted it operates independently of Beijing despite its Chinese ownership — has come under fire in the U.S. and India for the way it polices content on a platform used by more than a billion people. Parent ByteDance Ltd. has been accused of censoring content that may anger the Chinese government, even as scrutiny grows about its control over the personal information of youths.
The report made no mention of requests related to China, where ByteDance is based but TikTok doesn’t operate. A company spokesperson said it also didn’t receive a single data request in the second half from Hong Kong, a market it’s abandoned after Beijing passed a controversial law to grant police sweeping powers over online content. This week, U.S. internet giants from Facebook to Google said they will stop processing data requests from the city’s government, signaling their opposition to the legislation. TikTok was no longer available on Apple’s and Google’s Hong Kong app stores as of Thursday.
The video sharing app said it removed more than 49 million clips overall, according to its report on enforcement of content policy and government takedown requests. Of those removed videos, more than 16 million originated in India, a small portion of which came down after government request. TikTok said that of the total videos removed, its systems proactively caught and removed 98.2% before a user reported them, while 89.4% were taken down before they got any views.
The disclosure from TikTok comes in the same week as reports that the Federal Trade Commission and the U.S. Department of Justice have started to inquire about the company’s data practices — specifically accusations that the app collected data on users under the age of 13. A prior iteration of the app paid $5.7 million in 2019 to settle similar claims by the FTC.
“TikTok takes the issue of safety seriously for all our users,” a spokesperson said this week, “and we continue to further strengthen our safeguards and introduce new measures to protect young people on the app.” He declined to comment on whether the FTC or DOJ had approached TikTok about an investigation.