Multiple sources from inside TikTok are saying the same thing — the fastest-growing social media platform in the world has a child pornography problem.
TikTok and its Chinese parent company, ByteDance, came under intense scrutiny after an Aug. 4 report from Forbes detailed the testimony of two former TikTok moderators.
According to the moderators, the platform saves uncensored, sexually explicit images and videos of children, distributing them to employees for training purposes. Essentially, the videos and images are used to show moderators what illegal content looks like.
Former TikTok moderator Whitney Turner and others said employees of the company and Teleperformance, a third-party firm that moderates its content, were given access to a spreadsheet with “hundreds” of images of children who were naked or being abused.
“I was moderating and thinking: This is someone’s son. This is someone’s daughter. And these parents don’t know that we have this picture, this video, this trauma, this crime saved,” Turner told Forbes.
“If parents knew that, I’m pretty sure they would burn TikTok down,” she said.
Forbes reported that a spokesman for Teleperformance said the company “does not use videos featuring explicit content of child abuse in training, and said it does not store such material in its ‘calibration tools,’ but would not clarify what those tools are or what they do. He declined to answer a detailed list of other questions regarding how many people have access to child sexual abuse material through the [daily required reading] and how Teleperformance safeguards this imagery.”
TikTok’s child porn and abuse problem doesn’t stop there, however.
In 2021, the platform reported 154,618 instances of child sexual abuse material, child sex trafficking and online enticement to the National Center For Missing and Exploited Children.
Do you think TikTok should be held accountable?
“That would be about [the number] I’d expect but it still should be more,” Eliza Bleu told The Western Journal. “Each report can contain any number of images or videos. For instance, 1 report can contain 15k images. That information is not public.”
Few do more to cover the human trafficking crisis than Bleu, a human trafficking survivor advocate and a survivor herself. She believes the accusations made against TikTok are authentic and should be investigated “immediately.”
“Under no circumstances would any type of child sexual abuse material or child sexual exploitation material need to be saved for a tech company to do training,” Bleu said in a series of messages to The Western Journal. “A platform shouldn’t re-exploit the exploited for ‘training’ purposes or any purpose for that matter. This imagery haunts these survivors. The worst moments of their lives are not a training manual.”
TikTok is far from the only platform to be accused of allowing the sexual exploitation of women and children.
Other social media sites also have been criticized for not doing enough to remove illegal sexual content from their platforms. Twitter, for example, has been sued on multiple occasions for allegedly not removing child porn promoted by its users.
According to Bleu, this is the result of a misalignment of priorities. Social media sites such as Twitter are more concerned with “controlling the narrative” and “censorship of words and ideas” than with removing illegal sexual content, she said.
Until accountability is found, however, Bleu urged parents to “discuss internet safety with their children including the risks of sextortion.”
“That way perhaps we could stop some of these crimes from happening in the first place,” she wrote.
When it comes to the survivors of TikTok’s alleged malfeasance, Bleu offered some advice as well.
“If these allegations are true I’d encourage the survivors to reach out to the National Center for Missing and Exploited Children for aftercare support. I’d highly recommend that if and when they are ready to consider taking legal action. I’d personally consider pursuing a class action lawsuit,” she said.
Many such companies risk facing serious consequences, such as Bleu’s suggested class action lawsuits, for allowing child porn and sexual abuse content to remain on their platforms. A recent lawsuit filed against Visa seems to suggest as much.
After a woman alleged that Visa and the notorious pornography site Pornhub distributed a video featuring her engaging in sexual acts at the age of 13, a judge ruled last month that the payment processor was complicit in monetizing child porn.
“The Court can comfortably infer that Visa intended to help MindGeek monetize child porn from the very fact that Visa continued to provide MindGeek the means to do so and knew MindGeek was indeed doing so,” U.S. District Judge Cormac Carney of the Central District of California wrote in the ruling.
It doesn’t seem too far a stretch to suggest that TikTok, Twitter and others could be found similarly liable for materials allowed to remain on their own platforms. Perhaps accountability is on the way.