TikTok Under Investigation Over Child Abuse Material
The US Department of Homeland Security is investigating TikTok’s handling of child sexual abuse material, with fears the short-form video app isn’t adequately protecting its young user base.
In addition, the Department of Justice is looking into a specific privacy feature on TikTok that may be exploited by predators.
“It is a perfect place for predators to meet, groom and engage children,” said Erin Burke, unit chief of the child exploitation investigations unit at Homeland Security’s cyber crime division.
The Chinese-owned company employs more than 10,000 human moderators, who struggle to police more than one billion users. TikTok reported 155,000 problematic videos on its platform last year, while Instagram, who has a similar sized userbase, reported 3.4 million.
It is believed this shows a lack of proper policing on TikTok’s behalf.
“We want [social media companies] to proactively make sure children are not being exploited and abused on your sites — and I can’t say that they are doing that, and I can say that a lot of US companies are,” Burke said.
“TikTok has zero-tolerance for child sexual abuse material,” the company said.
“When we find any attempt to post, obtain or distribute [child sexual abuse material], we remove content, ban accounts and devices, immediately report to NCMEC, and engage with law enforcement as necessary.
“We are deeply committed to the safety and wellbeing of minors, which is why we build youth safety into our policies, enable privacy and safety settings by default on teen accounts, and limit features by age.”