Recommended

TikTok fined nearly $370 million for weak safeguards protecting underage users

LOIC VENANCE/AFP via Getty Images
LOIC VENANCE/AFP via Getty Images

TikTok was fined around $370 million following an inquiry from a European supervisory authority that found the social media app does not have safeguards to protect underage youth using the platform. 

The Irish Data Protection Commission is a national independent authority that protects the personal data of individuals in the European Union, and the group also regulates TikTok. 

The commission announced in a press release on Friday that, in addition to the fine, it had issued TikTok a reprimand and ordered the China-run social media giant to bring itself into compliance within three months.

Get Our Latest News for FREE

Subscribe to get daily/weekly email with the top stories (plus special offers!) from The Christian Post. Be the first to know.

The regulator conducted an investigation into the platform between July 31, 2020, and Dec. 31, 2020, before making a final decision regarding TikTok earlier this month. 

Following the inquiry, the commission found that the social media site failed to comply with multiple General Data Protection Regulation rules, the EU’s privacy and security law, by placing child users’ accounts on a public setting by default. As a result, anyone using the platform can view a child's posts. The regulator noted that TikTok failed to take into consideration the danger imposed on users aged 13 and younger by having their account placed on a public setting.

TikTok’s “Family Pairing” setting is also a violation of GDPR, according to the commission, as it reportedly allows adult users to pair their account with a child’s without first verifying that the adult is the minor’s parent or guardian. 

“This allowed the non-child user to enable Direct Messages for child users above the age of 16, which posed severe possible risks to child users,” the commission stated. 

TikTok did not respond to The Christian Post’s request for comment for this article. 

This is not the first time that a report has raised concerns about abusers exploiting loopholes on TikTok and exposing children to harmful content. 

As CP reported last November, TikTok influencer and child sex abuse survivor Seara Adair told Forbes in an interview at the time that users on the platform can post illegal child sex abuse material if they use a private posting login. The influencer discovered the loophole last March after someone took advantage of the loophole to post a video of a naked pre-teen doing “inappropriate things,” tagging Adair. 

"There's quite literally accounts that are full of child abuse and exploitation material on their platform, and it's slipping through their AI," Adair told Forbes. "Not only does it happen on their platform, but quite often it leads to other platforms — where it becomes even more dangerous."

A TikTok spokesperson told CP at the time that the platform has "zero tolerance" for child sex abuse material, claiming that this "abhorrent behavior" is prohibited. The spokesperson explained that the platform removes such content when made aware of it, and it also moves to ban the accounts responsible and make a report to The National Center for Missing & Exploited Children, a child protection organization. 

Forbes noted that the vast number of “post-in-private” accounts it had identified in its reporting and how fast new ones appear after the previous ones are banned appear to highlight what it described as a “blind spot” in TikTok’s moderation. Despite the platform’s claim that it has a “zero tolerance” policy for child sex abuse material, Forbes reported that TikTok seemed to be struggling to enforce its guidelines. 

Lina Nealon, director of corporate and strategic initiatives for the National Center on Sexual Exploitation, told CP at the time that the situation is not unique to TikTok. 

"No matter where children are on the internet, predators can and will find a way to reach them. What is happening on TikTok is not exclusive to that platform. Instagram, Snap, Discord, and others have vulnerabilities and parents must be aware and diligent," she said. 

Nealon suggested that solutions such as the "Kids Online Safety Act" and "Children and Teens' Online Privacy Protection Act” are necessary to hold tech platforms accountable. She also urged parents to protect their children by utilizing parental controls and having “age-appropriate” conversations with their children.

Samantha Kamman is a reporter for The Christian Post. She can be reached at: [email protected]. Follow her on Twitter: @Samantha_Kamman

Was this article helpful?

Help keep The Christian Post free for everyone.

By making a recurring donation or a one-time donation of any amount, you're helping to keep CP's articles free and accessible for everyone.

We’re sorry to hear that.

Hope you’ll give us another try and check out some other articles. Return to homepage.

Most Popular