TikTok could be fined £27 million ($29 million) for failing to protect children’s privacy, the U.K.’s Information Commissioner’s Office (ICO), a public body reporting to Parliament, announced Sept. 26. TikTok told Reuters in an email that it disagreed with the ICO’s findings and will respond soon. TikTok did not respond to the Observer’s request for comment.
Social media companies like TikTok are under scrutiny for violating children’s privacy laws and not monitoring their self-imposed age limit, which bars children under 13 from the sites in the U.S. and U.K. (age limits differ in other countries). While these rules shelter kids from harmful content, they also protect them from surveillance, exploitation, harassment and data collection. Parents have spoken out about the impact social media can have on children’s mental health, self confidence and sleep time, even when accounts are monitored by a parent.
Setting the age limit at 13 allowed social media companies to comply with a 1998 law which describes when companies can begin collecting data from kids without parental consent (in recent years, apps like TikTok and Instagram have launched alternate versions of their apps for kids requiring parental consent). Children continue to develop for years after turning 13, so while they are old enough to have accounts, they remain targets for serious harm.
But social media apps still appeal to users under 13, and some are tech-savvy enough to bypass restrictions. Companies say they try to remove these users, which costs them profit. For now, fines and regulatory pressures are the primary tactics keeping social media companies policing the ages of their users.
After the U.S. fined TikTok $5.7 million in 2019, TikTok removed accounts with users under 13 based on the birthdays users entered when they first signed up. It then asked users who claimed they were wrongly removed to verify their birthdays with a photo of government identification. Users under 13 were directed to an alternate version of the app with restricted content where they couldn’t post or comment. Meta (META), which owns Facebook and Instagram, said it had developed artificial intelligence to seek out children, including monitoring “happy birthday” messages from friends. It isn’t a catch-all though. Earlier this month, Ireland fined Meta-owned Instagram 405 million euros ($401 million) for publishing emails and phone numbers of underage users.
It is difficult to determine exactly how many TikTok users are younger than 13, but a New York Times review of TikTok’s internal documents found that 18 million U.S. users, or a third of the U.S.’s daily users, are 14 and younger. Some of this group are aged 13 and 14, but some likely aren’t. A former TikTok employee said their coworkers would flag users who appeared to be younger than the threshold, but the users’ accounts wouldn’t be taken down for weeks, according to the Times.
While requiring a government-issued ID to sign up for accounts would strengthen these apps’ confines, the practice has never been instituted by any major social media company.
Are fines an effective tool for regulating social media?
Social media companies also construct age profiles of users based on their behaviors and interests expressed on the app, said Cobun Zweifel-Keegan, a managing director at the International Association of Privacy Professionals, a nonprofit privacy-focused organization. Based on these profiles, they can assume a user’s age and remove children. Companies also rely on users flagging content that appears to be from an underage user.
Since this time last year, TikTok purged 70 million suspected underage accounts globally, or half of the total accounts it removed for violating its rules. Some of these accounts might have been created by the same person. Neither Meta nor Twitter publish how many accounts it removed for this reason.
Fines are used to enforce other forms of social media compliance, such as privacy, he said. “It’s undeniable that it’s moved the dial, and a big part is that there are big fines attached.”
However, he doesn’t necessarily see age regulation as the only answer to the dangers of social media. He said there are other at-risk communities using these platforms, including users with mental health issues, addiction and eating disorders, as well as senior citizens and marginalized communities, who can fall prey to similar exploitations and inappropriate content as children. Governmental regulations can encourage companies to think about unique harms that kids face, but apply them to broader communities as well, he said.
The ICO found that TikTok illegally processed data for children under the age of 13 without parental consent, processed special category data—including sexual orientation, religious and political beliefs, race and health data—and didn’t disclose it was doing so in a clear and concise way. It claims that TikTok violated privacy laws between May 2018 and July 2020 without providing any more information on why it chose those bookend dates. The organization provided TikTok with a notice of intent, which is a warning and means TikTok will not yet be fined. If it were, the fine would be the largest in the ICO’s history, followed by a £20 million fine against British Airways in 2020.