The European Commission (EC) is concerned that TikTok is not doing enough to protect children, claiming that the short-video app potentially sends children down rabbit holes of harmful content while making it easy for children to impersonate adults to avoid There are protective content filters.
The allegations emerged on Monday when the European Commission announced a formal investigation into how TikTok may be violating the Digital Services Act (DSA) “in areas related to the protection of minors, advertising transparency, data access for researchers, as well as the risk management of addictive substances”. Design and harmful content.”
“We must spare no effort to protect our children,” said Thierry Breton, EU Commissioner for the Internal Market, in the press release, reiterating that “the protection of minors is a top priority for the enforcement of the DSA.”
This makes TikTok the second platform to be investigated for possible DSA violations after X (also known as Twitter) came under fire last December. Both are under scrutiny after submitting transparency reports in September that the European Commission said failed to meet the DSA's strict standards on predictable matters, such as: B. insufficient advertising transparency or insufficient data access for researchers.
But while X is also under investigation for alleged dark patterns and disinformation – after allegations were made last October that .
“As a platform that reaches millions of children and young people, TikTok must fully comply with the DSA and has a special role in protecting minors online,” Breton said. “Today we are launching this formal infringement procedure to ensure that appropriate measures are taken to protect the physical and emotional well-being of young Europeans.”
The European Commission is likely to request more information from TikTok in the coming months while dissecting its DSA transparency report. The investigation may require interviews with TikTok employees or inspections of TikTok offices.
Once the investigation is complete, the European Commission could require TikTok to take interim measures to address any reported issues. The commission could also make a decision on non-compliance, potentially fining TikTok up to 6 percent of its global revenue.
Advertising
A press spokesman for the EU Commission, Thomas Regnier, told Ars that the commission suspected that TikTok “did not carry out careful risk assessments” to properly maintain mitigation efforts and “protect the physical and mental well-being of its users as well as the rights of the child.” to protect.
In particular, its algorithm could risk “stimulating addictive behavior” and its recommendation systems “could pull its users, particularly minors and vulnerable users, down a so-called 'rabbit hole' of repetitive harmful content,” Regnier told Ars. Furthermore, that could TikTok's age verification system may be deficient as the EU claims TikTok may have “not carefully assessed the risk of 13- to 17-year-olds impersonating adults when accessing TikTok,” Regnier said.
To better protect TikTok's young users, the EU investigation could force TikTok to update its age verification system and revise its default privacy, security and protection settings for minors.
“In particular, the Commission suspects that the default settings of TikTok’s recommendation systems do not ensure a high level of privacy, security and protection of minors,” Regnier said. “The Commission also suspects that the default privacy settings that TikTok has for 16-17 year olds are not the highest by default, which would be inconsistent with the DSA, and that push notifications are not turned off by default for minors, which has a negative impact could impact the safety of children.”
TikTok could avoid hefty fines by committing to the remedies recommended by the European Commission at the end of its investigation.
Regnier told Ars that the European Commission would not comment on the ongoing investigation, but that its investigation into X has so far lasted three months. Since the DSA does not provide for any deadlines that could speed up such enforcement proceedings, the duration of both investigations will ultimately depend on the extent to which “the company concerned cooperates,” the EU press release states.
A TikTok spokesperson told Ars that TikTok will “continue to work with experts and the industry to ensure the safety of young people on its platform” and confirmed that the company “looks forward to sharing this work with the European Commission in detail.” to explain”.
“TikTok has pioneered features and settings to protect youth and keep those under 13 off the platform. Issues that the entire industry is struggling with,” said a TikTok spokesperson.
All online platforms are now required to comply with the DSA, but enforcement on TikTok began in late July 2023. A TikTok press release last August promised that the platform would “adopt” the DSA. But in its transparency report submitted the following month, TikTok acknowledged that the report only covered “one month of metrics” and may not have met DSA standards.
“We still have work to do,” TikTok’s report said, promising that “we are working hard to address these issues before our next DSA transparency report.”