TikTok and Twitter/X Draw EU Scrutiny Under New Rules
| By Loc Le |
EU Investigation Targets TikTok in Potential DSA Violations
The EU officially opened formal proceedings on February 19, 2024 to investigate TikTok, the giant social media platform owned by the Chinese internet technology company, ByteDance. Announced via a press release, the European Commission will be examining TikTok for potentially breaching the Digital Services Act (DSA) on the basis of preliminary investigations including the company’s September 2023 risk assessment report and its replies to formal requests by the Commission for information regarding illegal content, protection of minors, and data access.
This examination into TikTok will mark the second official DSA-related proceedings after X, the social media platform formerly known as Twitter, was investigated in December 2023 for issues surrounding the distribution of illegal content and ineffective measures for combatting disinformation. The DSA, which went into effect last year, is comprised of comprehensive regulations focused on ensuring the safety of internet users and includes mandates to facilitate the identification of malicious content such as hate speech, provide users with non-algorithmic recommendations, and prohibit advertisements that are directed at children.
Alleged Violations and Regulatory Concerns
In TikTok’s case specifically, the company is being accused of violating DSA policies relating to the “protection of minors, advertising transparency, data access for researchers, as well as the risk management of addictive design and harmful content.” According to Thierry Breton, the Commissioner for Internal Market of the European Union, TikTok has a crucial role in protecting minors online and complying with the DSA as it is a platform that reaches millions of children and teenagers and that the European Commission is launching the formal infringement proceeding “to ensure that proportionate action is taken to protect the physical and emotional wellbeing of young Europeans.” Breton also announced the investigation with a post on X where more details regarding the suspected DSA policies breached by TikTok were included such as “addictive design and screen time limits, rabbit hole effect, age verification, and default privacy settings.”
The issues highlighted by Breton will be the main areas of focus for the European Commission’s proceedings as the investigation will seek to scrutinize TikTok’s compliance with DSA regulations regarding the assessment and mitigation of systemic risks, particularly those related to potential negative effects on the well-being of users and the rights of minors, such as behavioral addictions and exposure to inappropriate content. Additionally, TikTok’s measures concerning privacy, safety, and security for minors, including the adequacy of default privacy settings and recommender systems, will be assessed. Furthermore, the investigation will analyze TikTok’s adherence to DSA policies regarding advertisement transparency and the accessibility of platform data to researchers. If the investigation does find TikTok to be guilty of such violations, the company could receive fines of up to 6% of its global revenue.
TikTok’s Response and Industry Context
In response to the European Commission’s investigation, TikTok said via a spokesperson that the company “has pioneered features and settings to protect teens and keep under 13s off the platform,” which are issues “the whole industry is grappling with.” The company also stated that it will “continue to work with experts and industry to keep young people on TikTok safe and look forward to now having the opportunity to explain this work in detail to the Commission.” Additionally, the company added that it already responded to all the requests for information by the Commission but has not received any feedback on them yet. Furthermore, TikTok mentioned that it previously made an offer for the company’s internal child safety staff to meet with Commission officials but that offer has yet to be taken up.
Past Controversies and Ongoing Scrutiny
While it may seem like TikTok is actively trying to improve the platform’s overall safety measures for children, it appears that the company may not be doing enough as this is not the first time the company has been under controversy. For instance, in April 2023, TikTok was fined £12.7m by the UK Information Commissioner’s Office for illegally processing and using the data of 1.4 million UK children under 13 who were using the platform without consent from their parents or legal guardians. And more recently in September 2023, TikTok was fined €345m (£296m) by the Irish data watchdog for breaching EU data laws regarding its handling of children’s accounts by failing to conceal the content of underage users from the public.
Now, with the European Commission’s DSA investigations into social media platforms, many will begin wondering just how many warnings TikTok needs before the company starts taking the issues seriously. As one of the largest and fastest-growing social media platforms, TikTok needs to do better in its efforts to protect online users, especially those who are minors.