European Commission Rules TikTok's Addictive Design Violates Digital Services Act
Author: Tatyana Hurynovich
On February 6, 2026, the European Commission released its preliminary findings regarding TikTok, asserting that the platform's "addictive design" constitutes a breach of the European Union's Digital Services Act (DSA). The investigation, which originally commenced in 2024, suggests that features such as infinite scrolling and the automatic playback of videos effectively place users into a state of "autopilot." This psychological shift significantly diminishes self-control, creating a particularly high risk for minors and vulnerable adult populations.
The Commission specifically identified several core mechanisms that foster dependency, including the relentless delivery of new content, intrusive push notifications, and a highly sophisticated, personalized recommendation engine. Henna Virkkunen, the Executive Vice-President for Tech Sovereignty, Security, and Democracy, highlighted the detrimental effects of social media addiction on the developing minds of young people. She emphasized that under the DSA, digital platforms are legally responsible for the impact their services have on their user base.
Furthermore, the regulatory body pointed out that TikTok failed to conduct a thorough risk assessment, largely ignoring clear indicators of compulsive usage. These indicators include the frequency with which the app is opened and the duration of nighttime usage among underage users. The Commission argues that by overlooking these patterns, the platform has failed in its duty to protect its most susceptible users from the harms of digital overconsumption.
Current mitigation strategies, such as parental controls and screen time management tools, were deemed ineffective by the Commission because they are too easily bypassed or disabled. Thomas Regnier, a spokesperson for the Commission, confirmed that these existing measures are "simply insufficient" to address the systemic risks inherent in the app's design. To rectify these violations, the regulator suggests that TikTok must implement fundamental changes to its architecture, such as disabling addictive features, introducing mandatory usage breaks—especially during the night—and overhauling the recommendation algorithms.
Should these findings be finalized, TikTok's parent company, ByteDance, could face massive financial penalties of up to 6% of its total annual global turnover. This is a significant threat considering ByteDance reported a 2024 revenue of $155 billion, with TikTok's European revenue alone surging by 38% to reach $6.3 billion that same year. ByteDance has strongly contested the Commission's conclusions, describing them as a "categorically false and completely unfounded description" of their platform and vowing to fight the ruling through all available legal channels.
This case represents a landmark moment in the enforcement of the Digital Services Act, which establishes a unified regulatory framework for major internet platforms. The DSA requires these entities to proactively evaluate and mitigate "systemic risks," particularly those affecting mental health and the safety of minors. As the European Commission intensifies its scrutiny of other youth-centric platforms, this precedent signals a broader regulatory shift toward prioritizing user well-being in the digital economy.
4 Views
Read more news on this topic:
Did you find an error or inaccuracy?We will consider your comments as soon as possible.