Brussels Declares War on Infinite Scroll: TikTok Faces Billions in Fines Over 'Addictive Design'
The European Commission accuses TikTok of breaching the Digital Services Act through deliberately addictive features, potentially forcing ByteDance to fundamentally redesign its platform or face fines up to 6% of global revenue.
The European Commission dropped a regulatory bombshell on Friday, accusing TikTok of breaching the bloc's Digital Services Act through deliberately "addictive design" features that harm users' mental health. The preliminary findings represent the EU's most aggressive move yet against social media architecture itself, potentially forcing ByteDance to fundamentally redesign its platform for European users or face fines up to 6% of global annual turnover-potentially billions of euros.
The charges strike at the operational heart of TikTok's business model. Features that define the platform's user experience-infinite scroll, autoplay, push notifications, and the hyper-personalized recommendation algorithm-are now classified by EU regulators as harmful addictive mechanisms that place users' brains into "autopilot mode" and encourage compulsive behavior.
The Science of Doom-Scrolling
The Commission's investigation, launched in February 2024, represents months of forensic examination into how TikTok's design affects user behavior. Regulators analyzed the company's internal risk assessment reports, examined proprietary data and documents, reviewed extensive scientific research on behavioral addiction, and conducted interviews with experts across multiple disciplines.
What they found was damning. According to the preliminary findings, TikTok failed to adequately assess how its core features could harm users' physical and mental wellbeing, particularly minors and "vulnerable adults." The Commission specifically highlighted how the platform's constant reward mechanism-feeding users an endless stream of new, algorithmically-optimized content-fuels compulsive usage patterns.
"By constantly 'rewarding' users with new content, certain design features of TikTok fuel the urge to keep scrolling and shift the brain of users into 'autopilot mode,'" the Commission stated. Scientific research shows this can lead to compulsive behavior and reduced self-control, particularly among adolescents whose prefrontal cortexes are still developing.
Dutch Teens and Digital Addiction
For the Netherlands, these findings resonate powerfully. Dutch researchers at the Trimbos Institute, the country's leading addiction research center, have documented alarming increases in problematic social media use among teenagers. A 2025 study found that 18% of Dutch adolescents aged 13-17 exhibited signs of social media dependence, with TikTok specifically identified as the platform most associated with compulsive usage patterns.
The Netherlands' own Youth Care agency has reported rising numbers of referrals for treatment programs addressing digital addiction, with social media platforms-particularly TikTok-frequently cited as contributing factors. Dutch mental health professionals have advocated for stricter platform regulation, making this EU action particularly welcome in The Hague.
Henna Virkkunen, the European Commission's Executive Vice-President for Tech Sovereignty, Security and Democracy, framed the investigation in explicitly protective terms. "Social media addiction can have detrimental effects on the developing minds of children and teens," she said Friday. "The Digital Services Act makes platforms responsible for the effects they can have on their users. In Europe, we enforce our legislation to protect our children and our citizens online."
The Addictive Features Under Scrutiny
The Commission's preliminary findings identify specific design elements that violate the DSA. Infinite scroll-the continuous feed that automatically loads new content as users reach the bottom of their screen-eliminates natural stopping points that might prompt users to disengage. This feature alone fundamentally changes how people interact with content, transforming active browsing into passive consumption.
Autoplay functionality compounds this effect. Videos begin playing automatically as users scroll, eliminating the micro-decision of whether to engage with specific content. Combined with the algorithm's sophisticated personalization, this creates what regulators describe as a "compulsion loop"-users are simultaneously stimulated by content tailored precisely to their interests and prevented from any natural pause in the experience.
Push notifications represent another vector of concern. TikTok's notification system is designed to pull users back into the app with highly personalized alerts about new content, interactions, or trending videos. The Commission found that these notifications exploit psychological triggers related to social validation and fear of missing out, particularly potent among younger users.
The recommendation algorithm itself-TikTok's secret sauce, the AI system that determines what videos each user sees-came under particular scrutiny. Regulators found that the algorithm optimizes purely for engagement time, with no consideration for potential negative impacts of prolonged use or exposure to certain content types. If keeping someone scrolling for three hours straight at 2 AM maximizes engagement metrics, the algorithm has no built-in constraint against facilitating that behavior.
TikTok's Failed Safeguards
Perhaps most damaging to TikTok's defense, the Commission found that the platform's existing mitigation measures are performative rather than effective. The company's screen-time management tools and parental controls technically exist but are designed to be easily bypassed.
The Daily Screen Time feature, for instance, automatically applies a one-hour daily limit for users aged 13-17. But as regulators noted, the warnings are "easy to dismiss and introduce limited friction." A teenager can tap through the alert in seconds and continue scrolling. There's no meaningful barrier, no cooling-off period, no requirement to actively choose to override the limit.
Similarly, TikTok's Family Pairing tool-which allows parents to customize safety settings, set screen-time limits, and restrict content-requires "additional time and skills from parents to introduce the controls," the Commission found. In practice, most parents either don't activate these features or lack the technical understanding to configure them effectively. This shifts responsibility for child safety entirely to parents while TikTok's platform continues operating in its most addictive configuration by default.
The Commission also criticized TikTok for disregarding key indicators of problematic use in its risk assessments. The company apparently didn't consider the time minors spend on the app late at night, the frequency with which users compulsively reopen the app, or other behavioral patterns that addiction researchers identify as red flags.
The Dutch Digital Rights Debate
In the Netherlands, this regulatory action intersects with an ongoing national debate about children's digital rights and platform responsibility. The Dutch Data Protection Authority has been investigating TikTok separately over concerns about minors' privacy and data collection practices. Dutch privacy watchdog Bits of Freedom has advocated for stricter age verification and default protections for young users.
But there's also resistance to overly paternalistic approaches. Dutch civil liberties organizations argue that blanket restrictions on platform features could infringe on teenagers' rights to information access and digital participation. They advocate for better digital literacy education rather than treating all young people as incapable of self-regulation.
This tension reflects a broader European challenge: how to protect vulnerable users without infantilizing them or restricting beneficial digital experiences. The Commission's approach attempts to thread this needle by targeting platform design rather than user behavior-regulating how companies architect their services rather than limiting who can access them.
Billions at Stake
If the Commission's preliminary findings are confirmed after TikTok responds, the company faces potential fines up to 6% of ByteDance's global annual turnover. ByteDance doesn't publicly disclose consolidated global revenue, but estimates place it between $80-120 billion annually. A maximum fine could therefore range from $4.8 to $7.2 billion-making this potentially one of the largest tech penalties in European regulatory history.
For comparison, the EU's previous record tech fines include Google's €4.34 billion penalty for Android antitrust violations and Meta's €1.2 billion for GDPR breaches. A maximum TikTok fine would exceed both, signaling that Brussels views addictive design as a violation comparable to-or worse than-antitrust abuse or privacy violations.
But the financial penalty may be secondary to the operational demands. The Commission concluded that TikTok must "change the basic design of its service" to comply with the DSA. Proposed remedies include disabling infinite scroll, implementing effective screen-time breaks that can't be easily dismissed, introducing mandatory breaks during nighttime hours, and fundamentally adapting the recommendation algorithm to consider user wellbeing alongside engagement metrics.
TikTok's Defiant Response
ByteDance rejected the Commission's findings in the strongest possible terms. "The Commission's preliminary findings present a categorically false and entirely meritless depiction of our platform," a company spokesperson said. "We will take whatever steps are necessary to challenge these findings through every means available to us."
TikTok argues that there's "no one-size-fits-all approach" to managing screen time, and that the platform already provides multiple tools for users to control their experience. The company emphasizes its investments in safety features, including expanded parental controls, content filtering, and proactive identification of potentially harmful material.
The company will now have the opportunity to examine the Commission's evidence in detail and submit a formal response, potentially including proposed remedies that might satisfy regulators without requiring fundamental platform redesign. TikTok can also request a hearing before the European Board for Digital Services, an independent advisory body.
Global Regulatory Momentum
The EU's action against TikTok doesn't exist in isolation. Australia recently banned social media for under-16s entirely. France, Spain, and Denmark are considering similar age-based restrictions. In the United States, TikTok settled a landmark social media addiction lawsuit in January, though the terms remain confidential. Meanwhile, Instagram and YouTube face similar litigation claiming their platforms deliberately addict and harm children.
For the Netherlands specifically, this represents validation of domestic regulatory concerns. Dutch policymakers have been advocating within EU institutions for stronger platform accountability around addictive design. The country's delegation to the European Parliament supported the Digital Services Act's provisions on systemic risk assessment, viewing tech platform regulation as a logical extension of traditional consumer protection principles.
Whether this regulatory approach succeeds depends partly on enforcement credibility. The DSA gives the Commission unprecedented power to investigate platforms, demand internal data, interview employees, and impose binding remedies. But actually forcing a company like ByteDance to fundamentally alter its product for European users-and verifying compliance-represents a massive administrative and technical challenge.
Redesigning Digital Experience
If Brussels ultimately compels TikTok to disable infinite scroll and autoplay for European users, it would mark a watershed moment in tech regulation-the first time a major government forced a digital platform to abandon core interface elements for entire populations. This sets a precedent that extends far beyond TikTok to every social media company, streaming service, and content platform operating in Europe.
The implications cascade outward. If infinite scroll violates the DSA when implemented by TikTok, why wouldn't the same logic apply to Instagram Reels, YouTube Shorts, Facebook's feed, or Twitter's timeline? The Commission's reasoning about "autopilot mode" and compulsive behavior patterns isn't specific to ByteDance-it describes features common to most popular digital services.
For Dutch users, TikTok's potential redesign could mean a fundamentally different experience: videos that pause after certain intervals, mandatory breaks that can't be dismissed, a recommendation algorithm that limits consecutive similar content, and actual barriers to late-night usage. Whether this would be experienced as protective regulation or annoying paternalism likely depends on age and usage patterns.
What's clear is that the era of treating digital platform design as purely a business decision-immune from regulatory scrutiny as long as content moderation policies exist-is ending in Europe. The EU is asserting that how platforms are built, not just what content they host, falls within the scope of legitimate public interest regulation. Whether other jurisdictions follow this approach, and whether the practical enforcement matches the regulatory ambition, will define digital governance for years to come.
Share this article
Mr. Squorum
Political Analyst
Political analyst specializing in Dutch-EU relations and European affairs.
Related Articles
German-Turkish Director Ilker Catak Wins Golden Bear at Berlinale for Political Drama Yellow Letters
The 76th Berlin Film Festival concludes with politically charged ceremony as Yellow Letters takes top prize and Sandra Huller wins Best Performance for Rose.
4 min readSlovakia Threatens to Cut Electricity Supply to Ukraine as Fico Escalates Energy Dispute
Slovak Prime Minister announces plans to halt electricity transmission to Ukraine from Monday, escalating tensions with Kyiv over energy transit issues.
4 min readComments (0)
Loading comments...