The EU calls TikTok's design illegal, but defining what legal looks like may prove impossible
Brussels' preliminary ruling against addictive design faces the same problem as regulating junk food: everyone agrees something should change, nobody agrees on what
When K.G.M. was ten years old, her mother tried to block her from social media. She failed. By the time K.G.M. turned nineteen, she had spent nearly a decade on TikTok, Instagram, and Snapchat, developing what her lawyers describe as a compulsion that led to depression, anxiety, and body dysmorphia. On 27 January 2026, the eve of the first jury trial in American history to test whether social media platforms deliberately engineered their products to addict children, TikTok quietly settled. The terms were not disclosed. The company did not admit fault. But it chose to write a cheque rather than let a jury hear how its recommendation engine works.
Ten days later, the European Commission announced that TikTok's design is likely illegal.
What Brussels actually said
The Commission's preliminary finding, issued on 6 February 2026, concluded that TikTok had breached the Digital Services Act through features including infinite scroll, autoplay, push notifications, and its personalised recommendation algorithm. These features, the Commission stated, 'constantly reward users with new content, fuel urge to keep scrolling, shift brain into autopilot mode.' The required remedies are sweeping, demanding that TikTok disable infinite scroll over time, implement effective screen time breaks including at night, and overhaul the recommender system. In the Commission's own words, 'TikTok needs to change the basic design of its service.'
The potential fine is substantial, reaching up to 6% of ByteDance's global annual revenue. TikTok responded by calling the findings 'categorically false and entirely meritless' and promising to challenge them through every available channel.
This is not an isolated action. Meta faces a parallel DSA investigation over whether Instagram and Facebook algorithms stimulate behavioural addictions in children. X, formerly Twitter, was fined roughly €120 million at the end of 2025 for transparency violations. The Commission appears to be building a doctrine: that certain design patterns, when deployed at scale to very large online platforms serving more than 45 million monthly European users, can constitute a breach of duty to protect user wellbeing.
The ambition is remarkable. So is the problem it creates.
Where engagement ends and exploitation begins
Strip away the regulatory language and the Commission's case rests on a deceptively simple claim: that TikTok's design features are so effective at retaining attention that they cross from serving users into harming them. The scientific research cited in the investigation describes how rapid content delivery, personalised recommendations, and frictionless interfaces can trigger compulsive behaviour and reduce self-control, particularly in minors whose prefrontal cortices are still developing.
This is not controversial as neuroscience. The difficulty is legal. Every feature the Commission identifies as problematic is also a feature that users actively prefer. Infinite scroll removes the friction of clicking 'next page.' Autoplay eliminates the effort of choosing what to watch next. Personalised recommendations surface content users are statistically likely to enjoy. Push notifications alert users to activity they have expressed interest in. TikTok's recommendation engine updates within one second of user interaction, built on Apache Flink for real-time feature computation. This responsiveness is precisely what makes the platform feel intuitive rather than clunky. Making the algorithm slower or less accurate does not obviously protect users. It makes their experience worse.
The Commission's language hints at this difficulty without resolving it. Terms like 'autopilot mode' and 'compulsive behaviour' appear throughout the preliminary findings, but the decision does not specify what a legally compliant recommendation algorithm would look like, nor how much less effective the system must become before Brussels is satisfied. There is a reason for this silence. Any metric that could quantify 'addictive design' would also capture what most people simply call 'good design.' A search engine that reliably returns what you want is engaging. A music service that learns your taste is compelling. A video platform that surfaces content you enjoy is, by the Commission's apparent logic, a machine for manufacturing compulsion.
This is not to say the Commission is wrong that a problem exists. It is to say that the problem resists the kind of bright-line regulation that enforcement requires. You cannot fine a company into making its product precisely less enjoyable enough to be legal but not so much less enjoyable that users abandon it.
The food on your plate
The closest structural analogy is not, as many commentators suggest, tobacco regulation. It is the long, messy, and largely unresolved fight over ultra-processed food.
Roughly 62% of the American food supply meets scientific criteria for hyperpalatability: specific combinations of fat, sugar, sodium, and carbohydrates engineered to trigger the brain's reward system and encourage excessive consumption. Research published in the journal Addiction in 2023 found that tobacco companies which pivoted to food production after cigarette regulation tightened were significantly more likely to produce hyperpalatable products than other food manufacturers. The techniques migrated. The bliss point, the precise level of sweetness or saltiness that maximises enjoyment, became as carefully engineered as nicotine delivery.
The science is increasingly clear. An estimated 14% of people worldwide display symptoms consistent with food addiction. Dr Robert H. Lustig, a neuroendocrinologist at the University of California, San Francisco, has argued that ultra-processed foods meet all four classical criteria for regulation: abuse, toxicity, ubiquity, and externalities. Added sugar, specifically the fructose moiety, activates reward circuitry, and chronic exposure downregulates dopamine receptors, requiring ever greater stimulation to achieve the same effect. This is the neurological signature of addiction.
Yet decades after this science became available, regulatory responses remain fragmentary. Brazil introduced dietary guidelines referencing ultra-processed foods in 2014. France followed in 2018. Chile implemented the most aggressive approach: mandatory warning labels, restricted marketing, and additional taxes on products high in sugar, saturated fat, salt, and calories. Recent studies show that Chilean consumers measurably reduced their intake of warned products. But Chile is the exception. Most countries remain stuck in the definitional stage, debating what counts as 'ultra-processed' and whether government should regulate food composition at all.
The trajectory is instructive. If platform regulation follows the same path, what lies ahead is not swift reform but a multi-decade negotiation characterised by definitional argument, industry self-regulation pledges of varying sincerity, incremental labelling requirements, and very little structural change. The EU's ruling may be less the beginning of meaningful reform than the opening statement in a conversation that will outlast the current Commission.
Two TikToks, two forms of control
There is a detail that makes the EU's case uncomfortable for TikTok. ByteDance operates two separate platforms: Douyin for the Chinese domestic market and TikTok for the rest of the world. They share a codebase and a parent company. They do not share rules.
Douyin imposes mandatory 40-minute daily limits for users under 14, restricts access to the hours between 6am and 10pm, curates content for minors toward educational material including science experiments and museum exhibits, inserts a five-second pause between videos to disrupt compulsive scrolling, and requires identity verification to prevent age falsification. Tristan Harris, co-founder of the Center for Humane Technology and a former Google design ethicist, has described this as shipping the 'spinach version' of TikTok domestically while exporting the 'opium version' to the rest of the world.
The framing is memorable. It is also incomplete. Douyin's restrictions exist because the Chinese government mandated them, through a 2021 law requiring content 'conducive to healthy growth of minors' and 2022 regulations capping smartphone screen time for under-18s at two hours daily. ByteDance did not voluntarily protect Chinese children. It complied with authoritarian directives that also shape what adults may see and say. The content curated for Chinese youth is filtered through state ideology. Douyin's educational mode is wholesome in a way that serves Beijing's interests as much as children's welfare.
This complicates the standard narrative. The comparison is not between a responsible product and an irresponsible one. It is between two forms of control over attention: state mandate and market incentive. Neither is obviously preferable to a society that values both child safety and individual liberty. The EU is attempting something genuinely novel: using democratic regulation to achieve outcomes that, so far, only authoritarian mandate has delivered, while preserving the freedoms that make democratic societies worth living in. Whether this is possible remains an open question.
The courtroom may matter more than the commission
While Brussels deliberates, American courtrooms are moving faster.
The K.G.M. trial, proceeding in Los Angeles Superior Court against Meta and YouTube after TikTok's settlement, is the first of a wave of cases that legal commentators have compared to the 1990s campaign against Big Tobacco. Over a thousand individual plaintiffs, hundreds of school districts, and dozens of state attorneys general have filed suits alleging that social media platforms engineered features to addict children. A second federal bellwether trial is scheduled for June 2026 in Oakland, California, representing school districts. The legal strategy is precise: by focusing on design choices rather than content, plaintiffs aim to circumvent both Section 230 protections, which shield platforms from liability for user-posted material, and First Amendment defences.
The LA trial will offer unprecedented transparency. Jurors will review thousands of pages of internal documents, including the companies' own research on children. Mark Zuckerberg and Adam Mosseri, the head of Instagram, are expected to testify. For companies that have spent years controlling their public narratives through carefully worded blog posts, the prospect of cross-examination is something new.
Meta has argued that the plaintiffs are 'cherry-picking' executive statements to 'oversimplify' the relationship between social media and mental health, noting that 'clinicians and researchers find that mental health is a deeply complex and multifaceted issue.' This is true. It is also the kind of argument that tobacco companies made for decades before juries stopped finding it persuasive.
The financial logic is clarifying. A regulatory fine, even one calculated at 6% of global revenue, is a predictable cost that can be budgeted, appealed, and amortised. A jury verdict establishing that specific design choices caused identifiable harm to a named child creates precedent. Precedent creates liability. Liability, multiplied across a thousand plaintiffs and hundreds of school districts, creates the kind of existential financial exposure that changes corporate behaviour. TikTok settled for a reason. It did not want twelve ordinary people hearing expert witnesses explain how its recommendation engine targets adolescent reward circuitry.
Governing attention in a democracy
The honest conclusion to draw from the EU's ruling is not that regulation has arrived but that the conversation about what regulation should look like has barely begun. The Commission has identified a genuine problem. Short-form video platforms, optimised by some of the most sophisticated engineering on earth to capture and retain human attention, are producing measurable harm in vulnerable populations. The scientific evidence is accumulating. The personal stories are harrowing. The societal costs are real.
But the tools available to democratic governments are blunt instruments applied to a subtle problem. Banning infinite scroll is easy to mandate and trivial to circumvent. Requiring 'effective screen time breaks' presupposes agreement on what effective means. Demanding that TikTok 'adapt' its recommender system without specifying how leaves enforcement dependent on subjective assessment. The Commission knows this, which is why its language remains carefully imprecise.
What emerges from the full picture, the EU ruling, the US litigation, the Douyin comparison, the food regulation precedent, is not a story about a single platform or a single decision. It is the early chapter of a society-wide reckoning with a question that previous generations never had to answer: what obligations does a company have when its product is engineered to be irresistible, and its customers include children who cannot yet resist? Slot machine manufacturers faced this question and lost, but only after decades of regulatory battle and only for machines whose sole purpose is gambling. Food manufacturers face it now and are largely winning. Platform companies are somewhere on that continuum, and where they end up will depend less on any single Commission ruling than on whether courts, legislators, and the public can develop a shared vocabulary for the harm they are trying to prevent.
K.G.M. started using social media at ten. Her mother tried to stop her and could not. That failure is not a story about one family's struggle with parental controls. It is a design outcome, produced by systems built to override exactly the kind of friction a concerned parent represents. The question now is whether democratic societies can build friction of their own, friction that protects without controlling, that limits without prohibiting, that treats citizens as something more than attention to be harvested. Brussels has made its opening move. The jury in Los Angeles may make the one that matters.