Commission’s preliminary findings put infinite scroll, autoplay and notifications under the Digital Services Act spotlight—especially for minors.
In a short clip shared on YouTube — “TikTok’s Addictive Design Raises Risks for Children” — the message is blunt: Brussels believes some of TikTok’s most familiar engagement mechanics may be unlawful in the EU if they are found to drive compulsive use and expose children and vulnerable users to systemic harm.
On 6 February 2026, the European Commission published preliminary findings stating that TikTok’s design features could breach the EU’s Digital Services Act (DSA). The case targets a package of “sticky” features — infinite scroll, autoplay, push notifications, and a highly personalised recommender system — which, regulators argue, can reduce users’ ability to disengage and may worsen risks for minors.
What the Commission is alleging
The Commission’s preliminary view is not a final ruling, but it is a clear escalation in the EU’s approach to platform design. Under the DSA, “very large online platforms” must identify and mitigate systemic risks linked to their services — not only illegal content, but also broader harms to fundamental rights and wellbeing.
According to the Commission, TikTok has not shown that it adequately assessed or reduced the risks linked to “addictive design.” EU officials point to the way endless feeds and automated playback can push users into “autopilot” behaviour — a term repeated in several reports on the file — where self-control weakens as content continues to arrive without deliberate choice. :contentReference[oaicite:0]{index=0}
Why minors are central to the case
Children and teenagers are not the only users affected by compulsive design, but they are the focus of the Commission’s risk framing. The preliminary findings stress that platform safeguards should be robust enough for minors and other vulnerable users without relying on perfect user behaviour or constant parental intervention. :contentReference[oaicite:1]{index=1}
The Commission also casts doubt on the effectiveness of TikTok’s existing “screen-time management” features and parental tools, suggesting that optional prompts or easily bypassed controls do not meet the DSA’s standard of effective mitigation. The consumer group BEUC welcomed the Commission’s direction, arguing that real compliance may require altering TikTok’s “core design,” including disabling key addictive features and implementing meaningful breaks. :contentReference[oaicite:2]{index=2}
What TikTok says in response
TikTok denies wrongdoing and says it will contest the Commission’s assessment. Reuters quoted a TikTok spokesperson calling the preliminary depiction “categorically false” and “meritless,” signalling the company will challenge the case through the DSA’s procedural steps. :contentReference[oaicite:3]{index=3}
Those steps matter: TikTok will be able to examine the Commission’s file and submit a formal defence. A preliminary finding can still evolve if the platform provides new evidence, proposes credible remedies, or disputes the Commission’s interpretation of risks and mitigation duties.
What enforcement could mean: design changes, not just fines
If the Commission ultimately confirms non-compliance, the DSA allows for substantial penalties — widely reported as up to 6% of global annual turnover — and, crucially, it can impose binding orders requiring product changes. :contentReference[oaicite:4]{index=4}
In practical terms, this is the EU asking a platform to re-engineer the friction (or lack of it) that keeps people scrolling. Tech industry coverage notes that EU regulators are not simply demanding better disclosures; they are questioning whether engagement loops should be weakened by design, through stronger screen-time breaks, reduced notification pressure, and changes to how recommendations amplify repeated viewing. :contentReference[oaicite:5]{index=5}
A precedent for the wider platform economy
One reason the case is being watched closely is that TikTok is not alone in using infinite feeds, autoplay and algorithmic recommendations. If the Commission’s legal theory holds, it could signal a broader EU standard: certain engagement mechanics are acceptable only if platforms can prove they have tested and implemented effective safeguards that reduce harm at scale.
Human-rights groups are also watching. Amnesty International urged robust enforcement of the DSA in response to the Commission’s action, framing “addictive design” as a rights and wellbeing issue, not just a consumer preference debate. :contentReference[oaicite:6]{index=6}
The core question now is how far the EU will go in regulating design decisions that sit at the centre of ad-driven business models — and whether mandated “friction” becomes Europe’s signature approach to child online safety.
Related background: The European Times has previously covered debates over how to protect young users online, including the limits of blanket age measures and the need to address underlying risks in platform design (see this explainer).
