Newsletter

Addictive platform design under DSA scrutiny: The European Commission's preliminary findings on TikTok

by Larissa Barhebréus

Published:

Smart Phone at night. Photo.

The European Commission has preliminarily found TikTok in breach of Regulation (EU) 2022/2065 Digital Services Act (DSA) due to design features that may promote addictive user behaviour, marking a novel focus on platform design rather than content issues. The Commission's investigation highlights regulatory attention on the impact of product design on user wellbeing, especially among minors and vulnerable adults. The Commission suggests that the platform’s core features must be redesigned rather than relying on optional user controls. This investigation is among the first to directly challenge addictive design in a major platform, signalling a shift towards addressing systemic risks from design choices under the DSA, with complex implications for user autonomy and platform operations. Platform operators should assess the risks posed by features like infinite scroll, autoplay, push notifications, and its personalized recommender system, which collectively encourage compulsive use and reduce self-control.

Introduction

The European Commission announced on 6 February 2026 that it has preliminarily found TikTok in breach of the Regulation (EU) 2022/2065 Digital Services Act (DSA) for deploying design features that may foster addictive user behaviour. The preliminary findings mark a new development in the Commission's enforcement of the DSA and represent one of the first cases where platform design itself has been identified as the primary compliance concern. The case is particularly notable for its focus on the relationship between product design and user wellbeing, an area where regulatory intervention has historically been limited. If confirmed, the findings could establish the first European example for how digital platforms are expected to assess and mitigate risks arising from features intentionally designed to maximise engagement.

Background and regulatory context

The Commission opened formal proceedings against TikTok on 19 February 2024 as part of its broader supervisory role over very large online platforms under the DSA. The investigation covers multiple aspects of TikTok's compliance with the regulation, including the 'rabbit hole effect' of its recommender systems, age verification failures, privacy and security protections for minors, access to public data for researchers, and advertising transparency. Whilst some elements of the investigation have been resolved—advertising transparency was closed through binding commitments in December 2025, and preliminary findings on researcher access were adopted in October 2025—the addictive design component has now emerged as the most substantive concern. The DSA, which became fully applicable to designated very large online platforms in August 2023, imposes several key obligations under Section 5 relevant to this case. First, platforms must conduct comprehensive risk assessments identifying systemic risks stemming from the design and functioning of their services. Two of the four categories of systemic risks that are mentioned in the DSA, concern behavioural addictions of recipients of the service.  Second, they must implement reasonable, proportionate and effective mitigation measures to address those risks. The Commission's preliminary findings suggest that TikTok has failed on both counts. 

The Commission's findings on risk assessment

The Commission's investigation preliminarily indicates that TikTok did not adequately assess how certain core design features could harm the physical and mental wellbeing of its users, particularly minors and vulnerable adults.

The features in question include infinite scroll, autoplay, push notifications, and TikTok's highly personalised recommender system. According to the Commission, these features operate collectively to 'reward' users continuously with new content, fuelling the urge to keep scrolling and shifting users into what the Commission describes as 'autopilot mode'. The Commission references scientific research indicating that this design approach may lead to compulsive behaviour and reduce users' self-control.

The preliminary findings are particularly critical of TikTok's approach to its risk assessment obligations. The Commission found that TikTok disregarded important indicators of compulsive use, including the time minors spend on the platform at night, the frequency with which users open the app, and other potential behavioural signals. This suggests that TikTok's risk assessment process was not only incomplete but systematically excluded data points that would likely have revealed the extent of the risks posed by its design.

Inadequate mitigation measures

ven where TikTok has introduced measures ostensibly designed to address concerns about excessive use, the Commission preliminarily found these to be ineffective. The Commission focused on two categories of tools: screen time management features and parental controls. The Commission also found that parental controls may not be effective because they require additional time and skills from parents to implement. This raises a broader question about the design of mitigation measures: for a measure to be effective under the DSA, it must be usable in practice by the individuals it is intended to protect. If parental controls are too complex or time-consuming for parents to implement, they fail as risk mitigation tools regardless of their technical capabilities. 

At this stage, the Commission considers that TikTok needs to change the basic design of its service. Specific examples cited by the Commission include disabling key addictive features such as infinite scroll over time, implementing effective screen time breaks (including during the night), and adapting its recommender system. Incremental adjustments or optional user controls will not be sufficient. Instead, the platform's core architecture consisting of the features that drive user engagement, must be restructured to comply with the DSA.

Next step and potential consequences

TikTok now has the opportunity to exercise its right to defence. If the Commission's views are ultimately confirmed, it may issue a non-compliance decision. Under the DSA, fines for non-compliance can reach up to 6 per cent of the provider's total worldwide annual turnover, with the amount determined by the nature, gravity, recurrence and duration of the infringement.

Financial risk is substantial and operational consequences may be more significant. If the Commission requires fundamental design changes (such as disabling infinite scroll or restructuring the recommender system) TikTok will face the challenge of maintaining user engagement and advertising revenue whilst complying with regulatory requirements. This tension between commercial imperatives and regulatory obligations is likely to become a defining issue for very large online platforms operating in the EU.

A test case for design regulation

The TikTok investigation represents one of the first instances where a competition or digital services regulator has directly challenged the addictive design of a major platform. Whilst concerns about platform design have been discussed for years by EU policy-makers, regulatory enforcement has largely focused on transparency, consent and data protection rather than on the underlying architecture of user engagement. If the Commission's preliminary findings are confirmed, the DSA framework positions certain design choices as potential sources of systemic risk that must be assessed, mitigated and, where necessary, restructured.

Defining what constitutes 'addictive design' and determining which features must be modified or eliminated involves complex judgements about user autonomy, behavioural science and the balance between engagement and harm. The Commission's findings suggest it is prepared to make those judgements, but the legal and practical challenges of implementing design-level interventions should not be underestimated. As enforcement continues, platforms across the EU should closely monitor the developments to understand where the boundaries of acceptable design lie.

Do you have any questions?