I saw the news EU orders TikTok to change addictive design or face fines, and wondered what exactly is meant by addictive design and how this can be challenged. Here is what I found:
Addictive design refers to interface and algorithmic features that deliberately or foreseeably keep users engaged in a compulsive, hard‑to‑stop way, often by exploiting cognitive biases and attention vulnerabilities. EU regulators are now treating TikTok’s use of infinite scroll, autoplay, push notifications, and a highly personalised recommender system as a particularly clear case of such “addictive design,” especially harmful for minors and vulnerable adults.rte+3

What “addictive design” means
In EU debates, addictive design is framed as a subset of “dark patterns”: UX and algorithmic practices that distort or impair users’ ability to make free, informed choices, by nudging them towards behaviours they would not otherwise choose.ojs.weizenbaum-institut+1
In TikTok’s case, regulators point to: infinite scroll (no natural stopping point), autoplay (next video starts without user decision), frequent push notifications, and a recommender system tuned to reward continuous scrolling and reopenings of the app.euronews+3
The concern is that these features shift users into an “autopilot mode,” undermining self‑control and contributing to excessive screen time, with specific risks for children’s physical and mental wellbeing (sleep disruption, compulsive use, difficulty disengaging).nytimes+3
From a critical media/STS perspective, addictive design is not a single feature but an attention‑extraction regime combining interface, datafication, and business model: design choices are aligned with maximizing time‑on‑platform and ad impressions, not with user autonomy.ojs.weizenbaum-institut+1
How the EU wants this changed
EU’s preliminary finding under the Digital Services Act (DSA) is that TikTok’s current design breaches its duty to mitigate systemic risks, including harms from excessive and compulsive use among minors. The Commission therefore signals concrete design changes:europa+3
Disable or limit infinite scroll: introducing natural stopping points, rather than an endless feed that constantly refreshes with new content.theguardian+3
Make screen‑time breaks real, not cosmetic: effective, harder‑to‑dismiss breaks, especially at night, with friction that actually supports disengagement (e.g., requiring a deliberate choice to continue, defaulting to stopping).abcnews.go+2
Adjust recommender systems: changing how the For You‑style algorithm works so that it does not systematically reward compulsive engagement patterns and is more protective of minors (e.g., dampening late‑night use, limiting highly stimulating content loops).yahoo+4
Re‑evaluate push notifications and engagement prompts: reducing frequency and salience, especially for children, and avoiding notification patterns that are designed primarily to re‑trigger use.rte+2
Under the DSA, if TikTok does not implement adequate changes, the Commission can impose fines of up to 6% of the company’s global annual turnover; in extreme cases, a temporary suspension of the service in the EU is also possible.europa+2
Beyond TikTok: how addictive design could be regulated
The Weizenbaum paper on dark patterns and addictive designs suggests a broader regulatory trajectory that goes beyond one app.ojs.weizenbaum-institut+1
Potential directions (many of which align with ongoing EU thinking):
Explicitly classify addictive designs as dark patterns in law, covering not only interface tricks but also algorithmic systems that systematically capture attention.oecd+2
Recognize “attention” as a protected interest: treat attention and time as something that can be manipulated and thus needs legal safeguards similar to privacy or consumer protection.ojs.weizenbaum-institut+1
Impose duties of proof and auditing: require platforms to assess and document how their design choices affect user wellbeing (especially minors), including metrics like night‑time use, session length, and difficulty disengaging.euronews+3
Mandate alternative, less‑addictive defaults: e.g., chronological feeds, stricter default time limits for minors, opt‑in (not opt‑out) to personalized feeds, and friction for high‑risk features such as autoplay and endless scroll.oecd+2
This EU–TikTok case can be read as an institutionalization of critiques from the critical algorithmic studies: the shift from “user choice” rhetoric to recognising a structural attention economy, where addictive design is treated as a regulable harm rather than a neutral UX innovation.nytimes+3
Discover more from Erkan's Field Diary
Subscribe to get the latest posts sent to your email.
