Regulating The Dopamine Loop
Yesterday Tech Crunch reported that the EU is requiring TikTok to disable addictive features such infinite scroll. So, for todays Binary Response we are asking why this is only a requirement for TikTok and not other social media sites. Please visit Binary News where regular opinion pieces cover business, politics, sports, technology, and more.
The European Commission is essentially saying TikTok is designed to be addictive and is ordering it to rip out some of the core mechanics that make the app what it is — infinite scroll, autoplay, certain push notifications, and elements of the recommendation engine itself. Their claim is that TikTok didn’t adequately assess how those design decisions could harm users, especially minors and vulnerable adults, and didn’t act on indicators of compulsive use.
So TikTok is now being told to change the basic design of its UI and disable infinite scroll, build in real screen-time breaks, and alter how the recommender works. That’s not a nudge. That’s a structural rewrite, and TikTok is already calling the findings categorically false and entirely meritless and promising to fight it every step of the way.
But the obvious question is if infinite scroll, autoplay and aggressive recommendations are so harmful that they need to be turned off by law… why are we only talking about TikTok?
Let’s be honest, none of this is unique to one app. Infinite scroll is baked into Instagram, X, Facebook, YouTube, and pretty much every other feed-based product out there. Autoplay and you might also like queues might as well be the default setting for modern attention platforms. The entire growth playbook in social for the last decade has been reduce friction, keep you in the loop, and let the algorithm drip dopamine until you forget what time it is.
The Commission’s argument, is actually very general. Claiming that constantly rewarding users with new content fuels the urge to keep scrolling and shift the brain of users into ‘autopilot mode,’ which can lead to compulsive behavior and reduced self-control. That’s not a TikTok-specific sentence. That’s a description of the core loop on almost every major social platform.
So why frame it as a TikTok problem?
Part of the answer is legal and procedural. The EU’s Digital Services Act gives Brussels new teeth to go after what it calls very large online platforms. TikTok happens to be under an active DSA investigation, and these are just preliminary findings in that specific case. You investigate one platform at a time, you write up one set of findings at a time, and you threaten one company at a time with fines of up to 6% of global revenue. From a regulatory workflow standpoint, that makes sense.
But from a user’s perspective, that nuance gets lost. What people see is that TikTok is being told to disable addictive features, while every other app gets to keep scrolling along as usual. That feels less like we’ve identified a dark pattern that harms people” and more like “we’ve picked a villain of the month.
There’s also the geopolitical layer no one ever says out loud in the official documents. TikTok is owned by ByteDance, a Chinese company. Meta, YouTube, X, Snapchat and others are American. It’s a lot easier politically to come down hard on the one big Chinese-owned player and say you’re taking a stand on mental health, especially for minors, than it is to tell a whole ecosystem of U.S.-based giants to gut their engagement engines. The concerns about data, influence and foreign control of a dominant social platform have been simmering for years; now they’re blending with the newer narrative around addiction and youth harm.
That doesn’t mean TikTok is innocent. The EU points out that TikTok’s current screen-time tools and parental controls exist but are easy to dismiss and introduce limited friction, and that parental controls require time and tech literacy that a lot of parents simply don’t have. That’s fair but they could just as easily be copied and pasted into a letter to half the industry.
While Australia forces platforms to deactivate accounts for users under 16, the U.K. and Spain exploring similar measures, France/Denmark/Italy/Norway working on age restrictions, and 24 U.S. states rolling out age-verification laws. That wave is not TikTok-specific. It’s a broader panic about young people, screens, and who gets blamed when things go wrong.
Which circles back to the core inconsistency: if the underlying pattern is industry-wide, why is the design remedy not industry-wide?
A more coherent version of this policy would say Certain UX patterns — endless scroll without natural stopping points, autoplay that bypasses choice, recommender loops with no friction, overly aggressive push notifications — are considered high-risk by default when minors are involved. Any platform that uses them must do X, Y, and Z: default timeouts, clear ‘you’ve been scrolling for X minutes’ interstitials, easy opt-outs, and child-specific protections. Then you apply that across TikTok, Instagram, YouTube, X, you name it. Investigations can stay platform-specific, but the rules about dark patterns and addictive design would be technology-neutral.
Instead, what people are seeing is another TikTok story. TikTok must change the basic design of its app UI. TikTok is the one accused of fueling compulsive use. TikTok is the one facing headline-grabbing fines. That makes it too easy for the other platforms to say, Whew, at least they’re not talking about us — yet.
Here’s the thing folks: If the EU genuinely believes the science it’s citing — that these patterns can shift brains into autopilot and undermine self-control — then it can’t stop at one platform. Either this is a systemic harm that needs systemic design standards, or it’s really about TikTok.
With that… If infinite scroll and hyper-optimized recommendation engines are dangerous enough that TikTok has to rip them out, how long until the EU (and everyone else) is willing to say out loud that the problem isn’t just which app is doing it — it’s that these design patterns exist everywhere?
Using technology to make a living and for leisure causes some of us to have more poignant opinions!



