Addiction has always sparked fear - when it's something we can see, something tangible. Alcohol, cigarettes, gambling - society recognizes their dangers and regulates them. But what happens when addiction hides behind a screen? What happens when the drug is content and the dealer is an algorithm? We panic over substances that harm the few, yet remain silent about digital traps that affect millions - especially the young.
This is not hypothetical. In Austria, a man became radicalized through TikTok in mere months - an algorithmic descent that ended in bloodshed1. Teenagers plotted terrorist acts, driven by content curated by platforms that value engagement over ethics. And the reach is staggering: over 70% of 11-17-year-olds in Austria use TikTok, with more than 85% scrolling daily2.
We already regulate addictive substances because they destroy lives. So why do we exempt digital algorithms, which destroy minds, polarize societies, and, as recent events show, can kill? The only difference is their form - intangible, yes, but no less lethal. How did we let this happen?
The algorithms driving platforms like TikTok and Instagram are not passive participants in this crisis - they are its architects. Engineered to maximize engagement at any cost, they prey on human psychology, using reward loops and outrage to trap users in an endless scroll. Their fuel? Division, misinformation, and consumerism. Their currency? Controversy and clicks. Their victims? All of us - especially the young, whose identities and beliefs are still taking shape.
But these algorithms don't just steal attention - they distort reality. Misinformation spreads faster than truth, and conspiracy outcompetes reason. Political polarization deepens as users are funneled into echo chambers that reinforce their biases and harden their views. Debate collapses into hostility. Dialogue turns into division. This is not a malfunction - it is the model. Algorithms thrive on conflict because conflict drives engagement.
The consequences reach beyond politics. These same algorithms are engines of mass consumerism, pushing users into an endless loop of buying trends they never needed. They create cravings - must-have products, viral styles - fueling overconsumption in a time when humanity should consume less, not more. As the planet overheats and resources deplete, the digital economy urges us to scroll, shop, and waste. The cost is planetary, not just personal.
Austria and other countries have already seen the darkest side of algorithmic influence. This is no longer hypothetical. Platforms don't just share content; they shape belief. And they do so faster and more effectively than any human recruiter ever could. Algorithms have no conscience. They chase clicks, not consequences. The result is a digital battlefield where extremism, hate, lies, and mass consumption triumph over truth, empathy, and sustainability.
This is not innovation - it is exploitation. And its victims are not only individuals - they are societies, democracies, and the future of our planet.
The damage of addictive algorithms extends far beyond radicalization - it trickles down into the very fabric of our society, draining it of creativity, critical thought, and collective agency. We are witnessing a mass paralysis: millions chained to screens, passively consuming rather than questioning, scrolling instead of shaping the world around them.
Social platforms promise connection but breed isolation. They offer knowledge but reward conformity. The algorithms behind them do not foster curiosity - they suppress it, feeding users a steady stream of content tailored to their biases, narrowing their world instead of expanding it. Users stop exploring, stop debating, stop thinking. Instead, they echo the phrases and opinions served to them by their feeds.
This mental passivity is dangerous. In times of crisis, when society most needs critical minds and bold ideas, we are met with a collective shrug. Only small groups rise to challenge injustice, while the vast majority remain paralyzed - scrolling endlessly, outraged briefly, but ultimately inactive. The platforms thrive on this inaction. They do not want thinkers; they want consumers.
But the cost is catastrophic. We face existential threats - from climate collapse to social inequality - that demand collective creativity and urgent action. We cannot afford a society where people no longer question the world, no longer generate solutions, no longer create. The most critical problems of our time require fresh ideas from every mind capable of thinking beyond a curated feed.
And here lies the bitter irony: these platforms, designed to share ideas, instead produce inactivity. They transform millions into passive participants - consuming without contributing, following without questioning, reacting without reflecting. They suppress the very human instincts - curiosity, imagination, dissent - that drive progress and change.
This is more than an issue of addiction - it is a crisis of agency, of humanity's ability to shape its future. To allow these algorithms to continue unchecked is to accept a society that scrolls through collapse, too numb to act.
The crisis is clear. Addictive algorithms are not harmless technologies - they are engines of exploitation, radicalization, and societal decay. They manipulate human psychology to maximize profit, fostering division, isolating individuals, and paralyzing society's collective potential. They exploit the vulnerabilities of our youth, driving some to extremism and many more into a state of digital dependency. They steal time, creativity, and critical thought, replacing meaningful engagement with mindless consumption.
We cannot pretend that 'responsible use' is a solution. These systems are designed to addict and divide, making self-control a hollow concept. Austria's schools have already resorted to banning smartphones entirely3 - an act that reveals the true nature of the threat. If we must remove devices to protect children from disruption, why not confront the source - the algorithms that fuel the disruption in the first place?
The solutions are within reach, but they demand courage. Transparency must replace secrecy - social platforms should disclose how their algorithms operate and what content they promote. Engagement loops must be broken, with time limits and design principles that prioritize well-being over retention. Above all, harmful algorithmic practices that exploit and manipulate must be banned outright. We regulate every other public health risk - why should digital addiction be an exception?
But policies alone will not solve this crisis. The greatest responsibility lies with those of us who create these technologies. As AI developers, engineers, and policymakers, we are the architects of this digital ecosystem. We built it, and therefore, we must fix it. It is not enough to innovate; we must also safeguard. It is not enough to profit; we must also protect. We have the power to create algorithms that inspire, educate, and connect - technologies that serve humanity, rather than exploit it.
Let us be clear: none of these measures are radical. They are the logical response to a clear and present danger. Ignoring this threat, continuing with business as usual - that is what is truly extreme. If we allow these algorithms to continue unchecked, the paralysis of thought, the erosion of creativity, and the radicalization of the vulnerable will only accelerate. The cost will be our society's future.
So, to my peers in technology and policy: The time for passivity has passed. The systems we have built are harming society. It is our duty to redesign them - to confront the crisis we created and rebuild a digital world that empowers rather than exploits. If we do not act, history will ask: Why did those with the power to change the course remain silent? The answer must not be that we were too late, but that we chose to lead.