Tiktok as drugs. Brussels finally makes its move
The turning point may be historic for the regulation of digital platforms in Europe: the Commission has launched an investigation into TikTok, accused of being ‘addictive by design’ .
The application, now used by one in three Europeans with much higher peaks among young people, came under scrutiny for its design features which, according to the European authorities, exploit the cognitive vulnerabilities of users, especially minors, to maximise the time spent on the platform.
Among the incriminated features are the customised recommendation algorithm, the endless scrolling of videos (the so-called ‘For You’ feed) and the continuous push notifications that continuously solicit users’ attention.
If the investigation confirms the suspicions, Tiktok would receive a fine of up to 6% of its annual worldwide turnover, but above all an injunction to give up those addictive tools that are the very essence of its success.
About time
The investigation comes after years of pressure from public officials, associations and scientists.
Among the most critical voices, European Green MEP Alexandra Geese stated that ‘platforms like TikTok are designed to be addictive, especially among young people’.
The French association ‘Collectif Citoyen pour l’Éthique Numérique’ also pointed out that ‘digital platforms cannot continue to exploit the cognitive vulnerabilities of minors without any regulation’.
She was echoed by the UK Children’s Commissioner , who in 2022 published a report entitled ‘Life in Likes’, in which he pointed out how social platforms exploit instant gratification mechanisms to retain younger users.
According to the report, ‘platforms such as TikTok use gambling-like techniques to hold the attention of minors’.
The European Commission’s investigation is a significant step towards stricter regulation of digital platforms, especially in light of the new Digital Services Act that comes into force in 2023.
This regulation, which aims to protect the digital rights of European citizens, could become the first legal instrument capable of imposing substantial changes in the practices of large technology companies.
What ‘addictive by design’ means
The term ‘addictive by design’ refers to a design strategy that exploits the cognitive and psychological weaknesses of users to maximise the use of a platform.
In the case of TikTok, as we said, this accusation is based on three key elements: the customised recommendation algorithm, the infinite feed and push notifications.
TikTok’s algorithm is one of the most sophisticated in the world. It analyses user behaviour in detail, such as time spent on each video, interactions (‘likes’, comments, shares) and favourite topics.
This data is used to create a customised feed that presents highly engaging content, often in line with the user’s interests and emotions. This mechanism creates a gratification loop that drives users to spend more and more time on the platform.
TikTok’s feed, known as ‘For You’, is designed to scroll endlessly. Once the app is open, the videos follow one another automatically, without requiring any user interaction.
This minimises the cognitive effort required to continue using the app, making it more difficult to ‘detach’ from the platform.
Finally, TikTok’s push notifications are continuous and targeted.
Every interaction, even the smallest, can generate a notification, keeping the user permanently engaged. This system, in short, exploits the principle of intermittent gratification, similar to that used in casinos, to create a sense of anticipation and urgency.
DSA violations
And this is where the Digital Services Act comes in, which for the past two years has imposed a number of obligations on large digital platforms to protect their users.
In particular, Article 25 requires platforms to ‘assess and mitigate systemic risks’, including those related to the development of behavioural addiction.
The use of opaque and highly personalised algorithms such as TikTok’s could violate this principle.
Further on, Article 34 requires platforms to ensure the ‘transparency of recommendation algorithms’.
TikTok, however, still maintains a high level of secrecy about how its algorithm works, making it impossible for users and regulators to fully understand how content is selected and proposed.
This opacity is one of the main reasons why the European Commission launched the investigation.
Finally, Article 35 obliges platforms to ‘conduct regular risk assessments’, including those related to children’s mental health. Well, TikTok is accused of ignoring this obligation.
Overwhelming scientific evidence
In recent years, a broad scientific consensus has formed around the detrimental effects of long-term use of TikTok on psychiatric, biological and social levels.
In 2022, a team of Oxford researchers conducted a study on 5,000 teenagers, analysing the impact of TikTok on their mental health.
The results revealed that ‘prolonged use of TikTok is associated with a 30% increase in depressive symptoms among adolescents’. In addition, 25% of the respondents said they felt ‘unable to control the time spent on the app’, a clear sign of behavioural addiction.
In 2023, the Max Planck Institute examined the impact of the TikTok algorithm on the brain’s reward system.
Using neuroimaging techniques, the researchers found that ‘the TikTok algorithm can alter the brain’s reward system, making users more susceptible to immediate gratification’.
45% of the teenage users showed signs of behavioural addiction, including an inability to resist the urge to open the app, and a marked reduction in offline social activities.
Also in 2023, in France, a report by the Institut national de la santé et de la recherche médicale highlighted how intensive use of TikTok can have negative consequences on young people’s social skills. According to the study, ‘intensive use of TikTok correlates with a significant decrease in offline social interactions’. 35% of the adolescents surveyed reported that they ‘prefer virtual interaction to real interaction’, with negative consequences on their ability to communicate and relate in the real world.
In the same year, an investigation conducted by Karolinska Institutet in Stockholm examined the biological effects of TikTok use.
The researchers found that ‘intensive TikTok users have higher levels of the stress hormone cortisol’.
Furthermore, 20% of respondents reported sleep problems related to the use of the platform, including difficulty falling asleep and frequent nocturnal awakenings.
David versus Goliath?
The odds of success of the European Commission’s initiative are not low, especially in light of the recent fine that was threatened against TikTok for the opacity of its archives.
In 2023, in fact, the company was accused of not providing enough information about the functioning of its algorithm and the management of user data. The Commission threatened the usual fine of 6% of global turnover.
TikTok responded with a partial opening, promising to improve transparency and to cooperate with European authorities.
However, many observers consider these measures to be insufficient. On paper, the Commission would have the authority to impose structural changes, such as changing the recommendation algorithm or limiting push notifications. Political support from national governments, for once, would be broad.
These actions could force TikTok to revise its business model, freeing it from the creation of behavioural dependency.
The question, of course, is: can Tiktok survive without that system?
The difference between us and them
A new nightmare is therefore looming for the Chinese company that distilled what someone labeled ‘the bacteriological weapon of the 21st century’, just a few months after the Trump administration took drastic action against it.
At the end of January 2026, after years of hesitation and procrastination, TikTok USA was sold to a consortium led by Oracle, a US company (close to the President) specialising in cloud solutions and cybersecurity.
The handover came under pressure from the federal government, which considered the Chinese app a threat to national security.
The sale to Oracle, as expected, led to significant changes in the management of TikTok’s data and algorithms.
The new ownership has implemented security measures to ensure that US users’ data is no longer accessible to the Chinese government. However, it has also taken the opportunity to rejigger the criteria for censoring content, at the expense of those disliked by Trump. For instance, during the Minneapolis protests, many users reported that content related to the demonstrations was systematically removed or ‘shadowbanned ‘ (i.e. made less visible).
In short, Trump and his technocratic allies did not disarm the ‘bacteriological weapon of the century’, but rather exploited it for their own interests.
The European Union’s intervention, on the contrary, is not justified by reasons of national security but of health and the protection of children and adolescents.
The procedure may appear slower, but it is also more legitimate, because it belies any suspicion that the authorities in Brussels only want to seize a convenient instrument of social control by wresting it out of the hands of the authorities in Beijing.
The difference in style between the two sides of the Atlantic is clear at the moment. We shall see which of the two strategies will yield the best results overall.








