Europe takes on Big Tech: what impact will new social media rules have?

The era of the digital Wild West is over. Europe’s massive fines and algorithmic crackdowns signal a decisive shift toward a new regime of strict oversight and corporate accountability for social media giants.

Lina Žemaitytė Kirkman

3/8/20264 min read

The €120 million fine on Platform X and TikTok’s accusations of an addictive algorithm mark a new stage in Europe’s relationship with social media giants. It signals the end of the digital Wild West, and the transition of tech campaigning in the European Union (EU) to a regime of stricter oversight and clearer accountability.

Out of step with today’s realities:
Until recently, social media operated under the EU’s Electronic Commerce Directive, adopted in 2000. Its rules were designed before smartphones, algorithmic news feeds, social media, or today’s influencer economy. So, in practice, the rules no longer covered how the digital space has changed.

Over the past two decades, social platforms have emerged and become more than just a place to connect with friends. They have begun to shape what we see, what we believe, how much time we spend online, what goods we buy, and even what political attitudes we form.

Naturally, the question arose whether platforms with such influence can operate under outdated rules. European institutions have begun to examine the principles of platform algorithms and advertising, the protection of minors, and whether tech giants are “locking” users into their ecosystems.

The regulators’ deliberations resulted in two pieces of legislation adopted in 2022 and fully entering into force in 2024: the Digital Services Act (DSA) and the Digital Markets Act (DMA). The DSA aims to ensure that social networks and other large platforms are responsible for various risks (disinformation, manipulative advertising, protection of minors, etc.), transparency in their operations, and the right of users to maintain control. Meanwhile, the DMA is protecting tech giants from abusing their dominant market position.

At first, these were just rules “on paper.” But now Europe has started to put them into practice.

Hundred-million-euro fines:
One of the first precedents in the application of DSA was the 120 million euro fine imposed on the platform “X” (formerly “Twitter”). The European Commission (EC) cited several violations. One of them is related to the possible misleading of consumers, and more specifically – with the use of the trust symbol on the platform. Previously, a white checkmark on a blue background on the “Twitter” platform meant that the account was verified and real, i.e. it belongs to the person it says it belongs to and can be trusted. After the “Twitter” platform was taken over by businessman Elon Musk, this badge became a paid service, i.e. it can simply be purchased. According to EU regulators, this is misleading for consumers – a symbol that previously meant reliability can now simply be purchased.

Another DSA violation identified by the regulator is regarding the transparency of “X” advertising. Major social media platforms must have a public and easily accessible advertising archive that allows you to see who ordered the advertisement, to whom and according to what criteria it was shown. This requirement is intended to ensure that the public, journalists or researchers can understand what and how influences users in the digital space, and to reduce the risk of manipulative advertising. Today's "X" social network has been criticized for not meeting these requirements.

The fine imposed on the "X" platform is an important precedent, showing that the EU has moved from talking about a safer and more transparent digital environment to concrete work.

Another important case that shows a change in EU policy is related to "TikTok". This social network's algorithm, based on the display of highly personalized content, is considered one of the most effective tools for maintaining user attention. The EC has published preliminary conclusions that the platform may have violated the DSA by encouraging excessive use of the platform, especially among minors. Therefore, the company must assess the risks and take measures to mitigate them.

According to the EC, the measures already offered by TikTok, such as the ability to limit usage time or parental control functions, are insufficient to mitigate the risks of the app's addictive design.

If the violations are confirmed, TikTok could be fined up to 6 percent of its global annual revenue. This would also mean that for the first time, regulators are targeting not only the content of platforms, but also the principle of operation and design decisions of algorithms.

What benefits will it create for users?
The new EU regulation should provide more transparency and control to users.

The DSA stipulates that major platforms must offer at least one alternative to content recommendation that is not based on profiling. In practice, this usually means the ability to choose chronologically displayed content, in which posts are displayed according to the time of their publication, rather than according to the algorithm's assessment of what the user may be interested in. For example, the ability to choose a chronological, rather than an algorithmically arranged, sequence of posts has already appeared on social networks LinkedIn and Instagram. This feature is also available on Facebook, although it is more difficult to find.

Platforms must also be more explicit about who ordered the ad, why a particular user sees it, and what data was used to select it. This does not mean that we will no longer see ads, but there will be more transparency and clarity about who ordered it. This is especially relevant in the case of political and other opinion-forming ads.

The largest platforms will also be required to conduct risk assessments and demonstrate that they are taking steps to mitigate threats - from disinformation to the protection of minors.

Will Europe follow suit?
Europe has already become a global standard-setter for data protection with the General Data Protection Regulation (GDPR). The current decisions reflect the desire to take a similar role in the field of technology regulation. It is recognized that social networks have a significant impact on society, and therefore cannot operate without clear accountability, algorithm transparency, and mechanisms to protect minors.

Whether this model will become a global standard or remain European - time will tell. But one thing is clear: the activities of Big Tech in the EU will now be monitored more closely than ever before.

Of course, there is no shortage of debate about such an EU position. Critics argue that platforms may moderate content too strictly in order to avoid fines, and that complex regulatory procedures may hinder competition because they are more difficult for smaller market participants to implement. However, Europe is sending a clear signal: social networks are no longer considered just private technology companies. They are treated as digital infrastructure with a systemic impact on society. And infrastructure, from a European perspective, must be regulated.