By Julian Jaursch

**Updated on February 21st to reflect the Commission’s opening of formal proceedings against TikTok. **

The EU’s major new platform regulation law, the Digital Services Act (DSA), requires online platforms to explain their recommender systems, have user-friendly reporting mechanisms for potentially illegal content, label advertising and report on how they try to ensure diligent and non-arbitrary content moderation. Bigger platforms such as TikTok have to fulfill additional obligations such as conducting risk assessments.

While the DSA only took full effect in mid-February 2024, rules for such “very large online platforms” (VLOPs) – those with more than 45 million users in the EU per month – have been in place since the summer of 2023. What have the first months of enforcement meant for TikTok?

Compliance (assurances)

TikTok was adamant about fulfilling the DSA’s obligations ahead of the legal deadline. In a blog post in early August 2023, the company said it had introduced an additional reporting mechanism for users and expanded on the explanations of content moderation and recommender systems, among other things. It also stated it would not show personalized ads to minors. Research by my colleagues posted on this blog cast some doubt on this.

The fact that TikTok tries to comply with the DSA and makes public announcements on product or policy changes related to the new rules is a promising sign that the DSA is working: It puts some pressure even on bigger platforms to make changes. The expansion of the researcher API from the US to Europe can also be viewed in this context. The API is not fully meeting researchers’ expectations yet, as this blog has documented early on. At least compared to other services, though, there are steps in the right direction.

All of this is not to say that the early months of DSA enforcement have been rosy for TikTok, as the European Commission’s official proceedings against the company underline.

Commission proceedings against TikTok

The Commission oversees VLOPs’ compliance with the DSA and has kept an eye on TikTok and other companies since the summer of 2023, when the DSA started applying to VLOPs. Later that year, it sent requests for information to various VLOPs on different topics. For X and then TikTok, formal proceedings followed these initial requests (see table below).

Made with Flourish

The opening of proceedings against TikTok in February 2024 means that the Commission is now investigating whether the company is complying with several provisions of the DSA:

  • Does TikTok assess and mitigate risks related to behavioral addictions?
  • Are mitigation measures to protect minors good enough (in DSA language: “reasonable, proportionate and effective”)? In addition to risks of addiction, this concerns protection of privacy and age verification.
  • Is the mandatory ad archive up to the DSA’s standards by being reliable and searchable?
  • Does TikTok provide access to publicly available data, as required by the DSA?

The proceedings are in large part based on the internal risk assessment that TikTok provided to the Commission in August 2023. It is difficult to judge the platforms’ or the Commission’s job because the risk assessments are not public. What did the platforms write about? What did the Commission think was missing? To answer such questions, researchers, journalists and users have to wait until summaries of risk assessments come out, which will be only after the VLOPs have been audited by external auditors.

Whatever the metrics might be, the opening of proceedings shows Commission was not satisfied with the quantitative and/or qualitative data it received in TikTok’s risk assessment. The suspected shortcomings reflect long-standing issues raised with TikTok but also some other VLOPs: lack of youth protection, addictive design features, no meaningful transparency around ads and data access. It will now be up to the Commission to figure out if TikTok has actually been non-compliant on these issues, a process that is likely to take some time. In case TikTok did, in fact, not meet its DSA obligations, the ideal final outcome for platform users would be that the Commission can push for meaningful changes at TikTok to protect minors, improve data access and reduce addictive tendencies. To get to that point, however, the Commission needs an airtight case to prove non-compliance. Otherwise, there is a risk that a legal challenge by TikTok is successful and consumers do not see any improvements after all. Legal challenges would not come as a surprise, as some companies, including TikTok, have already sued the Commission on DSA-related issues.

Lawsuit against the Commission

TikTok filed a lawsuit against the Commission, albeit not one on substantive DSA due diligence obligations. The company argues that the fees tied to its designation as a VLOP are calculated unfairly. TikTok and Meta, which also sued the Commission over this, are not happy that they have to pay millions in supervisory fees per year, while platforms with smaller or no profits contribute nothing. The fee is important to help finance the Commission’s platform oversight structure.

The Commission claims its methodology for calculating the fee is just fine and now it will be up to the EU’s General Court to decide whether that is the case. It is not known how much TikTok is supposed to pay (Meta said its payment obligation for 2024 was 11 million euros; the company had a net income of 14 billion US dollars in the last three months of 2023).

Considering the compliance successes and struggles, the information requests and the lawsuit together offers a murky picture of early DSA enforcement: Key questions around the law are open or will be challenged in court over the next couple of years. Apart from companies, regulators and courts, it will be up to media, civil society and researchers to center users’ rights throughout these processes.