By Julian Jaursch

The EU’s major new platform regulation law, the Digital Services Act (DSA), requires online platforms to explain their recommender systems, have user-friendly reporting mechanisms for potentially illegal content, label advertising and report on how they try to ensure diligent and non-arbitrary content moderation. Bigger platforms such as TikTok have to fulfill additional obligations such as conducting risk assessments.

While the DSA only takes/took full effect in mid-February 2024, rules for such “very large online platforms” (VLOPs) – those with more than 45 million users in the EU per month – have been in place since the summer of 2023. What have the first months of enforcement brought for TikTok?

Compliance (assurances)

TikTok was adamant about fulfilling the DSA’s obligations ahead of the legal deadline. In a blog post in early August 2023, the company said it had introduced an additional reporting mechanism for users and expanded on the explanations of content moderation and recommender systems, among other things. It also stated it would not show personalized ads to minors. Research by my colleagues posted on this blog cast some doubt on this.

The fact that TikTok tries to comply with the DSA and makes public announcements on product or policy changes related to the new rules is a promising sign that the DSA is working: It puts some pressure even on bigger platforms to make changes. The expansion of the researcher API from the US to Europe can also be viewed in this context. The API is not fully meeting researchers’ expectations yet, as this blog has documented early on. At least compared to other services, though, there are steps in the right direction.

All of this is not to say that the early months of DSA enforcement have been rosy for TikTok.

Commission requests for information

The European Commission oversees VLOPs’ compliance with the DSA and has kept an eye on TikTok and other companies. In late 2023, it sent formal requests for information to various VLOPs on different topics (see table below). This can be the first step towards formal investigations under the DSA but does not have to be. The requests are mostly based on the internal risk assessments that VLOPs provided to the Commission in August 2023. If the Commission saw open questions, they sent the requests to the companies.

Made with Flourish

It is difficult to judge the platforms’ or the Commission’s job because the risk assessments are not public. What did the platforms write about? What did the Commission miss? To answer such questions, researchers, journalists and users have to wait until summaries of risk assessments come out, which will be only after the VLOPs have been audited by external auditors.

From TikTok, the Commission wanted to know how the company follows the rules on

  • Risk “mitigation measures against the spreading of illegal content, in particular the spreading of terrorist and violent content and hate speech, as well as the alleged spread of disinformation”
  • Protection of minors generally and the risk assessments regarding risks for minors in particular, with an emphasis on risks to mental health and physical health
  • Providing access to publicly available data

Whatever the metrics might be, this shows the Commission was not satisfied with the quantitative and/or qualitative data it received in TikTok’s risk assessment. Yet, at least the risk assessment was filed and the communication between regulator and company works. With the DSA in its early enforcement stages, this is the minimum of what needs to happen. It would have been surprising if everything at TikTok – from recommender system transparency to content moderation reporting to risk assessments – had fallen into place immediately and the Commission had not had any questions. It seems as if TikTok (and other companies) are already doing their DSA homework, but some improvements might still be necessary. With every iteration of the risk assessment cycle, the Commission and especially external observers should demand and check on these improvements.

Some outside checks on TikTok and other VLOPs are important particularly because companies will seek ways to get out of DSA obligations.

Lawsuit against the Commission

TikTok filed a lawsuit against the Commission, albeit not one on substantive DSA due diligence obligations. The company argues that the fees tied to its designation as a VLOP are calculated unfairly. TikTok and Meta, which also sued the Commission over this, are not happy that they have to pay millions in supervisory fees per year, while platforms with smaller or no profits contribute nothing. The fee is important to help finance the Commission’s platform oversight structure.

The Commission claims its methodology for calculating the fee is just fine and now it will be up to the EU’s General Court to decide whether that is the case. It is not known how much TikTok is supposed to pay (Meta said its payment obligation for 2024 was 11 million euros; the company had a net income 14 billion US dollars in the last three months of 2023).

Considering the compliance successes and struggles, the information requests and the lawsuit together offers a murky picture of early DSA enforcement: Key questions around the law are open or will be challenged in court over the next couple of years. Apart from companies, regulators and courts, it will be up to media, civil society and researchers to center users’ rights throughout these processes.