about us
We are independent researchers (Kathy, Martin, Anna, Alex) and Greta, at interface (previously “Stiftung Neue Verantwortung), a non-profit think tank on information technology and public policy. Learn more about us here.
about the project
Recommender systems play an increasingly important role in the lives of many people. Whether we use a search engine or social media platform, algorithms, artificial intelligence and statistical models determine what, when and how content is presented to users. Despite the importance of these systems for our daily lives and how we inform ourselves and communicate with each other, their design is challenging to understand for users, politicians, researchers, and civil society.
Whether certain content is systematically disadvantaged or favored, whether recommender systems amplify hate and disinformation, and how user behavior, algorithms, and platform design intertwine are highly relevant questions. It has been shown that platforms can contribute to serious harm to individuals, groups and societies. Studies have suggested that these negative impacts range from worsening an individual’s mental health to driving society-wide polarisation capable of putting democracies at risk.
To better safeguard people from these harms, the European Union’s Digital Services Act (DSA) requires platforms, especially those with large numbers of users, to make their algorithmic systems more transparent and follow due diligence obligations. However, the DSA lacks concrete guidelines. To fill this gap, we propose a risk-scenario-based audit process (RSBA process).
about this website
Our project is dedicated to studying the impact of social media platforms by closely looking into their recommender systems. Within the framework of the project (RSBA process), and given that it impacts billions of users around the world, we have decided to concentrate on TikTok. Our approach considers the evolving nature of platforms and emphasises the observability of their recommender systems’ components. We have found and learned many interesting things along the way.
On this website -through our blog- we document our journey into exploring, testing and auditing TikTok’s recommendation algorithms. Instead of writing long policy papers, this blog is the main space through which we want to share our analyses as they emerge with interested stakeholders and the general public.
Make sure to check out our news section to keep up with what we have been up to! You can also follow udates on Mastodon or via RSS.