Close

ADD blogpost: The collapse of democratic information environments – insights from a research visit in Amsterdam 

What do user-driven content moderation, transparency reporting and information environments have in common?

By Ulrikke Dybdal Sørensen, Industrial PhD student at the Centre for Social Media, Tech and Democracy, Ministry of Digital Affairs – uds@digmin.dk. 

What’s happening in Amsterdam?  

Amsterdam is a research hub for strong knowledge environments and established research groups. I visited three of them at the University of Amsterdam (UvA): Digital Methods Initiative ( (DMI), the Department of Media Studies and the Department of Communication Science. I spent most of my time at the DMI summer school, while also engaging in discussions with leading researchers at the two departments. In this post, I summarise my findings. 

The summer school was part of DMI, which works to develop methods for internet-related research and is one of Europe’s leading research groups in the field of internet studies. Both are led by Richard Rogers, professor of media and digital culture at the University of Amsterdam. The summer school consisted of two weeks of intense data sprints, where empirical questions are explored in a variety of projects. This year, the focus was on social media, AI platforms and developments in content moderation and fact-checking. 

The second week of the programme kicked off with a keynote from Tom Willaert. He is a researcher at Vrije Universiteit Brussel, part of the European Horizon project SoMe4Dem, and presented his work on social media and the public sphere, echoing the diagnosis that digital media can have negative impacts on democracy. Based on this, he presented an empirically informed framework for a more systematic approach to studying how social media affects the public sphere. The framework consists of a model for the deliberative public sphere with four core functions: information, deliberation, collective action and collective actors. It is linked to the online public sphere on social media and the concept of affordances. That is, the action possibilities and artefacts that encourage certain behaviour or enable certain usage e.g. on a digital platform. 

In addition to addressing the need for more systematic research in this area, the framework also contributes to the development of more definitive concepts that can be explored empirically on social media. Future work now lies in substantiating and strengthening the framework with empirical studies, which, however, requires methodological development and further operationalisation of core concepts to critically examine the impact of social media on democracy. 

NoteNotNeeded  

Community Notes on X is a user-driven or crowdsourced moderation system where users can add notes to posts to provide context or point out misleading content. The notes are reviewed by other users and are only published and displayed under the post if the note is rated ‘helpful’ by enough users with different points of view. In other words, Community Notes is designed to achieve consensus based on the ‘Diversity of Perspectives’. This is calculated using a so-called bridge-building algorithm that assesses ‘different perspectives’ based on how people have previously rated notes. 

Recently, there has been a shift in the approach to content moderation towards choosing more user-driven methods over fact-checking. This was evident, in particular, in Meta’s announcement in early 2025 to discontinue its fact-checking programme in favour of Community Notes. But how well does the Community Notes approach work? How effective is it as a tool for providing context to a post or pointing out misinformation? Particularly considering the European Commission’s formal proceedings against X, who is being investigated for the effectiveness of the Community Notes system, among other things.  

Together with Richard Rogers and a group of international researchers, I explored these topics during the first week of the summer school. We looked at whether Community Notes build dissensus or consensus, and how effective the system is at doing that. We examined effectiveness from three angles: How much consensus (or dissensus) is created? How quickly is it created? And do a few authors do most of the work? 

We found that 80% of the notes are written by less than 27% of all note contributors, that it takes an average of 4 days for notes to be published, if they are published at all, and that over 88% of the notes are in what we defined as ‘dissensus’ or a ‘state of non-agreement’. This means that the vast majority of notes created are never published – either because they are not considered helpful by users with different points of view, or because they are never reviewed. 

The week after the data sprint, the Digital Democracy Institute of the Americas (DDIA) published a new study: A Deep Dive into X’s Community Notes. It shows that over 90% of notes are never published and emphasises this is a worrying statistic for a system that is otherwise promoted as effective in the fight against manipulative information. Therefore, when the system appears to create a state of non-agreement rather than consensus, it indicates that the system is not achieving its purpose. 

We sought to understand what drives this by identifying the posts with the most notes and the most rated notes. The research revealed that none of the notes with the most ratings had reached a consensus, even though the notes had over 2,000 ratings, which is well above the 5-rating threshold set by X. Common to these ratings, the majority were categorised as ‘NoteNotNeeded’, which is a rating category introduced by X in 2021 to indicate when a note is not considered necessary. This introduces a form of non-algorithmic engagement in Community Notes, whereas the assessment is made as to whether there is disagreement about a note, rather than whether a post is misleading or lacks context. 

NoteNotNeeded is also used in the summary when note authors create a note. Here it appears as ‘NNN’ and is not only used to indicate that a note is unnecessary due to it being a clear opinion, statement, satire, joke or rumour. ‘NNN’ is also used to comment on other notes or as a way of addressing other note authors. Both forms of ‘NNN’ indicate that, instead of assessing a note, users are introducing a form of debate into the system that is more akin to a deliberative form of consensus than the more ‘manufactured’ consensus on which Community Notes is built 

Transparency Washing 

Almost in parallel with the European Commission launching the harmonisation of transparency reporting under the Digital Services Act (DSA), we explored transparency as a principle and practice during the second week of summer school. Using TikTok as a case study – a designated ‘Very Large Online Platform’ (VLOP) – we examined the extent to which transparency reporting fulfils its democratic purpose. 

For transparency as a principle, it is crucial for the democratic control of very large online platforms that the public and individual users are given insight into the platforms’ machinery in a transparent and uniform manner. We therefore mapped out the regulatory framework that the DSA establishes for transparency, which aims to hold actors accountable for their societal impact by operating in ways that can be monitored and evaluated by public authorities and other entities such as researchers and civil society. 

However, it remains an open question whether current practices actually promote meaningful transparency. Several factors indicate that there are challenges with the way platforms choose to fulfil this task. This includes challenges with the types of data that are published and how they align with different transparency objectives, significant differences in the level of detail, consistency and standardisation of information across platforms, and a lack of clear and precise descriptions of how the platforms moderate content

For transparency as a practice, then, it’s interesting to examine how effective TikTok’s transparency reporting and content moderation are in relation to the transparency requirements in the DSA. In other words, is it possible to check whether TikTok is doing what they say they’re doing? 

We examined this from different angles based on the rules, how information about implementation and enforcement in connection with the DSA is communicated by the European Commission, TikTok and in public debate, and whether the information in TikTok’s transparency reports corresponds to the data and information available through their Transparency Centre.  

While information about transparency, regardless of whether you are trying to learn more about EU rules or TikTok’s compliance, can be complicated and difficult to navigate, there is some overlap in the language and wording used. This means that TikTok ‘mimics compliance’ by repeating regulatory language while failing to provide the technical details needed to assess whether they are doing what they say they are doing. 

However, TikTok is not alone in being criticised for its failure to comply with fundamental provisions of the DSA regarding transparency and accountability for the societal impact of the platforms. The issue of data access is not limited to an academic debate focused on e.g. tracking how data access is implemented (or not). It is also a political debate, in which data access is highlighted as fundamental to ensuring democratic control over the platforms. 

At the same time, our analysis of the public debate showed that it is not only the platforms that are criticised for their unwillingness to comply with transparency requirements and principles, but also the DSA as a regulatory framework. Critics focus on what they see as unreasonable requirements for content moderation, which could lead to over-moderation and censorship, the EU Commission’s lack of transparency in their processes, and the disproportionate administrative burdens associated with the DSA. 

Nevertheless, as transparency reporting risks becoming a desk exercise, it remains difficult to ensure that the very large online platforms live up to the role they play in society and key democratic processes. This raises the question of whether transparency as a practice is more a matter of ‘transparency washing’ than of living up to the democratic principle of transparency. 

That said, with the harmonised guidelines for transparency reports, the EU has introduced common and comparable methods for assessing how platforms comply with transparency requirements. This is intended to make it possible to compare compliance with transparency requirements, among other things, across platforms. Simultaneously, at the beginning of July, the EU introduced the delegated act on data access, which is intended to strengthen the position of researchers and clarify the procedure for requesting access to data that is not publicly available, thereby supporting the ability to verify whether platforms are doing what they claim to be doing. 

Concerns for democracy  

In addition to exploring empirical questions about social media in relation to digital governance and democracy, as part of my research stay, I also had conversations with two leading scholars at the University of Amsterdam (UvA) in the fields of platform studies and media and communication research. We discussed how the platformisation of information and communication has changed the conditions for democratic society and what consequences this has. We discussed how it affects the quality of information and democratic discourse, considering if we can even talk about quality given what our information flows are filled with and the increasing role of ‘infotainment.’ 

Keynote speaker Willaert’s report highlights a growing convergence between the political sphere and the entertainment world, especially on social media. As mentioned, this was a focal point of my conversations, which explored how the approach to facts and quality content has a completely different tone than what we know from traditional media and journalistic principles. When people are increasingly questioning what reliable sources and quality content are, it changes the conditions. This means that fundamental core values associated with being an ‘informed citizen’ are challenged when there is increased apathy towards being informed, while greater focus and value is placed on being entertained. In line with Neil Postman’s idea, we are ‘amusing ourselves to death’. Or at least we are part of a development where people are not committed to being informed about important social issues, at least not in the way that our media reality and democratic information society are built upon. 

Besides introducing a breakdown of the democratic informed public, the digital information culture and platform ownership are raising concerns. Not simply in terms of how it impacts participation and engagement in a democratic information society, but also in relation to the concentration of power and how it positions marginalised groups. However, many of the challenges we face are not new; they have accelerated and reached heights that make them difficult to navigate and manage, especially when the very foundations of action, and thus democracy, are being called into question.