Skip to content

Expert blog: Meta's farewell to fact-checking opens the door to more social media chaos

Dr Tine Munk, senior lecturer in Criminology, on why it is unrealistic to expect social media users to take on the critical and challenging task of monitoring misinformation.

By Helen Breese | Published on 15 January 2025

Categories: Press office; Research; School of Social Sciences;

Meta has decided to abandon professional fact-checking and instead rely on users for moderation. While this might sound democratic, many experts call it a disaster. Meta promises "more speech and fewer errors," but the result could be the opposite: more disinformation and misinformation, greater division, and even less trust in social media. This decision comes at a time when social media plays a larger role than ever in shaping public opinion and influencing democratic processes.

Why this is a problem

Meta claims fact-checking is too complex and biased, and that user-driven systems like Community Notes are the solution. However, leaving the responsibility to online users creates confusion, as it is unclear what triggers a warning on social media. Most users lack the time, training, or tools to stop dis-/misinformation. Expecting ordinary people to decide whether a post is accurate or to understand the process is unrealistic. For instance, does the absence of a note mean the information is correct? Another issue is the requirement for contributors from different political sides to agree that a post is problematic, which delays action. Misleading posts can spread widely before any intervention. Community Notes only act as a warning, not a solution. Echo chambers within social media groups amplify such content, confirming users’ biases and filtering out opposing viewpoints. This snowball effect causes false stories to grow and appear credible.

This approach is akin to inviting fake news inside without anyone guarding the door. The result? Social media platforms where false stories spread rapidly, leaving many unable to distinguish fact from fiction. Critics fear that Meta's decision does the opposite of protecting people—it opens the door to more chaos. In 2023, the EU criticised Meta, X (formerly Twitter), and TikTok for failing to address dis-/ misinformation during the Israel-Hamas conflict. This led the EU to remind tech giants of their responsibilities under the Digital Services Act.

Dr Tine Højsgaard Munk.

Meta’s decision and political consequences

Meta’s decision plays into the hands of those who aim to spread false information for strategic gain. For instance, Russia uses such tactics to destabilise countries and create confusion about what is true or false. In war zones like Ukraine, Russian propaganda has already exploited weak moderation to sow division and undermine efforts. Elections are particularly vulnerable to disinformation. The risk is not theoretical.

During the 2024 U.S. presidential campaign, influencers were paid by Russia to promote Trump, while online campaigns sought to undermine Harris's candidacy and spread lies about the sitting government. Russia’s DoppelGänger campaign infiltrated well-known media outlets, spreading pro-Russian messages through cloned websites, fake articles, and manipulated content. Generative AI and bots produced fake content and circulated it under hidden affiliations.

Sponsored posts increased the reach of dis-/misinformation while bypassing moderation. Meta’s shift to user-based moderation exacerbates this issue, risking the undermining of free elections and democratic stability.

Timing: Trump in the shadows

Meta’s decision to end fact-checking coincides with a new U.S. presidential administration taking office. This raises suspicions of a political move to avoid offending any political faction—or even to ingratiate themselves with a potential Trump administration. Whistleblower Frances Haugen previously revealed Meta’s willingness to yield to political pressure, especially after their role in dis-/misinformation surrounding the January 6 Capitol Hill attack. Meta’s approach mirrors Elon Musk’s chaotic management of X, where poor moderation and Community Notes have turned the platform into a haven for fake news. The lack of oversight could have dire consequences.

Trump and Greenland

Trump has expressed interest in the U.S. acquiring Canada, Greenland, and the Panama Canal. On social media, these statements have been distorted to suggest Denmark is imperialist, that the U.S. has a legitimate claim to Greenland, and that Greenlanders want to join the U.S. Trump Jr.'s brief visit to Greenland in January fuelled these claims. Despite his declaration that Greenlanders were favourable to Trump, it does not necessarily reflect the population’s sentiment. His comments were repeated, distorted, and used to fuel polarising debates.

Memes and manipulation

Memes and comments exacerbate confusion about truth. Social media humour or manipulated memes take Trump’s statements out of context or add false claims, making it hard for users to distinguish fact from fiction. This mix of easily consumable content and lack of fact-checking creates a perfect storm. Comment threads repeating or amplifying dis-/misinformation worsen the problem. Meta’s decision undermines years of efforts to combat dis-/misinformation. Online groups have responded with deliberately false memes to highlight the absurdity of unregulated content. Some of these memes mimic credible sources like BBC News and place Mark Zuckerberg at the centre of fabricated stories to illustrate the dangers of manipulated content and lack of control.

Time to act

Though the issue starts in the U.S., it affects the entire world because online information transcends borders. Meta’s decision marks a step back in the fight against dis-/misinformation and poses a direct threat to democracy. Politicians, legislators, and society must hold platforms like Meta accountable. Social media companies must take greater responsibility to stop false and absurd stories from spreading to massive online audiences. Stronger control, transparency, and collaboration are needed to protect democracy and public debate. If action is not taken now, we risk a world where truth no longer matters, and fake news prevails—a price we cannot afford to pay.

Written by Dr Tine Munk and originally published by Jyllands-Posten