Este artículo también está disponible en español.
Meta has announced that, starting March 18, its Community Notes program will begin operating in the U.S. for Facebook, Instagram, and Threads. It will launch in a test mode, allowing a select group of users to propose notes, but these will not be publicly displayed for now.
As of that date, Meta’s independent fact-checking program in the U.S. will officially end. This program collaborated with independent fact-checkers who could label false content or content that needed context on Meta’s platforms, displaying warnings to users. Only fact-checkers accredited by the International Fact-Checking Network (IFCN) or the European Standards Fact-Checking Network (EFCSN) could participate. These organizations ensure that fact-checkers adhere to international standards of editorial quality, impartiality, and transparency.
According to Meta, its Community Notes will use the same visibility algorithm as Twitter (X), meaning that notes will only become visible when users who typically disagree find the note useful. This approach previously led to situations like the floods in Valencia, where over 90% of debunked hoaxes by Maldita.es did not have visible notes, despite accumulating 50 million impressions. During the last European elections, this figure was 85%.
Community Notes can work, but not like this
Two months ago, Maldita explained why we like the idea of community notes but also why we believe X's model has significant issues. We also provided recommendations for an effective community notes system. Now, we see that Meta’s new program repeats many of Elon Musk’s mistakes:
In its announcement, Meta does not clarify key aspects of how Community Notes will work. For example, it does not mention whether there will be an appeal process if a user believes they received an unfair note—something that existed in the independent fact-checking program.
The end of independent fact-checking on Meta
In justifying its decision, Meta claims it expects Community Notes to be "less biased than the independent fact-checking program they replace." However, the reality is that Meta never criticized its collaboration with fact-checkers until Donald Trump won the November 2024 elections—neither publicly nor privately. Just months before, the company was still defending the program's effectiveness, as it had for years. In its own transparency reports, Meta stated that fact-checkers had an error rate of 3.15%, while other content moderation decisions by Meta reached nearly 90%.
By announcing the end of its fact-checking program in the U.S. and the launch of Community Notes, Meta says its goal is to "eventually implement this new approach for users worldwide." However, for now, it will maintain its collaboration with fact-checkers outside the U.S., including in Spain, where Maldita.es is part of the independent fact-checking program.
In any case, a study by Fundación Maldita.es on X's Community Notes highlights that fact-checkers are the third most-used source worldwide and that they generate higher trust among users.