This article is also available in Spanish.
A study by Fundación Maldita.es on more than 1,175,000 Community Notes proposed by X users throughout 2024 reveals that 1 in every 27 included a link to an article from a fact-checker accredited by the European Fact-Checking Standards Network (EFCSN) or the International Fact-Checking Network (IFCN). This makes independent fact-checkers the third most cited source in Community Notes, only behind X itself and Wikipedia.
However, as we previously reported on Maldita.es, the issue is not so much the lack of well-founded notes but rather X's policy for displaying them. The platform only makes visible those notes that achieve consensus among users of "different political ideologies," prioritizing this factor over the accuracy of the note. According to Fundación Maldita.es study during the last European elections, only 15% of tweets containing electoral disinformation had a note. Furthermore, of the 20 most viral false posts that received no action from major digital platforms, 18 were on X, each accumulating more than 1.5 million views.
Despite this, notes citing fact-checking organizations are more likely to be seen. While only 8.3% of all proposed notes on X become visible, the percentage rises to 12% when they include a link to a verification organization and to 15.2% if they come from European fact-checkers.
These notes also become visible faster: they are proposed within 4 hours and 25 minutes of the original tweet's publication (23 minutes earlier than the general) and become visible 5 hours and 40 minutes after being rated as useful (24 minutes earlier than usual). In total, notes with fact-checker evidence become visible 90 minutes earlier than others.
You can access the full report with complete evidence through this link.
How Community Notes could be more effective
Despite these findings highlighting the importance of fact-checking in Community Notes, they also indicate that X’s model deprives users of valuable information by equating "consensus" with "truth," leaving 85% of notes containing fact-checker evidence unseen. Now that platforms like Meta and YouTube are planning to implement similar programs, it is crucial that they do not repeat X’s mistakes and instead collaborate with fact-checking organizations to:
Prioritize notes with quality sources and expert knowledge over "consensus" among users who often disagree.
Speed up the appearance of notes in cases of the most viral and dangerous disinformation.
Prevent organized groups or users with multiple accounts from manipulating the system.
Take action against users who repeatedly spread falsehoods and receive notes, such as removing the blue verification check or their ability to monetize.
Ensure the independence of the process and prevent the removal of notes due to external pressure.