Misinformation — a concept that has only been amplified by social media – is likely to spread even further with Meta’s decision to end fact-checking across all its platforms, including Instagram, Facebook, Threads, and WhatsApp.
Announced on Jan. 7, 2025, the change will take effect over the next few months. Meta CEO Mark Zuckerberg justified the decision by emphasizing the need for greater free expression. He argued that the company’s previous system led to excessive censorship and frequent errors, stating that it was time to “go back to our roots.”
The problem is that our “roots” have already been uprooted. According to the Pew Research Center, 64% of American adults believe fake news causes significant confusion about basic facts, and 23% admit to sharing such stories themselves.
As misinformation becomes more widespread, distinguishing fact from fiction is becoming increasingly difficult. A Statista report found that while 52% of Americans are only somewhat confident in recognizing fake news, 29% are not very confident, or not confident at all.
This is especially concerning given that 54% of U.S. adults say they sometimes get their news from social media, according to Pew Research Center. And one of the top platforms where Americans turn for news? Facebook.
This is where fact-checkers come in: ensuring that the information we see online is accurate. A study conducted by researchers at Penn State’s College of Information Sciences and Technology analyzed 24,000 articles fact-checked by Snopes and PolitiFact. Out of all of these articles, they found that the two sites disagreed on only one claim.
This study highlights the consistency and reliability of established fact-checking practices, countering Zuckerberg’s concerns about excessive errors and unfair content takedowns. While mistakes can happen, they are not significant enough to justify eliminating fact-checking entirely, nor are they as risky as replacing it with a Community Notes system instead.
With the decision to end fact-checking, a replacement system is inevitable. Meta plans to implement a Community Notes system, similar to the one used on X. Zuckerberg argues that this approach is more effective, citing concerns about bias in traditional fact-checking. However, the same issue exists within Community Notes, where user contributions can be influenced by personal biases.
A study conducted at Cornell University found that the majority of sources cited in X’s Community Notes system come from left-leaning, high-factuality news outlets, highlighting potential biases in community-based fact-checking.
Beyond bias, Community Notes can also amplify unverified claims, as passionate users may prioritize engagement over accuracy. The system risks becoming a battleground of competing opinions rather than a tool for ensuring factual information.
As each generation grows more reliant on social media for news, it’s crucial to recognize the risks this shift poses to our ability to stay accurately informed.
*This editorial reflects the views of the Editorial Board and was written by Jessica Li. The Editorial Board voted 7 in agreement, 4 somewhat in agreement, and 2 refrained from voting.