Mark Zuckerberg has recently expressed admiration for Elon Musk’s Community Notes system, acknowledging its effectiveness in tackling misinformation. During a Meta earnings call, Zuckerberg admitted that X’s crowdsourced approach to content moderation has outperformed Meta’s previous reliance on third-party fact-checkers.
Despite earlier disputes, Zuckerberg and Musk seem to share common terrain when it comes to fact-checking. The Meta CEO justified his agency’s move towards a similar collaborative design, explaining that X’s approach—where people add context to content —has proven to be more reliable than traditional fact-checking techniques.
Zuckerberg unveiled the fact that Meta has a responsibility to adopt and improve a superior system when a rival creates one. He thinks integrating a model like this will enable better moderation across Meta’s platforms.
Their previous conflicts, including a proposed cage fight in 2023, add to how unlikely it is that the two tech moguls are at odds with one another on this matter.
This is the second time Zuckerberg has publicly acknowledged X’s success in content moderation in recent weeks. In a video published on 7 January, he announced that Meta would be phasing out traditional fact-checkers in favour of a Community Notes-style system, mirroring Musk’s approach on X.
Musk said on X that he thought it was” cool” and that he was pleased with the decision.
Joel Kaplan, Meta’s chief global affairs officer, echoed Zuckerberg’s sentiments in a blog post, highlighting X’s ability to empower its community to identify and contextualise potentially misleading content. Kappa said that involving a wide range of users in selecting the most effective explanations lessens bias in moderation decisions.
In terms of the upcoming changes to Meta, Kaplan explained that the new system would require the participation of users with diverse backgrounds and viewpoints to ensure fairness, a practice that is closely resembling X’s model.
Why did Mark Zuckerberg change his mind?
Mark Zuckerberg’s stance on fact-checking has undergone a significant transformation, perhaps influenced by Donald Trump’s political resurgence. In the past, Zuckerberg positioned Meta as a strong advocate for third-party fact-checking, partnering with external organisations to monitor misinformation on the platform. However, after Trump’s electoral comeback, Zuckerberg appeared to shift his approach, recognising that traditional fact-checking had become a polarising issue, particularly in the US. Facing increasing criticism from conservatives who accused Meta of censorship and bias, Zuckerberg gradually distanced himself from fact-checkers, favouring a more decentralised, crowdsourced model akin to Elon Musk’s Community Notes. This change reduces Meta’s direct role in moderating political content while embracing a wider strategic plan to appeal to a wider audience.
The Shifting Landscape of Fact-Checking
Over the past decade, fact-checking has become an increasingly controversial practice, evolving into a powerful industry. The backlash against Meta’s decision to abandon traditional fact-checking in favour of Community Notes has been swift, but not unexpected.
Angie Drobnic Holan, director of the International Fact-Checking Network ( IFCN) at Poynter, warned that the change could harm Meta’s users by allowing more misinformation to spread unchecked. She claimed that conventional fact-checking has helped to lessen the spread of fabricated and conjectural theories.
Neil Brown, president of the Poynter Institute, defended fact-checkers, insisting that they operate without bias. However, a headline in The New York Times —” Meta Says Fact-Checkers Were the Problem. Fact-Checkers Rule That False” .—ironically illustrated the very criticisms aimed at traditional fact-checking methods.
Who Oversees the Moderators?
In the United States, where critics claim that many mainstream media fact-checkers have leaned toward supporting established narratives, the debate has become particularly heated. The question remains: Who watches the watchdogs?
During the COVID-19 pandemic, mainstream fact-checkers dismissed the lab-leak theory as a baseless conspiracy, only for US government agencies like the Department of Energy and the FBI to later deem it a plausible explanation. Similar backtracking occurred on issues like the potential for vaccinated people to spread the virus, natural immunity, and mask efficacy.
The Future of Online Moderation
Traditional fact-checking was once intended to stop misinformation, but critics claim it eventually expanded to mainstream media biases. This has resulted in intense scrutiny of Republicans in the US, while Democratic politicians have frequently been given more lectern treatment. For example, Vice President Kamala Harris’s past relationship with politician Willie Brown was once’ fact-checked’ as merely needing’ more context,’ despite his role in appointing her to influential positions.
Ultimately, the responsibility of discerning truth will always rest with individuals. As Zuckerberg acknowledged:” Fact-checkers have been too politically biased and have eroded more trust than they have built, particularly in the US”.
The ongoing debate over fact-checking highlights a wider trend: social media is shifting away from traditional media’s influence over narratives to a model where users can take a more active role in determining what is factual. While this shift will likely have its challenges, for now, Zuckerberg seems convinced that Musk’s Community Notes model is a step in the right direction.