Meta Ends Fact-Checking Program, Introduces Community-Based System Similar to X

Meta has decided to discontinue its U.S. fact-checking program, replacing it with a community-based system similar to X, while easing restrictions on sensitive topics like immigration and gender identity.

This shift represents a significant change in Meta’s approach, as CEO Mark Zuckerberg has previously supported active content moderation, despite facing conservative criticism over alleged censorship on its platforms. The change follows recent appointments, including Joel Kaplan as global affairs head and Dana White, UFC CEO and a close ally of President-elect Donald Trump, to Meta’s board.

Zuckerberg expressed in a video, “We’ve reached a point where there have been too many mistakes and too much censorship. It’s time to return to our roots of free expression.” He further stated that Meta would focus on minimizing mistakes, simplifying policies, and restoring free expression across its platforms, with stricter content filters before content is removed.

The termination of the fact-checking program, which began in 2016, surprised several partner organizations. Jesse Stiller, managing editor at Check Your Fact, commented, “We didn’t know this move was happening, and it comes as a shock. This will definitely affect us.”

Meta’s independent Oversight Board welcomed the change. The updates will impact Facebook, Instagram, and Threads, which collectively have over 3 billion users worldwide.

Recently, Zuckerberg has expressed regret over past content moderation decisions, particularly regarding COVID-19. Meta has also made a $1 million donation to Trump’s inaugural fund, marking a departure from previous practices.

Critics, like Ross Burley from the Centre for Information Resilience, warned that this change could be a step back for content moderation, particularly as misinformation and harmful content continue to evolve rapidly. He believes the shift seems more politically motivated than based on smart policy.

Meta’s new Community Notes system, set to roll out in the U.S. over the next few months, allows users to flag misleading posts rather than relying solely on independent fact-checking organizations. Unlike the previous system, Meta itself will not decide which Community Notes appear on posts. Elon Musk’s X, which has a similar system, is currently under investigation by the European Commission for handling illegal content, and the effectiveness of its “Community Notes” is still under scrutiny.

Meta will also move its trust and safety teams, responsible for overseeing content policies, out of California to Texas and other U.S. locations. The company emphasized its commitment to focusing automated systems on high-severity violations like terrorism and drugs.

Share this post