Meta announced Tuesday that it is abandoning its third-party fact-checking programs on Facebook, Instagram and Threads and replacing its army of paid moderators with a Community Notes model that mimics X’s much-maligned volunteer program, which allows users to publicly report content they believe. be incorrect or misleading.
In a blog post In announcing the news, Joel Kaplan, Meta’s new director of global affairs, said the decision was made to allow more topics to be openly discussed on the company’s platforms. This change will first impact business moderation in the United States.
“We will allow more expression by lifting restrictions on certain topics that are part of dominant discourse and focusing our enforcement on illegal and high-severity violations,” Kaplan said, although he did not specify which topics the new rules would cover.
In a video accompanying the blog post, Meta CEO Mark Zuckerberg said the new policies would see more political content return to people’s feeds as well as articles on other issues that have inflamed the culture wars in the United States. United States in recent years.
“We will simplify our content policies and remove a number of restrictions on topics like immigration and gender that are simply disconnected from mainstream discourse,” Zuckerberg said.
Meta significantly reduced fact-checking and got rid of content moderation policies it had in place following revelations in 2016 of influence operations carried out on its platforms, designed to influence elections and , in some cases, promote violence and even genocide.
Before last year’s high-profile elections around the world, Meta was criticized for taking a hands-off approach to the moderation of content related to these votes.
Echoing comments made by Mark Zuckerberg last yearKaplan said Meta’s content moderation policies were put in place not to protect users but “in part in response to societal and political pressures to moderate content.”
Kaplan also lambasted fact-checking experts for their “biases and perspectives” that led to excessive moderation: “Over time, we ended up with too much fact-checked content for people to understand as speech and legitimate policy debate,” Kaplan wrote.
However, WIRED reported last year that dangerous content such as medical misinformation had flourished on the platform, while groups like anti-government militias used Facebook to recruit new members.
Zuckerberg, for his part, accused “traditional media” of forcing Facebook to implement content moderation policies following the 2016 election. “After Trump’s election in 2016, traditional media wrote without stop on how disinformation posed a threat to democracy,” Zuckerberg said. “We have tried, in good faith, to address these concerns without becoming the arbiters of truth, but the fact-checkers have simply been too politically biased and have destroyed more trust than they have created. »
In what he tried to present as an attempt to eliminate bias, Zuckerberg said Meta’s internal trust and safety team would move from California to Texas, which is also now home to X’s headquarters. “As we work to promote free speech, I think this will help us build trust to do this work in places where there is less concern about bias on our teams,” Zuckerberg said.