ECNETNews has reported that Meta is implementing significant changes to its content moderation policies, including the discontinuation of its third-party fact-checking program in the United States. The company is now transitioning to a Community Notes model, aiming to enhance free expression while ensuring transparency on its platforms, including Facebook, Instagram, and Threads.
Meta Introduces Community Notes Model
The third-party fact-checking program, which began in 2016, has faced ongoing criticism for perceived bias and overreach. Meta recognized that this program often resulted in the unintended suppression of legitimate political discourse.
The new Community Notes system will empower users to add context to posts identified as potentially misleading. Contributors will collaboratively create and assess these notes, representing a variety of perspectives. Meta has emphasized that it will not be involved in crafting or selecting the notes featured on its platforms.
“Once the program is up and running, Meta won’t write Community Notes or choose which ones appear,” said a Meta spokesperson. The rollout of this program is set to begin in the U.S. over the coming months.
Enhancing Freedom of Speech
In addition, Meta is lifting restrictions on several discussion topics, including immigration and gender identity, which are crucial to political dialogue. The company admitted that its previous content moderation practices have been excessively restrictive, resulting in the erroneous removal of content and frustration among users.
In December 2024 alone, Meta reported the removal of millions of content pieces daily, with estimates suggesting that 10-20% of these removals were errors. To mitigate this, Meta will prioritize its automated systems toward high-severity violations, such as terrorism and fraud, while relying more on user reports for less critical issues.
“We are in the process of eliminating most content demotions, requiring greater assurance that the content in question violates our policies,” the spokesperson stated.
Improvements to Enforcement and Appeals Process
Meta is also revising its enforcement processes to minimize errors. The new protocols will involve multiple reviewers agreeing before any content is removed, in addition to utilizing large language models (LLMs) for second opinions on enforcement decisions.
To enhance the account recovery experience, Meta is testing facial recognition technology and expanding its support teams to streamline the appeals process.
Personalized Political Content Delivery
Furthermore, Meta plans to reintroduce political and civic content into user feeds through a more personalized method. The previous strategy aimed at reducing such content based on user feedback was overly broad.
Meta will now rank political content from accounts that users follow using explicit signals, such as likes, alongside implicit signals, like viewing duration. Users will gain additional control over the amount of political content displayed in their feeds.