Misinformation: Facebook strengthens moderation in its groups

Spread the love

False information: Facebook strengthens moderation in its groups

The platform had already given more tools to group managers to better moderate content.

Ahead of the US midterm elections, Facebook is improving moderation for fight misinformation in groups.

Meta on Thursday added a tool that allows people who administer Facebook groups to automatically filter content recognized as fake news. This change comes as the US midterm elections approach, conducive to waves of misinformation on the social network.

To ensure content is more trustworthy […], group admins can automatically hold messages that contain information determined to be false by third-party verification so that they can be reviewed. Facebook application director Tom Alison said in a statement.

The platform had already given more tools to those responsible for groups to better moderate content. However, it remains accused by many non-governmental organizations and by the authorities of not sufficiently combating disinformation.

More than 1.8 billion people use Facebook Groups every month. Parents of students, fans of artists and people living in the same neighborhood meet there to exchange news and organize activities, but also to discuss politics.

Meta has been criticized for not sufficiently monitoring groups that have contributed to the political radicalization of certain people, especially during the 2020 US elections.

AFP participates in around 30 countries in fact-checking by third parties, a program developed by Facebook in 2016. Around 60 general or specialized media around the world are also part of this program.


If something is considered false or misleading by any of these outlets, Facebook users are less likely to see it appear in their News Feed. And if people see it or try to share it, Facebook suggests that they read the verification article.

With information from Agence France-Presse

Previous Article
Next Article