Meta Updates Moderation Policy

18.02.2025
Meta Updates Moderation Policy

Meta is preparing significant changes to its moderation policies. The CEO, Mark Zuckerberg, announced that these changes are aimed at “fewer false account closures and content removals.” However, these changes raise concerns about the presence of hate speech and misleading content on the platform.

Meta's new approach plans to transform moderation into a community-driven model by removing third-party verification processes. While this model aims to provide more context to counter misinformation, the shift in control to users raises concerns about the potential for increased mobbing or manipulative behavior on the platform. The loosening of moderation risks the spread of harmful content, especially hate speech.

This new model raises serious questions about the safety of brands on the platform. The risk of advertising content appearing in the same context as hate speech or harmful content can threaten brands' reputations. For example, the implementation of a similar model on platform X led to a massive withdrawal of brands from the platform and a loss of value for the company.