In reaction to criticism around the use of Messenger in some countries worldwide, particularly Myanmar, Facebook has introduced new tools that it allow users of the app to report conversations that violate its community standards.
A new tab inside the Messenger app lets users flag messages under a range of categories that include harassment, hate speech and suicide. The claim is then escalated for review, Facebook said, after which it can be addressed.

Facebook has introduced new tools to its messaging app, Messenger, enabling users to finally be able to report conversations from within the app.

Previously, anybody that wanted to report the goings on in the app would have to do so via the tools available on Facebook or the Messenger web portal.

“Providing more granular reporting options in Messenger makes it faster and easier to report things for our Community Operations team to review. They review reports in over 50 languages. This means our community will see issues addressed faster so they can continue to have positive experiences on Messenger,” said Hadi Michel, product manager for Messenger, announcing the new tools.

“We encourage people to use our reporting tools. The more you do, the more you assist us in keeping the Messenger community safe.”

To report a conversation, users have to tap the name of the person or group they are chatting with and scroll to ‘Something’s Wrong’. Here there are categories to choose from such as harassment, hate speech, or pretending to be someone else. Users can also decide to ignore or block the person they are reporting.

YOUR REACTION?

Facebook Conversations