Meta May Now Use Warning Screens for Moderating Content as Well

Craig Cortez

2022-10-31

blog image

While Facebook and Instagram become home for more and more misinformation, abuse, and other unhealthy types of communication, Meta takes measures to contain this. The latest measure taken by the company is giving new functions to the Oversight Board, a special body to watch over content and apply warning screens to posts it considers misleading or abusive.

Not that the board is new. Including academics, lawyers, and right experts, it has already been working on reviewing users’ appeals to remove certain content as offensive, misleading, or otherwise not complying with the guidelines. It also had its say in site policies change: for example, it can unban certain posts if it considers moderators’ actions violating freedom of speech. Now, though, the board will be able to mark such content with special warning screens.

The message on this screen will show what exactly, according to the board, is wrong with this content. Posts can be marked as “sensitive” or “disturbing” (how not to wonder whether it will find the post disturbing if it only contains words “lack of faith”). After seeing such a screen, it’s up to you to watch the post in question or to skip it.

Being an independent body, the oversight board is meant to satisfy the demand for third-party unbiased moderation, Facebook users being so vocal about it. We are still to see how this body will handle its new powers, but one thing is for certain. While hate speech, misinformation, and abuse will be impossible to eliminate, it can be reduced when moderated correctly, and while misfires are inevitable as well, they can be handled too.

Do you hope Oversight Board will use these new warning screens wisely? What do you think will they be used on? Will it help users live a better life in a world where disturbing and sensitive information will inevitably appear? Share your ideas and concerns with us in the comments section!

 

Follow: