Facebook removed an error, the provisions of the Supervisory Board in the first cases

Facebook removed an error, the provisions of the Supervisory Board in the first cases

The Board considered two letters that were removed on the basis of hate speech. Facebook has to return one of them, partly because the supervisory board uses a different translation and therefore reads the message differently. This concludes that the text is not derogatory or offensive.

The other message, which Facebook rightly removed on the basis of hate speech, according to the council, concerns the use of a specific Russian word to refer to Azerbaijan. Language experts on the board assert that Facebook’s reading of the word is correct and thus inherently offensive.

Criticize the automatic removal

Nudity status Exception: Facebook has already recovered the post. Here’s a letter from eight pictures, which was part of a campaign to call attention to breast cancer symptoms. The female nipples were visible in five of the photos, not the other three.

According to the board, Facebook has not responded to a verdict on this, after all, the platform has already returned the message. The board saw otherwise and pointed to the fact that the message had been deleted by Facebook’s automated system. “Incorrect removal of this message indicates insufficient human censorship, leading to human rights concerns.”

Additionally, it is indicated that Facebook rules treat men’s and women’s nipples differently. “This uses incorrect automation, which disproportionately affects women’s freedom of expression.”

The fourth case included a quote from Nazi leader Joseph Goebbels, cited by a user who wanted to compare Trump’s presidency to the Nazi regime.

Criticism of government policy

READ  Clashes between supporters of 'Jews for Trump' and opposition protesters during a demonstration in New York City

The latest case relates to a message in which a French user talks about the use of agents to combat the Coronavirus, including hydroxychloroquine. According to Facebook, there was misinformation, and according to the council, it was a criticism of government policy.

The board also advises Facebook on a number of points. For example, it is recommended that you provide more information to users about why something was removed and there is criticism of the lack of transparency in this area. Additionally, Facebook is advised to set more clear rules when it comes to handling wrong medical information. The platform does not have to follow the advice.

Leave a Reply

Your email address will not be published. Required fields are marked *