(BussinesInsider) - Facebook's new Oversight Board has made its first rulings, overturning four out of the five cases it looked at and delivering policy recommendations to the social media giant.
The tech company's "supreme court," as some have labeled the independent group, was launched in October as an entity to review difficult content moderation decisions carried out by Facebook.
One of the removed posts that the board voted to reinstate was an October 2020 post in which a Myanmar user said, in Burmese, "[there is] something wrong with Muslims psychologically." It was originally removed over hate speech violations, but the Board overturned it because it believed the post was taken out of context and should be translated to instead say "[t]hose male Muslims have something wrong in their mindset."
The post was referring to what the user perceived as a lack of action taken in response to the treatment of Uyghur Muslims in China.
Another post was one that Facebook removed on October 2020. A Brazil user posted an Instagram photo showing breast cancer symptoms, which included nudity. Facebook's removal of the post was automated, not by a human, and the Board overturned it because the removal "indicates the lack of proper human oversight which raises human rights concerns."
You can read more about the two cases as well as the others on the Board's website here. Facebook removed a post from November 2020 that it deemed as containing a derogatory slur for Azerbaijanis. The Board agreed that the post "was meant to dehumanize its target" and upheld its decision.
"The Board's first case decisions are another step towards building a strong institution capable of holding Facebook to account over the long-term," the group said on its website. It also said it will be publishing its decision on a recent post that Facebook removed regarding India that violated its violence and incitement policies. The Board also said it accepted the case regarding the indefinite suspension of former President Donald Trump and will be "opening public comment" on the case shortly. It has 90 days to make its decision.
As part of the first rulings, the board also made nine policy recommendations to Facebook regarding content moderation stemming from its decisions.
Since launching in October, more than 150,000 cases have been brought to the Board, according to its website. But as Insider previously reported, experts are concerned that the board lets Facebook off the hook since the tech giant maintains control of how it implements its policies.
Facebook, and other tech platforms, have faced pushback recently over their content moderation decisions. Some have argued their decisions are politically motivated — conservatives, specifically, have lamented that internet platforms discriminate against them after Twitter and Facebook cracked down on former President Donald Trump's content. Some of those flagged posts included baseless claims of election fraud.
Others have said Facebook and its peers don't do enough to moderate hate speech and misinformation online. One grave example is a series of posts that helped drive genocide in the country of Myanmar. In late 2018, Facebook admitted that it did not do enough to stop the spread of hate speech and violence in Myanmar.
Think your friends would be interested? Share this story!