archyworldys

how Facebook and Instagram moderate content

A person in Brazil posts on Instagram in 2020 an image showing a woman’s breasts as part of a campaign to raise awareness of breast cancer, but the social network removes it for “naked”. Two years later, Donald Trump posts a video during the capture of the Capitol in Washington with the phrase “we love you”, in alleged reference to the vandals, and Facebook indefinitely suspends his platform account due to the risk that his presence encourage an “escalation of violence”.
These decisions, together with more than a “one million appeals”, were reviewed by the Content Advisory Council, a body created by Meta (the corporation on which Facebook, Instagram and WhatsApp, among other companies, depend) that has independence by Mark Zuckerberg. This week, the Council released its first report on how 20 cases of content removed or restricted on Facebook and Instagram were resolved.
An organization that was founded in 2018 under an idea: “Social network companies do not have to make decisions related to content moderation by themselves.”
It was also part of a proposal that Meta itself presented in November 2018, under which “it allowed people to appeal decisions related to content to an independent body whose decisions were binding.”
In its action plan, this Council not only reverses Meta’s decisions when it deems it necessary, but also recommends policy changes that “improve the treatment users receive and provide greater transparency.”
The protection of freedom of expression and other human rights, consideration of the context and impartial treatment of users regardless of where they are, are some of the principles on which it bases its decisions.

Council’s first report

Between October 2020 and December 2021, the Council received more than a million user requests for “independent review” of Meta decisions on content moderation. An average of 2,649 daily cases.
In addition, Meta itself referred 47 cases to him. On average, 8 out of 10 users appealed for content restoration in relation to those networks’ rules on bullying, hate speech or violence, and incitement.
The Council took into account more than 10,000 comments from people and organizations around the world in making decisions, and issued 86 recommendations to Meta, of which that group credited progress or implementation of the recommendations.
The report detailed the first 20 complete decisions, about which Meta retracted 14 times and ratified 6 times.
In addition to focusing on the content of these networks, the Council is already planning its role in content moderation in the incipient metaverse that Meta is developing, and for which it has already invested 10 billion dollars.
With this work, it also seeks to provide a “reliable frame of reference for other companies in the
social networks, platform publishers, regulators and policy makers around the world who face the tensions and challenges inherent in content moderation.”

Meta made decisions, but the Council reversed them

One of the decisions made by the Council “set an important precedent by highlighting the inability of Meta’s automated compliance systems,” it described, to distinguish permissible images of exposed female nipples in an awareness campaign against breast cancer
Specifically, in October 2020, a user from Brazil posted an image on Instagram with a caption in Portuguese that aimed to raise awareness about breast cancer. But an automated system in charge of enforcing Facebook’s rule on nudity and adult sexual activity removed the post (the rule also applied to Instagram).
In this case, the Council revoked Meta’s decision because the rule contemplates, as an exception, the display of nudity in cases in which the user intends to “raise awareness about a cause or for educational or medical purposes” and allows, specifically, the publication of images of exposed female nipples to “increase awareness of breast cancer.”
In addition, it recommended offering users the ability to appeal decisions made by automated systems for human review when they are told their content violated Facebook’s Community Standard on Nudity and Adult Sexual Activity.

Trump account suspension

Trump’s case was longer and had several back and forth between the Council and Meta. It began on January 6, 2021, after a crowd forced its way into the Capitol building in Washington, and the then President of the United States posted a video on Facebook and Instagram, in which he said the following:
“I know that it hurts them, that they feel hurt. The elections have been stolen from us. It was a landslide victory and everyone knows it, especially those on the other side, but they must go home. We must have peace. We must have law and order. We must respect the wonderful people who respect law and order. We don’t want anyone to get hurt. These are very difficult times.”
“Nothing like this had ever happened before, that they robbed us all: me, you, the country. These elections have been a fraud, but we cannot act in a way that favors those people. We must have peace. So come home. We want them. They are very special. They have seen what happens. They have seen the vile and evil way in which they treat others. I know how they feel. But go home and in peace.”
An hour later, Meta removed the video for “breaking the dangerous people and organizations rule.” Trump then posted a statement on Facebook:
“These are the things that happen when a landslide and respectable election victory is so viciously and abruptly snatched away from great patriots who have been terribly unfairly treated for so long. Go home with love and in peace. Always remember this day!”
Eight minutes later, Meta removed this post “for violating the Dangerous People and Organizations Community Standard. It also blocked Trump’s account to prevent him from posting more content on Facebook and Instagram.
Two weeks later, Meta referred to the Council the decision to suspend Trump’s access to its accounts indefinitely. The company also asked the Council for comments or recommendations on account suspensions of users who are political leaders.
On its official blog, Meta had explained that it removed the content posts for violating the policy that prohibits “the glorification, support or representation of designated violent events.” According to that group, the content also violated the part of the policy that prohibits “the glorification of individuals who participate in acts of organized violence.”
The company explained that the decision to suspend the possibility of Trump’s publication indefinitely was made based on international human rights standards, especially based on article 19 of the International Covenant on Civil and Political Rights and General Comment 34 of the UN Human Rights Committee, which allows necessary and proportionate restrictions to be applied to the right to freedom of expression in situations of public emergency that pose a threat to the life of the nation.
What the Board did in this case was uphold the initial decision to suspend the account but reverse the decision to maintain the suspension indefinitely.
The agency considered that, given the seriousness of the infractions and the continued risk of escalation of violence, the measure to suspend Trump’s accounts on January 6, 2021 and to extend the suspension on January 7 was justified. As well as determined that the two publications in question violated the community standards of Facebook and Instagram.
But after going back and forth with the company, the Council considered Meta’s decision to impose an indefinite sanction and without normative grounds for indefinite suspension “inappropriate.”
In response to this decision, Meta announced the adoption of a new set of compliance protocols “for exceptional cases like this” and, based on those protocols, updated the suspension of Trump’s account for two years from 7 January 2021.
Approaching the expiration date of the sanction, the company promised to “seek experts to assess whether there are no longer risks to public safety,” among other scenarios.
The Content Advisory Council offers a means to appeal content decisions on Facebook or Instagram. To initiate the appeal, go here: https://oversightboard.com/appeals-process/
(Source: Telam)

Trending