Group monopolized content in 2018 Brazilian elections

During the 2018 Brazilian elections, a small group of accounts and pages monopolized the production of political content on Facebook. According to internal documents from Mark Zuckerberg’s company obtained by state, the country suffered from the action of the so-called “superproducers”, users who publish a high volume of content – in the archives, Brazil is cited as “the most recent great democracy to face a frightening eruption of disinformation and hatred on Facebook”.

President Jair Bolsonaro in Brasilia 11/11/2021 REUTERS/Ueslei Marcelino

Photo: Reuters

The information appears in the “Facebook Papers”, a package of company documents leaked to an international consortium of press vehicles, including state, New York Times, Washington Post, Guardian e Le Monde. The disclosure was made to the Securities and Exchange Commission (SEC), the regulator of listed companies in the United States. The files were also provided to the US Congress in an edited form by the legal adviser of Frances Haugen, a former Facebook employee who collected internal surveys from the social network after resigning in May this year for disagreeing with the company’s attitudes.

One of the 28-page documents has an internal publication dated October 13, 2018 in which company researchers discuss problems related to elections in different countries – in the file, there is a topic called “lessons learned in Brazil”. The company claims that on October 7 of that year, the day of the first round of the country’s presidential elections, 18.4 million political publications were created by 6.7 million profiles or pages on the platform. However, 35% of this material was published by only 3% of accounts – that is, 6.4 million political posts were generated by only 201,000 accounts.

According to the study, 74 million different people viewed the materials, generating 2.74 billion views. Facebook researchers calculate that accounts that monopolized content production received 28% of those views, or 767.2 million. But the audience concentration may have been greater.

“The good news is that 72% of political content views were outside the material of the 3% of producers, right? Not much. the same content (but creating different posts)”, says a researcher in the document. “An analysis looking at patterns of content diffusion would likely reveal even more concentration.”

The report deals with the role of “superproducers” of political content, accounts that go beyond the limits of what would be “regular” intense engagement on the platform, gaining disproportionate influence in the political conversation.

The researchers expressed concern about the situation: “If Facebook is to be a true and democratic public square, everyone involved must be able to make a contribution to the political conversation. But not all political subjects are equal in their ability to invest in the conversation.” , says another part of the survey.

The concentration of content production is not just a Facebook phenomenon – the practice is pointed out in several social networks, such as Twitter. According to a study by the research institute Pew Research Center published on Wednesday, the 25% most active Twitter users in the United States produce 97% of the tweets in the country – the survey, however, does not cut the production of political speeches .

According to the leaked file from Facebook, the problem is more serious when there is a monopoly of political content. “If you win (in production) of non-political content, you earn a little more money. if you win (in production) of political content, you manage to control the entity that holds the monopoly on violence (ex: State)”, said one of the researchers in the document.


According to the document, the superproducers have different motivations: they can be people deeply committed to their ideology, mercenary “spammers” (people paid to exhaustively replicate the message), agents of foreign influence or “a little bit of everything”. Facebook speculates that the work could have two main origins: some would be paid by political agents or spammers to post the same political content thousands of times a day, while others would make the posts simply because they firmly believed in their ideologies.

In another file analyzed by state, in which the role of conservatives in disinformation in Macedonia is discussed, an official points out that the problem can also happen online. “I wouldn’t be surprised to see a network (from malicious actors) which connects operations in the Philippines, Myanmar, Brazil and Macedonia. It’s a globalized field,” comments an official in the July 2018 report.

“We made several discoveries (in 2017) which have greatly improved our ability to address this problem, but there is still a lot of work to be done – some of the more difficult tasks ahead are not about data, but about having the courage to call things by the right names and bring down disinformation operations,” he says. the official.

Danger zone

In global comparison, Brazil is among the countries with the highest level of performance of “super producers” (see map below). In the document, the company highlights the strength of the mechanism in Brazil and in Latin American and African countries. “Established European democracies, as well as Australia, Canada and South Africa, appear relatively healthy according to this metric (to map)“, says an excerpt from the file.

Since testifying in the US Senate in October, Frances Haugen has repeated that the company chose growth over user security – she says the effect of this acceleration has been even greater outside the US, where the company has not properly policed ​​its service. to follow the local scenario of different countries. In other company files, there was already information that Facebook makes choices about which problems to address because it is not able to deal with parallel crises in different countries.

In the document on “super producers”, a Facebook employee also laments the path the company has taken. “Facebook’s story over the past 3 years could be: suspicious growth strategies meet politics and begin to destroy democracies,” he says. On Facebook’s corporate platform, other employees reacted to the comment with a sad face.


The files also show employees discussing ways to solve the problem – the publication argues that there are ways to make the company’s algorithms work better.

One of the fixes would be a “sharing brake” applied to the classification of the news feed – that is, to act in the organization of the publications that are shown on the platform’s homepage. “It’s a drawdown-based approach (from posts) which seems to undo a little the effect of over-engagement on feed rankings,” he says.

It appears to be a mechanism released later by the company. In response to the report, Facebook says: “Over the years, we’ve taken a number of steps to reduce potentially harmful content and reconsider the types of content we amplify or recommend. That’s why we stopped recommending civic and political groups.”

The company also says it applies penalties for those who repeatedly violate its policies. “If someone shares content classified as false or altered by one of our fact-checking partners, we apply penalties to the content and, if you do so repeatedly, to your account, page or group.”

The documents show officials trying to ensure that ideas on the platform can compete with each other on their quality and persuasiveness, rather than on the ability of a few actors to silence other users. “The exact details are less important than agreeing to address this fundamental flaw that happens over and over again when political interactions that physically occur in our data centers have very real consequences in places near and far,” says a snippet of research.


About the 2018 elections in Brazil, Facebook claims that it has done extensive work to improve the quality of what circulated on the platform. Among the measures would be prevention of the circulation of disinformation, changes in the transparency standards of advertisements, increased transparency of pages and proactive work with the Superior Electoral Court (TSE) of Brazil and with the Regional Electoral Courts.

On the leaks, Facebook says: “The central premise in these stories is false. Yes, we are a business and we make a profit. But the idea that we profit at the expense of people’s well-being and safety does not understand where our own interests lie. commercials.”


See also  32 dead and 757 arrests