Covid-19. How Facebook would have missed a lot of antivax content

Facebook’s automatic detection algorithms are in the sights of the US authorities, after numerous leaks of internal documents through a former employee Frances Haugen. Bitter revelations take place in a trickle.

This Wednesday, Le Monde and American media, which had access to these documents, publish an investigation into the controversial choices of the platform in terms of moderation of messages of disinformation on the vaccine and Covid-19.

What is it exactly?

Hiccups

“All Covid vaccines are experimental.” This message was relayed in English on Facebook more than 50,000 times and seen more than 3 million times in March 2021. This disinformation fell through the cracks because of a bug, according to the social network. The algorithm would have understood that the message was written in Romanian and would have misinterpreted it, reports Le Monde.

In July, Joe Biden hinted for his part that Facebook played a major role in Americans’ reluctance to get vaccinated.

However, after much pressure, Facebook announced the implementation of its own policy of censoring disinformation in February 2021. The platform announced, among other things: “Facebook will now expand this ban, responding to false allegations that Covid-19 is synthetic, that vaccines are not effective in preventing disease, or that it is safer to get the disease than to be vaccinated. “

A handful of highly identified accounts

Another lesson: a very small handful of prolific antivaxes exerts a disproportionate influence on the social network. According to Facebook’s measurements, only a hundred accounts accumulate 50% of the entire antivax audience. And this is even more true when we discover that five of them (out of 3 million receptive people) accumulate 10% of the audience.

In September, several American media revealed that Facebook deliberately left the content of these influencers online to avoid depriving itself of a sizeable audience. Prosecutors in fourteen US states have taken up the matter and sent a letter to Facebook.

“Engagement is all that matters,” Imran Ahmed, CEO of the Center for Countering Digital Hate, an Internet watchdog group, told Associated Press on October 26. It gets attention and attention equals income advertising. “

Late responses

Even before the pandemic, a new tool for detecting anti-vaccination messages (excluding Covid-19 therefore) had been put in place at the end of 2019 at Facebook, reports Le Monde in its investigation. With the pandemic installed, this tool makes it possible to classify messages on Covid-19 and detects content that is a priori problematic. First only in English, then the functionality is extended to other languages ​​including French.

Problem: the tool only concerns publications and does not scan comments, “the preferred distribution channel for anti-vaccination messages”, notes a company report published in 2020. In general, the moderators are very little guidance on how to react to such comments. Another document reports 60% of comments on average against vaccination, under the publications related to this subject.

A Facebook researcher suggested turning off comments on vaccine posts in March until the platform can do a better job of tackling the anti-vaccine posts lurking there, but the proposal was ignored. , notes the American agency Associated press.

After much pressure, Facebook announced the implementation of its own policy of censoring disinformation in February 2021, more than a year after the start of the pandemic. The platform announced, among other things: “Facebook will now expand this ban, responding to false claims that Covid-19 is synthetic, that vaccines are not effective in preventing disease, or that it is safer to contract the disease. than getting vaccinated. “

A too late response, and not comprehensive enough, according to Frances Haugen and many employees.