The issue of vaccination had an important boom after the arrival of vaccines against covid-19 and has generated endless opinions, among which are anti-vaccine positions.
Although each person is free to decide whether to vaccinate or not, the truth is that people who share public content in the media and social networks must be responsible and avoid instilling panic and influencing decisions that can affect the health of Many.
In this sense, in a new attempt to stop the flow of erroneous and false information about vaccines, YouTube said it will not allow videos that claim that vaccines approved by health authorities are dangerous or do not work..
The platform will also measure ban accounts that are overtly against vaccines, including channels from Joseph Mercola, the Children’s Health Defense Fundy (linked to Robert F. Kennedy Jr.), Erin Elizabeth and Sherri Tenpenny; while the channels of two other major figures with anti-vaccine positions, Rashid Bhuttar and Ty and Charlene Bollinger, were canceled a few months ago.
Those anti-vaccine figures are part of the so-called “Misinformation Dozen,” a group identified by the Center for Countering Digital Hate as responsible for most of the misleading claims about vaccines on social media.
DO NOT STOP READING: New Malware attack on PC and cloud! Microsoft explains how to avoid it
In 2019, YouTube removed its ads from anti-vaccination content; And in October 2020, it said it would remove videos spreading misinformation about covid-19 vaccines.
Now, this policy is expanded to block misinformation about other vaccines, including flu, HPV, measles, mumps, and rubella. Videos that incorrectly state that vaccines cause health problems that have not been scientifically proven, such as autism or infertility, will not be allowed under the new policy.
In these measures there are also some exceptions, since YouTube will allow videos in which people share their personal experiences with vaccination, like the secondary effects that they presented, for example.
Such content will only be removed if the channels on which they are found “show a pattern of promoting misinformation about vaccines”, that is, if many of their videos are aimed at talking negatively about vaccination (without supporting evidence) or aimed at urging people not to get vaccinated.
YouTube’s new guidelines say the platform it will also allow videos with information that talks about contraindications, side effects or problems caused by vaccines, as long as they do so in a purely informative context (not in order to influence people’s decision) and with an objective approach, supported by the statements of experts and doctors.