Youtube commits to removing anti-vaccine content

YouTube has committed to removing content promoting false information about approved vaccines, building on the ban around misinformation on the Covid-19 vaccine. 

 Videos falsely discrediting approved vaccines will now be removed alongside anti-covid vaccine content, according to YouTube. False claims against the vaccine have included causing autism, infertility and cancer, as well as a high mortality rate. The new policy will include the termination of high-profile anti-vaccine accounts. 

 The ban on Covid vaccine misinformation came into place last year, but has since been expanded. YouTube has announced that 130,000 videos have been removed since then. The false claims surrounding Covid vaccines have stretched to cover misinformation about vaccines as a whole. YouTube’s new policy will also target misinformation about long-standing vaccines, such as the measles vaccine and the Hepatitis B vaccine. 

 Large platforms such as YouTube have received criticism for not doing more to stop the spread of fake health news on their sites. In July, US President Joe Biden implored these platforms to resolve this issue, stating that social media platforms were primarily responsible for people’s mistrust and suspicion of vaccinations. 

 Regarding their new policy, YouTube released a statement on their official blog, saying: “Content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines will be removed.” 

 Personal stories about being vaccinated, information about vaccine policies and videos recounting historical success and failure of vaccine trials will not be removed, as they do not violate the new guidelines. 

 This new crackdown on misinformation follows the Fazze anti-vaccine scandal, wherein an influencer marketing agency reached out to several influencers and offered to pay them to spread misinformation about the Covid vaccine. These influencers included German YouTuber and journalist Mirko Drotschmann, and French science YouTuber Léo Grasset. Both pretended to be interested in the deal in order to learn more and blow the whistle on the organisation.

 Influencers were asked to share what was claimed to be leaked information that showed an inflated number of deaths among those who had received the Pfizer vaccine. They were told not to disclose that the video had a sponsor, which is banned by many social media platforms and illegal in Germany and France. The agency also instructed influencers to share links from a list of articles corroborating this false information. At least four other influencers have spoken out with similar stories since Drotschmann and Grasset made their stories public. Fazze is currently being investigated by both the German and French authorities. 

 Fazze is a branch of digital marketing company AdNow, which is registered in both the UK and Russia. Accusations have been levied against the Russian government, with many believing this to be a ploy to discredit existing COVID vaccines in order to promote Russia’s own vaccine Sputnik V. 

The Russian embassy in London denies these claims, stating: “We treat Covid-19 as a global threat and, thus, are not interested in undermining global efforts in the fight against it…”

Of the incident, Grasset stated: “If you want to manipulate public opinion, especially for young people, you don’t go to TV. Just spend the same money on TikTok creators, YouTube creators. The whole ecosystem is perfectly built for maximum efficiency of disinformation right now.” This further highlights the importance of YouTube’s new ban. Similar bans have been implemented by other social media platforms, such as Facebook and Twitter. 

Image credit: PA