YouTube Takes a Tougher Stance Against COVID Vaccine Misinformation with Updated Rules

It’s been a long time coming, but YouTube has today announced that it’s moving to take a stronger stance against COVID-19 misinformation, and in particular, misleading content related to COVID vaccines, which could be helping to fuel global anti-vax movements.

As explained by YouTube:

Crafting policy around medical misinformation comes charged with inherent challenges and trade-offs. Scientific understanding evolves as new research emerges, and firsthand, personal experience regularly plays a powerful role in online discourse. Vaccines in particular have been a source of fierce debate over the years, despite consistent guidance from health authorities about their effectiveness. Today, we’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO.”

The updated policy is clearly stated:

“Don’t post content on YouTube if it includes harmful misinformation about currently approved and administered vaccines on any of the following:

  • Vaccine safety – Content alleging that vaccines cause chronic side effects, outside of rare side effects that are recognized by health authorities
  • Efficacy of vaccines – Content claiming that vaccines do not reduce transmission or contraction of disease
  • Ingredients in vaccines – Content misrepresenting the substances contained in vaccines”

Videos containing any of these claims will now be subject to removal from YouTube, with channels publishing such issued first with a warning, then strikes. Any channel that gets three strikes within a 90 day period will be terminated.

To be clear, YouTube’s policies already prohibit certain types of medical misinformation, including content that promotes harmful remedies, with YouTube further noting that it’s removed more than 130,000 videos for violating its COVID-19 vaccine policies over the past year.

The platform has worked to develop its approach to such over time, and amid the ongoing pandemic, but it has also been identified as a key source of medical misinformation, which has helped to boost harmful movements, running counter to the goals of health authorities.

Last year, a conspiracy-fueled ‘Plandemic’ video was viewed more than 7 million times on YouTube before it was eventually removed. The video sought to amplify rumors that the National Institute of Allergy and Infectious Diseases had buried research about how vaccines can damage people’s immune systems.

Earlier this year, researchers from the universities of Oxford and Southampton found that people who look to social media for information – particularly YouTube – are less willing to be vaccinated against COVID-19, and urged both the government and social media firms to take urgent action over the findings.

As per the report:

Trust in health institutions and experts and perceived personal threat are vital, with focus groups revealing that COVID-19 vaccine hesitancy is driven by a misunderstanding of herd immunity as providing protection, fear of rapid vaccine development and side effects, and beliefs that the virus is man-made and used for population control. In particular, those who obtain information from relatively unregulated social media sources – such as YouTube – that have recommendations tailored by watch history, and who hold general conspiratorial beliefs, are less willing to be vaccinated.”

Given this, as noted, YouTube’s tougher stance has been a long time coming, and it’s good to see the platform taking a more clarified approach against clearly incorrect medical information, which can cause significant real-world impacts, both for individuals and for society at large.

But the move will not please everybody, and could set YouTube on a course for conflict with some regional authorities.

This week, the Russian government threatened to block YouTube unless the platform restored two German-language channels, managed by Russia’s state media company RT, which were deleted after publishing vaccine misinformation.

As per The Washington Post:

“Russia’s communications ministry, Roskomnadzor, said it had sent a letter to Google “demanding that all restrictions on YouTube channels RT DE and Der Fehlende Part, operated by the Russian media outlet Russia Today, be lifted as soon as possible,” the Interfax news agency reported. The ministry threatened to fully or partially restrict YouTube in Russia, or fine Google, if the channels were not restored.”

Indeed, there are many politicians and civic leaders who are in support of those that oppose vaccine mandates, and you can bet that a major platform like YouTube taking a tougher line on such will once again stoke concerns that private companies are controlling the media narrative, and have an over-sized level of control of what can, and cannot, be shared.

Which is a valid concern, but as YouTube notes, it’s working with official health advice, based on both local health authorities and the World Health Organization, and as such, its stance is grounded in the guidance that governs all of our health procedures and approaches, which keeps us safe from such impacts.

Many of the opposing viewpoints are grounded in misunderstanding, and letting them grow is dangerous, and will keep us in the COVID holding pattern even longer, which could have disastrous long-term impacts. As such, it’s good to see YouTube taking a stronger stance, and moving to enforce the findings of official health bodies within its moderation approach.

Source link

Leave a Comment