Discord bans harmful Covid-19 misinformation on its platform

In a nutshell: Following other digital platforms like Reddit and YouTube, Discord has updated its community guidelines with new rules on dangerous medical misinformation. The instant messaging service will prohibit users from posting “misleading health information that is likely to cause physical or societal harm.”

The policy change was announced on Friday in a blog post by Discord’s senior platform policy specialist Alex Anderson. The rules are explicitly targeted at Covid-19 misinformation, which has become increasingly prevalent across the web.

“Ensuring the accuracy of online health information has never been as important as during the Covid-19 pandemic,” it reads.

In the blog, Discord cited specific types of content that would fall under the new rules, including anti-vaccination rhetoric, dangerous cures for diseases, and anything that “could hinder the resolution of a public health emergency.” They stated that health information is misleading if it “directly and unequivocally contradicts the most recent consensus by the medical community.”

The company intends to crack down on Covid-19 conspiracy theories, with these guidelines restricting unsubstantiated rumors and widely-debunked health claims. However, the post also clarified that they do not intend to be “punitive of polarizing or controversial viewpoints,” with non-misleading personal health experiences, commentary, and satire being excluded from the rules.

These guidelines apply to both individual accounts and organized servers. Discord specified that enforcement actions would be doled out based on the severity and potential harm of the misinformation, with punishments ranging from warnings to permanent suspensions of an account or server.

In an interview with The Verge, Discord’s chief legal officer Clint Smith explained that “if someone posts on Discord drink four ounces of bleach and your body will be rid of coronavirus, that’s actionable.” He also cited that low-risk misinformation will likely not be actionable.

“If someone posts about holding crystals against your chest for 5 minutes and your lung capacity will improve, that’s not something Discord is going to take action against,” Smith said

The community messaging platform is the latest in a line of technology companies that have attempted to combat health-related misinformation. After massive public pressure, YouTube, Reddit, and Facebook have all made policy changes intended to dampen growing anti-vax rhetoric on their platforms. Conversely, streaming giant Spotify has refused to remove prominent creators such as Joe Rogan over allegations of misleading Covid-19 claims.

This isn’t the first time the company has made significant moves to curb the spread of harmful content, with operations in place that combat exploitative content, violent extremism, and illegal activity. In its transparency report covering the first half of 2021, Discord stated that it removed over 43,000 servers and banned 470,000 accounts for violating guidelines.

Only time will tell if Discord can effectively enforce this new policy across its over 150 million active users and nearly 6 million servers. In the meantime, the company has posted instructions on reporting harmful misinformation to their team.

Image credit: Alexander Shatov

Leave a Reply