After social media giant Facebook, Google-owned YouTube is now banning all vaccine misinformation from its platform. The company said, on Wednesday, that it won’t allow any content that misleading claims about vaccines currently approved by health authorities will be removed from the platform. Social media platforms, including Facebook and YouTube, have been criticized and accused for spreading vaccine misinformation and not doing enough to stop the spread of such content. The criticism has come from politicians, academics and even some health authorities since last year.
This includes content alleging that approved vaccines cause chronic side effects other than those recognized by health authorities, content claiming vaccines do not reduce transmission or contraction of covid-19, and content misrepresenting substances contained in the vaccines. As examples, YouTube said content claiming the vaccines can cause Cancer, Diabetes etc. is “not allowed” on the platform. Claims that vaccines alter a person’s genetic makeup, cause autism, will also be removed. You can see YouTube’s full guidelines here.
However, the company isn’t changing its usual content removal policy for vaccines misinformation. “If your content violates this policy, we’ll remove the content and send you an email to let you know. If this is your first time violating our Community Guidelines, you’ll likely get a warning with no penalty to your channel. If it’s not, we may issue a strike against your channel. If you get 3 strikes within 90 days, your channel will be terminated,” the company said.
To be sure, YouTube isn’t banning all vaccine related content that violates its misinformation policies. The company said that content that “includes additional context in the video, audio, title or description” may be allowed even if it violates the policies. “This is not a free pass to promote misinformation. Additional context may include countervailing views from local health authorities or medical experts. We may also make exceptions if the purpose of the content is to condemn, dispute, or satirize misinformation that violates our policies,” the company said.
It may also make exceptions if the purpose of a video is to “condemn, dispute, or satirize” misinformation, which should cover comedians and other channels that produce that specific kind of content. Content showing an “open public forum” like protests, public hearings etc. may also be allowed as long as it doesn’t aim to promote misinformation.