On September 29, 2021 YouTube announced it will expand its vaccine misinformation policies with new guidelines targeting content that “falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of the disease, or contains misinformation on the substances contained in vaccines will be removed.” [1]

“YouTube doesn’t allow content that poses a serious risk of egregious harm by spreading medical misinformation about currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and by the World Health Organization (WHO).”  What they neglected to clarify is that globally the vaccines have been “approved” under various emergency use authorisations under limited clinical data.

Videos are removed if they violate YouTubes policies, including those who report adverse reactions following an injection or quoting scientific literature.