YouTube, a Google-owned video platform, has expanded its medical misinformation policies to include new guidelines that ban vaccine misinformation. "Today, we're expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO," according to a company blog post. "Specifically, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines will be removed," it added. "This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them. Our policies not only cover specific routine immunizations like for measles or Hepatitis B, but also apply to general statements," it noted. The platform previously introduced a policy prohibiting misinformation related to the coronavirus during the pandemic, including about treatment and prevention. YouTube banned prominent anti-vaccine accounts in an effort to strengthen its policies on misinformation around vaccines, the company said in a blog post Wednesday. It will also ban misinformation on all vaccines that are confirmed to be safe by the World Health Organization and local health authorities. Social media companies have said since the beginning of the pandemic that they're trying to stop the spread of coronavirus misinformation. But falsehoods have continued to run rampant as companies struggle to police the constant flood of posts and uploads to their platforms. As part of the crackdown, a YouTube spokesperson confirmed it removed pages associated with high-profile misinformation spreaders like Joseph Mercola, Erin Elizabeth, Sherri Tenpenny and the Children's Health Defense Fund, which is associated with Robert F. Kennedy Jr. Until now, YouTube banned videos that said the coronavirus vaccine was ineffective or dangerous. Under the new policy, it will block videos that spread misinformation about all commonly used vaccines, like the MMR vaccine, which protects against measles, mumps and rubella. "We've steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we're now at a point where it's more important than ever to expand the work we started with COVID-19 to other vaccines," the company said. YouTube said there are exceptions to its new guidelines. The company will allow videos about vaccine policies, trials and historical vaccine successes or failures. It will also allow personal testimonials relating to vaccines, "so long as the video doesn't violate other Community Guidelines, or the channel doesn't show a pattern of promoting vaccine hesitancy." — Agencies