Misinformation of COVID-19 vaccine videos will be removed from YouTube. Videos about a vaccine that denies information from health authorities or the World Health Organization won’t be authorized.
“A COVID-19 vaccine may be expected. Therefore we’re confirming we have the right policies in place to be fitted to remove misinformation associated with a COVID-19 vaccine,” Farshad Shadloo, a YouTube spokesman, stated in an email.
That could involve false allegations that vaccines implant microchips in people’s bodies, for example, or that they create infertility. Both rumours are untrue.
The new guidelines are an increase of YouTube’s existing COVID-19 Medical Misinformation Policy, which doesn’t recognize videos that falsely imply the coronavirus doesn’t exist, that inhibit mainstream medical care for the disease, or that maintain the virus is not spreading.
The extremely contagious virus does exist, and alternative, unproven treatments can be dangerous.
YouTube demonetized videos/Content that promoted anti-vaccination information in 2019.
Facebook declared its crackdown on anti-vaccination content: it’s not conceding ads that discourage vaccination. “We don’t need these ads on our platform,” the company stated.
Ads are as far as the policy goes, though, and organic posts from anti-vaccine groups will still be allowed.
The platform’s policies appear clinical trials of COVID-19 vaccines inch closer to completion.
Public trust in those vaccines is low. President Donald Trump has proceeded with public statements urging vaccine by Election Day, and many people in the US think that the development method is political, not scientific.