Facebook Inc. said it will take stronger steps to eliminate false information about Covid-19 and vaccines on its social network, a move that could remove major groups, accounts and Instagram pages for repeatedly spreading misinformation.
The company is acting on advice from the World Health Organization and other groups to expand its list of false claims that are harmful, according to a blog post on Monday.
Keep readinglist of 4 items
Facebook will ask administrators of user groups to moderate such misinformation. Facebook-owned Instagram will also make it harder to find accounts that discourage vaccination, and remove them if they continuously violate the rules.
The company this week will also include in its Covid-19 information center details from local health departments about when and where people can get vaccinated.
If Facebook’s systems come across content that says the coronavirus is man-made or manufactured, that it is safer to get the disease than to get the vaccine, or that the shots are toxic, dangerous or cause autism, that content will be removed.
“Claims about Covid-19 or vaccines that do not violate these policies will still be eligible for review by our third-party fact-checkers, and if they are rated false, they will be labeled and demoted,” Facebook said in its blog post.
Facebook, the world’s largest social network, had already made false vaccine claims in ads against its rules.
The company is working to undo years of momentum gained by the anti-vaccination movement on its platforms, where emotional anecdotes and stories that provoke fear tend to spread more quickly than scientific facts.
The changes on the sites, which start this week, will roll out globally in more than 50 languages, but may take a while to be effective, Facebook said.
In the meantime, the Menlo Park, California-based company will give $120 million in free advertising to health departments and non-governmental organizations that plan to spread accurate information about Covid-19 testing and vaccination.