Facebook Pushing COVID-19 Conspiracy Theorists To Read The WHO Is A Mistake

Facebook Pushing COVID-19 Conspiracy Theorists To Read The WHO Is A Mistake

Facebook now plans to further crack down on coronavirus misinformation with the help of the World Health Organization (WHO). The only problem is, neither Facebook nor the WHO are the most trusted groups to begin with.

The COVID-19 outbreak has not only resulted in the spread of a biological virus, but also a misinformation virus that’s now dispersed far and wide across the internet. The misinformation not only includes placing the blame on 5G, but also pointing to a variety of fake products, services, and tricks to help cure coronavirus. With misinformation reaching its own pandemic level, Apple, Google, Microsoft, Facebook, Instagram, WhatsApp, and many others, have all been acutely focused on releasing updates and new features to help combat coronavirus conspiracies.

While Facebook had already announced some of the measures it was employing to stem the flow of bad information, the company has now released an update detailing how it now plans to ramp up those measures. One of the most notable ways Facebook plans to fight coronavirus-related misinformation is by targeting those the company says “have liked, reacted or commented on harmful misinformation about COVID-19.” Essentially, Facebook will now show those users a News Feed message pointing to the fact the posts were bad, as well as linking out to what it considers good information, such as the WHO’s myth-busting page.

People Need To Trust Facebook & The WHO First

Facebook Pushing COVID-19 Conspiracy Theorists To Read The WHO Is A Mistake

Though noble, it remains to be seen how likely Facebook’s latest changes are going to help, considering accessing information is not actually Facebook’s problem, trust is. It is hardly news that Facebook has a misinformation issue and this started long before coronavirus arrived on the scene. However, Facebook’s general approach has always been to limit the amount it directly intervenes in content. This choice has now resulted in what is nothing short of a breeding ground for people and groups looking to spread information. It just so happens that it has taken COVID-19 to make the company see the effects of its choices, as clearly as it now does.

Further adding to the current problem is how Facebook is a company that many already have a hard time trusting. The Cambridge Analytica scandal, and data collection in general, have slowly eroded trust in the service. Similarly, the WHO has also come under criticism by some lately, including the U.S. government, with many on both sides of the political aisle, and across much of the Western world, raising questions over its allegiances. One of the major criticisms of the public health agency is the level of control China is believed to have, with many citing the poor information given during the first weeks and months of the outbreak. This, combined with the WHO’s refusal to allow Taiwan to become a member, and President Donald Trump’s funding freeze has cast even more doubt on the legitimacy of the WHO recently.

Whether rightly or wrongly, these mounting suggestions of distrust only further serve to distance the general public from big and centralized organizations, such as Facebook and the WHO. Therefore, having one turn around and direct a member of the public to another organization is unlikely to be enough to suddenly instill trust in one or the other. If people already do not trust large organizations, then any approach to bridge the divide is only going to further widen it, and possibly even create breathing room for new conspiracies to develop. It would’ve been better had Facebook opted to share work from top-level journalists instead, as this is an industry Facebook has said it wants to support. Choosing the WHO as one of the “authoritative sources” will look like a political move by some, and inflame conspiracy theorists further at this point.

To be clear, this is not to say that Facebook shouldn’t be making the moves it now is. It should. Every service is trying to point users towards legitimate and useful information and away from what’s deemed bad intel, and Facebook should be no different. The difference is simply that these changes are unlikely to make much difference to the problem they are designed to solve, especially when the WHO is chosen as a source. Facebook has spent a long time allowing conspiracies to grow on its platform, as has YouTube, and the current coronavirus misinformation problem is a direct result of those previous choices.