Follow Us

Sunday, May 10, 2020

May 10, 2020
Photo by Kaboompics .com from Pexels


Facebook announced Thursday that it will start warning users if they have liked, reacted or commented on harmful Covid-19 posts that the company has found to be misinformation and removed.

The feature will roll out in the coming weeks, Facebook said in a blog post.

“These messages will connect people to COVID-19 myths debunked by the World Health Organization including ones we’ve removed from our platform for leading to imminent physical harm,” Guy Rosen, Facebook’s vice president of integrity, said in a blog post.

After the WHO declared Covid-19 a global health emergency in January, Facebook started removing misinformation about the outbreak from its platforms. The company said Thursday that it’s removed hundreds of thousands of pieces of misinformation that could lead to physical harm, such as inaccurate content that says physical distancing is ineffective or drinking bleach cures the virus.

“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” Rosen said.

Facebook, which has been criticized for its handling of health issues, has made several coronavirus-related adjustments to its platform over the past few months.

For example, it has increased the number of partners working on fact-checking to limit the spread of false claims. It also started showing pop-ups that link to official health resources on Instagram and Facebook, which have directed more than 2 billion people to resources from health authorities.

Source