Have you liked or commented on a Facebook post about the COVID-19 pandemic?
Facebook is about to begin letting you know if you've spread bad information.
The company will soon be letting users know if they liked, reacted to, or commented on posts with harmful misinformation about the virus that was removed by moderators, by directing those who engaged with those posts to information about virus myths debunked by the World Health Organization.
Social media is awash in bad takes about the outbreak and platforms have begun to combat that misinformation.
Facebook said Thursday that people will begin seeing warning messages in coming weeks.
Facebook and other platforms have already taken steps to curb the wave of dangerous misinformation that has spread along with the coronavirus.
Facebook has banned bogus ads promising coronavirus treatments or cures. No such thing exists. There is no vaccine, though there is a global race to develop one.
The tech giant is altering its algorithms and, through an information page, attempting to put before users facts about the virus from global health organizations, as well as state and local health departments.
That hasn't stopped the spread of bad information.
Conspiracy theories about the origin of the virus and the vaccines being developed to prevent it still pop up daily. Posts or videos that promote unverified treatments and cures have raked in thousands of a views.
Facebook users, for example, viewed a false claim that the virus is destroyed by chlorine dioxide nearly 200,000 times, estimates a new study out today from Avaaz, a left-leaning advocacy group that tracks and researches online misinformation.
The group found more than 100 pieces of misinformation about the coronavirus on Facebook, viewed millions of times even after the claims had been marked as false or misleading by fact checkers. Other false claims were not labeled as misinformation, despite being declared by fact-checkers as false. Facebook partners with news organizations around the world to provide fact checks of misleading content on its site. The Associated Press is part of that fact-checking program.
"Coronavirus misinformation content mutates and spreads faster than Facebook's current system can track it," Avaaz said in its report.
Fake information on social media has been deadly. Last month, Iranian media reported more than 300 people had died and 1,000 were sickened in the country after ingesting methanol, a toxic alcohol rumored to be a remedy on social media. An Arizona man also died after taking chloroquine phosphate — a product that some mistake for the anti-malaria drug chloroquine, which President Donald Trump and conservative pundits have touted as a treatment for COVID-19. Health officials have warned the drug hasn't been proven safe or effective as a virus therapy.
The side effects of chloroquine can be dangerous.
Facebook partners with dozens of news organizations around the globe to provide fact checks of misleading content on its site. The Associated Press is part of the fact-checking program. Facebook users already see a warning label over articles, posts or videos in their feed that those fact checkers have marked as false or misleading. On Thursday the company announced that it put warning labels on 40 million coronavirus-related posts in March alone thanks to the roughly 4,000 articles those fact checkers produced. The warning labels stop about 95% of users from viewing the misinformation.