Facebook is taking its fight against COVID-19-related misinformation a step further.
According to a report by Fast Company, Facebook will soon start sending notifications to users who have shared, commented on, or liked posts which contain misinformation about the pandemic. The company will also provide these users with links to trustworthy sources on COVID-19.
These notifications, the report claims, will read like this: "We removed a post you liked that had false, potentially harmful information about COVID-19."
Clicking on the notification will take the user to a page showing the offending post, and some info on why it was removed from Facebook. Follow up actions, such as the option to unsubscribe from a group that originally shared the post, will also be offered.
Facebook has been warning users about COVID-19 misinformation since April, though it later admitted the strategy didn't work well enough. In December, the company decided to finally start removing false claims about COVID-19 vaccines from its platforms.
SEE ALSO:Facebook’s Oversight Board takes on one U.S. caseAs explained to Fast Company by Facebook product manager Valerio Magliuo, the company noticed that users often don't get how a warning message about COVID-19 misinformation is connected to a post they've interacted with. "There wasn’t a clear link between what they were reading on Facebook from that message and the content they interacted with," Magliuo told the outlet.
Facebook won't go into too many details on why a particular post was labeled as misinformation. According to the company, the danger there is re-exposing the user to the misinformation; also, the company doesn't want to shame the user who posted the misinformation.
While this feels like a step in the right direction, the fact is that COVID-19 misinformation is already running rampant on Facebook, and a lot of damage has already been done.
TopicsFacebookCOVID-19
(责任编辑:時尚)
Nate Parker is finally thinking about the woman who accused him of rape
FIFA2021年度最佳陣容 :羅梅搭檔萊萬 切爾西雙核入選