In yet another step to combat misinformation on the platform, Facebook is taking the step of calling out pages which repeatedly spread fake news.
If you try to like such a page, you will see a pop up saying that the page has "repeatedly shared false information," and that "independent fact-checkers said the information is false." You will then be presented with a choice of going back to previous page or following the page anyway.
There will also be a "learn more" link which will provide some more info on why this page has been labeled as a such, as well as another "learn more" link which will provide more info on Facebook's fact-checking program.
The company also said it would expand penalties for individual Facebook accounts which repeatedly share misinformation, in the sense that other users will see less of them in their News Feed.
Credit: facebookFinally, Facebook has redesigned the notifications that pop up when users share content that fact-checkers have labeled as false. The notification will now include the fact-checker's article that explains why the post is misleading, together with an option to share that article. Users will also be notified that posts from users who repeatedly share fake news will be positioned lower in the News Feed, making it less likely for other users to see them.
SEE ALSO:Facebook's Oversight Board upholds Trump's suspensionIn the past couple of years, Facebook has been introducing a number of measures to combat misinformation on the platform. These include introducing message forwarding limits on Messenger, encouraging users to read an article before sharing it, putting warning labels on fake news, and -- most famously -- blocking Donald Trump from using the platform. Despite these efforts, the company still has a long way to go before it can say it's really gotten rid of fake news.
TopicsFacebookSocial Media
(责任编辑:時尚)
Slack goes down again, prompting anxiety everywhere
This company is hiring someone just to drink all day