Facebook presented new ways to inform people that they are interacting with content that’s been rated by a fact-checker as disinformation and it is taking stronger action against people who repeatedly share misinformation on Facebook. The company stated that it is making sure fewer people see misinformation about COVID-19 and vaccines, climate change, elections or other topics.
“We want to give people more information before they like a Page that has repeatedly shared content that fact-checkers have rated, so you’ll see a pop up if you go to like one of these Pages” informed Facebook.
At the window that will pop up, Facebook users will be able to click on the “Find More” option and will be able to learn more about the misinformation published on the page that was opened, but also to read more about Facebook’s fact-checking program.
“This will help people make an informed decision about whether they want to follow the Page,” explained Facebook.
The company stated that since it launched its fact-checking program in late 2016, the focus has been on reducing viral misinformation. Starting from yesterday, it has reduced the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners. Facebook reminds that the debunked post’s reach has been reduced in the News Feed.
Facebook has redesigned the notifications when people share fact-checked content and the redesign was made to make it easier to understand when this happens. The notification includes the fact-checker’s article debunking the claim as well as a prompt to share the article with their followers. It also includes a notice that people who repeatedly share false information may have their posts moved lower in News Feed so other people are less likely to see them.