Facebook is tightening its policies around false information sharing and will now begin taking steps to call out repeat offenders.
On Wednesday, the company said in a statement that it would begin flagging pages that continue to share misinformation.
“We want to give people more information before they like a Page that has repeatedly shared content that fact-checkers have rated, so you’ll see a pop up if you go to like one of these Pages,” the company explained of its new policy. “You can also click to learn more, including that fact-checkers said some posts shared by this Page include false information and a link to more information about our fact-checking program.”
Facebook said that this new pop up will help users better make informed decisions about whether or not they want to follow a page. The social media giant added that it will expand its penalties for individual accounts that share false information.
“We will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners,” the company said. “We already reduce a single post’s reach in News Feed if it has been debunked.”
In addition, Facebook will now notify users when they share content that a fact-checker has deemed false, and will prompt users to share an updated article with the correct information. People who repeatedly share false information will have their posts moved lower down the news feed where content is less likely to be viewed.
The updated policies are the latest action taken by Facebook to prevent misinformation on the site. Earlier this month, the company announced that it would begin testing out a prompt to ensure that users properly understand the news articles they post before sharing it on their feed. Twitter introduced a similar measure last year.