Facebook to take action against users repeatedly sharing misinformation

Facebook Inc. (FB.O) announced it would take “stronger” action against those who spread false material on the network.

Facebook said in a blog post that it would limit the dissemination of all posts in a user’s news feed if the user often shares content that has been marked as fraudulent by one of the company’s fact-checking partners.

It went on to say that it was also launching techniques to alert users when they were interacting with fact-checked information.

During the COVID-19 pandemic, false claims and conspiracies spread on social media sites such as Facebook and Twitter.

“Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections, or other topics, we’re making sure fewer people see misinformation on our apps,” the company said in a statement.

Facebook announced earlier this year that it had removed 1.3 billion phony accounts between October and December, ahead of a House Committee on Energy and Commerce investigation into how digital platforms are dealing with misinformation.

+1
0
+1
0
+1
0
+1
0
+1
0
+1
0
+1
0

Enter your email and get notified when new content is added!

We don’t spam! Read our privacy policy for more info.

Leave a Reply

Your email address will not be published. Required fields are marked *

Send this to a friend