According to the latest announcement from Facebook, if users regularly share content labeled as fake news, they will experience reduced interaction on new posts.
Facebook tightens control of fake news. Illustration Recently, Facebook has taken “strong” action against users who constantly share fake information on this company’s social platform. In the announcement, Facebook said it would reduce engagement of all of a user’s news feed posts if this person regularly shares content labeled as fake news from one of its fact-checking partners. In addition, Facebook will also implement measures to notify users if they are interacting with content that has been reviewed by fact-checking partners. In recent times, false information and conspiracy theories have appeared a lot on social platforms with many users, such as Facebook or Twitter. Therefore, Facebook affirms: “Whether it’s fake or misleading information about Covid-19, vaccines, climate change, elections or other topics, we are working to ensure that more and more people are Few people have to see that kind of information on our platforms.” Earlier this year, Facebook said it removed 1.3 billion fake accounts between October and December last year. The move comes ahead of an examination by the US House of Representatives Energy and Commerce Committee on how technology platforms tackle the problem of disinformation.
You must log in to post a comment.