Facebook has been fighting the Fake News issue since a long time. The social media giant brought several new features and added more security to battle the Fake News on its platform but there has been no decrease in the problem. Now the social media company is adding publisher info button on links to fight fake news.

Facebook thinks giving users info about publishers would lessen Fake News

The social networking site probably thinks that showing its users the Wikipedia entries about publishers and some related articles would give them more context about the links they see. Recently, the social media giant began the testing of a new “I” button on News Feed links. The button opens up an informational panel about the publisher. Sara Su, the Facebook product manager, said in an interview with news site TechCrunch, “People have told us that they want more information about what they’re reading.”

Su added, “They want better tools to help them understand if an article is from a publisher they trust and evaluate if the story itself is credible.” The box will display the link to the full profile of the publisher, which would possibly assist the user in understanding better the reputation of the publisher and the credibility of the news content. The user would be able to conclude if the publisher is authentic or fake. The info will be missing if there is no Wikipedia page of the author. This will further provide a clue to users whether the content is legitimate or not.

The button will reveal Related Articles on all the links instead of on just the articles that are more popular. If the article is a part of a Trending topic, then trending information could appear as well. The social network showed Related Articles only on some occasions before and showed them on links without any extra click.

What about Wikipedia fake content?

This new change is part of the ongoing initiative of the social media giant to improve content integrity. Su says, “This work reflects feedback from our community, including publishers who collaborated on the feature development as part of the Facebook Journalism Project.”

TechCrunch reveals that a Facebook spokesperson, when asked about the risk of the Wikipedia entries being doctored with fake information, said “Vandalism on Wikipedia is a rare and unfortunate event that is usually resolved quickly. We count on Wikipedia to quickly resolve such situations and refer you to them for information about their policies and programs that address vandalism.”

Also, the social network said that Related Articles will be about the same topic and will be from a wide variety of publishers that regularly publish news content on Facebook that get high engagement with their community. Su further said, “As we continue the test, we’ll continue listening to people’s feedback to understand what types of information are most useful and explore ways to extend the feature. We will apply what we learn from the test to improve the experience people have on Facebook, advance news literacy, and support an informed community.”

 

Comments

comments