Cleaning up the news feed of a network with more than 2 billion users is an ongoing process — so just where is Facebook in its efforts to clean up content that violates its rules and downplay fake news? Facebook has shared an updated look at how the company is working to clean up the newsfeed, as well as making improvements to Messenger and Instagram. The updates integrate more data into determining the quality of links, expand third-party fact checks, add trust indicators, and let Messenger users know when a user is verified.
The list of updates — both recent and upcoming — continue Facebook’s strategy of “remove, reduce, and inform.” Facebook explains that content that goes against its Community Standards is removed, while content that’s not violating those rules but is often complained about (like clickbait) is downplayed in the news feed. The inform strategy adds tools to help users make sense of what they are seeing.
Facebook is also rolling out a resource that allows users to track progress and changes to the platform’s community standards. The standards will be revisited on a regular basis and updated based on expert recommendations, trends, discussion, and other elements. The new section will help users keep track of changes on a regular basis.
Facebook will also begin building on collaboration tools that help fight fake news. The network is currently working with experts to develop such a system that will help find fake news faster without introducing bias. The existing third-party fact-checking program is also expanding, with the Associated Press expanding efforts on misinformation inside videos, as well as in Spanish.
A new ranking tool will factor in a website’s click-gap signal, which measures inbound and outbound links. Websites that have more link clicks on Facebook than on the actual website will be demoted in the news feed. The network says a difference in click-gaps can suggest low-quality content.
Facebook also recently added “Trust Indicators” to the “i” context button that accompanies posts. The indicators, created by a group of news organizations called the Trust Project, use factors like fact-checking practices, ethics statements, and corrections, as well as who owns and funds the publication. The Page Quality tab will also expand to include details on clickbait.
Other changes apply specifically to Groups. A new Group Quality feature highlights removed posts and fake news as a tool for administrators. Groups repeatedly sharing misinformation will also be demoted in the news feed. Users that leave a Group will also now be able to delete their posts and comments, with access to the deletion option even after leaving.
Messenger will also gain the verified badge, beginning this week for verified accounts. Mass messages are also now easier to spot with a label that indicates if a message was forwarded, a change already launched earlier this year. Expanded settings for Messenger give users more control over who can send messages, while the block tool will also see improvements.
For Instagram, the company says inappropriate posts that don’t violate community standards will be excluded on the Explore pages, as well as hashtag pages.