Facebook-owned social media platform Instagram is testing the same fact-checking systems as its parent company in a bid to combat misinformation.
Facebook is making a vocal push into privacy and accuracy after a string of scandals in the past few years, including one that could see them fined up to $5 billion.
Facebook’s fact-checking partners, of which there are now more than 50 in 30 countries, review posts on the social network.
If a link, image, or video is rated as false, its future reach in the News Feed is restricted, and users are warned before sharing it.
US-based journalism advocate the Poynter Institute, administrator of the International Fact-Checking Network, reports Instagram is now testing similar policies, using image-recognition technology to find content already flagged on Facebook.
The posts will then be hidden from hashtags and Instagram’s Explore tab, obscuring it from people who don’t already follow the posting accounts.
Instagram is also considering adding pop-ups when people search for topics commonly rife with misinformation, such as the anti-vaxx movement.
The Poynter-owned fact checker PolitiFact’s executive director Aaron Sharockman said it’s a good step for the platform.
“Instagram is a place where a lot of people, particularly young people, get their news… it’s a space where misinformation can live.”
YouTube also began instituting fact checking in March.