Facebook has admitted it was “far too slow to recognise” Russian election interference and the spread of fake news on the social networking site.
In a blog post on social networks and their influence, product manager Samidh Chakrabarti said that at its worst social media “allows people to spread misinformation and corrode democracy”.
Alongside a debate about influence on the 2016 US election, the social media giant is currently also investigating Russian influence in UK politics in relation to a parliamentary inquiry into fake news.
Mark Zuckerberg has pledged to ‘fix’ misinformation on Facebook (Chris Ratcliffe/PA)
Mr Chakrabarti said Facebook did not have all the answers to combat misinformation, but the firm knew it had a duty to respond to it.
“I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t,” he said.
“That’s why we have a moral duty to understand how these technologies are being used and what can be done to make communities like Facebook as representative, civil and trustworthy as possible.”
He said Russia had used social media as an “information weapon” around the 2016 presidential election, working in part by promoting “inauthentic pages”, something Mr Chakrabarti said the site was now fixing.
“It’s abhorrent to us that a nation-state used our platform to wage a cyberwar intended to divide society,” he said.
“We’re working to make politics on Facebook more transparent. We’re making it possible to visit an advertiser’s page and see the ads they’re currently running.
“We’ll soon also require organisations running election-related ads to confirm their identities so we can show viewers of their ads who exactly paid for them. Finally, we’ll archive electoral ads and make them searchable to enhance accountability.”
On the subject of fake news and misinformation, Mr Chakrabarti said new ways to report false information and partnerships with third-party fact checkers were being used to curb its rise.
“In the public debate over false news, many believe Facebook should use its own judgment to filter out misinformation.
“We’ve chosen not to do that because we don’t want to be the arbiters of truth, nor do we imagine this is a role the world would want for us.
“Instead, we’ve made it easier to report false news and have taken steps in partnership with third-party fact checkers to rank these stories lower in News Feed. Once our fact checking partners label a story as false, we’re able to reduce future impressions of the story on Facebook by 80%.”
He also warned of the dangers of social media “echo chambers” where users only see viewpoints they agree with, adding that the firm is testing a “related articles” feature that shows users other articles about the stories they’re reading.
The company has also said it is hiring over 10,000 people to work on safety and security around the platform.
At the beginning of the year, Facebook founder Mark Zuckerberg said his aim for 2018 was to “fix” the site relating to its handling of abuse and misinformation.