It’s no secret that Facebook has a fake news problem. Critics have accused the social network of allowing false and hoax news stories to run rampant, with some suggesting that Facebook contributed to Donald Trump’s election by letting hyper-partisan websites spread false and misleading information.
Mark Zuckerberg has addressed the issue twice since Election Day, most notably in a carefully worded statement that reads: “Of all the content on Facebook, more than 99 per cent of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics.”
Still, it’s hard to visit Facebook without seeing phoney headlines like “FBI Agent Suspected in Hillary Email Leaks Found Dead in Apparent Murder-Suicide” or “Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement” promoted by no-name news sites like the Denver Guardian and Ending The Fed.
Gizmodo has learned that the company is, in fact, concerned about the issue, and has been having a high-level internal debate since May about how the network approaches its role as the largest news distributor in the US. The debate includes questions over whether the social network has a duty to prevent misinformation from spreading to the 44 per cent of Americans who get their news from the social network.
According to two sources with direct knowledge of the company’s decision-making, Facebook executives conducted a wide-ranging review of products and policies earlier this year, with the goal of eliminating any appearance of political bias. One source said high-ranking officials were briefed on a planned News Feed update that would have identified fake or hoax news stories, but disproportionately impacted right-wing news sites by downgrading or removing that content from people’s feeds. According to the source, the update was shelved and never released to the public. It’s unclear if the update had other deficiencies that caused it to be scrubbed.
“They absolutely have the tools to shut down fake news,” said the source, who asked to remain anonymous citing fear of retribution from the company. The source added, “there was a lot of fear about upsetting conservatives after Trending Topics,” and that “a lot of product decisions got caught up in that.”
In an emailed statement, Facebook did not answer Gizmodo’s direct questions about whether the company built a News Feed update that was capable of identifying fake or hoax news stories, nor whether such an update would disproportionately impact right-wing or conservative-leaning sites. Instead, Facebook said it “did not build and withhold any News Feed changes based on their potential impact on any one political party”. The full statement:
We did not build and withhold any News Feed changes based on their potential impact on any one political party. We always work to make News Feed more meaningful and informative, and that includes examining the quality and accuracy of items shared, such as clickbait, spam and hoaxes. Mark himself said, “I want to do everything I can to make sure our teams uphold the integrity of our products.” This includes continuously reviewing updates to make sure we are not exhibiting unconscious bias.
A New York Times report published Saturday cited conversations with current Facebook employees and stated that “The Trending Topics episode paralysed Facebook’s willingness to make any serious changes to its products that might compromise the perception of its objectivity”. Our sources echoed the same sentiment, with one saying Facebook had an “internal culture of fear” following the Trending Topics episode.
The sources are referring to a controversy that started in May, when Gizmodo published a story in which former Facebook workers revealed that the trending news team was run by human “curators” and guided by their editorial judgements, rather than populated by an algorithm, as the company had earlier claimed. One former curator said that they routinely observed colleagues suppressing stories on conservative topics. Facebook denied the allegations, then later fired its entire trending news team. The layoffs were followed by several high-profile blunders, in which the company allowed fake news stories (or hoaxes) to trend on the website. One such story said that Fox News fired Megyn Kelly for being “a closet liberal who actually wants Hillary to win”.
After Gizmodo’s stories were published, Facebook vehemently fought the notion that it was hostile to conservative views. In May, Mark Zuckerberg invited several high-profile conservatives to a meeting at Facebook’s campus, and said he planned to keep “inviting leading conservatives and people from across the political spectrum to talk with me about this and share their points of view”. Joel Kaplan, Facebook’s vice president of global public policy, emphasised in a post that Facebook was “a home for all voices, including conservatives”.
“There was a lot of regrouping,” the source told Gizmodo, “and I think that it was the first time the company felt its role in the media challenged.”
As Facebook scrambled to do damage control, the company continued to roll out changes to News Feed, which weighs thousands of factors to determine which stories users see most frequently. In June, the company rolled out several updates to prioritise updates from friends and family and downgrade spam. But according to one source, a third update — one that would have down-ranked fake news and hoax stories in the News Feed — was never publicly released.
Facebook has addressed its hoax problem before. In a January 2015 update, the company promised to show fewer fake news stories, by giving users a tool to self-report fake stories on their feeds. It wrote:
The strength of our community depends on authentic communication. The feedback we’ve gotten tells us that authentic stories are the ones that resonate most. That’s why we work hard to understand what type of stories and posts people consider genuine — so we can show more of them in News Feed. And we work to understand what kinds of stories people find misleading, sensational and spammy, to make sure people see those less.
Facebook’s efforts have had mixed results. Earlier this year, Buzzfeed News studied thousands of fake news posts published on Facebook, and found that while the average engagement on fake posts fell considerably from January 2015 to December 2015, the reach of fake posts skyrocketed in 2016, during the lead-up to the presidential election. (A Facebook spokesperson told Buzzfeed that “we have seen a decline in shares on most hoax sites and posts,” but declined to produce specific numbers.) Another Buzzfeed investigation this spring found that a group of young Macedonian publishers were running huge networks of popular Facebook pages filled with fake conservative news, targeted at Trump supporters in the US on websites such as TrumpVision365.com, USConservativeToday.com and USADailyPolitics.com.
“We can’t read everything and check everything,” Adam Mosseri, head of Facebook’s news feed, said in an August TechCrunch interview. “So what we’ve done is we’ve allowed people to mark things as false. We rely heavily on the community to report content.”
In a Facebook post published after the election, former Facebook product designer Bobby Goodlatte blamed the social network for boosting the visibility of “highly partisan, fact-light media”, and for not taking bigger steps to combat the spread of fake news in the lead-up to the election. “A bias towards truth isn’t an impossible goal for News Feed,” he wrote. “But it’s now clear that democracy suffers if our news feeds incentivise bullshit.”
Additional reporting by Kevin Roose.