Following internal questioning from employees prior to last Tuesday's election and upper-level rumination post-election, Facebook's CEO Mark Zuckerberg has made his official statement on whether or not the social media network that is "not a media company" adversely influenced the election. His conclusion: Nope, didn't.
As the New York Times reported yesterday, "several vice presidents and executives of the social network," began questioning the role that Facebook played in this year's presidential election almost immediately after it was over. Sources that wished to remain anonymous said that "top executives concluded that they should address the issue and assuage staff concerns at a quarterly all-hands meeting."
Last night, Zuckerberg attempted to address the public's concerns with a note on his page. After acknowledging the controversy and questions over FB's responsibility to prevent the spread of fake news and hoaxes, he said:
Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.
Already, it would seem that Zuckerberg is saying that the company has internal tools that can detect the difference between real and fake news. If the company does not have a way of determining the legitimacy of news published through the platform, then this number is a complete fiction.
He went on to explain that they are looking into tools that would be able to make the judgments that he himself just said the company has already made:
That said, we don't want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further.
He then described his fears of becoming the determiner of truth.
While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.
This is where a truly difficult issue arises in Facebook's inability to acknowledge its status as a media company. Over time, the public has learned what media outlets it wants to trust or not. Sometimes those outlets are worthy of trust and sometimes they aren't. Some use opinion mixed with cited sources, others just report the straight facts that they have discovered. There are also outlets that just write anything that they want without sources or facts.
Facebook, on the other hand, delivers a steady stream of outlets that no one has ever heard of but they might trust because, among other things, their friend said they should. Many will not read the story at all but will see a headline like, "FBI AGENT SUSPECTED IN HILLARY EMAIL LEAKS FOUND DEAD IN APPARENT MURDER-SUICIDE," and allow it to course into the way they see the world. That particular assertion came from The Denver Guardian, a site that very carefully constructs fake news in a way that isn't instantly recognisable.
The Facebook position is that they don't want to determine what the truth is, they only want to control what people see by relying on algorithms that prefer engagement rather than quality. A story like an FBI agent involved in Hillary's email investigation being murdered is hot, people will engage. And the fact that no one else is reporting it will cause them to engage even further with comments like, "why is no else reporting this?! This is an outrage!!"
But Zuckerberg does acknowledge Facebook's influence in the election when it suits him:
Overall, I am proud of our role giving people a voice in this election. We helped more than 2 million people register to vote, and based on our estimates we got a similar number of people to vote who might have stayed home otherwise. We helped millions of people connect with candidates so they could hear from them directly and be better informed. Most importantly, we gave tens of millions of people tools to share billions of posts and reactions about this election. A lot of that dialog may not have happened without Facebook.
For vague subjects like dialogue and voter registration, he asserts, Facebook is a powerful tool in the political machine.
The internet is an unprecedented force of communication and delivery of information. Sometimes that information is the news that keeps us informed. Facebook's strategy has always been to tear through the major facets of the internet and take them over for its own profit. All the way back in 2007, Fred Vogelstein wrote for Wired that Zuckerberg "has transformed his company from second-tier social network to full-fledged platform that organizes the entire Internet."
When a young Zuckerberg visited his alma mater, Exeter, in 2007, a blog post a blog post on the school's website recounted a Q&A session that Zuckerberg held at an assembly:
Zuckerberg also explained why he has turned down offers to buy out Facebook. "It's not because of the amount of money. For me and my colleagues, the most important thing is that we create an open information flow for people. Having media corporations owned by conglomerates is just not an attractive idea to me."
From the beginning, it would seem, Zuckerberg's plan was to take down "media corporations owned by conglomerates" in favour of an "open information flow for people." At this point, what is happening certainly seems hostile to established media and by algorithmically manipulating the news feed, the company is actively working against the idea of an open flow of information. Additionally, allowing sponsored posts and advertisements masquerading as news stories, or fake news stories masquerading as advertisements, it has further throttled the flow of quality information.
Doc Searles, an information and technology researcher, pointed out on his personal blog for Harvard, the ads that were served to him directly next to Mark Zuckerberg's note were indeed for fake news sources. He wrote:
Besides being false and misleading clickbait, they are not from espn.com. They're from http://espn.com-magazines.online, and bait for a topic switch:to pitching a diet supplement called Alpha Fuel. The pages with the pitches are made to look like ESPN ones, logos and all. But they're fake. I would think it can't be too hard to prevent this kind of obviously dishonest and misleading shit. It's also been going on for some time. See here. Why hasn't it been stopped?
This is a clear example of something that Facebook could correct but at the moment has chosen not to. Is Mark Zuckerberg trying to say that this academic researcher wants to receive fake news based on the sites he likes and the friends he connects with? If so, the algorithm could certainly use some tweaking because Searles believes it to be "obviously dishonest and misleading shit."
More than 1.04 billion people in the world use Facebook. That's an audience no media company on Earth has. The social media network has obfuscated when it comes to the question of responsibility saying, "No, we are a tech company, not a media company." The argument being that media is one of many services it bundles and it doesn't directly produce the content. As Gizmodo's J.K. Trotter put it,"No lie or falsehood or hoax is more consequential than Facebook's belief that it is not a media company, and thus can shirk the responsibilities of one -- beginning with a basic fidelity to the truth."
Facebook isn't just any media company, it is the world's largest media company and it is currently being quite reckless with its unprecedented position.