The New York Times Magazine has an interesting story out this week about Facebook CEO Mark Zuckerberg, fake news and Facebook's role as the world's most prominent information distributor. It's all part of Facebook's ongoing public relations freak out surrounding the prevalence of fake news and hoaxes spread on the platform. The company is trying to fix the problem now, but it sure is funny to see Zuck constantly rolled out to do a series of interviews on something he brushed off as a "crazy idea" just a few months ago.
Facebook and Zuckerberg's public stance on fake news, you see, changed in three cycles. First, it was outright dismissal; next came the arms-length pondering; then, finally, was a full-throated and multi-faceted war on the plague of hoaxes and falsehoods being disseminated on Facebook.
Zuck's first stab at the fake news problem — which flared up immediately following the election of Donald Trump — utilised a strategy he hoped would solve the problem quickly. Well-sourced New York Times reporter Mike Isaac wrote on Twitter that, according to "feedback" he was getting, Zuck tried to "outsmart" the wave of outrage and fingers pointed at Facebook for its culpability in the fake news sausage-making machine.
This was echoed in Zuckerberg's own public statements. "Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea," he said right after the US election. "Voters make decisions based on their lived experience. I do think there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is they saw some fake news. If you believe that, then I don't think you have internalized the message the Trump supporters are trying to send in this election."
It's apparent here just how badly Zuckerberg wanted the problem to go away — because this defence doesn't make a whole lot of sense. Nobody was alleging that because people saw some revelatory fake news on News Feed that they suddenly felt impassioned to vote for Donald Trump, or the other way around. The prevailing criticism at the time seemed to be that Facebook was enabling a quickly deteriorating sense of what is and isn't true, and that inflammatory fake news helped to harden political divides among Facebook users.
Facebook also released data in an attempt to downplay the issue — less than one per cent of content posted on Facebook was false, the company said in November. That may be true, but the company never gave any context for the number, like how much of that one per cent users actually clicked on, or how often it got shared.
Then came the about-face. The magazine's story does a good job of encapsulating Zuckerberg's dramatic shift on Facebook's issue with fake news, or as the company publicly refers to it, "false news". One Zuck interview for the piece was done in January, during which he did what he seems to do in almost every interview: Read from memorised talking points.
He speaks quickly but often unloads full paragraphs of thought, and sometimes his arguments are so polished that they sound rehearsed, which happened often that morning. "2016 was an interesting year for us," he said as the three of us, plus a P.R. executive, sat around a couple of couches in the glass-walled conference room where he conducts many of his meetings.
At this point, we've entered Stage 2. As evidenced by Zuck's answers, he was more open to the idea of Facebook having a role in the fake news fiasco.
After the election, Zuckerberg offered a few pat defences of Facebook's role. "I'm actually quite proud of the impact that we were able to have on civic discourse over all," he said when we spoke in January. Misinformation on Facebook was not as big a problem as some believed it was, but Facebook nevertheless would do more to battle it, he pledged. Echo chambers were a concern, but if the source was people's own confirmation bias, was it really Facebook's problem to solve?
It was hard to tell how seriously Zuckerberg took the criticisms of his service and its increasingly paradoxical role in the world. He had spent much of his life building a magnificent machine to bring people together. By the most literal measures, he'd succeeded spectacularly, but what had that connection wrought?
But after this interview, the almighty gods at Facebook summoned the reporter, Farhad Manjoo, back for a second interview. This time, Zuckerberg had clarified his message.
Zuckerberg wanted to become a global news distributor that is run by machines, rather than by humans who would try to look at every last bit of content and exercise considered judgment. "It's something I think we're still figuring out," he told me in January. "There's a lot more to do here than what we've done. And I think we're starting to realise this now as well."
It struck me as an unsatisfying answer, and it later became apparent that Zuckerberg seemed to feel the same way. On a Sunday morning about a month after the first meeting, I got a call from a Facebook spokesman. Zuckerberg wanted to chat again. Could Mike and I come back on Monday afternoon?
We met again in the same conference room. Same Zuck outfit, same P.R. executive. But the Zuckerberg who greeted us seemed markedly different. He was less certain in his pronouncements than he had been the month before, more expansive and questioning. Earlier that day, Zuckerberg's staff had sent me a draft of a 5,700-word manifesto that, I was told, he spent weeks writing. The document, "Building Global Community," argued that until now, Facebook's corporate goal had merely been to connect people. But that was just Step 1. According to the manifesto, Facebook's "next focus will be developing the social infrastructure for community — for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all." If it was a nebulous crusade, it was also vast in its ambition.
Since its initial defence in November, Facebook has deployed an all out assault on fake news. It launched the Facebook Journalism Project, hired former CNN host Campbell Brown to lead Facebook news partnerships, and has partnered with Snopes and Politifact to flag fake stories.
All of this is pretty incredible when you consider that just a few months ago Zuck had tried to kill the issue in its track with some deft PR manoeuvring. Because that's all the responsibility Facebook wanted to take: It was just a PR problem it could overcome, not an actual issue it wanted to take responsibility for. It's good that Facebook is finally trying to fix the problem, but this saga is an important lesson in how Facebook is a well-oiled machine — one that's obsessed with controlling and moulding the narrative around its company and its many controversies.