He may be the most beleaguered and mistrusted CEO of any major tech company, but if an enormous new profile of Mark Zuckerberg makes one thing clear, he’s not as dumb as Twitter’s CEO.
If you need catching up on the trials and tribulations of Zuck, the New Yorker has a 14,000-word profile that will take you down memory lane and gives some fresh insight into the evolving public character of the Facebook founder. If you think that Zuckerberg is an out-of-touch guy who struggles with basic human understanding and lives in denial of the enormous responsibilities his company carries, the piece probably won’t change your mind very much. But at least he made it through this one without appearing to give Holocaust deniers a pass.
At one point, the subject of Facebook’s recent decision to ban Alex Jones and several of his pages related to Infowars comes up, and he essentially goes on record saying the decision was made because Apple did it first.
Jones has become the poster child of debate around speech, misinformation, and abuse on social media over the course of the summer. As a professional conspiracy theorist, Jones has a sizable audience that is largely driven by social media. One theory he’s pushed in the past is that the Sandy Hook school shooting that claimed the lives of 20 children and six adults was a hoax. Parents of the victims have experienced years of harassment from Jones’ followers who believe they are “crisis actors,” and the parents are currently suing Jones. Over the last few months, demand for social media companies to remove his presence on their platforms for violating their policies on harassment, hate speech, and misinformation hit a fever pitch.
In July, the parents of one victim of the shooting wrote an open letter directly to Mark Zuckerberg detailing the harassment they’ve experienced and asking him to take action. At first, Facebook flirted with a suspension and made excuses about its terms of service. Then Apple stepped up and banned Jones’ podcasts from its platform. That sparked a wave of “courage” among tech companies and basically, everyone but Twitter proceeded with their own bans. In today’s profile, Zuckerberg said Apple prompted, or at least accelerated the decision to go for a full ban:
I asked Zuckerberg why Facebook had wavered in its handling of the situation. He was prickly about the suggestion: “I don’t believe that it is the right thing to ban a person for saying something that is factually incorrect.”
Jones seemed a lot more than factually incorrect, I said.
“O.K., but I think the facts here are pretty clear,” he said, homing in. “The initial questions were around misinformation.” He added, “We don’t take it down and ban people unless it’s directly inciting violence.” He told me that, after Jones was reduced, more complaints about him flooded in, alerting Facebook to older posts, and that the company was debating what to do when Apple announced its ban. Zuckerberg said, “When they moved, it was, like, O.K., we shouldn’t just be sitting on this content and these enforcement decisions. We should move on what we know violates the policy. We need to make a decision now.”
For his part, Dorsey reportedly banned Jones, citing videos of his IRL harassment of CNN reporter Oliver Darcy as a violation of its terms of service.
Zuckerberg doesn’t look great in this situation, either. Apple decides what can and can’t be on its platform, and gives itself a lot of wiggle room in its terms of service. Facebook tries to pretend it’s hamstrung by its policies oftentimes, but it’s a private company just like Apple, with all the same options at its disposal. The fact that Zuckerberg says he decided the company “should move on what we know violates the policy” is a blatant admission that he could have acted sooner. It looked even worse for Dorsey to finally make a move after he personally experienced Alex Jones screaming in his face. In both cases, it demonstrated that real enforcement of policies comes at the whim of a couple of dudes.
For a while, Twitter tried to say that Jones hadn’t violated its policies either, but CNN collected numerous examples demonstrating that wasn’t true at all. These companies just don’t like making decisions on what can and can’t be said by a notable figure. It’s an understandable feeling because doing so invites criticism that they are too large, have too much power, and pose too much danger as the gatekeepers of speech. It’s the kind of predicament that’s tough to reconcile but having billions of dollars will certainly help you ignore it.
The New Yorker can spill thousands of words probing Zuckerberg’s psyche and speaking to colleagues about how he’s growing in his unprecedented role of social media Pope to 2.2 billion users, but it’s still the same Zuckerberg who would apparently rather think about scaling and “community” than real-world consequences his company might be involved in.
Facebook has been aware of its role in violence and ethnic cleansing in Myanmar since at least 2014. It entered a market that it knew little about, where traditional media to inform the public was extremely limited, and found that it had built the perfect weapon for organising mob violence and propaganda. We’ve seen similar situations in Sri Lanka, Libya, the Philippines, and India. One Sri Lankan official characterised the situation to the New York Times, “The germs are ours, but Facebook is the wind.”
But Zuckerberg keeps repeating the same talking points about being “slow” to recognise the problem and how it’s going to take time to fix it. He told the New Yorker that he plans to have 100 people working on translating and moderation in Myanmar by the end of the year. The fact that a company can connect 2 billion people in a little over a decade but can’t hire 100 people over the course of a few years is telling. But the real issue is scale, and the inability of current technology to keep up with that scale.
Time and again, Zuckerberg talks about needing time to build the right systems to address moderation issues. Last week, he posted an update to his Facebook page saying that his personal mission to “fix” Facebook will likely continue through the end of 2019. What he really means is those algorithms and machine learning systems that will apparently one day moderate content won’t be ready for a while, and that the technology sector is moving relatively slow, in general.
Facebook and Twitter like to say that you control what you see on their platforms while algorithmic choices are being made for you every second, and they want to flip that dynamic so they can say that algorithms control what content gets taken down. Unfortunately, the intense speed of technological progress has gotten the tech giants into this mess, and its sluggishness is now preventing them from excusing their way out of it.