Mark Zuckerberg needs to hand over the keys, a team of researchers argues. In a new paper published in the peer-reviewed journal PNAS (Proceedings of the National Academy of Sciences of the United States of America), a diverse group makes the case that the social media problem rises to the level of urgency that climate change presents. At this point, it seems ludicrous not to muster all available resources to check a system that has fomented genocidal violence, platformed an insurrection, increased vaccine resistance during a pandemic, and imperiled asylum seekers, to name a few.
Seventeen academics including misinformation researchers, tech ethicists, climate scientists, biologists, psychological theorists, and anthropologists write that we should consider the social network disaster a “crisis discipline” akin to climate change and public health. A crisis discipline requires urgent cross-discipline collaboration in order to understand and address the problem, both lab- and fieldwork, global climate modelling, mathematical predictions, and ecological models. They list a few good things about dispersed social collaboration that works (Wikipedia…) and social media’s potential (promoting “the voices of historically disenfranchised groups”). That doesn’t much resemble the actual aftermath, they remind us, where amplified misinformation and paranoia pose a serious threat in a world already facing a climate crisis, the threat of nuclear war, a pandemic, racism, hatred, famine, inequality, etc.
Tell me something I haven’t gotten from documentaries and podcasts and books and daily blogs and political squabbles, you say. So, they focus less on specific crimes and catastrophes — asking us, instead, to consider social media from an evolutionary standpoint. They liken the network to “collective behaviour,” akin to locusts devouring everything in their path, rather than hunter-gatherer bands of maybe 100 people. The leaderless structure that makes that drone-like behaviour possible are “complex systems,” such as the global economy. The potential for disaster grows exponentially along with the system: “When perturbed, complex systems tend to exhibit finite resilience followed by catastrophic, sudden, and often irreversible changes in functionality,” they write.
They argue that this shit just doesn’t work with a for-profit model that algorithmically prioritises the spread of emotion-driven content and herds people into echo chambers where noise equals attention. Because there’s little incentive for companies to share what exactly they’re doing, nor change up the model, they write: “This raises the possibility that some business models may be fundamentally incompatible with a healthy society.” In other words, unplug Facebook. “Decisions that impact the structure of society should not be guided by voices of individual stakeholders but instead by values such as nonmaleficence, benevolence, autonomy, and justice.”
Taking this to the next logical step, they suggest elevating the social media architect to a solemn respectable post, requiring something like a Hippocratic oath.
Zuckerberg isn’t making any pledge, to do anything, anytime soon. Their more immediately actionable proposal would combine behavioural science and a macro understanding of algorithmic manipulation, reasoning that we “lack the scientific framework we would need to answer even the most basic questions that technology companies and their regulators face.” It’s not that we lack case studies; the international human rights group Avaaz has developed its own tools to study billions of algorithmically-boosted instances of misinformation-sharing on Facebook, including catching the network’s failure to identify Steve Bannon’s entirely foreseeable astroturf mission.
A body of literature would at least leave fewer easy-outs for tech CEOs, who’ve weaseled out of hearings with weak apologies, a couple of promises, and inscrutable nonsense.