A Brief History Of Stephen Hawking Being A Bummer

A Brief History Of Stephen Hawking Being A Bummer

Stephen Hawking is at it again, saying it’s a “near certainty” that a self-inflicted disaster will befall humanity within the next thousand years or so. It’s not the first time the world’s most famous physicist has raised the alarm on the apocalypse, and he’s starting to become a real downer. Here are some of the other times Hawking has said the end is nigh — and why he needs to start changing his message.

Speaking to the Radio Times recently ahead of his BBC Reith Lecture, Hawking said that ongoing developments in science and technology are poised to create “new ways things can go wrong”. The scientist pointed to nuclear war, global warming and genetically-engineering viruses as some of the most serious culprits.

“Although the chance of a disaster on planet Earth in a given year may be quite low, it adds up over time, becoming a near certainty in the next thousand or ten thousand years,” he was quoted as saying. “By that time we should have spread out into space, and to other stars, so it would not mean the end of the human race. However, we will not establish self-sustaining colonies in space for at least the next hundred years, so we have to be very careful in this period.”

Hawking is starting to sound like a broken record. Sure, he claims to be an optimist about humanity’s ingenuity in coming up with ways to control the dangers. But he has no problem coming up with all these ominously specific, horrible things that could happen to us in the future. He’s not wrong to highlight these risks — but in terms of what we’re actually supposed to do about, his answers are frustratingly simplistic and opaque, in sharp contrast to his predictions of doom.

A Brief History Of Stephen Hawking Being A Bummer

Hawking’s warnings go back at least a decade. In 2006, he posted a question online asking,

In a world that is in chaos politically, socially and environmentally, how can the human race sustain another 100 years?

The comment touched a nerve, prompting more than 25,000 people to chime in with their personal opinions. A number of people expressed their disappointment with Hawking for failing to answer his own question. As one respondent wrote, “It is humbling to know that this question was asked by one of the most intelligent humans on the planet … without already knowing a clear answer.” To clarify, Hawking later wrote, “I don’t know the answer. That is why I asked the question.”

The following year, Hawking warned the audience at a news conference in Hong Kong that “life on Earth is at the ever-increasing risk of being wiped out by a disaster, such as sudden global nuclear war, a genetically engineered virus or other dangers we have not yet thought of”.

Some of Hawking’s biggest concerns have to do with AI, which he says could be “our worst mistake in history”. In 2014, Hawking, along with physicists Max Tegmark and Frank Wilczek, described the potential benefits of AI as being huge, but said we cannot predict what will happen once this power is magnified. As the scientists wrote:

One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.

Last year, Hawking pushed for responsible AI oversight, and he signed an open letter calling for a ban on “offensive autonomous weapons beyond meaningful human control”.

A Brief History Of Stephen Hawking Being A Bummer

But not all of the dangers cited by Hawking are home grown. In addition to asteroids and giant comets, Hawking has said we also need to worry about an alien invasion. As he told the London’s Sunday Times back in 2010:

We only have to look at ourselves to see how intelligent life might develop into something we wouldn’t want to meet. I imagine they might exist in massive ships, having used up all the resources from their home planet. Such advanced aliens would perhaps become nomads, looking to conquer and colonize whatever planets they can reach….. If aliens ever visit us, I think the outcome would be much as when Christopher Columbus first landed in America, which didn’t turn out very well for the Native Americans.

It’s evident from this and other quotes that Hawking has a particularly grim view of humanity. In the book Stephen Hawking: His Life and Work, he argued that computer viruses should be considered a new form of life: “Maybe it says something about human nature, that the only form of life we have created so far is purely destructive. Talk about creating life in our own image.”

And as Hawking likes to stress, we need to flee the sinking ship. To guarantee our long term prospects, he has argued time and time again that we need to get off this planet and start colonizing other worlds, saying “we have no future if we don’t go into space.”

To be fair, Hawking is the world’s most famous scientist, so anything he says is bound to get extra media attention and scrutiny. And his ideas haven’t been emerging from a vacuum (or a black hole, for that matter). Over the past 15 years, an increasing number of European scientists — many of them based in the UK — have become concerned about so-called “existential risks”. While once the ruminations of alarmist Chicken Littles, the subject has now crept into academia and formal institutions.

Oxford philosopher Nick Bostrom kicked it all off in 2002 with his highly influential paper, “Existential Risks: Analysing Human Extinction Scenarios.” Bostrom argued that accelerating technological progress is shifting our species into a dangerous — and potentially insurmountable — new phase, with emerging threats that “could cause our extinction or destroy the potential of Earth-originating life”. Since the paper’s publication, the term “existential risks” has steadily come into general use.

Sir Martin Rees giving a TED talk: Can we prevent the end of the world?

In 2003, esteemed physicist Sir Martin Rees published a book on the topic: Our Final Hour: A scientist’s warning: How terror, error, and environmental disaster threaten humankind’s future in this century-on earth and beyond. Another influential book came in 2008, Global Catastrophic Risks, which was edited by Bostrom and Milan M. Cirkovic.

In Britain, the potential for existential risks is being studied by philosophers, scientists and futurists at Oxford’s Future of Humanity Institute, and at the University of Cambridge’s newly minted Centre for the Study of Existential Risk. The subject hasn’t really gained much traction elsewhere, though it is a concern of the US-based Institute for Ethics and Emerging Technologies.

So Hawking is clearly not alone. But he happens to occupy a unique position from where he can proclaim his warnings about the future and potentially influence our response. The problem is in having such a highly respected figure continually spout these grim proclamations without also offering viable solutions. By doing so, he’s perpetuating a defeatist attitude, and even a certain degree of misanthropy.

Clearly, it’s important to get the word out before it’s too late, but it’s not enough to just be the bearer of bad news. Moving forward, let’s hope he can use that big brain of his to come up with something more productive.
[Telegraph [BBC]]
Top image: Illustration by Jim Cooke, photo by AP; middle image: Lwp Kommunikáció


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.