risks
-
These 23 Principles Could Help Us Avoid An AI Apocalypse
Science fiction author Isaac Asimov famously predicted that we’ll one day have to program robots with a set of laws that protect us from our mechanical creations. But before we get there, we need rules to ensure that, at the most fundamental level, we’re developing AI responsibly and safely. At a recent gathering, a group…
-
These Are The Most Serious Catastrophic Threats Faced By Humanity
Oxford’s Global Priorities Project has compiled a list of catastrophes — both natural and self-inflicted — that could kill off 10 per cent or more of the human population. It’s a real buzzkill of a report and it says that any of these catastrophes could happen within the next five years.