A “lethal” weaponised drone “hunted down” and “remotely engaged” human targets without its handlers’ say-so during a conflict in Libya last year, according to a United Nations report first covered by New Scientist this week. Whether there were any casualties remains unclear, but if confirmed, it would likely be the first recorded death carried out by an autonomous killer robot.
In March 2020, a Kargu-2 attack quadcopter, which the agency called a “lethal autonomous weapon system,” targeted retreating soldiers and convoys led by Libyan National Army’s Khalifa Haftar during a civil conflict with Libyan government forces.
“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the UN Security Council’s Panel of Experts on Libya wrote in the report.
It remains unconfirmed whether any soldiers were killed in the attack, although the UN experts imply as much. The drone, which can be directed to self-destruct on impact, was “highly effective” during the conflict in question when used in combination with unmanned combat aerial vehicles, according to the panel. The battle resulted in “significant casualties,” it continued, noting that Haftar’s forces had virtually no defence against remote aerial attacks.
The Kargu-2 is a so-called loitering drone that uses machine learning algorithms and real-time image processing to autonomously track and engage targets. According to Turkish weapons manufacturer STM, it’s specifically designed for asymmetric warfare and anti-terrorist operations and has two operating modes, autonomous and manual. Several can also be linked together to create a swarm of kamikaze drones.
Zachary Kallenborn, a research affiliate with the Unconventional Weapons and Technology Division of the National Consortium for the Study of Terrorism and Responses to Terrorism, said this incident could mark a terrifying turning point in global warfare. Writing for the Bulletin of the Atomic Scientists, he called the Kargu-2’s deployment “a new chapter in autonomous weapons, one in which they are used to fight and kill human beings based on artificial intelligence.”
Meaning you can add “flying killer robots” to your list of plausible fears that science fiction predicted.
Several human rights watchdogs and non-governmental organisations have petitioned for a global ban on lethal autonomous weapons systems. However, a coalition of UN members, including the U.S., has fiercely argued that preemptive legal regulations aren’t necessary given our current technology’s limitations, effectively stalling any progress on the issue.