DW report on cyber and autonomous weapons: “Future Wars – and How to Prevent Them”

In DW’s new 45-minute report “Future Wars – and How to Prevent Them,” Germany’s Foreign Minister Heiko Maas warns that an international technological arms race is underway: “We’re right in the middle of it.”

“We’re right in the middle of it. That’s the reality we have to deal with. (…) The world must take an interest in the fact that we’re moving towards a situation with cyber or autonomous weapons where everyone can do as they please. We don’t want that.”

The DW report, premiering on DW English on June 11, uncovers how artificial intelligence is making militaries faster, smarter and more efficient. And how, on the other hand, they have the potential to destabilize the world. Worst-case scenarios include a cyber intrusion against a nuclear early-warning system or “flash wars” erupting because of interacting autonomous weapons.

“This is a race which cuts across the military and the civilian fields,” said Amandeep Singh Gill, former chair of the UN group of governmental experts on lethal autonomous weapons.

While political talks on controlling those weapons have repeatedly been stalled by major powers pursuing their individual goals, international experts from the worlds of politics, diplomacy, intelligence, academia and activism spoke to DW about how the world could become a safer place. 

When tensions in the Caucasus erupted into a war between Azerbaijan and Armenia in late 2020, autonomous weapons experts like Ulrike Franke from the European Council on Foreign Relations saw a watershed in warfare. “The really important aspect of the conflict in Nagorno-Karabakh, in my view, was the use of these loitering munitions, so-called ‘kamikaze drones’ – these pretty autonomous systems,” said Franke. Loitering munitions detect a target, fly into it and destroy it on impact with an on-board payload of explosives, hence the nickname.

“Since the conflict, you could definitely see a certain uptick in interest in loitering munitions,” said Franke. “We have seen more armed forces around the world acquiring or wanting to acquire these loitering munitions.”

Franke: “Some actors may be forced to adopt a certain level of autonomy, at least defensively, because human beings would not be able to deal with autonomous attacks as fast as would be necessary.” The critical factor of speed could even lead to “flash wars,” erupting out of nowhere, with autonomous systems reacting to each other in a rush of escalation.

Harvard Law School human rights law lecturer Bonnie Docherty is an architect of the Campaign to Stop Killer Robots, a high-profile alliance of NGOs demanding a global treaty to ban lethal autonomous weapons. “The overarching obligation of the treaty should be to maintain meaningful human control over the use of force,” she told DW. “It should be a treaty that governs all weapons operating with autonomy that choose targets and fire on them based on sensor’s inputs rather than human inputs.”

“Russia has been particularly vehement in its objections,” Docherty said. But it’s not alone. “Some of the other states developing autonomous weapon systems such as Israel, the US, UK and others have certainly been unsupportive of a new treaty.”

While Germany’s Foreign Minister has been a vocal proponent of a global ban, he does not support the Campaign to Stop Killer Robots. “We don’t reject it in substance – we’re just saying that we want others to be included,” Maas told DW. “Military powers that are technologically in a position not just to develop autonomous weapons but also to use them. We need to include them.”

Read More