Fully autonomous killer combat robots could soon be used in warfare, experts warn

Fully autonomous killer robots could soon be deployed on the battlefield after rapid advances in drone technology in Ukraine, ushering in a terrifying new age of warfare.

Military and artificial intelligence experts say the longer the war lasts, the more likely it becomes that drones will be used to identify, select and attack targets without help from humans.

Russia has already been pummelling Kyiv with Shahed-136 drones supplied by Iran, wreaking terror with the Unmanned Aerial Vehicles (UAVs) which overwhelmed air defences.

Ukraine also has semi-autonomous attack drones and counter-drone weapons endowed with AI, but the weapons are not entirely autonomous, although that chilling prospect could soon be a reality. 

Fully autonomous killer robots could soon be deployed on the battlefield after rapid advances in drone technology in Ukraine, ushering in a terrifying new age of warfare (artist’s impression)

Experts say it may be only a matter of time before either Russia or Ukraine, or both, deploy them, a revolution in military technology as profound as the introduction of the machine gun.

‘Many states are developing this technology,’ said Zachary Kallenborn, a George Mason University weapons innovation analyst. ‘Clearly, it´s not all that difficult.’

The sense of inevitability extends to activists, who have tried for years to ban killer drones but now believe they must settle for trying to restrict the weapons’ offensive use.

Ukraine’s digital transformation minister, Mykhailo Fedorov, agrees that fully autonomous killer drones are ‘a logical and inevitable next step’ in weapons development. He said Ukraine has been doing ‘a lot of R&D in this direction.’

‘I think that the potential for this is great in the next six months,’ Fedorov told The Associated Press in a recent interview.

Russia has already been pummelling Kyiv with Shahed-136 drones supplied by Iran, wreaking terror with the Unmanned Aerial Vehicles (pictured above Kyiv in October)

Ukrainian rescuers work at the site of a residential building destroyed by a Russian drone strike

Firefighters work after a drone attack on buildings in Kyiv, with fully autonomous robots a near prospect

Ukrainian Lt. Col. Yaroslav Honchar, co-founder of the combat drone innovation nonprofit Aerorozvidka, said in a recent interview near the front that human war fighters simply cannot process information and make decisions as quickly as machines.

Ukrainian military leaders currently prohibit the use of fully independent lethal weapons, although that could change, he said.

‘We have not crossed this line yet – and I say “yet” because I don’t know what will happen in the future.’ said Honchar, whose group has spearheaded drone innovation in Ukraine, converting cheap commercial drones into lethal weapons.

Russia could obtain autonomous AI from Iran or elsewhere. The long-range Shahed-136 exploding drones supplied by Iran have crippled Ukrainian power plants and terrorised civilians but are not especially smart. Iran has other drones in its evolving arsenal that it says feature AI.

Without a great deal of trouble, Ukraine could make its semi-autonomous weaponized drones fully independent in order to better survive battlefield jamming, their Western manufacturers say.

Soon, drones will be used to identify, select and attack targets without help from humans

Those drones include the U.S.-made Switchblade 600 and the Polish Warmate, which both currently require a human to choose targets over a live video feed. AI finishes the job. The drones, technically known as ‘loitering munitions,’ can hover for minutes over a target, awaiting a clean shot.

‘The technology to achieve a fully autonomous mission with Switchblade pretty much exists today,’ said Wahid Nawabi, CEO of AeroVironment, its maker. That will require a policy change – to remove the human from the decision-making loop – that he estimates is three years away.

Drones can already recognise targets such as armoured vehicles using catalogued images. 

But there is disagreement over whether the technology is reliable enough to ensure that the machines don’t err and take the lives of noncombatants.

The AP asked the defence ministries of Ukraine and Russia if they have used autonomous weapons offensively – and whether they would agree not to use them if the other side similarly agreed. Neither responded.

This undated photograph released by the Ukrainian military’s Strategic Communications Directorate shows the wreckage of what Kyiv has described as an Iranian Shahed drone downed near Kupiansk

A Switchblade 600 loitering missile drone manufactured by AeroVironment is displayed at the Eurosatory arms show in Villepinte, north of Paris

If either side were to go on the attack with full AI, it might not even be a first.

An inconclusive U.N. report suggested that killer robots debuted in Libya´s internecine conflict in 2020, when Turkish-made Kargu-2 drones in full-automatic mode killed an unspecified number of combatants.

A spokesman for STM, the manufacturer, said the report was based on ‘speculative, unverified’ information and ‘should not be taken seriously.’ He told the AP the Kargu-2 cannot attack a target until the operator tells it to do so.

Fully autonomous AI is already helping to defend Ukraine. Utah-based Fortem Technologies has supplied the Ukrainian military with drone-hunting systems that combine small radars and unmanned aerial vehicles, both powered by AI. 

The radars are designed to identify enemy drones, which the UAVs then disable by firing nets at them – all without human assistance.

The number of AI-endowed drones keeps growing. Israel has been exporting them for decades. Its radar-killing Harpy can hover over anti-aircraft radar for up to nine hours waiting for them to power up.

Ukrainian soldiers shoot a drone that appears in the sky seconds before it fired on buildings in Kyiv

Other examples include Beijing’s Blowfish-3 unmanned weaponized helicopter. Russia has been working on a nuclear-tipped underwater AI drone called the Poseidon. 

The Dutch are currently testing a ground robot with a .50-caliber machine gun.

Honchar believes Russia, whose attacks on Ukrainian civilians have shown little regard for international law, would have used killer autonomous drones by now if the Kremlin had them.

‘I don’t think they’d have any scruples,’ agreed Adam Bartosiewicz, vice president of WB Group, which makes the Warmate.

AI is a priority for Russia. President Vladimir Putin said in 2017 that whoever dominates that technology will rule the world. 

In a December 21 speech, he expressed confidence in the Russian arms industry´s ability to embed AI in war machines, stressing that ‘the most effective weapons systems are those that operate quickly and practically in an automatic mode.’

Russian officials already claim their Lancet drone can operate with full autonomy.

‘It’s not going to be easy to know if and when Russia crosses that line,’ said Gregory C. Allen, former director of strategy and policy at the Pentagon’s Joint Artificial Intelligence Centre.

AI is a priority for Russia. President Vladimir Putin said in 2017 that whoever dominates that technology will rule the world (artist’s impression)

Switching a drone from remote piloting to full autonomy might not be perceptible. To date, drones able to work in both modes have performed better when piloted by a human, Allen said.

The technology is not especially complicated, said University of California-Berkeley professor Stuart Russell, a top AI researcher. 

In the mid-2010s, colleagues he polled agreed that graduate students could, in a single term, produce an autonomous drone ‘capable of finding and killing an individual, let’s say, inside a building,’ he said.

An effort to lay international ground rules for military drones has so far been fruitless. Nine years of informal United Nations talks in Geneva made little headway, with major powers including the United States and Russia opposing a ban. 

The last session, in December, ended with no new round scheduled.

Washington policymakers say they won’t agree to a ban because rivals developing drones cannot be trusted to use them ethically.

Toby Walsh, an Australian academic who, like Russell, campaigns against killer robots, hopes to achieve a consensus on some limits, including a ban on systems that use facial recognition and other data to identify or attack individuals or categories of people.

‘If we are not careful, they are going to proliferate much more easily than nuclear weapons,’ said Walsh, author of ‘Machines Behaving Badly.’ ‘If you can get a robot to kill one person, you can get it to kill a thousand.’

Scientists also worry about AI weapons being repurposed by terrorists. In one feared scenario, the U.S. military spends hundreds of millions writing code to power killer drones. Then it gets stolen and copied, effectively giving terrorists the same weapon.

To date, the Pentagon has neither clearly defined ‘an AI-enabled autonomous weapon’ nor authorised a single such weapon for use by U.S. troops, said Allen, the former Defence Department official. Any proposed system must be approved by the chairman of the Joint Chiefs of Staff and two undersecretaries.

That’s not stopping the weapons from being developed across the U.S. Projects are underway at the Defence Advanced Research Projects Agency, military labs, academic institutions and in the private sector.

The Pentagon has emphasised using AI to augment human warriors. The Air Force is studying ways to pair pilots with drone wingmen. 

A booster of the idea, former Deputy Defence Secretary Robert O. Work, said in a report last month that it ‘would be crazy not to go to an autonomous system’ once AI-enabled systems outperform humans – a threshold that he said was crossed in 2015, when computer vision eclipsed that of humans.

Humans have already been pushed out in some defensive systems. Israel´s Iron Dome missile shield is authorised to open fire automatically, although it is said to be monitored by a person who can intervene if the system goes after the wrong target.

Multiple countries, and every branch of the U.S. military, are developing drones that can attack in deadly synchronised swarms, according to Kallenborn, the George Mason researcher.

So will future wars become a fight to the last drone?

That’s what Putin predicted in a 2017 televised chat with engineering students: ‘When one party´s drones are destroyed by drones of another, it will have no other choice but to surrender.’

Read More

Associated Jack Newman