Drones over Ukraine: fears of Russian ‘killer robots’ have failed to materialise
Drones have played a starring role in Ukraine’s defence against the ongoing Russian attack. Before the invasion experts believed Russia’s own fleets of “killer robots” were likely to be a far more potent weapon, but to date they have hardly been seen.
What’s going on? Ukraine’s drone program grew from a crowd-funded group of hobbyists, who appear to know and like their technology – even if it isn’t the cutting edge. Russia, on the other hand, seems to have swarms of next-generation autonomous weapons, but generals may lack faith in the technology.
Drone vs drone
Less is known about Russia’s drones, particularly new models with artificial intelligence (AI) capabilities. Last year, the Russian Ministry of Defence announced the creation of a special AI department with its own budget, which would begin its work in December 2021.
Just before invading Ukraine, Russian forces were seen testing new “swarm” drones, as well as unmanned autonomous weapons capable of tracking and shooting down enemy aircraft. However, there is no evidence they have been used in Ukraine for that purpose.
This isn’t the first time these types of drones with lethal capability have featured on the world stage. Russia deployed “interceptor” drones to defend against hostile aircraft when it annexed Crimea in 2014; and, in 2020, Azerbaijan used drones against Armenia during the Nagorno-Karabakh conflict. And the US has committed to providing Ukraine access to its highly portable “suicide drone”, the Switchblade.
Are drones the future of warfare?
As military warfare becomes more technologically advanced than ever before, AI-powered drones are creating a new concept of power.
As far back as 2017, Russian President Vladimir Putin said the development of AI raises “colossal opportunities and threats that are difficult to predict”, warning that “the one who becomes the leader in this sphere will be the ruler of the world”.
The Russian leader predicted future wars would be fought by drones, and “when one party’s drones are destroyed by drones of another, it will have no other choice but to surrender”.
Yet since Russia invaded Ukraine, it seems to be Ukrainian drones that are being used to greatest effect – predominantly by targeting Russian logistic elements supplying fuel or ammunition to frontline forces.
Ukrainian soldiers have reportedly been using drones bought off the shelf to locate Russian military targets and to help coordinate artillery strikes. Reports have even emerged of Ukrainian soldiers jury-rigging explosives to homemade drones before flying them at Russian tanks.
Footage of drone strikes are also proving a potent information weapon, with Ukrainian soldiers uploading them to social media.
Where are Russia’s drones?
One possible reason is that drones are being held in reserve for a later escalation in the conflict. Drones can deliver chemical, biological or even nuclear weapons without endangering a human pilot – and Russia’s current strategy suggests it may not shrink from using banned weapons.
Another possible reason is logistics. Given widespread reports of Russian military vehicles breaking down, Russia may not be able to support drone operations in Ukraine.
According to RAND Institute experts, however, one of the biggest reasons may be a lack of trust in the technology.
Why is trust so important?
This produces significant problems. Researchers have long been aware of “machine bias”: the idea that we trust machines to make decisions, simply because they’re machines. Yet misplaced trust in machines – especially if they are making life-and-death decisions – can have catastrophic results.
One way to improve trust in military drones could be to limit them to simple roles. A drone acting simply as an airborne camera can’t fake what it sees, whereas a drone scanning video footage to identify targets (what the military call a “decision support system”) is far more likely to make a fatal mistake.
Another way to improve trust in drones is to refuse to arm them with lethal weapons, or program them to disarm enemy soldiers. In 2007, John Canning, a researcher at the Naval Surface Warfare Center, suggested future autonomous weapons might attack rifles or ammunition instead of attacking the human holding them.
In the age of autonomous warfare, the limit will be how far we trust machines. As lethal drones become more common and familiar, how satisfied are we that these drones will make the right decisions? To use these weapons we will need to trust them, but first we will need to make sure that trust is justified.
On a cluttered table the x-shaped frame of one drone stands among bundles of plastic propellers and sachets of minuscule screws.
Soon it will take flight with its payload: a wine bottle-sized anti-tank grenade designed to plunge on Russian armour.
Two other drones are already affixed with quad … read more
22 May 2022