Killer robots: Replacing troops with machines’ll hasten war, Amnesty warns

Segun Adewole

Published 2 November 2021

Global human rights advocacy group, Amnesty International, has warned that the replacement of troops with machines will make it easier for countries to decide to go to war.

Amnesty which has begun a campaign with Stop Killer Robots said it would be an assault on humanity if machines are allowed to make life-or-death decisions.

In a statement titled, ‘Global: A critical opportunity to ban killer robots – while we still can,’ the human rights group warned that top countries in the world have developed technologies that would be disastrous when used on the battlefield.

This is as Amnesty lamented that negotiations on a new treaty to address the threat posed by killer robots are being stalled by some powerful states.

The statement read, “Amnesty International and the Stop Killer Robots campaign today unveiled a social media filter which provides a terrifying glimpse of the future of war, policing and border control. Escape the Scan, a filter for Instagram and Facebook, is part of a major campaign calling for a new international law to ban autonomous weapons systems. It uses augmented reality (AR) technology to depict aspects of weapons systems that are already in development, such as facial recognition, movement sensors, and the ability to launch attacks on ‘targets’ without meaningful human control.

“Several countries are investing heavily in the development of autonomous weapons, despite the devastating human rights implications of giving machines control over the use of force. In December, a group of UN experts will meet to decide whether to begin negotiating new international law on autonomy in weapons systems. Amnesty International and Stop Killer Robots have launched a petition calling on all governments to voice their support for negotiations.

Treaty to stop proliferation of killer robots

The statement quoted Amnesty International’s Senior Advisor on Military, Security and Policing, Verity Coyle as calling for a legally binding international treaty to stop the proliferation of killer robots before it’s too late.

“We are stumbling into a nightmare scenario, a world where drones and other advanced weapons can choose and attack targets without human control. This filter is designed to give people an idea of what killer robots could soon be capable of, and show why we must act urgently to maintain human control over the use of force,” he said

“Allowing machines to make life-or-death decisions is an assault on human dignity, and will likely result in devastating violations of the laws of war and human rights. It will also intensify the digital dehumanisation of society, reducing people to data points to be processed. We need a robust, legally binding international treaty to stop the proliferation of killer robots – before it’s too late.”

“We have had a decade of talks on autonomous weapons at the United Nations, but these are being blocked by the same states that are developing the weapons,” said Ousman Noor of the Stop Killer Robots campaign.

“The UN Secretary General, the International Committee of the Red Cross, Nobel Prize Winners, thousands of scientists, roboticists and tech workers, are all calling for a legal treaty to prevent these weapons – governments need to draw a line against machines that can choose to kill.”

Powerful states stalling talks on legal framework for weapons autonomy

On December 2, 2021, the Group of Governmental Experts to the Convention on Conventional Weapons will begin critical talks on whether to proceed with negotiations on a new treaty to address the threat posed by killer robots.

According to Amnesty, “So far 66 states have called for a new, legally binding framework on autonomy in weapons systems. But progress has been stalled by a small number of powerful states, including Russia, Israel and the US, who regard the creation of a new international law as premature.

“The replacement of troops with machines will make the decision to go to war easier. What’s more, machines can’t make complex ethical choices within the context of unpredictable battlefield or real world scenarios; there is no substitute for human decision making. We have already seen how technologies like facial, emotion, gait and vocal recognition fail to recognize women, people of colour and persons with disabilities; and how they cause immense human rights harms even when they “work”. Employing these technologies on the battlefield, in law enforcement or border control would be disastrous.

“Despite these concerns, countries including the US, China, Israel, South Korea, Russia, Australia, India, Turkey and the UK are investing heavily in the development of autonomous systems. For example, the UK is developing an unmanned drone which can fly in autonomous mode and identify a target within a programmed area. China is creating small drone “swarms” which could be programmed to attack anything that emits a body temperature, while Russia has built a robot tank which can be fitted with a machine gun or grenade launcher.”

Read More

Segun Adewole