Earlier this month the United Kingdom’s top soldier said in the future the British Army could be made up of robots that would work alongside humans in and around the frontline, While Chief of Defence, Gen. Sir Nick Carter, suggested that these robots and other autonomous platforms could fill roles that reduce the dependency on human soldiers, he didn’t actually suggest that these would be “terminator” style fighting machines.
However, Russia and China have been working to develop autonomous fighting machines, and as other nations explore the possibility of armed robots, that has been seen with alarm by groups such Human Rights Watch, which have long opposed any “killer robots.” Along with other groups and leading academics, Human Rights Watch has called for a ban on autonomous platforms that have the ability to take a human life.
Last month Human Rights Watch upped its call for a ban, and released a twenty-five-page report, “New Weapons, Proven Precedent: Elements of and Models for a Treaty on killer Robots,” which outlined the key elements for a future agreement. This included a call to maintain meaningful human control over the use of force, but also to prohibit weapons systems that operate without such control.
“International law was written for humans, not machines, and needs to be strengthened to retain meaningful human control over the use of force,” said Bonnie Docherty, senior arms researcher at Human Rights Watch, which coordinates the Campaign to Stop Killer Robots. “A new international treaty is the only effective way to prevent the delegation of life-and-death decisions to machines.”
The report was co-published with the Harvard Law School International Human Rights Clinic, where Docherty is associate director of armed conflict and civilian protection.
In the report it is suggested that fully autonomous weapons could usher in a new era of armed conflict, similar to the advent of air warfare or the proliferation of nuclear weapons. Among the concerns is that lethal autonomous weapons systems or “killer robots” could select and engage targets without meaningful human control. The report also called for ways to identify legal and policy precedent for each of the proposed threat elements.
“Killer robots present distinctive challenges but constructing a new treaty does not require starting from scratch,” Docherty added. “Existing international law and principles of artificial intelligence provide ample precedent showing that it is legally, politically, and practically possible to develop a new treaty on killer robots.”
The international, non-governmental organization suggested that past treaties could be used as a basis to determine how autonomous systems could be employed. This included the United Nations Convention on Certain Conventional Weapons (CCW), which was concluded in October 1980 and entered in service in December 1983. It sought to prohibit or restrict the use of certain conventional weapons that were considered excessively injurious or whose effects were deemed to be indiscriminate.
The convention covered landmines, booby traps, incendiary weapons, blinding laser weapons and clearance of explosive remnants of war. While lethal autonomous weapons were not included, multiple groups have called for the use of such killer robots to be addressed in a similar manner. However, a new treaty does not have to be negotiated under the auspices of the CCW Human Rights Watch noted.
As more nations move forward with the development of autonomous weapons, critics of such platforms agree that now is the time to act.
“There’s no time to waste when it comes to preventing development of fully autonomous weapons,” Docherty added. “It’s crucial for governments to begin negotiations and swiftly adopt a new international ban treaty to retain meaningful human control over the use of force.”
Peter Suciu is a Michigan-based writer who has contributed to more than four dozen magazines, newspapers and websites. He is the author of several books on military headgear including A Gallery of Military Headdress, which is available on Amazon.com.