AI begins ushering in an age of killer robots

The motorcycle’s growling engine was no match for the silent drone as it stalked Babenko. “Push, push more. Pedal to the metal, man,” his colleagues called out over a walkie-talkie as the drone swooped toward him. “You’re screwed, screwed!”

If the drone had been armed with explosives, and if his colleagues hadn’t disengaged the autonomous tracking, Babenko would have been killed.

Vyriy is just one of many Ukrainian companies working on a major leap forward in the weaponization of consumer technology, driven by the war with Russia. The pressure to outthink the enemy, along with huge flows of investment, donations, and government contracts, has turned Ukraine into a Silicon Valley for autonomous drones and other weaponry.

What the companies are creating is technology that makes human judgment about targeting and firing increasingly tangential. The widespread availability of off-the-shelf devices, easy-to-design software, powerful automation algorithms, and specialized artificial intelligence microchips has pushed a deadly innovation race into uncharted territory, fueling a potential era of killer robots.

The most advanced versions of the technology that allows drones and other machines to act autonomously have been made possible by deep learning, a form of AI that uses large amounts of data to identify patterns and make decisions. Deep learning has helped generate popular large language models, most prominently OpenAI’s GPT-4, but it also helps make models interpret and respond in real time to video and camera footage. That means software that once helped a drone follow a snowboarder down a mountain can now become a deadly tool.

In more than a dozen interviews with Ukrainian entrepreneurs, engineers, and military units, a picture emerged of a near future when swarms of self-guided drones can coordinate attacks and machine guns with computer vision can automatically shoot down soldiers. More outlandish creations, such as a hovering unmanned copter that wields machine guns, are also being developed.

The weapons are cruder than the slick stuff of science-fiction blockbusters, like “The Terminator” and its T-1000 liquid-metal assassin, but they are a step toward such a future. While these weapons aren’t as advanced as expensive military-grade systems made by the United States, China, and Russia, what makes the developments significant is their low cost — just thousands of dollars or less — and ready availability.

Except for the munitions, many of these weapons are built with code found online and components such as hobbyist computers.

Some US officials said they worried that the abilities could soon be used to carry out terrorist attacks.

For Ukraine, the technologies could provide an edge against Russia, which is also developing autonomous killer gadgets — or simply help it keep pace. The systems raise the stakes in an international debate about the ethical and legal ramifications of AI on the battlefield. Human rights groups and United Nations officials want to limit the use of autonomous weapons for fear that they may trigger a new global arms race that could spiral out of control.

In Ukraine, such concerns are secondary to fighting off an invader.

“We need maximum automation,” said Mykhailo Fedorov, Ukraine’s minister of digital transformation, who has led the country’s efforts to use tech startups to expand advanced fighting capabilities. “These technologies are fundamental to our victory.”

Autonomous drones like Vyriy’s have already been used in combat to hit Russian targets, according to Ukrainian officials and video verified by The New York Times. Fedorov said the government was working to fund drone companies to help them rapidly increase production.

Major questions loom about what level of automation is acceptable. For now, the drones require a pilot to lock onto a target, keeping a “human in the loop” — a phrase often invoked by policymakers and AI ethicists. Ukrainian soldiers have raised concerns about the potential for malfunctioning autonomous drones to hit their own forces. In the future, constraints on such weapons may not exist.

Ukraine has “made the logic brutally clear of why autonomous weapons have advantages,” said Stuart Russell, an AI scientist and professor at the University of California, Berkeley, who has warned about the dangers of weaponized AI. “There will be weapons of mass destruction that are cheap, scalable, and easily available in arms markets all over the world.”

Makeshift factories and labs have sprung up across Ukraine to build remote-controlled machines of all sizes, from long-range aircraft and attack boats to cheap drones — abbreviated as FPVs, for first-person view, because they are guided by a pilot wearing virtual-reality-like goggles that give a view from the drone. Many are precursors to machines that will eventually act on their own.

Efforts to automate FPV flights began last year but were slowed by setbacks building flight control software, according to Fedorov, who said those problems had been resolved. The next step was to scale the technology with more government spending, he said, adding that about 10 companies were already making autonomous drones.

“We already have systems which can be mass-produced, and they’re now extensively tested on the front lines, which means they’re already actively used,” Fedorov said.

Often, battlefield demands pull together engineers and soldiers. Oleksandr Yabchanka, a commander in Da Vinci Wolves, a battalion known for its innovation in weaponry, recalled how the need to defend the “road of life” — a route used to supply troops fighting Russians along the eastern front line in Bakhmut — had spurred invention. Imagining a solution, he posted an open request on Facebook for a computerized, remote-controlled machine gun.

In several months, Yabchanka had a working prototype from a firm called Roboneers. The gun was almost instantly helpful for his unit.

“We could sit in the trench drinking coffee and smoking cigarettes and shoot at the Russians,” he said.

This article originally appeared in The New York Times.

Read More

Paul Mozur Adam Satariano