Boston Dynamics Promises Not to Make a Robocop

Photo: Matt Winkelmeyer (Getty Images)

Boston Dynamics, the DARPA-backed robotics company known for uncomfortable videos where nearly 200-pound humanoid robots perform backflips, uncomfortable dances, and various forms of horrifyingly aggressive parkour, says it isn’t interested in weaponizing its robots.

In an open letter this week, Boston Dynamics Dynamics joined five other robotics makers in a pledge not to weaponize their advanced-mobility, general-purpose robots, or the software that makes them tick. The companies said they would carefully review their customers’ intended application for the bots “when possible” and pledged to explore features that could somehow mitigate risks. Stating the obvious, the companies wrote that weaponization of advanced robotics “raises new risks of harm and serious ethical issues,” and could harm public trust in the technology.

The robot makers went on to encourage policymakers to explore ways to promote the safe use of robots and encouraged other researchers and developers to join the pledge.

“We are convinced that the benefits for humanity of these technologies strongly outweigh the risk of misuse, and we are excited about a bright future in which humans and robots work side by side to tackle some of the world’s challenges,” the companies wrote.

This isn’t the first time Boston Robotics has tried to convince the public that its buff hunks of metal aren’t actually weapons of war. Last year the company lashed out at a group of artists who attached a paintball gun to the company’s dog-like “Spot” robot. The event, dubbed “Spot’s Rampage,” let viewers remotely control Spot and use the gun to decimate an art gallery. Boston Dynamics released a statement condemning the artist’s “provocative use” of Spot and said it condemned any portrayal of its technology “that promotes violence, harm, or intimidation.” Apparently, that doesn’t extend to portrayals of the machines standing side by side with cops or armed military forces.

“Provocative art can help push useful dialogue about the role of technology in our lives,” Boston Dynamics wrote. “This art, however, fundamentally misrepresents Spot and how it is being used to benefit our daily lives.”

This week’s open letter once again appeared to call out tinkerers modifying robots with weapon attachments. The robotics companies in the letter said there was an urgency to publish the letter in part due to public concern “caused by a small number of people who have visibly publicized their makeshift efforts to weaponize commercially available robots.”

Boston Dynamics’ version of “don’t be evil”

Boston Dynamics explicitly prohibits the use of its robots to harm other people in its Terms and Conditions of Sale. The company’s end-user license agreement instructs partners to only use its products in ways that benefit humanity or to perform jobs otherwise harmful to humans, such as inspecting incidents at petroleum facilities or collecting data from radiation zones. In other words, Boston Dynamics has a type of “don’t be evil” clause akin to Google’s.

However, those principles seem… let’s just say up for interpretation, considering law enforcement agencies and military units are already testing and using Boston Dynamics robots all around the world. In Massachusetts, for example, videos from 2019 show a Boston Dynamics Spot robot using a long-reaching claw to stealthily open a door.

So, with the sale agreement in mind, would Massachusetts police violate Boston Dynamics’ principles if they shot and killed an innocent man after Spot popped open their front door? What if Spot stores and hands out the ammunition used by an infantry rifleman during war? Or what if the New York Police Department used the robot to surveil and single out individuals it intended to stop and frisk? We may never know the answer to that last scenario since New York residents forced the NYPD to cut its $94,000 contract with Boston Dynamics short last year after videos surfaced of the robot entering a public housing project.

When asked over email whether or not its law enforcement partnerships potentially contradict its claims around harm and weaponization, a Boston Dynamics spokesperson referred Gizmodo to this blog post expanding on the company’s ethical principles. The blog post reiterates the company’s commitment to police departments, fire brigades, and the military who they say can use robots to, “help keep people out of harm’s way.”

“We also support the safe, ethical, and effective use of robots in public safety, such as Spot being used for hazardous gas detection, unexploded ordnance inspection, suspicious package investigation, search and rescue, and other hazards,” the spokesperson said.

Boston Dynamics did not directly comment on whether or not certain police uses would potentially violate its commitment to not doing harm. Similarly, Boston Dynamics would not say whether if Spot or Atlas, hypothetically, stomped a human to death, if that would constitute a violation of its principles.

But even if you take Boston Dynamics at their word for some reason that they don’t want to weaponize their machines, that commitment really doesn’t mean much once the robots leave their facilities. A pledged commitment to avoid working with law enforcement or military entities, on the other hand, could potentially go much further in preventing some of the worst harms to humans that the company appears to be so worried about. Instead, the letter reads more like an effort by the signed robotics companies to cover their own asses rather than make actual business sacrifices to avoid weaponized robots.

Update: 1:20 P.M: Added comment from Boston Dynamics.

Read More

Mack DeGeurin