Massachusetts could become the first state in the nation to regulate weapons attached to robots, under a bill proposed on Beacon Hill Tuesday.
The legislative proposal filed by state Representative Lindsay Sabadosa and Senator Michael Moore would ban the manufacture, sale, or operation of a robot or drone with an attached weapon. The bill would also ban the use of robots to threaten, harass, or physically restrain people.
However, the state’s ban on robots with attached weapons would not apply to the US military, defense contractors, or law enforcement bomb squads. And private companies developing antiweaponization technology, such as a robot that automatically shuts down upon detecting gunfire, could apply for case-by-case waivers from the attorney general. Violations would be punishable by fines of $5,000 to $25,000.
The proposal follows a call last year from robot developers including Boston Dynamics in Waltham for policy makers to outlaw weapons attached to autonomous or remotely controlled devices. While Boston Dynamics and its rivals do not sell robots with attached weapons, videos have cropped up online displaying devices that have been modified with attached guns. Some are made to resemble Boston Dynamics’ Spot, a dog-like robot, with an attached automatic gun — modifications the company doesn’t permit on its devices.
“These efforts arise from the instances we’ve seen of folks buying off-the-shelf robots, weaponizing them, having them walk around shooting, and then putting those videos on YouTube,” said Brendan Schulman, vice president of policy and government relations at Boston Dynamics.
Sabadosa, who represents a district centered around Northampton, said she was sensitive to the concerns raised by the robotics companies about the misuse of their products. “We’re in this moment of burgeoning technology and things are really changing,” she said. “So it was important to get something on the books as soon as possible.”
The bill is not primarily aimed at dealing with law enforcement agencies’ use of robots but does have several provisions to regulate the sector. Police could not use a robot to enter a private dwelling without a warrant except in “exigent circumstances.” And law enforcement agencies would have to disclose information about their use of advanced robotics under public records requests.
The legislation has support from industry, including trade group MassRobotics and civil rights group ACLU of Massachusetts. The bill next will be assigned to relevant legislative committees, which could hold hearings on the proposal later this year or in 2024.
“We hope this will resonate with people and we’ll do some education, and we’ll hope to move it along,” Sabadosa said.
The approach makes sense to ethicists who have studied uses of artificial intelligence.
“Since the use of drones and robots as weapons is already presumably illegal, it wouldn’t be particularly stringent, and it would be a good thing, to outlaw attaching weapons to them,” said Boston University philosophy professor Juliet Floyd.
Nir Eisikovits, director of UMass Boston’s Center for Applied Ethics, also supported the bill’s approach and warned law enforcement’s use of robots could be problematic. “There are credible concerns about bias in law enforcement even when it’s not equipped with autonomous weapons systems,” he said. “It’s hard to see an argument for allowing it to use such weapons. This is particularly true because AI systems are themselves frequently plagued by algorithmic bias problems.”
Northeastern University professor Denise Garcia, a member of the school’s Institute for Experiential Robotics, would like to see lawmakers go even further and block weaponized and automated robots even from the battlefield.
“Weaponizing drones and robots, AI-assisted or non-AI assisted, should be prohibited,” Garcia, who is also vice chair of the International Committee for Robot Arms Control, said. “The US is already the most violent country with the highest homicide rates in the developed world. Weaponizing robots and drones could make all worse.”