Faster, Better, Stronger: Meet the U.S. Army’s Optionally Manned Tank

When confronted with a large group of fast-approaching, unidentified enemy armored vehicles on attack, future Army tanks will need an ability to receive and organize incoming surveillance data, identify an enemy target and take the necessary defensive measures. Perhaps a forward-operating drone captures surveillance videos of the approaching attackers, transmits the images directly to a tank engineered with artificial intelligence (AI)-enabled computing able to instantly find moments of tactical relevance in the video. That program could then identify the threat and present organized information to human decision-makers in position to counterattack.

These types of anticipated future warfare scenarios are precisely the reason why the Army expects AI to be heavily incorporated into engineering designs for its emerging new Abrams replacement, the Optionally Manned Tank. Prototyping options or design configuration options will be presented to Army decision-makers in 2023, following several years of ongoing design study, virtual assessments, simulation and conceptual exploration into what the Army’s future tank should be.

The success of a mission of this kind naturally hinges upon the speed of data collection, analysis and transmission to truncate the time necessary to complete the sensor-to-shooter cycle.

Maximizing the speed of coordinated, informed attack could clearly be identified as a major objective with the effort. Perhaps a machine, programmed to instantly bounce incoming information off of a vast existing database including a threat library, could instantly identify that approaching armored vehicles are Russian tanks. It could then operate with an immediate understanding of the kinds of weapons, sensors and threats the approaching enemy platform might present. This computer-generated information, including the results of nearly instant analysis, can inform human decision makers and put them in position to draw upon attributes unique to human decision-making and cognition to determine an optimal response. 

“We will use AI to reduce the cognitive burden, but we allow human reason and decision making to assess those items not exactly tangible where there is not a ‘1s and Os’ solution,” Brig. Gen. Ross Coffman, Director, Next-Generation Combat Vehicle Cross Functional Team, told The National Interest in an interview.

Coffman further specified certain more subjective cognitive phenomena unique to human decision-making and not calculable by mathematically engineered computer algorithms.

Elements of human experience and less calculable variables such as intuition, emotion, intention or anticipation all represent characteristics which cannot be fully replicated, captured or analyzed by machines. Machines are nonetheless much faster when it comes to data aggregation, data analysis and data transmission, there are clear limits when it comes to the analysis of less quantifiable phenomena.

“Humans are better at game theory than machines,” Coffman said.

All of these advances in autonomy and AI pertain to longstanding ethical and doctrinal questions increasingly gaining attention at the Pentagon as technology changes quickly. While there is some ongoing discussion about the possible use of AI-capable autonomous weapons destroying targets without human intervention for purely defensive purposes, the existing Department of Defense doctrine that humans must be “in-the-loop” when it comes to the use of lethal force.

“Our enemies shoot on ‘detect.’ We shoot on ‘identify.’ We must get our sensors to the point where we can identify as they detect, particularly with non line-of-sight factors, air-ground coordination, manned-unmanned teaming and shared situational awareness between platforms,” Coffman said.

Given the respective advantages known to both man and machines, many weapons developers and warfare futurists believe the optimal approach is to leverage and combine the best of both machines and humans, merging them together into an ideal, sought after blend. This concept provides the fundamental rationale for the Army’s pursuit of new levels of “manned-unmanned” teaming.

Kris Osborn is the defense editor for the National Interest. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army—Acquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Masters Degree in Comparative Literature from Columbia University.

Image: Reuters.


Read More

Kris Osborn