Human-machine teaming is at the heart of the military service's plans, leaders told lawmakers on Capitol Hill.
When discussing the future of autonomous and unmanned systems, one thing is becoming clear: military officials want desire greater teaming with humans and machines, as laid out in the Defense Department’s Third Offset Strategy. Deputy Secretary Of Defense Robert Work has said a goal of such teaming is to “help humans make better decisions faster.”
That’s going to require collaboration with industry, while still tailoring new technologies for the specific needs of the military, the three leads for the services branches in autonomous technology told lawmakers during a hearing last week.
The Air Force, for example, is seeking the right balance of human and machine teaming by increasing capable hardware and software with unique human abilities in perception, judgement and innovation, said Greg Zacharias, the service’s chief scientist.
Enabling human-autonomous teams to operate effectively in high-tempo, uncertain and complex-decision environments, Zacharias said, requires not just sensors that gather data but also “reasoning systems” that can help make sense of that data and recommend decisions.
“The overall goal here is to enable systems to react appropriately to their environment and perform situationally appropriate tasks synchronized and integrated with autonomous, human or machine systems,” he said.
And while the military can learn from commercial developments, it’s also important to note that the services operate in different environments, said Jonathan Bornstein, chief of Autonomous System Division, Vehicle Technology Directorate at the Army Research Laboratory. Bornstein noted that commercial technologies tend to operate best in structured, rather than dynamic, environments. Google’s driverless cars, for instance, operate on the structured highway systems, while the military deals with dynamic environments, “where we don’t know things in advance. We have to have organic sensing and reasoning powers on board the vehicle…there’s a distinct difference there.” But the military can still leverage from the commercial sector in more structured environments such as logistics, forward operating bases or convoy operations.
He said air and ground vehicles could serve as “wingmen,” enabling resupply and sustainment, and cognitive decision tools.
Even if commercial technologies operate in different environments, public/private collaborations is key. “On the commercial side, we’re working with the DIU X…to try and work with some of the folks that are doing some of the advanced technologies and machine learning, pattern recognition, robotics and so forth,” Zacharias said. DIU X, or Defense Innovation Unit Experimental, is the DOD outpost located in Silicon Valley https://defensesystems.com/articles/2015/04/23/dod-carter-cyber-strategy-silicon-valley.aspx aimed at increasing outreach, partnership and innovation with the nation’s top technology companies.
Appearing in his first congressional hearing since being named to the newly established post of deputy assistant secretary of the Navy for unmanned systems, Frank Kelley talked about the need for platform-independent systems, as well as the need for new technologies for communication and precision guidance underwater.
Zacharias also discussed the notion that autonomy does not just have to exist in the traditional platforms most are familiar with. People tend to think of autonomy as being in motion, given remotely piloted aircraft, Google’s cars or the Navy’s undersea unmanned systems. These systems all have sensors, such as GPS, they all have onboard smarts, such as the ability to set waypoints to go to a particular location and they all have some type of motor or locomotion system to allow it to move within its environment.
But while the sensors and motors are important, the real advances in autonomy are happening in the middle part – the onboard smarts, or the system’s brain. “So if you think about removing those onboard smarts to a ground-based system and putting them, say, in a command and control center or a planning center, then you’ve got autonomy at rest,” he said. “So many of the advances that we’re going to see in this area are may come from data feeds or other sensors or satellite imagery, but they’re going to be in these ground-based situations and they’ll have a sense part and a think part and an output part. It might be natural language generator like the Siri interface or the visualization.”