Harnessing the full power of sensor fusion

The Army is refining tools and techniques that synthesize intelligence for ground operations

Sensor fusion and dissemination is all about getting left of the boom. In other words, the goal of sensor fusion is to have actionable intelligence before the detonation of an improvised explosive device rather than to the right, or after, the device has exploded.

Fusion definitions

When discussing the challenge of fusing data from myriad sources, members of Project Manager Distributed Common Ground System-Army refer to the following levels of sensor fusion.

Level 0 fusion (source preprocessing) is the initial processing accomplished at or near the sensor that organizes the collected data into a usable form for the system or person who will receive it. This is where characteristics of the collected objects are standardized and compiled.

Level 1 fusion (entity refinement) takes new inputs, validates and normalizes the new inputs, correlates them with an existing entity, and updates the knowledge about that entity. In this way, it identifies what the collector physically detects and resolves information conflicts. This fusion level reduces redundancy of reported entities, provides the last known disposition or status of an entity, and makes this information available in a database.

Level 2 fusion (situation refinement) is comprised of several separate processes. It aggregates individual entities or elements into larger entities or forces, determines how those entities are related and working together, interprets the actions of the entities, determines the larger activities and places the smaller actions in that context. It also hypothesizes what the enemy might be doing or will do and assesses the accuracy of those hypotheses. Level 2 fusion has three subcategories:

  • 2A — “How are the entities related?”
  • 2B — “What’s going on with those related entities?”
  • 2C — “What is the behavior of the entities?”

Level 3 fusion (threat refinement) interprets events and actions, determines objectives and how elements plan to operate, and predicts future actions and their potential effects on operations. Level 3 is now primarily cognitive, although automated processing can help with some elements.

Level 4 fusion (process refinement) consists of assessing the entire fusion process and related activities to improve the timeliness, relevance and accuracy of information and/or intelligence. Its review includes sensors, collectors, analysts, algorithms, information management systems and staffs.

Level 5 fusion (user refinement) comprises a set of processes that connect users to the rest of the fusion process so that they can visualize the feedback/control interface to enhance or improve these products.

Source: Program Executive Office for Intelligence, Electronic Warfare and Sensors

That challenge requires collecting raw data from dozens of sensor sources and reconciling data so multiple sensors that pick up intelligence about the same target don't indicate that the data is from multiple targets. In addition, the data needs to be structured properly and then shared in real time for uses such as targeting or transferred to a database for future uses such as convoy planning.

All those functions need to be invisible to users.

“It’s about disseminating the right information,” said Brig. Gen. Thomas Cole, the Army’s program executive officer for intelligence, electronic warfare, and sensors at Fort Monmouth, N.J. “It’s no different than when you’re driving down the road in your car and you want to know what’s going on in the road in front of you and you have some display in your car that tells you ‘traffic is congested.’ You don’t care whether that comes from a satellite or from an [unmanned aerial vehicle] or from a policeman standing in the road putting data into a handheld device. It can be from any source. And then, the more sources you have, the better picture you have on what is going on.

“The same is true of the soldier," Cole said. "The soldier doesn’t care where the data is coming from. They just want to know reliably and timely what’s going on in their immediate vicinity.”

The Army’s primary tool for sensor fusion and dissemination is the Distributed Common Ground System-Army, which is one of the eight project managers in the Program Executive Officer for Intelligence, Electronic Warfare and Sensors. DCGS-A is responsible for the collection, reconciliation, normalization and dissemination of sensor data. One of the primary challenges for the unit is dealing with the constant stream of new sensor capabilities and volume of data that they produce.

“Older UAVs have analog [electro-optical, infrared] sensors; newer sensors provide digital data of high-definition quality,” said Kam Lok, acting chief engineer at DCGS-A. “The native form of data today requires a significant increase in bandwidth to bring data from the sensor platform to the ground for processing. Storage capacity for digital information must also increase.”

A number of enhancements to DCGS are being developed to deal with those issues, both for the short and long term.

Deployment of DCGS-A Version 3 systems to brigade combat teams and battalions in Iraq and Afghanistan is 70 percent complete. By the time it is 100 percent complete in late fiscal 2010 or early fiscal 2011, the Army will be moving toward the next-generation DCGS, which it calls Mobile Basic. Scheduled for deployment in fiscal 2011 with user tests in mid-2011 and a production decision by late 2011, Mobile Basic will eventually reduce the number of vehicles carrying DCGS sensors from 27 to nine.

“The challenge is dealing with the legacy equipment and software and bringing that into Mobile Basic rather than creating everything from scratch,” said Lt. Col. Scott Hamann, product manager of DCGS-A Mobile Systems.

In the long term, the DCGS organization is looking at what it calls the Tactical Signals Intelligence Super Cloud. “We’re looking to make the DCGS global enterprise more similar to commercial counterparts like Google where they have huge server farms,” said Sam Fusaro, deputy project manager of DCGS-A. “Now the computing is done within the current systems themselves.”

An initial prototype is expected to be tested in late fiscal 2010, with large-scale testing and development not expected until after fiscal 2012.

The DCGS-A team wants to get to the point where it is not only reporting data, but also predicting actions based on the data.

“We do an excellent job of normalizing and deconflicting data and of giving that one common picture of everything going on at present,” Fusaro said. “Where we need to improve is in taking what’s going on in a current situation and, with confidence, predict what will happen based on what we’re witnessing — predictive analysis. We’re missing a technology piece...a better algorithm that simulates gray matter and how commanders think. We haven’t been able to model the cognitive process yet.”

Such predictive patterns would give soldiers sufficient agility to predict enemies' patterns, said Lt. Col. Thomas Gloor, product manager of DCGS-A Intelligence Fusion.

And that ability would help the Army get left of the boom.

About the Author

Barry Rosenberg is editor-in-chief of Defense Systems. Follow him on Twitter: @BarryDefense.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Your Name:(optional)
Your Email:(optional)
Your Location:(optional)
Comment:
Please type the letters/numbers you see above