Battlespace Tech

Army, startup put their chips on data fusion capability

Sensor data North Korea

Graphics chips can more quickly process the data that goes into visualizations.

The geospatial intelligence community is starting to embrace big data tools as vendors in the sector leverage the processing power in graphics chips to build ever-expanding geospatial databases.

Graphics chipmaker Nvidia Corp. has been fueling the shift away from standard CPU processing architectures as geospatial intelligence vendors assemble huge databases with data analytics capabilities. As the amount of data gathered by more sensors and other inputs grows, so too does the need for visual information and real-time analytics to make sense of it all.

GIS Federal, an enterprise cloud computing and big data startup nurtured by the U.S. Army, has among other things developed big data computational engines. Its GPU-powered database is billed as being able to crunch and fuse data from different sources to, for example, track enemy movements.

GIS Federal, based in Arlington, Va., launched a geospatial intelligence research effort with the Army to determine whether graphics chips traditionally used in high-end video games could supply the processing power needed for geospatial intelligence applications. When the project began several years ago, real-time, cloud-based access to detailed geospatial data was just getting off the ground, as were data visualization techniques that have become a mainstay of the big data era.

The company said it leveraged GPU acceleration technology from Nvidia to develop a distributed database it claims can deliver near real-time access to predictive analytics. That means potential customers like the Army could use the database, for instance, to spot an enemy target at a remote location.

The startup's chief technology officer, Nima Negahban, said the patented database uses the computational power of a multi-core GPU engine to tie "the threads that the GPU provides to the data."

The new big data tool gives geospatial intelligence analysts a deeper search capability as sensors collect more data and datasets grow larger and more diverse, Negahban said.

The Army has long sought a data fusion capability that could combine sensor data from the battlefield, make sense of it all, then distribute it where and when it is needed. Those internal efforts that began in the 1980s but got bogged down, largely due to information overload, insufficient algorithms and a lack of processing power.

Commercially developed big data and chip solutions could be moving the service closer to its data fusion goal.

Graphics processor specialist Nvidia has long pushed GPU technology as equal to or better than standard multi-core microprocessors at the heart of computational engines. Supercomputer makers have tended to agree, and have been blending GPUs and CPUs into today’s fastest machines.

The ability to process and render graphics makes GPUs well suited to geospatial intelligence applications, where fused data is presented on maps displaying real-time information. Those capabilities have been augmented by the rise of big data tools like predictive analytics that leverage evolving algorithms to allow users to sift through and make sense of a profusion of sensor and other data.

GIS Federal claims the combination of more processing power and big data analytics allows its database to "rasterize," or render, the products of data analytics as an image or a video on a map, and do it in near real-time.

Hence, the Army's long march toward a data fusion capability may be moving closer to reality simply by leveraging a combination of commercial chip and data analytics technologies. 

About the Author

George Leopold is a contributing editor for Defense Systems and author of Calculated Risk: The Supersonic Life and Times of Gus Grissom."Connect with him on Twitter at @gleopold1.

Defense Systems Update

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.