Looking for a sharper image with sensors for unmanned platforms
Shift to high-definition sensors requires many changes
- By Terry Costlow
- Apr 06, 2012
The trends in unmanned aircraft systems (UASes) might all be summed up in one word: more. More users want more data from more UASes. And they’re also pushing for more resolution.
Over the past couple years, high-definition imagery has become more viable on these aircraft, making it cost effective to give analysts and other users high-resolution images that can be expanded to show very fine details. While video cameras are leading the way as users transition to high resolution, other sensors are also moving to high definition. That’s prompting a number of changes to UAS electronics.
New names, new contract structures alter UAS market
Budget pressure drives Air Force UAS overhaul
The decision to upgrade to high-resolution video isn’t made lightly. It’s not just the camera that has to be replaced. Everything from cabling to processors and storage systems has to be upgraded to handle the huge increase in data collected by these cameras.
“When you go to a 1920-by-1020-pixel image, pretty much the entire infrastructure needs to be replaced,” said Andrew Haylett, product and technology specialist at Curtiss-Wright Controls Defense Solutions. “You’ve got 12 times more data than you’ve been dealing with. You need an order of magnitude more storage to save it, and you’ve got more than 10 times more data to send over the same satellite link.”
The drive to high-definition video is also occurring in other video sensors. Short-wave infrared imagers (SWIR), midwave infrared and other cameras are also being enhanced to give users better images.
“Resolution is now 640 by 512 pixels for SWIR; high definition is a year or two out,” said David Strong, vice president of marketing for FLIR Government Systems division. “Midwave infrared has gone from 640 by 512 to 1280 by 720.”
When all these sensors gather high-definition imagery, processing and transferring data become critical issues. That brings additional importance to data-compression techniques. Over the past couple years, JPEG2000 and MPEG-4 have become popular techniques for compressing video images. One benefit of these newer compression techniques is that they work on a frame-by-frame basis, so problems with one frame won’t impact the next image.
Another bandwidth-reduction technique is to drop every other frame. Some users go further, dropping three of four. “If you’ve got a 60 Hz video feed, you’re getting 15 frames per second, which is still good resolution,” Haylett said.
The systems that compress data and perform some image processing must work at very high speeds without requiring a lot of power or taking up much space. Currently, field programmable gate arrays (FPGAs) are commonly used for these chores.
FPGAs can be programmed for specific jobs, so one device can process images as quickly as three or four quad-core microprocessors, saving space and reducing power consumption. However, semiconductor companies like Texas Instruments are promoting the use of multicore digital signal processors (DSP). DSP-based systems can hit low power and weight points for use on small unmanned aerial vehicles (UAVs), according to Sandeep Kumar, multicore product line manager at Texas Instruments.
Even when images are compressed, there’s a lot of data gathered by the many sensors on a UAV. Moving this data efficiently requires a lot of networking capability. As bandwidth requirements rise and the need to share data with many different users also increases, there’s a push to use more standardized technologies. Many UAVs use high-speed Gigabit Ethernet for on-board communications, making it easier to connect sensors from many different sources.
When data is sent to terrestrial stations, many users are turning to the TCP/IP technology that’s become almost ubiquitous in the commercial world. Using the protocols common to both Ethernet and the Internet makes it much simpler to share data with coalition partners and others who are authorized to see data collected by UAVs.
Though the drive for real-time transmissions makes these satellite links more important, there’s still a need to store all frames at full resolution. These images can be examined for deeper analysis, and they are also used for training.
Solid-state storage systems are also being upgraded to meet the demands of high-resolution cameras. Flash memory chip capacities continue to rise rapidly, driven in large part by the demands of consumer products like smart phones.
For example, Intel and Micron Semiconductor unveiled a 128-gigabit Gbit chip late last year. SanDisk also rolled out a device that squeezes 128 gigabits/sec on a die that’s only 170 square millimeters. These advances will let storage suppliers push capacities well beyond the four-terabyte level now provided by many storage modules.