Program would improve 3D imaging by going flat

The Pentagon’s top research arm wants to make laser-based, 3D imaging systems significantly smaller, lighter and cheaper by changing the way they collect images. In short: get rid of the lenses, mirrors and other components of a telescope and replace them with light-emitting and -detecting semiconductor dots on a wafer-thin disc.

The result could be a truly disruptive technology that would change what’s possible with surveillance and detection systems of all sizes.

The Defense Advanced Research Projects Agency is pursuing the research under a program it calls Modular Optical Aperture Building Blocks, or MOABB, which seeks to create ultra-compact light detection and ranging (LIDAR) systems. The agency has set a Proposer’s Day for Dec. 17 for potential participants in the program, which DARPA said could go for five years with $58 million in funding.

LIDAR, which dates to the 1960s, is already used in a number of sensing and mapping applications. It works in a way similar to radar, but instead of sending and receiving reflected sound signals, it sends pulses of laser light toward a surface and measures the time it takes for them to reflect back. With high-speed pulses and fine-grained reception, it can create a high-resolution map of the area it’s measuring, as well as track movements in that area.

But as DARPA pointed out in announcing the MOABB program, the practical applications for LIDAR are limited by the size, weight and expense of the equipment. LIDAR systems use telescopes to collect their images, which means lenses and mirrors, mechanics and gears for focusing, and the internal space required to allow those components to work together. DARPA is looking for a digital alternative in for form of “flat optics.”

The program envisions an array of 10,000 semiconductor dots on a DVD-sized disc to send and receive pulses of light, creating inexpensive systems that work as well as, or better than, current LIDAR systems.

Such a system could allow aircraft or small unmanned vehicles to map an area and even see through tree cover to detect hidden threats, DARPA said. “You would be able to fly a MOABB-enabled helicopter or drone low over a lush forest canopy and be able to effectively peel back the leaves and see a sniper or a tank underneath,” program manager Joshua Conway said in DARPA’s announcement. “It could instantaneously give you the range and velocity of everything up to a football field’s distance away with the resolution of a camera.”

They also could be used, for instance, for collision avoidance for small UAVs that operate indoors, such as the systems DARPA is researching under its Fast Lightweight Autonomy program, as well as for giving robots more precise motor control and improving immersive training systems. “Every machine that interacts with the 3D world—whether it is a manufacturing robot, UAV, car, or smartphone—could have a chip- or wafer-scale LIDAR on it,” Conway said.

DARPA plans to run MOABB in three phases, with the first focusing on developing the light-emitting and –detecting cells that can be scaled into larger arrays. Phases 2 and 3 will focusing on building arrays consisting on 100 to 10,000 cells.

Advanced registration, which closes Dec. 14, is required for the Proposer’s Day. Additional information is available via DARPA’s solicitation.

About the Author

Kevin McCaney is a former editor of Defense Systems and GCN.

Defense Systems Update

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.