The next generation of drones may be able to see in the dark without GPS
- By Katherine Owens
- Sep 25, 2017
Drones provide valuable data for warfighters by acting as their eyes in inaccessible areas, using camera technology to “see” and record their surroundings. Yet, unlike the human eye, cameras don’t adjust well to darkness. That’s why a research team at the University of Zurich has developed a technology that will allow the military to operate drones in dark environments.
Drones “need to react quickly to moving objects in uneven lighting conditions. Conventional video cameras are too slow and specialized high-frame rate cameras produce too much data to process in real time,” said a spokesperson from iniLabs, an industry collaborator.
To solve this problem, the newly developed Dynamic Vision Sensor (DVS) technology mimics the human eye’s ability to focus on specific areas that have low-light intensity, according to David Scaramuzza, Director of the Robotics Perception Group at the University of Zurich.
In order to devote extra processing power to the darker frame areas, the DVS technology does not constantly update the entire image frame. Instead, it relies on stored memories of the unchanged parts of the frame and matches them with the highly-focused input from the low-light areas to complete the view.
“Instead of wastefully sending entire images at fixed frame rates, only the local pixel-level changes cause by movement in a scene are transmitted,” explained Scaramuzza.
The ability to “see” in the dark is an important component of developing non-GPS reliant drones.
“Small, low-cost unmanned aircraft rely heavily on tele-operators and GPS, “ explained JC Ledé, Program Manager for the Fast Lightweight Autonomy program at DARPA. However, GPS signals are susceptible to jamming and disruption. As a result, DARPA, industry developers, and research teams are working on alternative methods for position, timing, and navigation (PNT), including ways to “see” in the dark.
The human eye is serving as inspiration for other solutions to GPS-less navigation. For example, one approach uses two stereo camera sensors attached to the front of the drone, according to Dr. Camillo J. Taylor, part of a research team at the University of Pennsylvania. A stereo camera has multiple lenses and separate image sensors for each lens, which means that it can use triangulation algorithms to imitate the human eye’s ability to perceive distance and three-dimensionality.
According to the Zurich research team, which also received some DARPA funding, the DVS technology has not been operating alone. When installed on a drone that is equipped with a more conventional camera system, the DVS increased the quality of data collected by about 85 percent reported the team.
It has not been announced whether DVS technology will be implemented on military drones, however Scaramuzza reported that drones carrying the DVS and conventional camera combination will be applicable to future urban warfare environments.