Piloting ain't easy. In addition to getting an intuition of the physics of flight, a pilot has to learn how to fly by instruments and how to fly safely with other aircraft. A speck on the horizon may be another aircraft heading straight toward the pilot, at which point changing a course is necessary before the speck starts growing and it may be too late to avoid a collision. With an increasing number of airplanes in the sky, and the perspective of even more unmanned aircraft claiming a share of that space, there is a growing need for technology able to capture the basic skills of learning how to avoid aerial collisions.
The neuromorphics lab and the ECE department at Boston University are collaborating with NASA Langley to use animal visual systems as inspiration for the development of electrical circuits that use a camera to spot distant aircraft and, if necessary, adjust a flight path to avoid any accidents. Such a system could be used as an additional warning system for pilots, similar to a computer assisting a radiologist in identifying suspicious image regions on a CT scan. These systems can also be used to help pilot unmanned aerial vehicles (UAV's), completely robotic aircraft that will only be allowed to share the airspace with people once their flying abilities are comparable to human pilots.
The project's central theme leverages optic flow to allow an aircraft to understand how itself and other objects are moving through the world. Optic flow is visual motion that accompanies body movement; for example, the optic flow produced by driving toward a tunnel is the expansion of the tunnel's entrance over time. Moving in different ways produces different patterns of optic flow; moving to the right makes the whole world shift to the left, which can be distinguished from the expansion of the world experienced when moving forward. Because of this correspondence between body movement and the experience of optic flow, mathematical models can suggest how to run the equation in reverse — how to estimate where my body is moving given the optic flow that I see. While we don't know which ‘equation’ our brains use, we know that we have this capability: a person can easily drive a car in a video game where she sees the screen but lacks the kinematic sensation of having her body move with the virtual car.
Honeybees use optic flow for, among other strategies, compensating for wind when searching for flowers and for maintaining a constant height above the ground; these methods have proved equally simple and effective for slow-moving robots navigating tight spaces and for helicopter hovering (YouTube). Quickly and accurately determining optic flow from a camera, however, requires intense computing power. UAV's cannot carry heavy payloads, so the team plans to eventually condense the artificial visual system, including the optic flow component, into FPGA, a lightweight programmable hardware platform.
While some knowledge about the world can be directly programmed into the artificial visual system, recognizing distant objects as potentially dangerous may best be learned through practice. We currently use FlightGear for simulator training, but in the end there's no better training experience (nor a better way to develop a neuromorphic system) than to allow the system to test pilot an actual aircraft, the eventual goal of the project.
NASA may eventually use UAV's to automatically collect weather statistics; more generally, UAV's can be used for a variety of dangerous flight situations like firefighting. Making autonomous flight easier, however, can also encourage illicit and immoral UAV use. Drone aircraft military strikes can breach humanitarian law, and UAV surveillance by police units can erode privacy protections. Drone aircraft still require a remote human operator, but increasingly automated flight breaks new ethical ground that would be better met by proactive ethical engagement throughout society. We hope that, by creating systems that make manned and unmanned flight safer, the world may also become a safer place to live.