Fully Neuromorphic Vision and Control for Autonomous Drone Flight

Fully Neuromorphic Visual and Control Autonomous Aerial Vehicle

Automated Control Pipeline

Background and Research Motivation

Over the past decade, deep artificial neural networks (ANNs) have made significant advancements in the field of artificial intelligence, particularly in visual processing. However, these advanced visual processing technologies, despite achieving high accuracy, often require substantial and energy-intensive computational resources, making them challenging to apply in resource-constrained scenarios such as small flying robots.

To address this issue, neuromorphic hardware achieves more efficient perception and processing capabilities by mimicking the sparse, asynchronous characteristics of the biological brain. In the field of robotics, event-driven cameras and spiking neural networks (SNNs) within neuromorphic hardware possess potential advantages such as low latency and low power consumption. However, the limitations of current embedded neuromorphic processors and the challenges of training spiking neural networks have primarily restricted these technologies to low-dimensional perception and action tasks.

To solve these problems, this paper presents a fully neuromorphic visual-to-control pipeline for controlling drones in flight. Specifically, we trained a spiking neural network that directly accepts raw data from event-driven cameras and outputs low-level control actions to achieve visual autonomous flight.

Source of Research

This paper was co-authored by F. Paredes-Vallés, J. J. Hagenaars, J. Dupeyroux, etc., from the Micro Aerial Vehicle Laboratory at the Faculty of Aerospace Engineering, Delft University of Technology, Netherlands. It was published in “Science Robotics,” reference number SCI. ROBOT. 9, EADI0591 (2024), with a publication date of May 15, 2024.

Research Process

(a) Research Workflow

The research includes several steps, detailed as follows:

  1. Data Collection and Neural Network Training:

    • A DVS 240 event-driven camera was used to capture real event data. The camera captured static, texture-rich planes below for training and evaluating the neural network.
    • A spiking neural network with five layers and 28,800 neurons was trained using a self-supervised learning method, with adjustments accomplished via backpropagation and time-reversed propagation algorithms, assessed within a simulator.
    • The network was trained using real event data in the simulator, mapping raw events to self-motion estimates.
  2. Neural Network Structure and Implementation:

    • The structure includes an input layer, three auto-regressive encoders, a pooling layer with a total of 7,200 neurons, and 506,400 synapses. The network processes fields of interest (ROIs) independently across 16x16 pixels, providing optical flow estimates for each ROI.
    • The control part was trained using evolutionary algorithms to learn a linear decoding layer that decodes optical flow information output by the visual network into flight control commands.
  3. Experiments and Validation:

    • The neural network was implemented on Intel’s Loihi neuromorphic processor, running at a frequency of 200 Hz with an idle power consumption of 0.94 Watts, increasing by 7 to 12 milliwatts during operation.
    • Experiments show that the network can accurately control the self-motion of drones, achieving tasks such as hovering, landing, and lateral maneuvers, maintaining stability even during yaw occurrences.

(b) Core Experimental Results

  1. Visual Part Results:

    • Utilizing a self-contrast maximization framework, the network successfully achieved accurate optical flow estimation by correcting events from the event camera.
    • The neural network captured motion information from the input event stream, maintaining accuracy even during rapid rotations (about 4 radians/second).
  2. Control Part Results:

    • The control part was trained and validated in simulation, with performance in real environments matching expected results.
    • Despite hardware limitations and model simplifications, experiments demonstrated that the neural network could enable the drone to perform self-motion operations such as horizontal flight and vertical landing accurately.
  3. Energy Consumption and Efficiency:

    • Trials of Loihi’s operating power consumption under various sequences indicate significant efficiency advantages for neuromorphic hardware when processing sparse event inputs, consuming less power than Jetson Nano in a 10-Watt mode.
    • Though Loihi’s primary power consumption is in idle state, its overall energy consumption remains much lower than GPU platforms.

© Research Conclusion

This paper demonstrates an effective fully-neuromorphic visual-to-control pipeline, highlighting the potential of neuromorphic hardware in achieving low-latency, low-power autonomous flight control. Experiments suggest that neuromorphic processing can run complex deep neural networks on small drones, approaching the agility and flexibility of flying animals such as insects.

Future research can further optimize the input/output bandwidth and interfaces of neuromorphic processors, enhancing visual processing and control performance in real-world applications. Eventually, transitioning to mixed-signal hardware may lead to greater efficiency improvements, though accompanied by significant development and deployment challenges.

(d) Research Highlights

  • Innovativeness: This study is the first to demonstrate a complete neuromorphic visual-to-control pipeline enabling autonomous flight of a drone.
  • Practicality: Experiments demonstrate successful simulation-to-reality translation in actual environments, proving the practical application potential of this technology.
  • Energy Efficiency: Compared to traditional embedded GPUs, neuromorphic processing shows excellent performance in terms of energy efficiency and speed, especially suitable for resource-constrained small flying vehicles.

(e) Other Valuable Information

Apart from the above experimental results and research significance, the paper also details each experimental step and method, providing clear implementation processes and reproducibility, which serve as valuable references for subsequent research and practical applications.

Through detailed data analysis and experimental results, this paper establishes a solid foundation for exploring the potential of neuromorphic hardware in the field of autonomous navigation for small robots and paves the way for further technological advancements and practical applications!