Event-based visual localization and map formation on Intel's Neuromorphic research chip Loihi

alternate text
Animals, even as small as bees (with a total of just 1M neurons), robustly navigate complex environments. This requires efficient pose and velocity estimation(odometry) and map formation. Such a capability would enable various industry and consumer applications that cannot afford large batteries for on-device GPUs or cannot send data to the cloud for external processing, such as small autonomous drones or AR-based human-machine interfaces (e.g., light-weight augmented reality (AR) glasses). However, currently, only limited real-time low-power systems for visual odometry are available as visual processing classically needs high data throughput and intensive computations.

In this project, we are working on a visual odometry algorithm for event-based cameras inspired by hyperdimensional computing and findings in neuroscience that we plan to implement on Intel’s new neuromorphic research chip Loihi. See Rebecq (2017) for a non-neuromorphic implementation of an event-based visual odometry algorithm.

Project level: Semester project, Bachelor, or Master thesis.

Literature
Rebecq, H., Gallego, G. and Scaramuzza, D. 2016. EMVS: Event-based Multi-View Stereo. In: Proceedings of the British Machine Vision Conference 2016. British Machine Vision Association, pp. 63.1–63.11. Rebecq, H., Horstschaefer, T., Gallego, G. and Scaramuzza, D. 2017. EVO: A Geometric Approach to Event-Based 6-DOF Parallel Tracking and Mapping in Real Time. IEEE robotics and automation letters 2(2), pp. 593–600. Renner, A., Evanusa, M. and Sandamirskaya, Y. 2019. Event-based attention and tracking on neuromorphic hardware. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. Sandamirskaya, Y. 2013. Dynamic neural fields as a step toward cognitive neuromorphic architectures. Frontiers in Neuroscience 7, p. 276.

Requirements

Experience in Python, experience, or strong interest in spiking neural networks, and brain-inspired algorithms.

Contact

Alpha Renner alpren (at) ini.uzh.ch