Master thesis on Evolving Optimized Neuromorphic Hardware Architectures
Introduction and background
Rather than learning everything from scratch when interacting with the environment, which would be highly costly, our brain takes advantage of the neural architectures that are developed during evolution. This neural architecture has small-world graphical connectivity, with neurons mostly talking to their neighbors, to reduce the energy required to send information to the further nodes. This can be thought of as a pre-optimized network topology.
Neuromorphic technologies are inspired by the workings of the brain to design more efficient and intelligent systems. They use event-driven parallel processing to reduce the power consumption of the “spiking neural networks” that are being run on it. When these systems are scaled up, multiple “neural cores” are required which will be working in parallel, and communicate to each other through “event routers”. The architecture of the chip, including the number of neurons per core, fan-in, fan-out and the number of cores, is often decided by the designer through hardware constraints.
In this project, we will search for an optimized hardware architecture, not only constrained by the hardware, but also from an algorithmic point of view, inspired from how the brain’s topology is evolved. Since small-world connectivity inherently reduces energy consumption by reducing data movement, we will use evolutionary strategies that can find an optimized small-world architecture for a class of problems such as speech recognition. Neural Architecture Search is currently a very important topic in the Machine Learning community, and we will exploit it for neuromorphic hardware optimization. This project will be a collaboration between Institute of Neuroinformatics and University of Tennessee.
Methods
Evolutionary Optimization for Neuromorphic Systems (EONS) is an automated design method for spiking neural networks for neuromorphic systems [1]. EONS uses an evolutionary algorithm approach for designing the topology and/or the parameters of a spiking neural network for neuromorphic hardware deployment. EONS can target a specific neuromorphic hardware design, or it can be used as part of a co-design process to determine characteristics of the neuromorphic architecture. Additionally, EONS can be used to define network topology and architecture requirements alongside a parameter training approach.
[1] Schuman, Catherine D., J. Parker Mitchell, Robert M. Patton, Thomas E. Potok, and James S. Plank. "Evolutionary optimization for neuromorphic systems." In Proceedings of the Neuro-inspired Computational Elements Workshop, pp. 1-9. 2020.
Requirements
- Strong programming skills in Python.
- In-depth neural network training, hyperparameter optimization and debugging skills.
Tasks involved in the project:
- Study evolutionary strategies and understand EONS code
- introduce the constraint of small-world connectivity to the EONS code
- Train a neural network with small-world connectivity on Spiking Heidelberg Digits (SHD).
Contact
If you are interested in this project for a Masters thesis, please contact Melika Payvand and Katie Schuman with melika (at) ini.uzh.ch and cschuman (at) utk.edu with your CV.