Neuromorphic Compiler: Mapping a learned connectivity matrix onto small-world graphical hardware

Motivation


As Artificial Intelligence (AI) is becoming an increasing part of our daily lives, “Edge Computing” is emerging as the solution to reduce the power footprint of running neural network models, by removing the need to send the sensory data to remote computing units. Custom AI accelerators are at the forefront of edge computing solutions for processing sensory information locally. Therefore, the network models should be able to run on the AI accelerator hardware. However, this is not trivial, as the trained networks do not take into account the specific underlying architecture of the hardware. The objective of this master’s thesis is to come up with a graphical embedding algorithm for mapping a trained weight matrix on available hardware architecture.

Project Description


Background


Recently, we have proposed a novel neuromorphic hardware, the Mosaic, which is inspired by the local connectivity structure of the cortex . The Mosaic is a spiking recurrent neural network accelerator using scalable emergent memory technologies i.e., Resistive Random Access Memory (RRAM). The neural network architecture has a specific connection pattern featuring the small-world property which makes it extremely area and energy efficient with additional computational benefits.
In order to use the Mosaic we need a method to allocate neurons to the network imposed by the Mosaic architecture. At the mathematical level, this problem is closely related to graph isomorphisms and thus we will look at graph theory tools to solve it.

Basic definitions


Neural Network: A graph with neurons as its nodes and edges as synapses
Spatial Embedding: A list of points in 2D space where every point corresponds to one neuron
Mosaic Architecture: A model of the Mosaic, with neuron-cores containing neurons and routing-cores which connect neurons across cores.
Mosaic Neuron Embedding: A map that lists the neurons with their neuron-cores.
Mosaic Connectivity Embedding: A map that lists the synapses with their routing-cores.

Approach


The Mosaic architecture consists of neuron-cores containing the neurons and routing-cores that implement the synapses. Within each neuron core the routing costs are negligible, so the first step will be to cluster the neurons so that as many connections as possible are within the neuron-core.
Then we need to decide the position of the neuron-cores in the Mosaic. The key insight is that since the Mosaic is built on a 2D surface, it has an inherent 2D structure, meaning the further away the neuron-cores, the more routing is needed. We will leverage this notion by using geometric embeddings , a recent tool in graph theory where every node in a graph is given a spatial coordinate that preserves distances. In those embeddings, two connected nodes will be close in the embedding space, while disconnected nodes will be far apart.
The thesis will therefore have three main goals. First, to generate a set-up where a Mosaic embedding can be evaluated in terms of feasibility and the amount of routing it requires. Second, to provide a heuristic to map a given neural network into the Mosaic and evaluate it using the previous set-up. Finally, we will propose a new training algorithm for neural networks where the embedding and its evaluation will be part of the training, so that the trained network is guaranteed to fit into Mosaic.

Requirements


• Strong programming skills in Python.
• Familiarity with graph theory.
• Background in the following fields is appreciated: network embedding, neuromorphic architectures or machine learning.

Tasks involved in this project


Task 1: Evaluation framework


• Create a graph representing the Mosaic architecture.
• For a given Mosaic Neuron Embedding, estimate how much routing will be required.
• For a given Mosaic Connectivity Embedding, check that all routing-cores have enough capacity.

Task 2: Generate an embedding algorithm


• Import trained neural networks and turn them into a graph
• Test different balanced clustering methods on the graph
• Embed the network of clusters into a 2D space
• Using 2D embedding, determine which cluster corresponds to which neuron-core in the Mosaic Architecture
• For a given Mosaic Neuron Embedding, generate a Mosaic Connectivity Embedding

Task 3: Architecture-aware training


• Familiarize yourself with the existing method to train small-world neural networks
• Define a loss function that penalizes synapses based on the routing cost
• Design an iterative algorithm that trains a network, embeds it into Mosaic, and repeats

Contact


If you are interested in this project for a Masters thesis, please contact Melika Payvand and Pau Vilimelis Aceituno with {melika, pau}@ini.uzh.ch.

References


[1] Peng Cui, Xiao Wang, Jian Pei, and Wenwu Zhu. A survey on network embedding. IEEE transactions on knowledge and data engineering, 31(5):833–852, 2018.
[2] Dalgaty*, Thomas, Moro*, Filippo, Demirag*, Yigit, De Pra, Alessio, Indiveri, Giacomo, Vianello, Elisa, and Payvand, Melika. The neuromorphic mosaic: re-configurable in-memory small-world graphs. 2021.

© 2022 Institut für Neuroinformatik