Time-invariant shape recognition using tactile sensors on neuromorphic hardware (in collaboration with the University of Chicago)
This project focuses on classifying objects of various shapes using data generated from tracing shapes on a tactile sensor array. A dataset has been collected that includes shapes with varying numbers of sides, sizes, locations, trace speeds, and trace widths. The goal is to develop a system capable of recognizing object shapes independent of these characteristics, with an initial focus on size and trace speed. The work builds upon previously presented research [1], now leveraging spiking neural networks (SNNs) implemented on neuromorphic hardware, specifically the Dynapse chip [2], developed at the Institute of Neuroinformatics (INI). The system will process tactile data to spatially reproduce object shapes on-chip, enabling both shape classification and clustering based on tactile patterns.
The Dynapse chip is a neuromorphic processor designed for efficient, real-time processing of spiking neural networks. It features an event-driven architecture that mimics the behavior of biological neurons and synapses, making it particularly suitable for processing spatiotemporal data such as tactile information. The chip supports large-scale SNNs.
SNNs are a class of neural networks that encode information as discrete events, or spikes, in both time and space. Unlike traditional artificial neural networks, SNNs process data dynamically and can capture temporal dependencies inherent in tactile interactions.
In this project, the SNN will be designed to map tactile sensor inputs onto the chip, preserving the spatial structure of the tactile data to reproduce object shapes. The network will be trained to classify shapes independently of speed and size variations, using clustering mechanisms to group similar shapes together.
Methodology
-
1. The tactile sensors will be mapped spatially onto the Dynapse chip to reproduce object shapes as spiking patterns that persist over time.
-
2. A spiking neural network will be designed to process the tactile data. The network will be optimized for real-time processing on the Dynapse hardware.
-
3. The system will classify objects based on their shapes while clustering similar shapes into groups. This process will account for variations in speed and size, relying on the dynamic capabilities of the SNN.
-
4. A demonstration system will be developed to visualize the spatial reproduction of shapes and real-time classification on the Dynapse chip.
Expected Outcomes
The expected outcomes of this project include:
-
- A fully implemented SNN on the Dynapse chip
-
- A functional demo showcasing real-time tactile shape reproduction
-
- A proof of concept for using neuromorphic hardware in shape-based object classification tasks, with potential applications in robotics and prosthetics.
Available material
Requirements
Matlab programming for working with existing dataset (can be ported to Python if desired), Python programming for working with Dynapse, familiarity with signal processing and neural networks
Contacts
Elisa Donati (elisa@ini.uzh.ch), Chiara De Luca (chiaradeluca@ini.uzh.ch), Mark Iskarous (miskarous@uchicago.edu)
References
[1] Iskarous, Mark M., Zan Chaudhry, Fangjie Li, Samuel Bello, Sriramana Sankar, Ariel Slepyan, Natasha Chugh, Christopher L. Hunt, Rebecca J. Greene, and Nitish V. Thakor. "Invariant neuromorphic representations of tactile stimuli improve robustness of a real-time texture classification system." arXiv preprint arXiv:2411.17060 (2024).
[2] S. Moradi, N. Qiao, F. Stefanini, and G. Indiveri, “A Scalable Multicore Architecture With Heterogeneous Memory Structures for Dynamic Neuromorphic Asynchronous Processors (DYNAPs),” IEEE Trans. Biomed. Circuits Syst., vol. 12, no. 1, pp. 106–122, Feb. 2018, doi: 10.1109/TBCAS.2017.2759700.