Linking structure and function in RNNs trained on many cognitive tasks

Recurrent neural networks (RNNs) are a commonly used tool to study neural computation [1]. They can be trained on neuroscience-inspired tasks and reverse-engineered to generate novel hypotheses about the underlying computational mechanisms [2,3]. However, a general understanding of how the network connectivity generates the observed network dynamics has only been established in specific cases and is largely lacking.

In our previous work, we studied this question of linking structure and function in unconstrained RNNs and developed the concept of “operative dimensions” [4]. Operative dimensions present a tool to identify in which subspace of the network connectivity a specific functional module is implemented. We studied operative dimensions in RNNs trained on neuroscience-inspired tasks [2,3] and showed that (1) individual functional modules can be related to specific subspaces of the network connectivity and (2) a large subspace of the connectivity in unconstrained, trained RNNs is irrelevant to solve the task.

Goal of this project is to apply operative dimensions to RNNs trained on many cognitive tasks [5] and study how these RNNs implement different functional modules in their network connectivity.
Recent work studied these multi-task RNNs on the level of the network function and found that different functional modules rely on distinct clusters of recurrent units [6]. Motivated by these findings, here we aim to understand if these functional modules are similarly represented in the underlying network connectivity.

Requirements for this project are programming skills (MATLAB or Python), familiarity with neural networks, and basic knowledge of linear algebra and standard data analysis methods such as principal component analysis, etc.

Dr. Renate Krause: rekrau (at)
Subject Line: “Master Thesis/Project”
Laboratory of Prof. Valerio Mante
Institute of Neuroinformatics
University of Zurich and ETH Zurich

[1] Barak, O. Recurrent neural networks as versatile tools of neuroscience research. Current opinion in neurobiology, 46, 1-6 (2013).
[2] Sussillo, D., & Barak, O. Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks. Neural computation, 25, 626-649 (2013).
[3] Mante, V., Sussillo, D., Shenoy, K. V., & Newsome, W. T. Context-dependent computation by recurrent dynamics in prefrontal cortex. Nature, 503, 78-84 (2013)
[4] Krause, R., Cook, M., Kollmorgen, S., Mante, V., & Indiveri, G. Operative dimensions in unconstrained connectivity of recurrent neural networks. NeurIPS22 (2022).
[5] Yang, G. R., Joglekar, M. R., Song, H. F., Newsome, W. T., & Wang, X. J. Task representations in neural networks trained to perform many cognitive tasks. Nature neuroscience, 22, 297-306 (2019).
[6] Driscoll, L., Shenoy, K., & Sussillo, D. Flexible multitask computation in recurrent networks utilizes shared dynamical motifs. bioRxiv, 2022-08 (2022).

© 2023 Institut für Neuroinformatik