Video-based Tracking of groups of Songbirds

We have built a setup for measuring behavior in groups (2-8 animals) of freely behaving songbirds. The setup is equipped with three cameras that continuously monitor the animals. Our goal is to track the position, body orientation and identity (and possibly additional posture points) of the birds in the setup in 3D from the camera recordings. Recently, several deep learning based tools have been published for this purpose (1–4). While these methods work well for tracking a single animal in 2D in short videos, it is challenging to track multiple animals in 3D over exceptionally long recordings (we want to be able to continuously track the animals over multiple weeks). The task of this thesis will be to develop a labeling tool to annotate multi-camera-angle training frames, develop a data processing pipeline for the reliable tracking of the birds over long video recordings and evaluate the performance of the pipeline on video recordings from the setup. Some manual labeling of training frames will be required in the process.

References:
1. Lauer J, Zhou M, Ye S, Menegas W, Nath T, Rahman MM, et al. Multi-animal pose estimation and tracking with DeepLabCut. BioRxiv. 2021 Apr 30;
2. Pereira TD, Tabris N, Li J, Ravindranath S, Papadoyannis ES, Wang ZY, et al. SLEAP: Multi-animal pose tracking. BioRxiv. 2020 Sep 2;
3. Marks M, Qiuhan J, Sturman O, von Ziegler L, Kollmorgen S, von der Behrens W, et al. SIPEC: the deep-learning Swiss knife for behavioral data analysis. BioRxiv. 2020 Oct 26;
4. Mathis MW, Mathis A. Deep learning tools for the measurement of animal behavior in neuroscience. Curr Opin Neurobiol. 2020;60:1–11.

Requirements

• Python
• SSH/ Docker/ bash scripting/ git (can be learning during project)
• Basic Machine learning knowledge
• Plus: Computer Vision Skills

Contact

Linus Rüttimann, rlinus (at) ini.ethz.ch

© 2021 Institut für Neuroinformatik