Cognitive and Emotional Modulation via Music-Based Entrainment
The current study explores how different genres of music influence motor coordination, reaction time, and physiological arousal in healthy young adults. Using the Muse S/2 wearable device, we acquire simultaneous EEG and PPG recordings during an interactive task designed to synchronize motor actions with the rhythm of music. The task, “Tempo-Matching Motion Grid,” requires users to react to visual cues in time with musical beats, challenging their ability to entrain to dynamic auditory stimuli.
While many studies focus on passive music listening or generic attention tasks, this project introduces a closed-loop interactive paradigm where participants' behavior is actively shaped by the evolving tempo and complexity of music. By recording EEG and PPG data during performance, we aim to investigate how genre, tempo, and individual preference affect cognitive enhancement and emotional regulation, with potential applications in neuroadaptive interfaces, mental training, or personalized therapy.
To move toward a full system capable of decoding user state and adapting stimulation in real time, this project focuses first on collecting and analyzing a multimodal dataset across music genres. Future steps could involve building predictive models to identify optimal genre-task pairings for each individual.
Methodology
-
- Design an interactive “rhythm coordination task” implemented in Python or Unity: participants match taps to beats while tracking visual cues on a grid.
-
- Acquire EEG (4 channels) and PPG using the Muse headset, recording signals continuously during each task block (across 5 music genres and silence).
-
- Preprocess EEG (filtering, artifact rejection) and extract power in theta, alpha, and beta bands; analyze PPG for heart rate (HR) and heart rate variability (HRV).
-
- Compute behavioral metrics: reaction time, timing error, and coordination accuracy per genre.
-
- Correlate physiological measures (EEG/PPG) with task performance and subjective feedback (valence/arousal questionnaires).
Expected Outcomes
-
- Acquired experience in EEG/PPG acquisition and synchronization using wearable sensors (Muse S/2).
-
- Fully implemented interactive task and controlled protocol for cross-genre testing.
-
- Collect a multimodal dataset with EEG, PPG, and performance metrics under six experimental conditions.
-
- Insights into how music type and tempo modulate cognitive and emotional states during motor tasks.
Available Material
Requirements
-
- Python programming (for task, data collection, and analysis)
-
- Interest in music cognition, affective neuroscience, or biofeedback systems
Contact
Elisa Donati (elisa@ini.uzh.ch)