The neural activation patterns provoked in response to music listening can reveal whether a subject did or did not receive music training. In the current exploratory study, we have approached this two-group (musicians and nonmusicians) classification problem through a computational framework composed of the following steps: Acoustic features extraction; Acoustic features selection; Trigger selection; EEG signal processing; and Multivariate statistical analysis. We are particularly interested in analyzing the brain data on a global level, considering its activity registered in electroencephalogram (EEG) signals on a given time instant. Our experiment's results—with 26 volunteers (13 musicians and 13 nonmusicians) who listened the classical music Hungarian Dance No. 5 from Johannes Brahms—have shown that is possible to linearly differentiate musicians and nonmusicians with classification accuracies that range from 69.2% (test set) to 93.8% (training set), despite the limited sample sizes available. Additionally, given the whole brain vector navigation method described and implemented here, our results suggest that it is possible to highlight the most expressive and discriminant changes in the participants brain activity patterns depending on the acoustic feature extracted from the audio.
A Whole Brain EEG Analysis of Musicianship
This study was financially supported in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior – Brazil (CAPES) – Finance Code 001, and the INCT MACC, process 465586/2014-7. We would like to thank all the volunteers who took part in the experiments.
- Views Icon Views
- Share Icon Share
- Search Site
Estela Ribeiro, Carlos Eduardo Thomaz; A Whole Brain EEG Analysis of Musicianship. Music Perception 1 September 2019; 37 (1): 42–56. doi: https://doi.org/10.1525/mp.2019.37.1.42
Download citation file: