Repository logo
  • English
  • العربية
  • বাংলা
  • Català
  • Čeština
  • Deutsch
  • Ελληνικά
  • Español
  • Suomi
  • Français
  • Gàidhlig
  • हिंदी
  • Magyar
  • Italiano
  • Қазақ
  • Latviešu
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Српски
  • Svenska
  • Türkçe
  • Yкраї́нська
  • Tiếng Việt
Log In
New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Scholalry Output
  3. Publications
  4. Predicting Dominant Beat Frequency from Brain Responses while Listening to Music
 
  • Details

Predicting Dominant Beat Frequency from Brain Responses while Listening to Music

Source
Proceedings 2021 IEEE International Conference on Bioinformatics and Biomedicine Bibm 2021
Date Issued
2021-01-01
Author(s)
Pandey, Pankaj
Ahmad, Nashra
Miyapuram, Krishna Prasad  
Lomas, Derek
DOI
10.1109/BIBM52615.2021.9669750
Abstract
Modern neuroscience has shown that the brain is profoundly rhythmic and that frequencies of neural rhythms are responsive to frequencies of musical rhythms. We collected Electroencephalography (EEG) response on 12 naturalistic music stimuli (songs), from 20 participants. We retrieved the tempo and its sub-harmonics from our stimuli (songs), and further used this information to predict the beats in the brain response using Machine Learning techniques. We observed a hierarchy of beats in each of the songs, with a specific beat frequency to be dominant (i.e. higher in magnitude) than others. This led us to form three groups of songs and their brain responses, with each of the groups indicating the frequency of a beat that dominated in the hierarchy of beat structure of that song. We used small segments of 1, 3 and 5 seconds of brain responses, rather than the entire song duration. We further created two sets for classification of the three groups of brain responses and utilized two spatial filtering techniques: Mean across electrodes (ME) and first principal component (PC1), and a Dense method using data from all electrodes. This was followed by feature extraction using band power. We developed univariate and multivariate models for classification to demonstrate the significance of each frequency band which represent beat frequencies. The dense method outperformed ME and PC1. Features related to eighth note generated maximum discrimination between classes. We also observed a positive correlation between window length and rate of correct prediction. Accuracy from one second to five seconds window improved significantly in both the sets. We achieved maximum accuracy of 70% and 56% accuracies for binary and ternary classification respectively, which is 20% above chance-level accuracy. Random Forest and kNN performed better than SVM. This work contributes to the growing body of knowledge to understand the underlying neural mechanism of rhythm processing in the brain.
Unpaywall
URI
https://d8.irins.org/handle/IITG2025/26396
Subjects
EEG | Entrainment | Machine Learning | Rhythm
IITGN Knowledge Repository Developed and Managed by Library

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Privacy policy
  • End User Agreement
  • Send Feedback
Repository logo COAR Notify