Pankaj Pandey

and 3 more

Research in music perception and brain activity has led to the development of Music Brain-Computer Interface Systems. While previous studies have focused on aspects such as song identification, stimulus-response correlation, and inter-subject correlations, they have often overlooked the understanding of individual differences in subjective experiences. In this research, our objective is to identify listener-specific neural signatures by analyzing EEG data obtained from six naturalistic music datasets collected in the USA, Greece, and India, involving a total of 161 listeners. Our approach consists of a feature representation pipeline that decomposes the EEG signals into five primary brain waves and extracts 21 features to predict the listener’s identity. We utilize linear and non-linear features in training Random Forest classifiers to determine the most discriminating feature among listeners. The results demonstrate that neural oscillations in higher frequency bands are crucial in distinguishing subjective differences. Specifically, beta waves emerge as the most effective in predicting these differences, yielding an average accuracy of 91.03% across all datasets. Furthermore, our findings highlight the Hjorth Mobility feature as having the highest predictive ability. Additionally, we observe that the frontal region exhibits increased sensitivity in capturing unique features for person identification. Moreover, we discuss previous studies that align with our research direction, focusing on encoding individual-related information within high-frequency brain rhythms. These findings have significant implications for the field of bio-metrics in Music Brain-Computer Interface Technology, providing valuable insights into personalized user experiences and paving the way for future advancements in this exciting research area.