29 — Musical NeuroPicks: a consumer-grade BCI for on-demand music streaming services

Kalaganis et al (1709.01116)

Read on 19 September 2017
#music  #brainwaves  #BCI  #machine-learning  #neural-network  #deep-learning  #signal-processing  #EEG 

This paper explores an experimental way of training music streaming services to pick music you’ll enjoy: The authors propose a machine-learning algorithm wrapped around EEG signals from a human brain while listening to streaming music. The collected data are then used to generate recommendations based on those predictions.

The authors first found literature to support particular brain regions’ correlations with particular EEG signals when listening to music. In particular, the authors used $\gamma$-band and $\beta$-band brainwaves to reflect subjective music-listening.

Next is an abstract overview of the signal-processing techniques used to convert the time-series brainwave data into an embedding of semantic information (namely, music preference). $\gamma$ and $\beta$ waves were split into high- and low-pass sub-bands, and random subsamples of this multichannel time-series data were selected from each person while listening to each song.

The paper closes by demonstrating utility to two methods of extracting music-preference from EEG data — NeuroPicks and NeuroPicksVQ, the latter of which uses Vector Quantization to normalize the algorithm across individuals to minimize the training overhead. But because the subjects were instructed not to move or change position during the experiments, this technology is still far from deployable in the wild.