Brain Sciences have revealed that we/it lives "on the edge of chaos" exhibiting "a self-organized criticality" that is tentatively balanced between normalcy and madness. Over the course of history, humans have used various agents and activities to shape, influence and control this living-chaos, ranging from substances such as caffeine, sugar, drugs etc., to activities such as the arts (including music), social-discourse/therapy, meditation etc. Of these, music has a pervasive role in shaping our moods and helping us transition between different mental states, as well as maintain it for extended periods of time. While we have been using music and other techniques to help us control and shape the internal chaos, it is only in the last century or so that the quantitative instrumentation of this massively complex system that comprises of close to a 100-billion neurons networked into a 1000-trillion synaptic edifice has been attained. And of late, affordable, wearable neurorecording devices (i.e., EEG’s) are available on the market, thus facilitating the quantitative study of the influence of music in brain dynamics feasible on a large-scale/crowd-sourcing sense. To help come to terms with the complexity of our 1000-trillion synaptic edifice, we need to gather data on a vast scale and analyze it creatively. This paper shows that the data from a wearable EEG may be used for tracking subtle differences in the brain-state as the subject listens to variations on a musical piece. We demonstrate proof-of-concept that wearable-EEG recordings are able to differentiate the electrophysiological response to a mechanically-generated piano performance as compared to an expressive human-performed version of the same piece. As no conscious effort is required of the listener, this approach has the potential to remove the stated/revealed preference effect in the fields of biomusicology and beyond.
Keywords: Biomusicology, Wearable technology, EEG, Human-computer Interface, Music and Brain
Registration date: 2015-10-06