An emotion model for music using brain waves

College

College of Computer Studies

Department/Unit

Advance Research Institute for Informatics, Computing and Networking

Document Type

Conference Proceeding

Source Title

Proceedings of the 13th International Society for Music Information Retrieval Conference, ISMIR 2012

First Page

265

Last Page

270

Publication Date

12-1-2012

Abstract

Every person reacts differently to music. The task then is to identify a specific set of music features that have a significant effect on emotion for an individual. Previous research have used self-reported emotions or tags to annotate short segments of music using discrete labels. Our approach uses an electroencephalograph to record the subject's reaction to music. Emotion spectrum analysis method is used to analyze the electric potentials and provide continuous-valued annotations of four emotional states for different segments of the music. Music features are obtained by processing music information from the MIDI files which are separated into several segments using a windowing technique. The music features extracted are used in two separate supervised classification algorithms to build the emotion models. Classifiers have a minimum error rate of 5% predicting the emotion labels. © 2012 International Society for Music Information Retrieval.

html

Disciplines

Computer Sciences

Keywords

Electroencephalography; Music

Upload File

wf_no

This document is currently not available here.

Share

COinS