Music Emotion and Genre Recognition Toward New Affective Music Taxonomy

Jonghwa Kim and Lars Larsen

erschienen 2010 "Audio Engineering Society Convention 128", Paper# 8018


Exponentially increasing electronic music distribution creates a natural pressure for fine-grained musical metadata. On the basis of the fact that a primary motive for listening to music is its emotional effect, diversion, and the memories it awakens, we propose a novel affective music taxonomy which combines the global music genre taxonomy, e.g. Classical, Jazz, Rock/Pop, and Rap, with emotion categories such as Joy, Sadness, Anger, and Pleasure, in a complementary way. In this paper, we deal with all essential stages of automatic genre/emotion recognition system, i.e. from reasonable music data collection up to performance evaluation of various machine learning algorithms. Particularly, a novel classification scheme, called as consecutive dichotomous decomposition tree (CDDT) is presented which is specifically parametrized for multi-class classification problem with extremely high number of class, e.g. sixteen music categories in our case. The average recognition accuracy of 75% for the 16 music categories shows a realistic possibility of the affective music taxonomy we proposed.


  • BibTeX  -  (BibTeX.txt, 0 KB)