Weina Zhu
School of Information Science, Yunnan University, China
Jan Drewes
Center for Mind/Brain Sciences (CIMeC), Trento University, Rovereto, Italy
ABSTRACT
In traditional content-based music classification or retrieval, classic musical features such as pitch, tempo, melody, timbre and rhythm were used. On the other hand, music conveys and evokes emotions which is one of the most important features of music. However, only limited success has been obtained in studying automatic classifiers of emotion in music. In this study, we constructed a multi-feature-based representation of music emotion by means of combining several musical features. And with this representation, we used a linear classifier to determine the kind of emotion of a certain piece of music. We evaluated the classification on music pieces which were labeled according to the music plane by human subjects.
PDF References Citation
How to cite this article
Weina Zhu and Jan Drewes, 2013. Emotion-Based Music Classification. Information Technology Journal, 12: 4472-4475.
DOI: 10.3923/itj.2013.4472.4475
URL: https://scialert.net/abstract/?doi=itj.2013.4472.4475
DOI: 10.3923/itj.2013.4472.4475
URL: https://scialert.net/abstract/?doi=itj.2013.4472.4475
REFERENCES
- Chen, X.Y., F. Gao and R.Z. Lu, 2007. A new indexing method for content-based music information retrieval and its application. Comput. Eng. Appl., 43: 233-235.
CrossRef - Dalla Bella, S., I. Peretz, L. Rousseau and N. Gosselin, 2001. A developmental study of the affective value of tempo and mode in music. Cognition, 80: B1-10.
PubMedDirect Link - Geringer, J.M. and C.K. Madsen, 2003. Gradual tempo change and aesthetic responses of music majors. Int. J. Music Educ., 40: 3-15.
CrossRef - Ghias, A., J. Logan, D. Chamberlin and B.C. Smith, 1995. Query by humming-musical information retrieval in an audio database. Proceedings of the 3rd ACM International Conference on Multimedia, November 5-9, 1995, Francisco, California, pp: 231-236.
CrossRef - Hailstone, J.C., R. Omar, S.M. Henley, C. Frost, M.G. Kenward and J.D. Warren, 2009. It's not what you play, it's how you play it: Timbre affects perception of emotion in music. Q. J. Exp. Psychol. (Colchester), 62: 2141-2155.
CrossRef - Hevner, K., 1936. Experimental studies of the elements of expression in music. Am. J. Psychol., 48: 246-268.
CrossRef - Kaminska, Z. and J. Woolf, 2000. Melodic line and emotion: Cooke's theory revisited. Psychol. Music, 28: 133-153.
CrossRef - Lartillot, O. and P. Toiviainen, 2007. A matlab toolbox for musical feature extraction from audio. Proceedings of the 10th International Conference on Digital Audio Effects (DAFx-07), September 10-15, 2007, Bordeaux, France, pp: 1-8.
Direct Link - Naoko, K., N. Yuichi, S. Tetsuo, Y. Masashi and K. Kazuhiko, 2000. A practical query-by-humming system for a large music database. Proceedings of the 8th ACM International Conference on Multimedia, October 30- November 3, 2000, Los Angeles, CA., USA., pp: 333-342.
CrossRef - McNab, R.J., L.A. Smith and I.H. Witten, 1996. Signal processing for melody transcription. Proceedings of the 19th Australasian Computer Science Conference, January 31-February 2, 1996, Melbourne, Australia, pp: 301-307.
Direct Link - McNab, R.J., L.A. Smith and I.H. Witten, C.L. Henderson and S.J. Cunningham, 1996. Towards the digital music library: Tune retrieval from acoustic input. Proceedings of the 1st ACM International Conference on Digital Libraries, March 20-23, 1996, ACM, Bethesda, Maryland, USA., pp: 11-18.
CrossRef - Sonoda, T. and Y. Muraoka, 2000. A WWW-based melody retrieval system: An indexing method for a large database. Proceedings of the International Computer Music Conference, (CMC'00), Brlin, Germany, pp: 170-173.
Direct Link