Show simple item record

dc.contributor.authorDan Liuen_US
dc.contributor.authorLie Luen_US
dc.contributor.authorHong-Jiang Zhangen_US
dc.contributor.editorHolger H. Hoosen_US
dc.contributor.editorDavid Bainbridgeen_US
dc.date.accessioned2004-10-21T04:26:24Z
dc.date.available2004-10-21T04:26:24Z
dc.date.issued2003-10-26en_US
dc.identifier.isbn0-9746194-0-Xen_US
dc.identifier.urihttp://jhir.library.jhu.edu/handle/1774.2/14
dc.description.abstractMusic mood describes the inherent emotional meaning of a music clip. It is helpful in music understanding and music search and some music-related applications. In this paper, a hierarchical framework is presented to automate the task of mood detection from acoustic music data, by following some music psychological theories in western cultures. Three feature sets, intensity, timbre and rhythm, are extracted to represent the characteristics of a music clip. Moreover, a mood tracking approach is also presented for a whole piece of music. Experimental evaluations indicate that the proposed algorithms produce satisfactory results.en_US
dc.format.extent72847 bytes
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.publisherJohns Hopkins Universityen_US
dc.subjectPerception and Cognitionen_US
dc.subjectMusic Analysisen_US
dc.titleAutomatic Mood Detection from Acoustic Music Dataen_US
dc.typeArticleen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record