Show simple item record

dc.contributor.authorHartmann, Martín Ariel
dc.date.accessioned2011-08-03T07:33:32Z
dc.date.available2011-08-03T07:33:32Z
dc.date.issued2011
dc.identifier.otheroai:jykdok.linneanet.fi:1181512
dc.identifier.urihttps://jyx.jyu.fi/handle/123456789/36531
dc.description.abstractAutomatic musical genre classification is an important information retrieval task since it can be applied for practical purposes such as the organization of data collections in the digital music industry. However, this task remains an open question because the current state of the art shows far from satisfactory outcomes in terms of classification performance. Moreover, the most common algorithms that are used for this task are not designed for modelling music perception. This study suggests a framework for testing different musical features for use in music genre classification and evaluates the performance of this task based on two musical descriptors. The focus of this study is on automatic classification of music into genres based on audio content. The performance of two sets of timbral descriptors, namely the sub-band fluxes and the mel-frequency cepstral coefficients, is compared. The choice of these particular descriptors is based on their ease or difficulty of interpretation from a perceptual point of view. Classification performance is determined by using a variety of music datasets, learning algorithms, feature selection approaches and combinatorial feature subsets yielded from these descriptors. The results were estimated upon overall classification accuracies, generalization capability, and relevance of these musical descriptors based on feature ranking. According to the results, the sub-band fluxes, perceptually motivated descriptors of polyphonic timbre, performed better than the widely used mel-frequency cepstral coefficients. The former timbral descriptors showed better classification accuracies and lower tendency to overfit than the latter. In a nutshell, this study gives support to using perceptually interpretable timbre desciptors for musical genre classification tasks and suggests the utilization of the sub-band flux set for further content-based tasks in the field of music information retrieval.
dc.format.extent79 s
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.rightsThis publication is copyrighted. You may download, display and print it for Your own personal use. Commercial use is prohibited.en
dc.rightsJulkaisu on tekijänoikeussäännösten alainen. Teosta voi lukea ja tulostaa henkilökohtaista käyttöä varten. Käyttö kaupallisiin tarkoituksiin on kielletty.fi
dc.subject.othermusic information retrieval
dc.subject.othermusic genre classification
dc.subject.otherpolyphonic timbre
dc.subject.otherfeature ranking
dc.titleTesting a spectral-based feature set for audio genre classification
dc.identifier.urnURN:NBN:fi:jyu-2011080311207
dc.type.dcmitypeTexten
dc.type.ontasotPro gradu -tutkielmafi
dc.type.ontasotMaster’s thesisen
dc.contributor.tiedekuntaHumanistinen tiedekuntafi
dc.contributor.tiedekuntaFaculty of Humanitiesen
dc.contributor.laitosMusiikin laitosfi
dc.contributor.laitosDepartment of Musicen
dc.contributor.yliopistoUniversity of Jyväskyläen
dc.contributor.yliopistoJyväskylän yliopistofi
dc.contributor.oppiaineMusic, Mind and Technology (maisteriohjelma)fi
dc.contributor.oppiaineMaster's Degree Programme in Music, Mind and Technologyen
dc.subject.methodmallintaminen
dc.date.updated2011-08-03T07:33:32Z
dc.rights.accesslevelopenAccessfi
dc.type.publicationmasterThesis
dc.contributor.oppiainekoodi3054
dc.subject.ysomusiikki
dc.subject.ysogenret
dc.subject.ysosähköiset palvelut
dc.subject.ysoluokitus
dc.format.contentfulltext
dc.type.okmG2


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record