Show simple item record

dc.contributor.authorZuo, Xin
dc.contributor.authorZhang, Chi
dc.contributor.authorHämäläinen, Timo
dc.contributor.authorGao, Hanbing
dc.contributor.authorFu, Yu
dc.contributor.authorCong, Fengyu
dc.date.accessioned2022-09-19T07:15:54Z
dc.date.available2022-09-19T07:15:54Z
dc.date.issued2022
dc.identifier.citationZuo, X., Zhang, C., Hämäläinen, T., Gao, H., Fu, Y., & Cong, F. (2022). Cross-Subject Emotion Recognition Using Fused Entropy Features of EEG. <i>Entropy</i>, <i>24</i>(9), Article 1281. <a href="https://doi.org/10.3390/e24091281" target="_blank">https://doi.org/10.3390/e24091281</a>
dc.identifier.otherCONVID_156549549
dc.identifier.urihttps://jyx.jyu.fi/handle/123456789/83275
dc.description.abstractEmotion recognition based on electroencephalography (EEG) has attracted high interest in fields such as health care, user experience evaluation, and human–computer interaction (HCI), as it plays an important role in human daily life. Although various approaches have been proposed to detect emotion states in previous studies, there is still a need to further study the dynamic changes of EEG in different emotions to detect emotion states accurately. Entropy-based features have been proved to be effective in mining the complexity information in EEG in many areas. However, different entropy features vary in revealing the implicit information of EEG. To improve system reliability, in this paper, we propose a framework for EEG-based cross-subject emotion recognition using fused entropy features and a Bidirectional Long Short-term Memory (BiLSTM) network. Features including approximate entropy (AE), fuzzy entropy (FE), Rényi entropy (RE), differential entropy (DE), and multi-scale entropy (MSE) are first calculated to study dynamic emotional information. Then, we train a BiLSTM classifier with the inputs of entropy features to identify different emotions. Our results show that MSE of EEG is more efficient than other single-entropy features in recognizing emotions. The performance of BiLSTM is further improved with an accuracy of 70.05% using fused entropy features compared with that of single-type feature.en
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.publisherMDPI AG
dc.relation.ispartofseriesEntropy
dc.rightsCC BY 4.0
dc.subject.otheremotion recognition
dc.subject.otherEEG
dc.subject.otherfeature fusion
dc.subject.otherMSE
dc.subject.otherBiLSTM
dc.titleCross-Subject Emotion Recognition Using Fused Entropy Features of EEG
dc.typearticle
dc.identifier.urnURN:NBN:fi:jyu-202209194618
dc.contributor.laitosInformaatioteknologian tiedekuntafi
dc.contributor.laitosFaculty of Information Technologyen
dc.contributor.oppiaineTekniikkafi
dc.contributor.oppiaineTietotekniikkafi
dc.contributor.oppiaineSecure Communications Engineering and Signal Processingfi
dc.contributor.oppiaineEngineeringen
dc.contributor.oppiaineMathematical Information Technologyen
dc.contributor.oppiaineSecure Communications Engineering and Signal Processingen
dc.type.urihttp://purl.org/eprint/type/JournalArticle
dc.type.coarhttp://purl.org/coar/resource_type/c_2df8fbb1
dc.description.reviewstatuspeerReviewed
dc.relation.issn1099-4300
dc.relation.numberinseries9
dc.relation.volume24
dc.type.versionpublishedVersion
dc.rights.copyright© 2022 by the authors. Licensee MDPI, Basel, Switzerland.
dc.rights.accesslevelopenAccessfi
dc.subject.ysofysiologiset vaikutukset
dc.subject.ysoentropia
dc.subject.ysomittausmenetelmät
dc.subject.ysoihmisen ja tietokoneen vuorovaikutus
dc.subject.ysoaivot
dc.subject.ysoneuroverkot
dc.subject.ysoEEG
dc.subject.ysotunteet
dc.format.contentfulltext
jyx.subject.urihttp://www.yso.fi/onto/yso/p11511
jyx.subject.urihttp://www.yso.fi/onto/yso/p5009
jyx.subject.urihttp://www.yso.fi/onto/yso/p20083
jyx.subject.urihttp://www.yso.fi/onto/yso/p38007
jyx.subject.urihttp://www.yso.fi/onto/yso/p7040
jyx.subject.urihttp://www.yso.fi/onto/yso/p7292
jyx.subject.urihttp://www.yso.fi/onto/yso/p3328
jyx.subject.urihttp://www.yso.fi/onto/yso/p3485
dc.rights.urlhttps://creativecommons.org/licenses/by/4.0/
dc.relation.datasethttps://bcmi.sjtu.edu.cn/home/seed/
dc.relation.doi10.3390/e24091281
jyx.fundinginformationThis research was funded by the National Natural Science Foundation of China (no. 61703069 and 62001312), the National Foundation in China (no. JCKY2019110B009), the Fundamental Research Funds for the Central Universities (no. DUT21GF301) and the Science and Technology Planning Project of Liaoning Province (no. 2021JH1/10400049).
dc.type.okmA1


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

CC BY 4.0
Except where otherwise noted, this item's license is described as CC BY 4.0