Show simple item record

dc.contributor.authorZhou, Dongdong
dc.contributor.authorXu, Qi
dc.contributor.authorWang, Jian
dc.contributor.authorZhang, Jiacheng
dc.contributor.authorHu, Guoqiang
dc.contributor.authorKettunen, Lauri
dc.contributor.authorChang, Zheng
dc.contributor.authorCong, Fengyu
dc.date.accessioned2023-02-20T10:45:58Z
dc.date.available2023-02-20T10:45:58Z
dc.date.issued2021
dc.identifier.citationZhou, D., Xu, Q., Wang, J., Zhang, J., Hu, G., Kettunen, L., Chang, Z., & Cong, F. (2021). LightSleepNet : A Lightweight Deep Model for Rapid Sleep Stage Classification with Spectrograms. In <i>EMBC 2021 : 43rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society </i> (pp. 43-46). IEEE. Annual International Conference of the IEEE Engineering in Medicine and Biology Society. <a href="https://doi.org/10.1109/embc46164.2021.9629878" target="_blank">https://doi.org/10.1109/embc46164.2021.9629878</a>
dc.identifier.otherCONVID_102379324
dc.identifier.urihttps://jyx.jyu.fi/handle/123456789/85535
dc.description.abstractDeep learning has achieved unprecedented success in sleep stage classification tasks, which starts to pave the way for potential real-world applications. However, due to its enormous size, deployment of deep neural networks is hindered by high cost at various aspects, such as computation power, storage, network bandwidth, power consumption, and hardware complexity. For further practical applications (e.g., wearable sleep monitoring devices), there is a need for simple and compact models. In this paper, we propose a lightweight model, namely LightSleepNet, for rapid sleep stage classification based on spectrograms. Our model is assembled by a much fewer number of model parameters compared to existing ones. Furthermore, we convert the raw EEG data into spectrograms to speed up the training process. We evaluate the model performance on several public sleep datasets with different characteristics. Experimental results show that our lightweight model using spectrogram as input can achieve comparable overall accuracy and Cohen’s kappa (SHHS100: 86.7%-81.3%, Sleep-EDF: 83.7%-77.5%, Sleep-EDF-v1: 88.3%-84.5%) compared to the state-of-the-art methods on experimental datasets.en
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.publisherIEEE
dc.relation.ispartofEMBC 2021 : 43rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society
dc.relation.ispartofseriesAnnual International Conference of the IEEE Engineering in Medicine and Biology Society
dc.rightsIn Copyright
dc.subject.otherdeep learning
dc.subject.othertraining
dc.subject.otherpower demand
dc.subject.othersleep
dc.subject.othercomputational modeling
dc.subject.otherbiological system modeling
dc.subject.otherbrain modeling
dc.titleLightSleepNet : A Lightweight Deep Model for Rapid Sleep Stage Classification with Spectrograms
dc.typeconference paper
dc.identifier.urnURN:NBN:fi:jyu-202302201794
dc.contributor.laitosInformaatioteknologian tiedekuntafi
dc.contributor.laitosFaculty of Information Technologyen
dc.contributor.oppiaineTekniikkafi
dc.contributor.oppiaineTietotekniikkafi
dc.contributor.oppiaineComputing, Information Technology and Mathematicsfi
dc.contributor.oppiaineSecure Communications Engineering and Signal Processingfi
dc.contributor.oppiaineLaskennallinen tiedefi
dc.contributor.oppiaineEngineeringen
dc.contributor.oppiaineMathematical Information Technologyen
dc.contributor.oppiaineComputing, Information Technology and Mathematicsen
dc.contributor.oppiaineSecure Communications Engineering and Signal Processingen
dc.contributor.oppiaineComputational Scienceen
dc.type.urihttp://purl.org/eprint/type/ConferencePaper
dc.relation.isbn978-1-7281-1180-3
dc.type.coarhttp://purl.org/coar/resource_type/c_5794
dc.description.reviewstatuspeerReviewed
dc.format.pagerange43-46
dc.relation.issn2375-7477
dc.type.versionacceptedVersion
dc.rights.copyright© 2021, IEEE
dc.rights.accesslevelopenAccessfi
dc.type.publicationconferenceObject
dc.relation.conferenceAnnual International Conference of the IEEE Engineering in Medicine and Biology Society
dc.subject.ysomallintaminen
dc.subject.ysosyväoppiminen
dc.subject.ysounitutkimus
dc.subject.ysoneuroverkot
dc.subject.ysosignaalinkäsittely
dc.subject.ysoEEG
dc.format.contentfulltext
jyx.subject.urihttp://www.yso.fi/onto/yso/p3533
jyx.subject.urihttp://www.yso.fi/onto/yso/p39324
jyx.subject.urihttp://www.yso.fi/onto/yso/p21988
jyx.subject.urihttp://www.yso.fi/onto/yso/p7292
jyx.subject.urihttp://www.yso.fi/onto/yso/p12266
jyx.subject.urihttp://www.yso.fi/onto/yso/p3328
dc.rights.urlhttp://rightsstatements.org/page/InC/1.0/?language=en
dc.relation.doi10.1109/embc46164.2021.9629878
jyx.fundinginformationThis work was support by National Natural Science Foundation of China (Grant No.91748105), National Foundation in China (No. JCKY2019110B009, 2020-JCJQ-JJ-252), Fundamental Research Funds for Central Universities [DUT2019, DUT20LAB303] in Dalian University of Technology in China and the scholarships from China Scholarship Council (No.201806060164, No.202006060226), CAAI-Huawei MindSpore Open Fund (CAAIXSJLJJ-2020-024A).
dc.type.okmA4


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

In Copyright
Except where otherwise noted, this item's license is described as In Copyright