Näytä suppeat kuvailutiedot

dc.contributor.authorHao, Xinyu
dc.contributor.authorXu, Hongming
dc.contributor.authorZhao, Nannan
dc.contributor.authorYu, Tao
dc.contributor.authorHämäläinen, Timo
dc.contributor.authorCong, Fengyu
dc.date.accessioned2024-03-21T07:32:25Z
dc.date.available2024-03-21T07:32:25Z
dc.date.issued2024
dc.identifier.citationHao, X., Xu, H., Zhao, N., Yu, T., Hämäläinen, T., & Cong, F. (2024). Predicting pathological complete response based on weakly and semi-supervised joint learning in breast cancer multi-parametric MRI. <i>Biomedical Signal Processing and Control</i>, <i>93</i>, Article 106164. <a href="https://doi.org/10.1016/j.bspc.2024.106164" target="_blank">https://doi.org/10.1016/j.bspc.2024.106164</a>
dc.identifier.otherCONVID_207426349
dc.identifier.urihttps://jyx.jyu.fi/handle/123456789/94001
dc.description.abstractNeoadjuvant chemotherapy (NAC) is the primary treatment used to reduce the tumor size in early breast cancer. Patients who achieve a pathological complete response (pCR) after NAC treatment have a significantly higher five-year survival rate. However, accurately predicting whether patients could achieve pCR remains challenging due to the limited availability of manually annotated MRI data. This study develops a weakly and semi-supervised joint learning model that integrates multi-parametric MR images to predict pCR to NAC in breast cancer patients. First, the attention-based multi-instance learning model is designed to characterize the representation of multi-parametric MR images in a weakly supervised learning setting. The Mean-Teacher learning framework is then developed to locate tumor regions for extracting radiochemical parameters in a semi-supervised learning setting. Finally, all extracted MR imaging features are fused to predict pCR to NAC. Our experiments were conducted on a cohort of 442 patients with multi-parametric MR images and NAC outcomes. The results demonstrate that our proposed model, which leverages multi-parametric MRI data, provides the AUC value of over 0.85 in predicting pCR to NAC, outperforming other comparative methods.en
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.publisherElsevier
dc.relation.ispartofseriesBiomedical Signal Processing and Control
dc.rightsCC BY-NC-ND 4.0
dc.subject.otherweakly-supervised learning
dc.subject.othersemi-supervised learning
dc.subject.otherattention mechanism
dc.subject.otherpathological complete response
dc.subject.otherbreast cancer
dc.titlePredicting pathological complete response based on weakly and semi-supervised joint learning in breast cancer multi-parametric MRI
dc.typearticle
dc.identifier.urnURN:NBN:fi:jyu-202403212544
dc.contributor.laitosInformaatioteknologian tiedekuntafi
dc.contributor.laitosFaculty of Information Technologyen
dc.type.urihttp://purl.org/eprint/type/JournalArticle
dc.type.coarhttp://purl.org/coar/resource_type/c_2df8fbb1
dc.description.reviewstatuspeerReviewed
dc.relation.issn1746-8094
dc.relation.volume93
dc.type.versionacceptedVersion
dc.rights.copyright© 2024 the Authors
dc.rights.accesslevelembargoedAccessfi
dc.subject.ysohoitomenetelmät
dc.subject.ysomagneettikuvaus
dc.subject.ysorintasyöpä
dc.subject.ysolääkehoito
dc.format.contentfulltext
jyx.subject.urihttp://www.yso.fi/onto/yso/p392
jyx.subject.urihttp://www.yso.fi/onto/yso/p12131
jyx.subject.urihttp://www.yso.fi/onto/yso/p20019
jyx.subject.urihttp://www.yso.fi/onto/yso/p10851
dc.rights.urlhttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.relation.doi10.1016/j.bspc.2024.106164
jyx.fundinginformationThis work was supported by National Natural Science Foundation of China (Grant No. 82102135), the Fundamental Research Funds for the Central Universities (Grant No. DUT22YG114, Grant No. DUT23YG130), the Natural Science Foundation of Liaoning Province (Grant No. 2022-YGJC-36) and the scholarship from the China Scholarship Council (No. 202006060060).
dc.type.okmA1


Aineistoon kuuluvat tiedostot

Thumbnail

Aineisto kuuluu seuraaviin kokoelmiin

Näytä suppeat kuvailutiedot

CC BY-NC-ND 4.0
Ellei muuten mainita, aineiston lisenssi on CC BY-NC-ND 4.0