Näytä suppeat kuvailutiedot

dc.contributor.authorVedernikov, Alexander
dc.contributor.authorSun, Zhaodong
dc.contributor.authorKykyri, Virpi-Liisa
dc.contributor.authorPohjola, Mikko
dc.contributor.authorNokia, Miriam
dc.contributor.authorLi, Xiaobai
dc.date.accessioned2024-12-19T07:23:52Z
dc.date.available2024-12-19T07:23:52Z
dc.date.issued2024
dc.identifier.citationVedernikov, A., Sun, Z., Kykyri, V.-L., Pohjola, M., Nokia, M., & Li, X. (2024). Analyzing Participants’ Engagement during Online Meetings Using Unsupervised Remote Photoplethysmography with Behavioral Features. In <i>2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)</i> (pp. 389-399). IEEE. IEEE Computer Society Conference on Computer Vision and Pattern Recognition workshops. <a href="https://doi.org/10.1109/cvprw63382.2024.00044" target="_blank">https://doi.org/10.1109/cvprw63382.2024.00044</a>
dc.identifier.otherCONVID_243254305
dc.identifier.urihttps://jyx.jyu.fi/handle/123456789/99086
dc.description.abstractEngagement measurement finds application in healthcare, education, services. The use of physiological and behavioral features is viable, but the impracticality of traditional physiological measurement arises due to the need for contact sensors. We demonstrate the feasibility of unsupervised remote photoplethysmography (rPPG) as an alternative for contact sensors in deriving heart rate variability (HRV) features, then fusing these with behavioral features to measure engagement in online group meetings. Firstly, a unique Engagement Dataset of online interactions among social workers is collected with granular engagement labels, offering insight into virtual meeting dynamics. Secondly, a pre-trained rPPG model is customized to reconstruct rPPG signals from video meetings in an unsupervised manner, enabling the calculation of HRV features. Thirdly, the feasibility of estimating engagement from HRV features using short observation windows, with a notable enhancement when using longer observation windows of two to four minutes, is demonstrated. Fourthly, the effectiveness of behavioral cues is evaluated when fused with physiological data, which further enhances engagement estimation performance. An accuracy of 94% is achieved when only HRV features are used, eliminating the need for contact sensors or ground truth signals; use of behavioral cues raises the accuracy to 96%. Facial analysis offers precise engagement measurement, beneficial for future applications.en
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.publisherIEEE
dc.relation.ispartof2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
dc.relation.ispartofseriesIEEE Computer Society Conference on Computer Vision and Pattern Recognition workshops
dc.rightsIn Copyright
dc.subject.otheraccuracy
dc.subject.otheratmospheric measurements
dc.subject.otherestimation
dc.subject.otherweb conferencing
dc.subject.othersensor phenomena and characterization
dc.subject.otherphotoplethysmography
dc.subject.otherparticle measurements
dc.titleAnalyzing Participants’ Engagement during Online Meetings Using Unsupervised Remote Photoplethysmography with Behavioral Features
dc.typeconference paper
dc.identifier.urnURN:NBN:fi:jyu-202412197897
dc.contributor.laitosPsykologian laitosfi
dc.contributor.laitosDepartment of Psychologyen
dc.type.urihttp://purl.org/eprint/type/ConferencePaper
dc.relation.isbn979-8-3503-6548-1
dc.type.coarhttp://purl.org/coar/resource_type/c_5794
dc.description.reviewstatuspeerReviewed
dc.format.pagerange389-399
dc.relation.issn2160-7508
dc.type.versionacceptedVersion
dc.rights.copyright© IEEE
dc.rights.accesslevelembargoedAccessfi
dc.type.publicationconferenceObject
dc.relation.conferenceIEEE/CVF Computer Society Conference on Computer Vision and Pattern Recognition Workshops
dc.subject.ysofotopletysmografia
dc.subject.ysomittausmenetelmät
dc.subject.ysoosallistujat
dc.subject.ysovuorovaikutus
dc.subject.ysoarviointi
dc.subject.ysomittaus
dc.subject.ysoetäkokoukset
dc.format.contentfulltext
jyx.subject.urihttp://www.yso.fi/onto/yso/p40244
jyx.subject.urihttp://www.yso.fi/onto/yso/p20083
jyx.subject.urihttp://www.yso.fi/onto/yso/p25967
jyx.subject.urihttp://www.yso.fi/onto/yso/p10591
jyx.subject.urihttp://www.yso.fi/onto/yso/p7413
jyx.subject.urihttp://www.yso.fi/onto/yso/p4794
jyx.subject.urihttp://www.yso.fi/onto/yso/p28060
dc.rights.urlhttp://rightsstatements.org/page/InC/1.0/?language=en
dc.relation.doi10.1109/cvprw63382.2024.00044
jyx.fundinginformationThis work was supported by the Research Council of Finland (former Academy of Finland) Academy Professor project EmotionAI (grants 336116, 345122), and the Finnish Work Environment Fund (Project 200414). The authors also acknowledge CSC-IT Center for Science, Finland, for providing computational resources.
dc.type.okmA4


Aineistoon kuuluvat tiedostot

Thumbnail

Aineisto kuuluu seuraaviin kokoelmiin

Näytä suppeat kuvailutiedot

In Copyright
Ellei muuten mainita, aineiston lisenssi on In Copyright