dc.contributor.author | Vedernikov, Alexander | |
dc.contributor.author | Sun, Zhaodong | |
dc.contributor.author | Kykyri, Virpi-Liisa | |
dc.contributor.author | Pohjola, Mikko | |
dc.contributor.author | Nokia, Miriam | |
dc.contributor.author | Li, Xiaobai | |
dc.date.accessioned | 2024-12-19T07:23:52Z | |
dc.date.available | 2024-12-19T07:23:52Z | |
dc.date.issued | 2024 | |
dc.identifier.citation | Vedernikov, A., Sun, Z., Kykyri, V.-L., Pohjola, M., Nokia, M., & Li, X. (2024). Analyzing Participants’ Engagement during Online Meetings Using Unsupervised Remote Photoplethysmography with Behavioral Features. In <i>2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)</i> (pp. 389-399). IEEE. IEEE Computer Society Conference on Computer Vision and Pattern Recognition workshops. <a href="https://doi.org/10.1109/cvprw63382.2024.00044" target="_blank">https://doi.org/10.1109/cvprw63382.2024.00044</a> | |
dc.identifier.other | CONVID_243254305 | |
dc.identifier.uri | https://jyx.jyu.fi/handle/123456789/99086 | |
dc.description.abstract | Engagement measurement finds application in healthcare, education, services. The use of physiological and behavioral features is viable, but the impracticality of traditional physiological measurement arises due to the need for contact sensors. We demonstrate the feasibility of unsupervised remote photoplethysmography (rPPG) as an alternative for contact sensors in deriving heart rate variability (HRV) features, then fusing these with behavioral features to measure engagement in online group meetings. Firstly, a unique Engagement Dataset of online interactions among social workers is collected with granular engagement labels, offering insight into virtual meeting dynamics. Secondly, a pre-trained rPPG model is customized to reconstruct rPPG signals from video meetings in an unsupervised manner, enabling the calculation of HRV features. Thirdly, the feasibility of estimating engagement from HRV features using short observation windows, with a notable enhancement when using longer observation windows of two to four minutes, is demonstrated. Fourthly, the effectiveness of behavioral cues is evaluated when fused with physiological data, which further enhances engagement estimation performance. An accuracy of 94% is achieved when only HRV features are used, eliminating the need for contact sensors or ground truth signals; use of behavioral cues raises the accuracy to 96%. Facial analysis offers precise engagement measurement, beneficial for future applications. | en |
dc.format.mimetype | application/pdf | |
dc.language.iso | eng | |
dc.publisher | IEEE | |
dc.relation.ispartof | 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) | |
dc.relation.ispartofseries | IEEE Computer Society Conference on Computer Vision and Pattern Recognition workshops | |
dc.rights | In Copyright | |
dc.subject.other | accuracy | |
dc.subject.other | atmospheric measurements | |
dc.subject.other | estimation | |
dc.subject.other | web conferencing | |
dc.subject.other | sensor phenomena and characterization | |
dc.subject.other | photoplethysmography | |
dc.subject.other | particle measurements | |
dc.title | Analyzing Participants’ Engagement during Online Meetings Using Unsupervised Remote Photoplethysmography with Behavioral Features | |
dc.type | conference paper | |
dc.identifier.urn | URN:NBN:fi:jyu-202412197897 | |
dc.contributor.laitos | Psykologian laitos | fi |
dc.contributor.laitos | Department of Psychology | en |
dc.type.uri | http://purl.org/eprint/type/ConferencePaper | |
dc.relation.isbn | 979-8-3503-6548-1 | |
dc.type.coar | http://purl.org/coar/resource_type/c_5794 | |
dc.description.reviewstatus | peerReviewed | |
dc.format.pagerange | 389-399 | |
dc.relation.issn | 2160-7508 | |
dc.type.version | acceptedVersion | |
dc.rights.copyright | © IEEE | |
dc.rights.accesslevel | embargoedAccess | fi |
dc.type.publication | conferenceObject | |
dc.relation.conference | IEEE/CVF Computer Society Conference on Computer Vision and Pattern Recognition Workshops | |
dc.subject.yso | fotopletysmografia | |
dc.subject.yso | mittausmenetelmät | |
dc.subject.yso | osallistujat | |
dc.subject.yso | vuorovaikutus | |
dc.subject.yso | arviointi | |
dc.subject.yso | mittaus | |
dc.subject.yso | etäkokoukset | |
dc.format.content | fulltext | |
jyx.subject.uri | http://www.yso.fi/onto/yso/p40244 | |
jyx.subject.uri | http://www.yso.fi/onto/yso/p20083 | |
jyx.subject.uri | http://www.yso.fi/onto/yso/p25967 | |
jyx.subject.uri | http://www.yso.fi/onto/yso/p10591 | |
jyx.subject.uri | http://www.yso.fi/onto/yso/p7413 | |
jyx.subject.uri | http://www.yso.fi/onto/yso/p4794 | |
jyx.subject.uri | http://www.yso.fi/onto/yso/p28060 | |
dc.rights.url | http://rightsstatements.org/page/InC/1.0/?language=en | |
dc.relation.doi | 10.1109/cvprw63382.2024.00044 | |
jyx.fundinginformation | This work was supported by the Research Council of Finland (former Academy of Finland) Academy Professor project EmotionAI (grants 336116, 345122), and the Finnish Work Environment Fund (Project 200414). The authors also acknowledge CSC-IT Center for Science, Finland, for providing computational resources. | |
dc.type.okm | A4 | |