Show simple item record

dc.contributor.advisorPiitulainen, Harri
dc.contributor.advisorKujala, Jan
dc.contributor.advisorAvela, Janne
dc.contributor.authorNyländen, Paavo
dc.date.accessioned2024-06-24T11:26:01Z
dc.date.available2024-06-24T11:26:01Z
dc.date.issued2024
dc.identifier.urihttps://jyx.jyu.fi/handle/123456789/96128
dc.description.abstractThis study aimed to investigate the feasibility of decoding proprioceptive and tactile stimuli applied to the fingers using magnetoencephalography (MEG) signals and support vector machines (SVMs). With the advancement of neuroprosthetic brain-computer interfaces (BCIs) there is a growing need to enhance the control of these devices by accurately decoding subtle sensory stimuli. This research focuses on the temporal dynamics of MEG responses to such stimuli, investigating their potential to inform the development of more dexterous and responsive neuroprostheses. The method involved recruiting ten healthy adult participants and using a custom-built four-finger pneumatic actuator integrated with a tactile stimulator to deliver stimuli to the index, middle, ring, and little fingers. MEG data were recorded using a 306-channel Elekta Neuromag system at a 1000 Hz sampling rate. Preprocessing steps included noise reduction techniques such as oversampled temporal projection (OTP), temporal signal space separation (tSSS), and independent component analysis (ICA). Features for decoding were extracted from the temporal changes in MEG signals using a sliding time window analysis, and SVMs were employed for classification. Results indicated that proprioceptive stimuli applied to different fingers yielded slightly higher and more consistent classification accuracies (70%-73%) compared to tactile stimuli (around 67%-72%). Classification between proprioceptive and tactile stimuli applied to the same finger achieved even higher accuracies, averaging around 90%. These findings suggest that the temporal characteristics of MEG signals can be effectively used for decoding sensory stimuli, providing a solid foundation for future BCI applications. Further research should consider expanding the sample size, exploring different feature selection methods, and utilizing electroencephalography (EEG) for practical, non-invasive BCI implementations.en
dc.format.extent53
dc.language.isoen
dc.rightsIn Copyrighten
dc.subject.otherproprioception
dc.subject.othertactile stimuli
dc.subject.othersupport vector machine
dc.titleDecoding four-finger proprioceptive and tactile stimuli from magnetoencephalography
dc.typemaster thesis
dc.identifier.urnURN:NBN:fi:jyu-202406244974
dc.type.ontasotMaster’s thesisen
dc.type.ontasotPro gradu -tutkielmafi
dc.contributor.tiedekuntaLiikuntatieteellinen tiedekuntafi
dc.contributor.tiedekuntaFaculty of Sport and Health Sciencesen
dc.contributor.laitosLiikunta- ja terveystieteetfi
dc.contributor.laitosSport and Health Sciencesen
dc.contributor.yliopistoJyväskylän yliopistofi
dc.contributor.yliopistoUniversity of Jyväskyläen
dc.contributor.oppiaineBiomekaniikkafi
dc.contributor.oppiaineBiomechanicsen
dc.type.coarhttp://purl.org/coar/resource_type/c_bdcc
dc.rights.accesslevelopenAccess
dc.type.publicationmasterThesis
dc.contributor.oppiainekoodi5012
dc.subject.ysoMEG
dc.subject.ysoaivot
dc.subject.ysokoneoppiminen
dc.subject.ysosignaalianalyysi
dc.subject.ysoMEG
dc.subject.ysobrain
dc.subject.ysomachine learning
dc.subject.ysosignal analysis
dc.rights.urlhttps://rightsstatements.org/page/InC/1.0/


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

In Copyright
Except where otherwise noted, this item's license is described as In Copyright