What do you have in mind? : ERP markers of visual and auditory imagery
Proverbio, A. M., Tacchini, M., & Jiang, K. (2023). What do you have in mind? : ERP markers of visual and auditory imagery. Brain and Cognition, 166, Article 105954. https://doi.org/10.1016/j.bandc.2023.105954
Published in
Brain and CognitionDate
2023Copyright
© 2023 The Authors. Published by Elsevier Inc.
This study aimed to investigate the psychophysiological markers of imagery processes through EEG/ERP recordings. Visual and auditory stimuli representing 10 different semantic categories were shown to 30 healthy participants. After a given interval and prompted by a light signal, participants were asked to activate a mental image corresponding to the semantic category for recording synchronized electrical potentials. Unprecedented electrophysiological markers of imagination were recorded in the absence of sensory stimulation. The following peaks were identified at specific scalp sites and latencies, during imagination of infants (centroparietal positivity, CPP, and late CPP), human faces (anterior negativity, AN), animals (anterior positivity, AP), music (P300-like), speech (N400-like), affective vocalizations (P2-like) and sensory (visual vs auditory) modality (PN300). Overall, perception and imagery conditions shared some common electro/cortical markers, but during imagery the category-dependent modulation of ERPs was long latency and more anterior, with respect to the perceptual condition. These ERP markers might be precious tools for BCI systems (pattern recognition, classification, or A.I. algorithms) applied to patients affected by consciousness disorders (e.g., in a vegetative or comatose state) or locked-in-patients (e.g., spinal or SLA patients).
...
Publisher
Elsevier BVISSN Search the Publication Forum
0278-2626Keywords
Publication in research information system
https://converis.jyu.fi/converis/portal/detail/Publication/176420994
Metadata
Show full item recordCollections
Additional information about funding
This project, entitled: “Reading mental representations through EEG signals”, was funded by a grant from University of Milano Bicocca (ATE – Fondo di Ateneo N° 31159-2019-ATE-0064).License
Related items
Showing items with similar title or keywords.
-
Editorial : Visual mismatch negativity (vMMN) : A unique tool in investigating automatic processing
Astikainen, Piia; Kreegipuu, Kairi; Czigler, István (Frontiers Media SA, 2022) -
The scope and limits of implicit visual change detection
Lyyra, Pessi (University of Jyväskylä, 2014) -
Event-related brain potential markers of visual and auditory perception : A useful tool for brain computer interface systems
Proverbio, Alice Mado; Tacchini, Marta; Jiang, Kaijun (Frontiers Media SA, 2022)Objective: A majority of BCI systems, enabling communication with patients with locked-in syndrome, are based on electroencephalogram (EEG) frequency analysis (e.g., linked to motor imagery) or P300 detection. Only recently, ... -
The bilateral field advantage effect in memory precision
Zhang, Yin; Ye, Chaoxiong; Roberson, Debi; Zhao, Guang; Xue, Chengbo; Liu, Qiang (Sage Publications Ltd., 2018)Previous research has demonstrated that visual working memory performance is better when visual items are allocated in both left and right visual fields than within only one hemifield. This phenomenon is called the bilateral ... -
Somatosensory Deviance Detection ERPs and Their Relationship to Analogous Auditory ERPs and Interoceptive Accuracy
Kangas, Elina S.; Vuoriainen, Elisa; Li, Xueqiao; Lyyra, Pessi; Astikainen, Piia (Hogrefe Publishing Group, 2022)Automatic deviance detection has been widely explored in terms of mismatch responses (mismatch negativity or mismatch response) and P3a components of event-related potentials (ERPs) under a predictive coding framework; ...