Irrelevant task suppresses the N170 of automatic attention allocation to fearful faces
Dou, H., Liang, L., Ma, J., Lu, J., Zhang, W., & Li, Y. (2021). Irrelevant task suppresses the N170 of automatic attention allocation to fearful faces. Scientific Reports, 11, Article 11754. https://doi.org/10.1038/s41598-021-91237-9
Published inScientific Reports
© 2021 the Authors
Recent researches have provided evidence that stimulus-driven attentional bias for threats can be modulated by top-down goals. However, it is highlight essential to indicate whether and to what extent the top-down goals can affect the early stage of attention processing and its early neural mechanism. In this study, we collected electroencephalographic data from 28 healthy volunteers with a modified spatial cueing task. The results revealed that in the irrelevant task, there was no significant difference between the reaction time (RT) of the fearful and neutral faces. In the relevant task, we found that RT of fearful faces was faster than that of neutral faces in the valid cue condition, whereas the RT of fearful faces was slower than that of neutral faces in the invalid cue condition. The N170 component in our study showed a similar result compared with RT. Specifically, we noted that in the relevant task, fearful faces in the cue position of the target evoked a larger N170 amplitude than neutral faces, whereas this effect was suppressed in the irrelevant task. These results suggest that the irrelevant task may inhibit the early attention allocation to the fearful faces. Furthermore, the top-down goals can modulate the early attentional bias for threatening facial expressions. ...
PublisherNature Publishing Group
Publication in research information system
MetadataShow full item record
Additional information about fundingThis study was financially supported by the National Natural Science Foundation of China (Grant No. 81171289).
Showing items with similar title or keywords.
Negative and Positive Bias for Emotional Faces : Evidence from the Attention and Working Memory Paradigms Xu, Qianru; Ye, Chaoxiong; Gu, Simeng; Hu, Zhonghua; Lei, Yi; Li, Xueyan; Huang, Lihui; Liu, Qiang (Hindawi Publishing Corporation, 2021)Visual attention and visual working memory (VWM) are two major cognitive functions in humans, and they have much in common. A growing body of research has investigated the effect of emotional information on visual attention ...
Xu, Qianru; Ruohonen, Elisa; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia (Frontiers Research Foundation, 2018)It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been ...
Affekteihin ja emootioihin perustuvien kommunikaatiosignaalien havaitseminen ja tulkitseminen konenäköä hyödyntäen Nyrhinen, Riku (2019)Ihmisten sisäsyntyinen kyky ymmärtää monimutkaisia tunteita ja niiden ilmaisua on arvokas non-verbaalisen viestinnän taito, jonka hallitseminen reaalimaailman tehtävissä toisi älykkäille tietokoneille ja roboteille uusia ...
Putkinen, Vesa; Makkonen, Tommi; Eerola, Tuomas (Oxford University Press, 2017)Previous studies indicate that positive mood broadens the scope of visual attention, which can manifest as heightened distractibility. We used event-related potentials (ERP) to investigate whether music-induced positive ...
Rantanen, Matti; Hautala, Jarkko; Loberg, Otto; Nuorva, Jaakko; Hietanen, Jari K.; Nummenmaa, Lauri; Astikainen, Piia (Wiley-Blackwell, 2021)Depressed individuals exhibit an attentional bias towards mood-congruent stimuli, yet evidence for biased processing of threat-related information in human interaction remains scarce. Here, we tested whether an attentional ...