视觉工作记忆存储的表征单位 [The Representation Unit of Visual Working Memory]
Liu, R., Guo, L., Cheng, Y., Li, Q., & Ye, C. (2022). 视觉工作记忆存储的表征单位 [The Representation Unit of Visual Working Memory]. Xin li xue jin zhan, 12(03), 868-875. https://doi.org/10.12677/ap.2022.123103
Published in
Xin li xue jin zhanDate
2022Copyright
© 2022 The Authors. Early Intervention in Psychiatry published by John Wiley & Sons Australia, Ltd.
The visual working memory is a limited storage system in which people could flexibly process the representations to complete the task. Among studies on the capacity of visual working memory, the representation unit is a prior subject. Previous research employs the change-detection paradigm and the recall paradigm to explore the nature of representations when features of diverse dimensions and different-level features of the same dimension are stored in the visual working memory. These studies elicit two binary hypotheses: the object-based representation hypothesis backs that those diverse features are bound on the object and form a whole, while the feature-based representation hypothesis supports that independent feature representations have a specific capacity limitation on each dimension. Besides, as a target-drive memory system, the visual working memory receives an undeniable impact from the experiment process and the task demand. By comparing and concluding previous debates on the representation unit, we attempt to reach a united explanation of two hypotheses. This benefits our comprehension of the capacity of visual working memory. The exploration of relevant problems might consider the impact of task demand to improve the experimental design.
...
Publisher
Hans PublishersISSN Search the Publication Forum
2160-7273Keywords
Publication in research information system
https://converis.jyu.fi/converis/portal/detail/Publication/117588253
Metadata
Show full item recordCollections
Additional information about funding
This research was funded by the National Natural Science Foundation of China (31700948).License
Related items
Showing items with similar title or keywords.
-
Sustained attention required for effective dimension-based retro-cue benefit in visual working memory
Liu, Ruyi; Guo, Lijing; Sun, Hong-jin; Parviainen, Tiina; Zhou, Zifang; Cheng, Yuxin; Liu, Qiang; Ye, Chaoxiong (Association for Research in Vision and Ophthalmology (ARVO), 2023)In visual working memory (VWM) tasks, participants’ performances can be improved through the use of dimension-based retro-cues, which direct internal attention to prioritize a particular dimension (e.g., color or orientation) ... -
The differential impact of face distractors on visual working memory across encoding and delay stages
Ye, Chaoxiong; Xu, Qianru; Pan, Zhihu; Nie, Qi-Yang; Liu, Qiang (Springer Nature, 2024)External distractions often occur when information must be retained in visual working memory (VWM)—a crucial element in cognitive processing and everyday activities. However, the distraction effects can differ if they occur ... -
不同情绪面孔的视觉工作记忆表现差异 [The Performance Difference of Visual Working Memory between Various Emotional Faces]
Li, Qianru; Guo, Lijing; Zhou, Zifang; Liu, Ruyi; Cheng, Yuxin; Ye, Chaoxiong (Hans Publishers, 2022)Among social-emotional stimuli, emotional faces occupy an important position, which specifically refer to human faces with certain facial expressions. The visual working memory is a limited workspace where information can ... -
The inhibitory effect of long-term associative representation on working memory
Zhang, Yin; Liang, Tengfei; Ye, Chaoxiong; Liu, Qiang (Science Press, 2020)Studies on how long-term memory affects working memory (WM) have found that long-term memory can enhance WM processing. However, these studies only use item memory as the representation of long-term memory. In addition to ... -
The impact of retro-cue validity on working memory representation : Evidence from electroencephalograms
Fu, Xueying; Ye, Chaoxiong; Liang, Tengfei; Hu, Zhonghua; Li, Ziyuan; Liu, Qiang (Association for Research in Vision and Ophthalmology, 2022)