心理科学进展
心理科學進展
심이과학진전
Advances In Psychological Science
2015年
7期
1109~1117
,共null页
王苹 潘治辉 张立洁 陈煦海
王蘋 潘治輝 張立潔 陳煦海
왕평 반치휘 장립길 진후해
面孔情绪 语音情绪 整合加工 神经振荡 事件相关电位
麵孔情緒 語音情緒 整閤加工 神經振盪 事件相關電位
면공정서 어음정서 정합가공 신경진탕 사건상관전위
facial emotion; vocal emotion; integration; neural oscillation; ERPs
面孔、语音情绪信息的整合加工是社会交往的重要技能,近年来逐渐引起心理学、神经科学研究的关注。当前研究较为系统地考察了双通道情绪信息整合加工的行为表现和影响因素,也很好地回答了“何时整合”与“在哪里整合”两个认知神经科学关注的问题,但对“面孔、语音情绪信息能否整合为一致情绪客体?双通道情绪信息在大脑中如何整合为一?”两个关键问题都还缺乏系统研究。因此,本项目拟系统操纵面孔、语音刺激的情绪凸显度和任务要求,引入动态面孔.语音刺激以增加外部效度,综合运用行为和电生理技术,从多角度挖掘数据,特别是引入神经振荡(时频、相干)分析,系统考察动态性面孔和语音情绪信息是否能整合成一致情绪客体,并在神经振荡层面探明双通道情绪信息整合的机制。
麵孔、語音情緒信息的整閤加工是社會交往的重要技能,近年來逐漸引起心理學、神經科學研究的關註。噹前研究較為繫統地攷察瞭雙通道情緒信息整閤加工的行為錶現和影響因素,也很好地迴答瞭“何時整閤”與“在哪裏整閤”兩箇認知神經科學關註的問題,但對“麵孔、語音情緒信息能否整閤為一緻情緒客體?雙通道情緒信息在大腦中如何整閤為一?”兩箇關鍵問題都還缺乏繫統研究。因此,本項目擬繫統操縱麵孔、語音刺激的情緒凸顯度和任務要求,引入動態麵孔.語音刺激以增加外部效度,綜閤運用行為和電生理技術,從多角度挖掘數據,特彆是引入神經振盪(時頻、相榦)分析,繫統攷察動態性麵孔和語音情緒信息是否能整閤成一緻情緒客體,併在神經振盪層麵探明雙通道情緒信息整閤的機製。
면공、어음정서신식적정합가공시사회교왕적중요기능,근년래축점인기심이학、신경과학연구적관주。당전연구교위계통지고찰료쌍통도정서신식정합가공적행위표현화영향인소,야흔호지회답료“하시정합”여“재나리정합”량개인지신경과학관주적문제,단대“면공、어음정서신식능부정합위일치정서객체?쌍통도정서신식재대뇌중여하정합위일?”량개관건문제도환결핍계통연구。인차,본항목의계통조종면공、어음자격적정서철현도화임무요구,인입동태면공.어음자격이증가외부효도,종합운용행위화전생리기술,종다각도알굴수거,특별시인입신경진탕(시빈、상간)분석,계통고찰동태성면공화어음정서신식시부능정합성일치정서객체,병재신경진탕층면탐명쌍통도정서신식정합적궤제。
The integration of facial-vocal emotion is an important factor for successful communication that intrigue psychologists and neuroscientists in recent years. Previous studies have elaborated on the behavioral performance and the influence factors for facial-vocal emotion integration, as well as "when" and "where" information from the two modes integrated. However, it remains open questions whether the integration of facial-vocal emotion follows the principles of multisensory integration (eg.the principle of inverse effectiveness), and how the bimodal emotional information merges into a coherence emotional object. Therefore, taking "whether facial-vocal emotion integration obeys the principle of inverse effectiveness" as main line, we designed six experiments which manipulated emotional salience of the dynamic facial-vocal emotional stimuli and task demands systematically. Moreover, using multi-dimensional analysis of behavioral and EEG data, especially time-frequency and coherence analysis of EEG data, we aimed to answer the two proposed questions, to further reveal the neurophysiological mechanism of facial-vocal emotion integration.