TY - JOUR
T1 - Emotion recognition from spatiotemporal EEG representations with hybrid convolutional recurrent neural networks via wearable multi-channel headset
AU - Chen, Jingxia
AU - Jiang, Dongmei
AU - Zhang, Yanning
AU - Zhang, Pengwei
N1 - Publisher Copyright:
© 2020 Elsevier B.V.
PY - 2020/3/15
Y1 - 2020/3/15
N2 - Emotion recognition from electroencephalography (EEG) has been an important research direction in affective brain–computer interactions (ABCI). An integration of EEG-based emotion recognition algorithms in ABCI can make the users experience more complete and engaging. In this paper, we propose a new data representation of electroencephalogram (EEG), which transforms 1D chain-like EEG vector sequences into 2D mesh-like matrix sequences. The mesh structure of the matrix at each time point corresponds to the electrodes’ location of EEG headset, which could better represent the spatial correlation of EEG signals among multiple physically adjacent electrodes. Then, the sliding window is used to divide the 2D matrix sequences into segments containing equal time points, and each segment is seen as an EEG sample integrating the temporal and spatial correlation of raw EEG recordings. We also propose both cascaded and parallel hybrid convolution recurrent neural networks to accurately predict the emotional category of each EEG sample. Extensive binary emotion classification experiments in valence and arousal are carried out on a large scale DEAP dataset (32 subjects, 9,830,400 EEG recordings). The experimental results demonstrate that the classification accuracies of both proposed hybrid networks on our spatial–temporal EEG representation achieve over 93%, which outperform the most recent baseline methods and other deep learning models in within-subject validation scenario. Our proposed methods and hybrid models effectively improve the accuracy and robustness of EEG emotion classification, which can be applied to develop affective BCI applications using consumer multi-electrode EEG headset.
AB - Emotion recognition from electroencephalography (EEG) has been an important research direction in affective brain–computer interactions (ABCI). An integration of EEG-based emotion recognition algorithms in ABCI can make the users experience more complete and engaging. In this paper, we propose a new data representation of electroencephalogram (EEG), which transforms 1D chain-like EEG vector sequences into 2D mesh-like matrix sequences. The mesh structure of the matrix at each time point corresponds to the electrodes’ location of EEG headset, which could better represent the spatial correlation of EEG signals among multiple physically adjacent electrodes. Then, the sliding window is used to divide the 2D matrix sequences into segments containing equal time points, and each segment is seen as an EEG sample integrating the temporal and spatial correlation of raw EEG recordings. We also propose both cascaded and parallel hybrid convolution recurrent neural networks to accurately predict the emotional category of each EEG sample. Extensive binary emotion classification experiments in valence and arousal are carried out on a large scale DEAP dataset (32 subjects, 9,830,400 EEG recordings). The experimental results demonstrate that the classification accuracies of both proposed hybrid networks on our spatial–temporal EEG representation achieve over 93%, which outperform the most recent baseline methods and other deep learning models in within-subject validation scenario. Our proposed methods and hybrid models effectively improve the accuracy and robustness of EEG emotion classification, which can be applied to develop affective BCI applications using consumer multi-electrode EEG headset.
KW - Affective BCI
KW - Convolutional recurrent neural network
KW - EEG
KW - Emotion recognition
KW - Spatiotemporal feature
KW - Wearable multi-electrode headset
UR - http://www.scopus.com/inward/record.url?scp=85079854277&partnerID=8YFLogxK
U2 - 10.1016/j.comcom.2020.02.051
DO - 10.1016/j.comcom.2020.02.051
M3 - 文章
AN - SCOPUS:85079854277
SN - 0140-3664
VL - 154
SP - 58
EP - 65
JO - Computer Communications
JF - Computer Communications
ER -