Feature hypergraph representation learning on spatial-temporal correlations for EEG emotion recognition

Cogn Neurodyn. 2023 Oct;17(5):1271-1281. doi: 10.1007/s11571-022-09890-3. Epub 2022 Oct 10.

Abstract

Electroencephalogram(EEG) becomes popular in emotion recognition for its capability of selectively reflecting the real emotional states. Existing graph-based methods have made primary progress in representing pairwise spatial relationships, but leaving higher-order relationships among EEG channels and higher-order relationships inside EEG series. Constructing a hypergraph is a general way of representing higher-order relations. In this paper, we propose a spatial-temporal hypergraph convolutional network(STHGCN) to capture higher-order relationships that existed in EEG recordings. STHGCN is a two-block hypergraph convolutional network, in which feature hypergraphs are constructed over the spectrum, space, and time domains, to explore spatial and temporal correlations under specific emotional states, namely the correlations of EEG channels and the dynamic relationships of temporal stamps. What's more, a self-attention mechanism is combined with the hypergraph convolutional network to initialize and update the relationships of EEG series. The experimental results demonstrate that constructed feature hypergraphs can effectively capture the correlations among valuable EEG channels and the correlations inside valuable EEG series, leading to the best emotion recognition accuracy among the graph methods. In addition, compared with other competitive methods, the proposed method achieves state-of-art results on SEED and SEED-IV datasets.

Keywords: EEG; Emotion recognition; Hypergraph learning; Self-attention mechanism.