EEG Scalogram Analysis in Emotion Recognition: A Swin Transformer and TCN-Based Approach


Pesen S. T., Ali Altuncu M.

Computers, Materials and Continua, vol.84, no.3, pp.5597-5611, 2025 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 84 Issue: 3
  • Publication Date: 2025
  • Doi Number: 10.32604/cmc.2025.066702
  • Journal Name: Computers, Materials and Continua
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Aerospace Database, Communication Abstracts, Compendex, INSPEC, Metadex, zbMATH, Civil Engineering Abstracts
  • Page Numbers: pp.5597-5611
  • Keywords: Continuous wavelet transform, EEG, emotion recognition, Swin Transformer, temporal convolutional network
  • Kocaeli University Affiliated: Yes

Abstract

EEG signals are widely used in emotion recognition due to their ability to reflect involuntary physiological responses. However, the high dimensionality of EEG signals and their continuous variability in the time-frequency plane make their analysis challenging. Therefore, advanced deep learning methods are needed to extract meaningful features and improve classification performance. This study proposes a hybrid model that integrates the Swin Transformer and Temporal Convolutional Network (TCN) mechanisms for EEG-based emotion recognition. EEG signals are first converted into scalogram images using Continuous Wavelet Transform (CWT), and classification is performed on these images. Swin Transformer is used to extract spatial features in scalogram images, and the TCN method is used to learn long-term dependencies. In addition, attention mechanisms are integrated to highlight the essential features extracted from both models. The effectiveness of the proposed model has been tested on the SEED dataset, widely used in the field of emotion recognition, and it has consistently achieved high performance across all emotional classes, with accuracy, precision, recall, and F1-score values of 97.53%, 97.54%, 97.53%, and 97.54%, respectively. Compared to traditional transfer learning models, the proposed approach achieved an accuracy increase of 1.43% over ResNet-101, 1.81% over DenseNet-201, and 2.44% over VGG-19. In addition, the proposed model outperformed many recent CNN, RNN, and Transformer-based methods reported in the literature.