Abstract

Emotion recognition based on EEG has been implemented in numerous studies. In most of them, there are two observations made: first, extensive implementation is negatively associated with the performed validation. Cross-subject validation is more difficult than subject-dependent validation due to the high variability between EEG recordings caused by domain shifts. Second, a large number of channels requires extensive computation. Efforts to reduce channels are impeded by decreased performance as the number of channels is decreased; therefore, an effective approach for reducing channels is required to maintain performance. In this paper, we propose collaboration on 2D EEG input in the form of scalograms, CNN, and channel selection based on power spectral density ratios coupled with the relief method. The power ratio is derived from the power band's power spectral density. Based on the trial selection with various conditions, the collaboration of the proposed scalogram and PR-Relief (power ratio-Relief) produced a stable classification rate. For analysis, the Database for Emotion Analysis of Physiological Signals (DEAP) has been employed. Experimental results indicate that the proposed method increases the accuracy of cross-subject emotion recognition using 10 channels by 2.71% for valence and 1.96% for arousal, respectively. Using 10 channels for subject-dependent validation, the efficacy of the valence and arousal classes increased by 2.41% and 1.2%, respectively. Consequently, by pursuing collaboration between input interpretation and stable channel selection methods, the proposed collaborative method achieves a better result.

Original languageEnglish
Pages (from-to)110136-110150
Number of pages15
JournalIEEE Access
Volume11
DOIs
Publication statusPublished - 2023

Keywords

  • Channel selection
  • cross-subject
  • emotion recognition
  • scalogram
  • validation

Fingerprint

Dive into the research topics of 'Cross-Subject Channel Selection Using Modified Relief and Simplified CNN-Based Deep Learning for EEG-Based Emotion Recognition'. Together they form a unique fingerprint.

Cite this