DOI QR코드

DOI QR Code

공간 주파수 합성곱 게이트 트랜스포머를 이용한 시청각 자극에 따른 뇌전도 기반 감정적 스트레스 인식

Electroencephalogram-based emotional stress recognition according to audiovisual stimulation using spatial frequency convolutional gated transformer

  • 김형국 (광운대학교 전자융합공학과) ;
  • 정동기 (광운대학교 전자융합공학과) ;
  • 김진영 (전남대학교 ICT융합시스템공학과)
  • 투고 : 2022.07.20
  • 심사 : 2022.09.16
  • 발행 : 2022.09.30

초록

본 논문에서는 합성곱 신경망과 주의집중 메커니즘을 결합하여 뇌파 신호로부터 감정적 스트레스 인식 성능을 향상시키는 방식을 제안한다. 제안하는 방식에서는 뇌파 신호를 5개의 주파수 영역으로 분해하고, 각 주파수 영역에 합성곱 신경망 계층을 사용하여 뇌파 특징의 공간정보를 획득한 후에 게이트 트랜스포머를 이용한 주의집중 메커니즘을 사용하여 각 주파수 대역에서 두드러진 주파수 정보를 학습하고, 주파수 간 대역 매핑을 통해 보완 주파수 정보를 학습하여 최종 주의집중 표현에 반영한다. DEAP 데이터세트와 6명의 피 실험자가 참여한 뇌파 스트레스 인식 실험을 통해, 제안된 방식이 기존 방식과 비교하여 뇌파 기반 스트레스 인식 성능 향상에 효과가 있음을 보여준다.

In this paper, we propose a method for combining convolutional neural networks and attention mechanism to improve the recognition performance of emotional stress from Electroencephalogram (EGG) signals. In the proposed method, EEG signals are decomposed into five frequency domains, and spatial information of EEG features is obtained by applying a convolutional neural network layer to each frequency domain. As a next step, salient frequency information is learned in each frequency band using a gate transformer-based attention mechanism, and complementary frequency information is further learned through inter-frequency mapping to reflect it in the final attention representation. Through an EEG stress recognition experiment involving a DEAP dataset and six subjects, we show that the proposed method is effective in improving EEG-based stress recognition performance compared to the existing methods.

키워드

과제정보

본 논문은 2018년도 정부(교육부)의 재원으로 한국연구재단의 지원과 2022년도 광운대학교교내 학술연구비 지원에 의해 연구되었음(NRF-2018R1D1A1B07041783).

참고문헌

  1. H. M. Burke, M. C. Davis, C. Otte, and D. C. Mohr, "Depression and cortisol responses to psychological stress: a meta-analysis," Psychoneuroendocrinology, 30, 846-856 (2005). https://doi.org/10.1016/j.psyneuen.2005.02.010
  2. N. Sharma and T. Gedeon, "Objective measures, sensors and computational techniques for stress recognition: A survey," Comput. Methods Programs Bio. 108, 1287-1301 (2012). https://doi.org/10.1016/j.cmpb.2012.07.003
  3. D. Li, L. Xie, B. Chai, Z. Wang, and H. Yang, "Spatialfrequency convolutional self-attention network for EEG emotion recognition," Appl. Soft Comput. 122, 108740 (2022). https://doi.org/10.1016/j.asoc.2022.108740
  4. S. Issa, Q. Peng, X. You, and W. Ali, "Emotion assessment using EEG brain signals and stacked sparse autoencoder," J. Inf. Assur. Secur. 14, 20-29 (2019).
  5. Y. Song, X. Jia, L. Yang, and L. Xie, "Transformerbased spatial-temporal feature learning for EEG decoding," arXiv preprint arXiv:2106. 11170 (2021).
  6. H. J. Eun, "Basics of electroencephalography for neuropsychiatrist" (in Korean), J. Korean Neuropsychiatr Assoc. 58, 76-104 (2019). https://doi.org/10.4306/jknpa.2019.58.2.76
  7. A. Nguyen, K. Pham, D. Ngo, T. Ngo, and L. Pham, "An analysis of state-of-the-art activation functions for supervised deep neural network," Proc. ICSSE, 215-220 (2021).
  8. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, and A. N. Gomez, "Attention is all you need," Adv. Neural Inf. Process. Syst. 30, 5998-6008 (2017).
  9. K. Cho, B. Van Merrienboer, D. Bahdanau, and Y. Bengio, "On the properties of neural machine translation: Encoder-decoder approaches," arXiv preprint arXiv:1409.1259 (2014).
  10. Y. Tao, T. Sun, A. Muhamed, S. Genc, D. Jackson, A. Arsanjani, S. Yaddanapudi, L. Li, and P. Kumar, "Gated trans former for decoding human brain EEG signals," Proc. IEEE EMBC, 125-130 (2021).
  11. S. Koelstra, C. Muhl, M. Soleymani, J. S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras, "DEAP: A database for emotion analysis; Using physiological signals," IEEE Trans. Affective Comput. 3, 18-31 (2012). https://doi.org/10.1109/T-AFFC.2011.15
  12. M. J. Hasan and J. M. Kim, "A hybrid feature poolbased emotional stress state detection algorithm using EEG signals," Brain Sci. 9, 376 (2019). https://doi.org/10.3390/brainsci9120376
  13. D. Shon, K. Im, J.H. Park, D. S. Lim, B. Jang, and J. M. Kim, "Emotional stress state detection using genetic algorithm-based feature selection on EEG signals," Int. J. Environ. Res. Public Health, 15, 2461 (2018). https://doi.org/10.3390/ijerph15112461
  14. A. Martinez-Rodrigo, B. Garcia-Martinez, A. Huerta, and R. Alcaraz, "Detection of negative stress through spectral features of electroencephalographic recordings and a convolutional neural network," Sensors, 21, 3050 (2021). https://doi.org/10.3390/s21093050
  15. X. Li, D. Song, P. Zhang, G. Yu, Y. Hou, and B. Hu, "Emotion recognition from multi-channel EEG data through convolutional recurrent neural network," Proc. IEEE BIBM, 352-359 (2016).
  16. J. X. Chen, D. M. Jiang, and Y. N. Zhang, "A hierarchical bidirectional GRU model with attention for EEG-based emotion classification," Access, 7, 118530-18540 (2019). https://doi.org/10.1109/ACCESS.2019.2936817