DOI QR코드

DOI QR Code

Implementing an Adaptive Neuro-Fuzzy Model for Emotion Prediction Based on Heart Rate Variability(HRV)

심박변이도를 이용한 적응적 뉴로 퍼지 감정예측 모형에 관한 연구

  • Park, Sung Soo (SKKU Business School, Sungkyunkwan University) ;
  • Lee, Kun Chang (SKKU Business School/SAIHST (Samsung Advanced Institute for Health Science & Technology), Sungkyunkwan University)
  • 박성수 (성균관대학교 경영대학) ;
  • 이건창 (성균관대학교 경영대학/삼성융합의과학원)
  • Received : 2018.10.10
  • Accepted : 2019.01.20
  • Published : 2019.01.28

Abstract

An accurate prediction of emotion is a very important issue for the sake of patient-centered medical device development and emotion-related psychology fields. Although there have been many studies on emotion prediction, no studies have applied the heart rate variability and neuro-fuzzy approach to emotion prediction. We propose ANFEP(Adaptive Neuro Fuzzy System for Emotion Prediction) HRV. The ANFEP bases its core functions on an ANFIS(Adaptive Neuro-Fuzzy Inference System) which integrates neural networks with fuzzy systems as a vehicle for training predictive models. To prove the proposed model, 50 participants were invited to join the experiment and Heart rate variability was obtained and used to input the ANFEP model. The ANFEP model with STDRR and RMSSD as inputs and two membership functions per input variable showed the best results. The result out of applying the ANFEP to the HRV metrics proved to be significantly robust when compared with benchmarking methods like linear regression, support vector regression, neural network, and random forest. The results show that reliable prediction of emotion is possible with less input and it is necessary to develop a more accurate and reliable emotion recognition system.

감정을 정확히 예측하는 것은 환자중심의 의료디바이스 개발 및 감성관련 산업에서 매우 중요한 이슈이다. 감정예측에 관한 많은 연구 중 감정 예측에 심박 변동성과 뉴로-퍼지 접근법을 적용한 연구는 없다. 본 연구는 HRV를 이용한 ANFEP(Adaptive Neuro Fuzzy system for Emotion Prediction)을 제안한다. ANFEP의 핵심 기능은 인공 신경망과 퍼지시스템을 통합해 예측 모델을 학습하는 ANFIS(Adaptive Neuro-Fuzzy Inference System)에 기반한다. 제안 모형의 검증을 위해 50명의 실험자를 대상으로 청각자극으로 감정을 유발하고, 심박변이도를 구하여 ANFEP 모형에 입력하였다. STDRR과 RMSSD를 입력으로 하고 입력변수 당 2개의 소속함수로 하는 ANFEP모형이 가장 좋은 결과를 나타났다. 제안한 감정예측 모형을 선형회귀 분석, 서포트 벡터 회귀, 인공신경망, 랜덤 포레스트와 비교한 결과 본 제안모형이 가장 우수한 성능을 보였다. 연구 결과는 보다 적은 입력으로 신뢰성 높은 감정인식이 가능함을 입증했고, 이를 활용해 보다 정확하고 신뢰성 높은 감정인식 시스템 개발에 대한 연구가 필요하다.

Keywords

DJTJBT_2019_v17n1_239_f0001.png 이미지

Fig. 1. Structure of ANFIS

DJTJBT_2019_v17n1_239_f0002.png 이미지

Fig. 2. Process of ANFEP

DJTJBT_2019_v17n1_239_f0003.png 이미지

Fig. 3. Error of ANFIS Structure (Training & Checking)

DJTJBT_2019_v17n1_239_f0004.png 이미지

Fig. 4. RMSE by Eproch

DJTJBT_2019_v17n1_239_f0005.png 이미지

Fig. 5. Function plots STDRR

DJTJBT_2019_v17n1_239_f0006.png 이미지

Fig. 6. Function plots RMSSD

DJTJBT_2019_v17n1_239_f0007.png 이미지

Fig. 7. Valence prediction using ANFEP

Table 1. ANFEP model performance

DJTJBT_2019_v17n1_239_t0001.png 이미지

Table 2. Result of Emotion prediction model

DJTJBT_2019_v17n1_239_t0002.png 이미지

References

  1. R. Riedl, F. D. Davis & A. R. Hevner. (2014). Towards a NeuroIS Research Methodology : Intensifying the Discussion on Methods , Tools , and Measurement. Journal of the Association for Information Systems, 15(Special Issue), i-xxxv.
  2. E. C. Nook, K. A. Lindquist & J. Zaki. (2015). A new look at emotion perception: Concepts speed and shape facial emotion recognition. Emotion, 15(5), 569-578. DOI : 10.1037/a0039166
  3. C. N. Anagnostopoulos, T. Iliou & I. Giannoukos. (2015). Features and classifiers for emotion recognition from speech: a survey from 2000 to 2011. Artificial Intelligence Review, 43(2), 155-177. DOI : 10.1007/s10462-012-9368-5
  4. M. Soleymani, S. Asghari-Esfeden, Y. Fu & M. Pantic. (2016). Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection. IEEE Transactions on Affective Computing, 7(1), 17-28. DOI : 10.1109/TAFFC.2015.2436926
  5. C. D. Katsis, N. S. Katertsidis & D. I. Fotiadis. (2011). An integrated system based on physiological signals for the assessment of affective states in patients with anxiety disorders. Biomedical Signal Processing and Control, 6(3), 261-268. DOI : 10.1016/j.bspc.2010.12.001
  6. F. Russo, N. Vempala & G. Sandstrom. (2013). Predicting musically induced emotions from physiological inputs: linear and neural network models. Frontiers in Psychology, 4, 468. DOI : 10.3389/fpsyg.2013.00468
  7. P. A. Kragel & K. S. Labar. (2013). Multivariate pattern classification reveals autonomic and experiential representations of discrete emotions. Emotion, 13(4), 681-690. DOI : 10.1037/a0031820
  8. J. Selvaraj, M. Murugappan, K. Wan & S. Yaacob. (2013). Classification of emotional states from electrocardiogram signals: a non-linear approach based on hurst. BioMedical Engineering OnLine, 12(1), 44. DOI : 10.1186/1475-925X-12-44
  9. G. Valenza, L. Citi, A. Lanata, E. P. Scilingo & R. Barbieri. (2014). Revealing Real-Time Emotional Responses: a Personalized Assessment based on Heartbeat Dynamics. Scientific Reports, 4, 4998. DOI : 10.1038/srep04998
  10. S. Yu & S. Chen. (2015). Emotion state identification based on heart rate variability and genetic algorithm. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 538-541. DOI : 10.1109/EMBC.2015.7318418
  11. R. Rakshit, V. R. Reddy & P. Deshpande. (2016). Emotion Detection and Recognition Using HRV Features Derived from Photoplethysmogram Signals. In Proceedings of the 2nd Workshop on Emotion Representations and Modelling for Companion Systems. DOI : 10.1145/3009960.3009962
  12. H. Guo, Y. Huang, C. Lin, J. Chien, K. Haraikawa & J. Shieh. (2016). Heart Rate Variability Signal Features for Emotion Recognition by Using Principal Component Analysis and Support Vectors Machine. In 2016 IEEE 16th International Conference on Bioinformatics and Bioengineering (BIBE), 274-277. DOI : 10.1109/BIBE.2016.40
  13. A. Dimoka et al. (2012). On the Use of Neurophysiological Tools in IS Research: Developing a Research Agenda for NeuroIS. MIS Quarterly, 36(3), 679-702. DOI : 10.2307/41703475
  14. T. Teubner, M. Adam & R. Riordan. (2015). The Impact of Computerized Agents on Immediate Emotions, Overall Arousal and Bidding Behavior in Electronic Auctions. Journal of the Association for Information Systems, 16(10), 838-879. https://doi.org/10.17705/1jais.00412
  15. A. Hariharan & M. T. P. Adam. (2015). Blended Emotion Detection for Decision Support. IEEE Transactions on Human-Machine Systems, 45(4), 510-517. DOI : 10.1109/THMS.2015.2418231
  16. P. M. Leger, F. D. Davis, T. P. Cronan & J. Perret. (2014). Neurophysiological correlates of cognitive absorption in an enactive training context. Computers in Human Behavior, 34, 273-283. DOI : 10.1016/j.chb.2014.02.011
  17. Y. Zheng, X. Ding, C. C. Y. Poon, B. P. L. Lo, H. Zhang, X. Zhou & Y. Zhang. (2014). Unobtrusive Sensing and Wearable Devices for Health Informatics. IEEE Transactions on Biomedical Engineering, 61(5), 1538-1554. DOI : 10.1109/TBME.2014.2309951
  18. L. Shen, M. Wang & R. Shen. (2009). Affective e-Learning: Using Emotional Data to Improve Learning in Pervasive Learning Environment. Journal of Educational Technology & Society, 12(2), 176-189.
  19. J. R. Jang. (1993). ANFIS: adaptive-network- based fuzzy inference system. IEEE Transactions on Systems, Man, and Cybernetics, 23(3), 665-685. DOI : 10.1109/21.256541
  20. A. Haag, S. Goronzy, P. Schaich & J. Williams. (2004). Emotion Recognition Using Bio- sensors: First Steps towards an Automatic System. In E. Andre, L. Dybkjaer, W. Minker, & P. Heisterkamp (Eds.), Affective Dialogue Systems (pp. 36-48). Berlin, Heidelberg: Springer Berlin Heidelberg. DOI : 10.1007/978-3-540-24842-2_4
  21. D. Kukolja, S. Popovic, M. Horvat, B. Kovac & K. Cosic. (2014). Comparative analysis of emotion estimation methods based on physiological measurements for real-time applications. International Journal of Human-Computer Studies, 72(10), 717-727. DOI : 10.1016/j.ijhcs.2014.05.006
  22. R. L. Mandryk & M. S. Atkins. (2007). A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. International Journal of Human- Computer Studies, 65(4), 329-347. DOI : 10.1016/j.ijhcs.2006.11.011
  23. C. D. Katsis, N. Katertsidis, G. Ganiatsas & D. I. Fotiadis. (2008). Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 38(3), 502-512. DOI : 10.1109/TSMCA.2008.918624
  24. N. Kamaruddin, A. Wahab & C. Quek. (2012). Cultural dependency analysis for understanding speech emotion. Expert Systems with Applications, 39(5), 5115-5133. DOI : 10.1016/j.eswa.2011.11.028
  25. M. Malkawi & O. Murad. (2013). Artificial neuro fuzzy logic system for detecting human emotions. Human-Centric Computing and Information Sciences, 3(1), 3. DOI : 10.1186/2192-1962-3-3
  26. G. Uchyigit & M. Y. Ma. (Eds.). (2008). Personalization Techniques and Recommender Systems. Series in Machine Perception and Artificial Intelligence. World Scientific. DOI : 10.1142/6788
  27. H. D. Critchley, S. Wiens, P. Rotshtein, A. Ohman & R. J. Dolan. (2004). Neural systems supporting interoceptive awareness. Nature Neuroscience, 7(2), 189. DOI : 10.1038/nn1176
  28. V. N. Salimpoor, M. Benovoy, K. Larcher, A. Dagher & R. J. Zatorre. (2011). Anatomically distinct dopamine release during anticipation and experience of peak emotion to music. Nature Neuroscience, 14, 257. DOI : 10.1038/nn.2726
  29. K. A. Cha, W. K. Hong, S. H. Park & H. S. Choi. (2017). Development of Emotion Inference Application with Location Information and User's Heartbeat Rate. Journal of the Korea Convergence Society, 8(8), 83-88. https://doi.org/10.15207/JKCS.2017.8.2.083