• Title/Summary/Keyword: Extract Emotion

Search Result 134, Processing Time 0.024 seconds

Study of Emotion Recognition based on Facial Image for Emotional Rehabilitation Biofeedback (정서재활 바이오피드백을 위한 얼굴 영상 기반 정서인식 연구)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.10
    • /
    • pp.957-962
    • /
    • 2010
  • If we want to recognize the human's emotion via the facial image, first of all, we need to extract the emotional features from the facial image by using a feature extraction algorithm. And we need to classify the emotional status by using pattern classification method. The AAM (Active Appearance Model) is a well-known method that can represent a non-rigid object, such as face, facial expression. The Bayesian Network is a probability based classifier that can represent the probabilistic relationships between a set of facial features. In this paper, our approach to facial feature extraction lies in the proposed feature extraction method based on combining AAM with FACS (Facial Action Coding System) for automatically modeling and extracting the facial emotional features. To recognize the facial emotion, we use the DBNs (Dynamic Bayesian Networks) for modeling and understanding the temporal phases of facial expressions in image sequences. The result of emotion recognition can be used to rehabilitate based on biofeedback for emotional disabled.

Emotion Recognition Using Template Vector and Neural-Network (형판 벡터와 신경망을 이용한 감성인식)

  • Joo, Young-Hoon;Oh, Jae-Heung
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.13 no.6
    • /
    • pp.710-715
    • /
    • 2003
  • In this paper, we propose the new emotion recognition method for intelligently recognizing the human's emotion using the template vector and neural network. In the proposed method, human's emotion is divided into four emotion (surprise, anger, happiness, sadness). The proposed method is based on the template vector extraction and the template location recognition by using the color difference. It is not easy to extract the skin color area correctly using the single color space. To solve this problem, we propose the extraction method using the various color spaces and using the each template vectors. And then we apply the back-propagation algorithm by using the template vectors among the feature points). Finally, we show the practical application possibility of the proposed method.

Emotion Classification Using EEG Spectrum Analysis and Bayesian Approach (뇌파 스펙트럼 분석과 베이지안 접근법을 이용한 정서 분류)

  • Chung, Seong Youb;Yoon, Hyun Joong
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.37 no.1
    • /
    • pp.1-8
    • /
    • 2014
  • This paper proposes an emotion classifier from EEG signals based on Bayes' theorem and a machine learning using a perceptron convergence algorithm. The emotions are represented on the valence and arousal dimensions. The fast Fourier transform spectrum analysis is used to extract features from the EEG signals. To verify the proposed method, we use an open database for emotion analysis using physiological signal (DEAP) and compare it with C-SVC which is one of the support vector machines. An emotion is defined as two-level class and three-level class in both valence and arousal dimensions. For the two-level class case, the accuracy of the valence and arousal estimation is 67% and 66%, respectively. For the three-level class case, the accuracy is 53% and 51%, respectively. Compared with the best case of the C-SVC, the proposed classifier gave 4% and 8% more accurate estimations of valence and arousal for the two-level class. In estimation of three-level class, the proposed method showed a similar performance to the best case of the C-SVC.

Vision System for NN-based Emotion Recognition (신경회로망 기반 감성 인식 비젼 시스템)

  • Lee, Sang-Yun;Kim, Sung-Nam;Joo, Young-Hoon;Park, Chang-Hyun;Sim, Kwee-Bo
    • Proceedings of the KIEE Conference
    • /
    • 2001.07d
    • /
    • pp.2036-2038
    • /
    • 2001
  • In this paper, we propose the neural network based emotion recognition method for intelligently recognizing the human's emotion using vision system. In the proposed method, human's emotion is divided into four emotion (surprise, anger, happiness, sadness). Also, we use R,G,B(red, green, blue) color image data and the gray image data to get the highly trust rate of feature point extraction. For this, we propose an algorithm to extract four feature points (eyebrow, eye, nose, mouth) from the face image acquired by the color CCD camera and find some feature vectors from those. And then we apply back-prapagation algorithm to the secondary feature vector(position and distance among the feature points). Finally, we show the practical application possibility of the proposed method.

  • PDF

Development of Context Awareness and Service Reasoning Technique for Handicapped People (멀티 모달 감정인식 시스템 기반 상황인식 서비스 추론 기술 개발)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.1
    • /
    • pp.34-39
    • /
    • 2009
  • As a subjective recognition effect, human's emotion has impulsive characteristic and it expresses intentions and needs unconsciously. These are pregnant with information of the context about the ubiquitous computing environment or intelligent robot systems users. Such indicators which can aware the user's emotion are facial image, voice signal, biological signal spectrum and so on. In this paper, we generate the each result of facial and voice emotion recognition by using facial image and voice for the increasing convenience and efficiency of the emotion recognition. Also, we extract the feature which is the best fit information based on image and sound to upgrade emotion recognition rate and implement Multi-Modal Emotion recognition system based on feature fusion. Eventually, we propose the possibility of the ubiquitous computing service reasoning method based on Bayesian Network and ubiquitous context scenario in the ubiquitous computing environment by using result of emotion recognition.

Analysis and Application of Power Consumption Patterns for Changing the Power Consumption Behaviors (전력소비행위 변화를 위한 전력소비패턴 분석 및 적용)

  • Jang, MinSeok;Nam, KwangWoo;Lee, YonSik
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.25 no.4
    • /
    • pp.603-610
    • /
    • 2021
  • In this paper, we extract the user's power consumption patterns, and model the optimal consumption patterns by applying the user's environment and emotion. Based on the comparative analysis of these two patterns, we present an efficient power consumption method through changes in the user's power consumption behavior. To extract significant consumption patterns, vector standardization and binary data transformation methods are used, and learning about the ensemble's ensemble with k-means clustering is applied, and applying the support factor according to the value of k. The optimal power consumption pattern model is generated by applying forced and emotion-based control based on the learning results for ensemble aggregates with relatively low average consumption. Through experiments, we validate that it can be applied to a variety of windows through the number or size adjustment of clusters to enable forced and emotion-based control according to the user's intentions by identifying the correlation between the number of clusters and the consistency ratios.

Facial Expression Recognition with Fuzzy C-Means Clusstering Algorithm and Neural Network Based on Gabor Wavelets

  • Youngsuk Shin;Chansup Chung;Lee, Yillbyung
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2000.04a
    • /
    • pp.126-132
    • /
    • 2000
  • This paper presents a facial expression recognition based on Gabor wavelets that uses a fuzzy C-means(FCM) clustering algorithm and neural network. Features of facial expressions are extracted to two steps. In the first step, Gabor wavelet representation can provide edges extraction of major face components using the average value of the image's 2-D Gabor wavelet coefficient histogram. In the next step, we extract sparse features of facial expressions from the extracted edge information using FCM clustering algorithm. The result of facial expression recognition is compared with dimensional values of internal stated derived from semantic ratings of words related to emotion. The dimensional model can recognize not only six facial expressions related to Ekman's basic emotions, but also expressions of various internal states.

  • PDF

Study for the mechanism of exteriror, interior disease by the nature and emotion (성정(性情)에 의한 표리병증(表裏病證)의 발생기전에 관한 고찰(考察))

  • Ko, Woo-Seok;Lee, Soo-Kyung;Lee, Eui-Ju;Ko, Byung-Hee;Song, Il-Byung
    • Journal of Sasang Constitutional Medicine
    • /
    • v.16 no.2
    • /
    • pp.44-51
    • /
    • 2004
  • 1. Objectives This paper is written for clear understanding on the concepts of 'Deep adherence of the Nature' and 'Sudden explosion of the Emotion' and for intensive comprehension of Mechanism of 'The Exterior Disease' and 'The Interior Disease' by The Nature and the Emotion. 2. Methods The study mainly based on "The exterior disease injured by the Nature, the interior disease injured by the Emotion" and deals with 'Seongmyungron, Sadanron, Hwarkchungron and Jangburon'. 3. Results and conclusions The conclusions as follows (1) Exterior Disease in each constitution breaks out by deep adherence of the Nature. Interior Disease brings out with sudden explosion of the Emotion. (2) 'Deep adherence of the Nature' is the state that the Ear, Eye, Nose, Mouth taking care of environmental frames can't work enoughly because of wrong minds on the Jaw, Breast, Umbilicus, Abdment. 'Sudden explosion of the Emotion' is the state that the Lung, Spleen, Liver, Kidney doing human affairs doesn't do appropriately for each constitution due to negative minds on the Head, Shoulder, Waist, Hip. (3) The result that the Ear, Eye, Nose, Mouth doesn't extract the clear energy from 'Jin, Go, You, Yaek sea' so they are lack of 'Shin, Qi, Hyul, Jeong' causes exterior disease in each constitution. (4) In each constitution, Interior disease results from that the Lung, Spleen, Liver, Kidney fails to bring on the clear juice from 'Lee, Mak, Hyul, Jeong sea', so shortage of 'Jin, go, You, Yaek' results in it.

  • PDF

An Emotion Recognition Technique using Speech Signals (음성신호를 이용한 감정인식)

  • Jung, Byung-Wook;Cheun, Seung-Pyo;Kim, Youn-Tae;Kim, Sung-Shin
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.4
    • /
    • pp.494-500
    • /
    • 2008
  • In the field of development of human interface technology, the interactions between human and machine are important. The research on emotion recognition helps these interactions. This paper presents an algorithm for emotion recognition based on personalized speech signals. The proposed approach is trying to extract the characteristic of speech signal for emotion recognition using PLP (perceptual linear prediction) analysis. The PLP analysis technique was originally designed to suppress speaker dependent components in features used for automatic speech recognition, but later experiments demonstrated the efficiency of their use for speaker recognition tasks. So this paper proposed an algorithm that can easily evaluate the personal emotion from speech signals in real time using personalized emotion patterns that are made by PLP analysis. The experimental results show that the maximum recognition rate for the speaker dependant system is above 90%, whereas the average recognition rate is 75%. The proposed system has a simple structure and but efficient to be used in real time.

Emotion Communication through MotionTypography Based on Movement Analysis (모션타이포그래피의 움직임을 통한 감성전달)

  • Son, Min-Jeong;Lee, Hyun-Ju
    • Journal of Digital Contents Society
    • /
    • v.12 no.4
    • /
    • pp.541-550
    • /
    • 2011
  • MotionTypography is crucial to effective emotional communication in digital society. In this paper, I study movements to represent emotion using motiontypography and approach two goals: First to define an emotional measure by means of emotion assesses by the public, Second to research image characteristics corresponding to movements. In this dissertation, we collect emotional words by literature and experimental surveys and extract representative emotional words using KJ method and clustering analysis. The results of research, the emotional axes selected for motiontypography represent 'calm to active' and 'soft to stiff', the viewers feel a specific emotional state from some movements of motiontypography. If, we investigate the relationship of motiontypographic visual elements with emotional words are achieved together, I think it will serve as a motiontypographic guideline that enables helping the public to easily produce motiontypography.