• Title/Summary/Keyword: Emotion Recognition and Expression

Search Result 143, Processing Time 0.028 seconds

Recognition and Generation of Facial Expression for Human-Robot Interaction (로봇과 인간의 상호작용을 위한 얼굴 표정 인식 및 얼굴 표정 생성 기법)

  • Jung Sung-Uk;Kim Do-Yoon;Chung Myung-Jin;Kim Do-Hyoung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.12 no.3
    • /
    • pp.255-263
    • /
    • 2006
  • In the last decade, face analysis, e.g. face detection, face recognition, facial expression recognition, is a very lively and expanding research field. As computer animated agents and robots bring a social dimension to human computer interaction, interest in this research field is increasing rapidly. In this paper, we introduce an artificial emotion mimic system which can recognize human facial expressions and also generate the recognized facial expression. In order to recognize human facial expression in real-time, we propose a facial expression classification method that is performed by weak classifiers obtained by using new rectangular feature types. In addition, we make the artificial facial expression using the developed robotic system based on biological observation. Finally, experimental results of facial expression recognition and generation are shown for the validity of our robotic system.

Classification of Three Different Emotion by Physiological Parameters

  • Jang, Eun-Hye;Park, Byoung-Jun;Kim, Sang-Hyeob;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.2
    • /
    • pp.271-279
    • /
    • 2012
  • Objective: This study classified three different emotional states(boredom, pain, and surprise) using physiological signals. Background: Emotion recognition studies have tried to recognize human emotion by using physiological signals. It is important for emotion recognition to apply on human-computer interaction system for emotion detection. Method: 122 college students participated in this experiment. Three different emotional stimuli were presented to participants and physiological signals, i.e., EDA(Electrodermal Activity), SKT(Skin Temperature), PPG(Photoplethysmogram), and ECG (Electrocardiogram) were measured for 1 minute as baseline and for 1~1.5 minutes during emotional state. The obtained signals were analyzed for 30 seconds from the baseline and the emotional state and 27 features were extracted from these signals. Statistical analysis for emotion classification were done by DFA(discriminant function analysis) (SPSS 15.0) by using the difference values subtracting baseline values from the emotional state. Results: The result showed that physiological responses during emotional states were significantly differed as compared to during baseline. Also, an accuracy rate of emotion classification was 84.7%. Conclusion: Our study have identified that emotions were classified by various physiological signals. However, future study is needed to obtain additional signals from other modalities such as facial expression, face temperature, or voice to improve classification rate and to examine the stability and reliability of this result compare with accuracy of emotion classification using other algorithms. Application: This could help emotion recognition studies lead to better chance to recognize various human emotions by using physiological signals as well as is able to be applied on human-computer interaction system for emotion recognition. Also, it can be useful in developing an emotion theory, or profiling emotion-specific physiological responses as well as establishing the basis for emotion recognition system in human-computer interaction.

Korean Facial Expression Emotion Recognition based on Image Meta Information (이미지 메타 정보 기반 한국인 표정 감정 인식)

  • Hyeong Ju Moon;Myung Jin Lim;Eun Hee Kim;Ju Hyun Shin
    • Smart Media Journal
    • /
    • v.13 no.3
    • /
    • pp.9-17
    • /
    • 2024
  • Due to the recent pandemic and the development of ICT technology, the use of non-face-to-face and unmanned systems is expanding, and it is very important to understand emotions in communication in non-face-to-face situations. As emotion recognition methods for various facial expressions are required to understand emotions, artificial intelligence-based research is being conducted to improve facial expression emotion recognition in image data. However, existing research on facial expression emotion recognition requires high computing power and a lot of learning time because it utilizes a large amount of data to improve accuracy. To improve these limitations, this paper proposes a method of recognizing facial expressions using age and gender, which are image meta information, as a method of recognizing facial expressions with even a small amount of data. For facial expression emotion recognition, a face was detected using the Yolo Face model from the original image data, and age and gender were classified through the VGG model based on image meta information, and then seven emotions were recognized using the EfficientNet model. The accuracy of the proposed data classification learning model was higher as a result of comparing the meta-information-based data classification model with the model trained with all data.

Difficulty in Facial Emotion Recognition in Children with ADHD (주의력결핍 과잉행동장애의 이환 여부에 따른 얼굴표정 정서 인식의 차이)

  • An, Na Young;Lee, Ju Young;Cho, Sun Mi;Chung, Young Ki;Shin, Yun Mi
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • v.24 no.2
    • /
    • pp.83-89
    • /
    • 2013
  • Objectives : It is known that children with attention-deficit hyperactivity disorder (ADHD) experience significant difficulty in recognizing facial emotion, which involves processing of emotional facial expressions rather than speech, compared to children without ADHD. This objective of this study is to investigate the differences in facial emotion recognition between children with ADHD and normal children used as control. Methods : The children for our study were recruited from the Suwon Project, a cohort comprising a non-random convenience sample of 117 nine-year-old ethnic Koreans. The parents of the study participants completed study questionnaires such as the Korean version of Child Behavior Checklist, ADHD Rating Scale, Kiddie-Schedule for Affective Disorders and Schizophrenia-Present and Lifetime Version. Facial Expression Recognition Test of the Emotion Recognition Test was used for the evaluation of facial emotion recognition and ADHD Rating Scale was used for the assessment of ADHD. Results : ADHD children (N=10) were found to have impaired recognition when it comes to Emotional Differentiation and Contextual Understanding compared with normal controls (N=24). We found no statistically significant difference in the recognition of positive facial emotions (happy and surprise) and negative facial emotions (anger, sadness, disgust and fear) between the children with ADHD and normal children. Conclusion : The results of our study suggested that facial emotion recognition may be closely associated with ADHD, after controlling for covariates, although more research is needed.

The Effect of Impulsivity and the Ability to Recognize Facial Emotion on the Aggressiveness of Children with Attention-Deficit Hyperactivity Disorder (주의력결핍 과잉행동장애 아동에서 감정인식능력 및 충동성이 공격성에 미치는 영향)

  • Bae, Seung-Min;Shin, Dong-Won;Lee, Soo-Jung
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • v.20 no.1
    • /
    • pp.17-22
    • /
    • 2009
  • Objectives : A higher level of aggression has been reported for children with attention-deficit/hyperactivity disorder (ADHD) than for non-ADHD children. Aggression was shown to have a negative effect on the social functioning of children with ADHD. The ability to recognize facial emotion expression has also been related to aggression. In this study, we examined whether impulsivity and dysfunctional recognition of facial emotion expression could explain the aggressiveness of children with ADHD. Methods : 67 children with ADHD participated in this study. We measured the ability to recognize facial emotion expression by using the Emotion Recognition Test (ERT) and we measured aggression by the T score of the aggression subscale of the Child Behavior Checklist (CBCL). Impulsivity was measured by the ADHD diagnostic system (ADS). Results : The teacher rated level of aggression was related to the score of recognizing negative affect. After controlling for the effect of impulsivity, this relationship is not significant. Only the score of the visual commission errors ex plained the level of aggression of children with ADHD. Conclusion : Impulsivity seems to have a major role in explaining the aggression of children with ADHD. The clinical implication of this study is that effective intervention for controlling impulsivity may be expected to reduce the aggression of children with ADHD.

  • PDF

Hybrid Facial Representations for Emotion Recognition

  • Yun, Woo-Han;Kim, DoHyung;Park, Chankyu;Kim, Jaehong
    • ETRI Journal
    • /
    • v.35 no.6
    • /
    • pp.1021-1028
    • /
    • 2013
  • Automatic facial expression recognition is a widely studied problem in computer vision and human-robot interaction. There has been a range of studies for representing facial descriptors for facial expression recognition. Some prominent descriptors were presented in the first facial expression recognition and analysis challenge (FERA2011). In that competition, the Local Gabor Binary Pattern Histogram Sequence descriptor showed the most powerful description capability. In this paper, we introduce hybrid facial representations for facial expression recognition, which have more powerful description capability with lower dimensionality. Our descriptors consist of a block-based descriptor and a pixel-based descriptor. The block-based descriptor represents the micro-orientation and micro-geometric structure information. The pixel-based descriptor represents texture information. We validate our descriptors on two public databases, and the results show that our descriptors perform well with a relatively low dimensionality.

Emotion Labor and Emotional Exhaustion : The Role of Emotional Intelligence (감정노동, 감성지능이 종업원의 감정고갈에 미치는 영향에 관한 연구)

  • Hong, Yong-Ki
    • Management & Information Systems Review
    • /
    • v.25
    • /
    • pp.243-273
    • /
    • 2008
  • A new research paradigm is emerging within organizational behavior, in both theory and empiricism, based on the increasing recognition of the importance of emotions to organizational life. This paper suggest that emotion intelligence play a moderate variables in relationship of emotion labor and emotional exhaustion. More specifically, it is proposed that emotional intelligence, the ability to understand and manage emotions in the employee self and others, contribute to effective emotions management in organizations. Four major aspects of emotion labor, appraisal and expression of emotion in oneself, appraisal and recognition of emotion in others, regulation of emotion in oneself and use of emotion to facilitate performance, are described. Also, the emotional intelligence are consists of four aspects, frequency of appropriate emotional display, attentiveness to required displayed rules, variety of emotions to be displayed and emotional dissonance. Then I propose how emotional intelligence contributes to of relations the emotion labor and emotional exhaustion. The purpose of this research is to investigate the impact of emotion labor to employee's emotional exhaustion to explore the moderating effects of the emotional intelligence between the emotion labor and emotional exhaustion. To complete the research the data were collected through a questionnaire from 147 employees from service company. After multi-hierarchical regression analysis, the outcomes of this study are the employee's emotional exhaustion are affected negatively by the three factors: major aspects of emotion labor, regulation of emotion in oneself, use of emotion to facilitate performance, make the moderation effect between emotion labor and emotional intelligence. These results indicate that instilling in others an appreciation of the importance of work activities: encouraging of true expression individual emotions, generating and maintaining well emotional climate and cooperation situations, and managing a meaningful environment for an organizational life.

  • PDF

Effects of the facial expression presenting types and facial areas on the emotional recognition (얼굴 표정의 제시 유형과 제시 영역에 따른 정서 인식 효과)

  • Lee, Jung-Hun;Park, Soo-Jin;Han, Kwang-Hee;Ghim, Hei-Rhee;Cho, Kyung-Ja
    • Science of Emotion and Sensibility
    • /
    • v.10 no.1
    • /
    • pp.113-125
    • /
    • 2007
  • The aim of the experimental studies described in this paper is to investigate the effects of the face/eye/mouth areas using dynamic facial expressions and static facial expressions on emotional recognition. Using seven-seconds-displays, experiment 1 for basic emotions and experiment 2 for complex emotions are executed. The results of two experiments supported that the effects of dynamic facial expressions are higher than static one on emotional recognition and indicated the higher emotional recognition effects of eye area on dynamic images than mouth area. These results suggest that dynamic properties should be considered in emotional study with facial expressions for not only basic emotions but also complex emotions. However, we should consider the properties of emotion because each emotion did not show the effects of dynamic image equally. Furthermore, this study let us know which facial area shows emotional states more correctly is according to the feature emotion.

  • PDF

Recognition of Facial Emotion Using Multi-scale LBP (멀티스케일 LBP를 이용한 얼굴 감정 인식)

  • Won, Chulho
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.12
    • /
    • pp.1383-1392
    • /
    • 2014
  • In this paper, we proposed a method to automatically determine the optimal radius through multi-scale LBP operation generalizing the size of radius variation and boosting learning in facial emotion recognition. When we looked at the distribution of features vectors, the most common was $LBP_{8.1}$ of 31% and sum of $LBP_{8.1}$ and $LBP_{8.2}$ was 57.5%, $LBP_{8.3}$, $LBP_{8.4}$, and $LBP_{8.5}$ were respectively 18.5%, 12.0%, and 12.0%. It was found that the patterns of relatively greater radius express characteristics of face well. In case of normal and anger, $LBP_{8.1}$ and $LBP_{8.2}$ were mainly distributed. The distribution of $LBP_{8.3}$ is greater than or equal to the that of $LBP_{8.1}$ in laugh and surprise. It was found that the radius greater than 1 or 2 was useful for a specific emotion recognition. The facial expression recognition rate of proposed multi-scale LBP method was 97.5%. This showed the superiority of proposed method and it was confirmed through various experiments.

Hand Gesture Recognition Using an Infrared Proximity Sensor Array

  • Batchuluun, Ganbayar;Odgerel, Bayanmunkh;Lee, Chang Hoon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.3
    • /
    • pp.186-191
    • /
    • 2015
  • Hand gesture is the most common tool used to interact with and control various electronic devices. In this paper, we propose a novel hand gesture recognition method using fuzzy logic based classification with a new type of sensor array. In some cases, feature patterns of hand gesture signals cannot be uniquely distinguished and recognized when people perform the same gesture in different ways. Moreover, differences in the hand shape and skeletal articulation of the arm influence to the process. Manifold features were extracted, and efficient features, which make gestures distinguishable, were selected. However, there exist similar feature patterns across different hand gestures, and fuzzy logic is applied to classify them. Fuzzy rules are defined based on the many feature patterns of the input signal. An adaptive neural fuzzy inference system was used to generate fuzzy rules automatically for classifying hand gestures using low number of feature patterns as input. In addition, emotion expression was conducted after the hand gesture recognition for resultant human-robot interaction. Our proposed method was tested with many hand gesture datasets and validated with different evaluation metrics. Experimental results show that our method detects more hand gestures as compared to the other existing methods with robust hand gesture recognition and corresponding emotion expressions, in real time.