• Title/Summary/Keyword: Facial emotion

Search Result 311, Processing Time 0.024 seconds

Facial expression recognition based on pleasure and arousal dimensions (쾌 및 각성차원 기반 얼굴 표정인식)

  • 신영숙;최광남
    • Korean Journal of Cognitive Science
    • /
    • v.14 no.4
    • /
    • pp.33-42
    • /
    • 2003
  • This paper presents a new system for facial expression recognition based in dimension model of internal states. The information of facial expression are extracted to the three steps. In the first step, Gabor wavelet representation extracts the edges of face components. In the second step, sparse features of facial expressions are extracted using fuzzy C-means(FCM) clustering algorithm on neutral faces, and in the third step, are extracted using the Dynamic Model(DM) on the expression images. Finally, we show the recognition of facial expression based on the dimension model of internal states using a multi-layer perceptron. The two dimensional structure of emotion shows that it is possible to recognize not only facial expressions related to basic emotions but also expressions of various emotion.

  • PDF

A Comparative Analysis on Facial Expression in Advertisements -By Utilising Facial Action Coding System(FACS) (광고 속의 얼굴 표정에 따른 비교 연구 -FACS를 활용하여)

  • An, Kyoung Hee
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.3
    • /
    • pp.61-71
    • /
    • 2019
  • Due to the limit of the time length of advertisement, facial expressions among the types of nonverbal communication are much more expressive and convincing to appeal to costumers. The purpose of this paper is not only to investigate how facial expressions are portrayed but also to examine how facial expressions convey emotion in TV advertisements. Research subjects are TV advertisements of and which had the wide range of popularity for customer known as one of the most touching commercials. The research method is Facial Action Coding System based on the theoretical perspective of a discrete emotions and designed to measure specific facial muscle movements. This research is to analyse the implications of facial expressions in the both TV ads by using FACS based on Psychology as well as anatomy. From the all the result of this, it is shown that the facial expressions portrayed with the conflict of emotional states and the dramatic emotional relief of the heroin could move more customers' emotions.

Video-based Facial Emotion Recognition using Active Shape Models and Statistical Pattern Recognizers (Active Shape Model과 통계적 패턴인식기를 이용한 얼굴 영상 기반 감정인식)

  • Jang, Gil-Jin;Jo, Ahra;Park, Jeong-Sik;Seo, Yong-Ho
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.14 no.3
    • /
    • pp.139-146
    • /
    • 2014
  • This paper proposes an efficient method for automatically distinguishing various facial expressions. To recognize the emotions from facial expressions, the facial images are obtained by digital cameras, and a number of feature points were extracted. The extracted feature points are then transformed to 49-dimensional feature vectors which are robust to scale and translational variations, and the facial emotions are recognized by statistical pattern classifiers such Naive Bayes, MLP (multi-layer perceptron), and SVM (support vector machine). Based on the experimental results with 5-fold cross validation, SVM was the best among the classifiers, whose performance was obtained by 50.8% for 6 emotion classification, and 78.0% for 3 emotions.

Toward an integrated model of emotion recognition methods based on reviews of previous work (정서 재인 방법 고찰을 통한 통합적 모델 모색에 관한 연구)

  • Park, Mi-Sook;Park, Ji-Eun;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.14 no.1
    • /
    • pp.101-116
    • /
    • 2011
  • Current researches on emotion detection classify emotions by using the information from facial, vocal, and bodily expressions, or physiological responses. This study was to review three representative emotion recognition methods, which were based on psychological theory of emotion. Firstly, literature review on the emotion recognition methods based on facial expressions was done. These studies were supported by Darwin's theory. Secondly, review on the emotion recognition methods based on changes in physiology was conducted. These researches were relied on James' theory. Lastly, a review on the emotion recognition was conducted on the basis of multimodality(i.e., combination of signals from face, dialogue, posture, or peripheral nervous system). These studies were supported by both Darwin's and James' theories. In each part, research findings was examined as well as theoretical backgrounds which each method was relied on. This review proposed a need for an integrated model of emotion recognition methods to evolve the way of emotion recognition. The integrated model suggests that emotion recognition methods are needed to include other physiological signals such as brain responses or face temperature. Also, the integrated model proposed that emotion recognition methods are needed to be based on multidimensional model and take consideration of cognitive appraisal factors during emotional experience.

  • PDF

Emotion Recognition and Expression System of User using Multi-Modal Sensor Fusion Algorithm (다중 센서 융합 알고리즘을 이용한 사용자의 감정 인식 및 표현 시스템)

  • Yeom, Hong-Gi;Joo, Jong-Tae;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.1
    • /
    • pp.20-26
    • /
    • 2008
  • As they have more and more intelligence robots or computers these days, so the interaction between intelligence robot(computer) - human is getting more and more important also the emotion recognition and expression are indispensable for interaction between intelligence robot(computer) - human. In this paper, firstly we extract emotional features at speech signal and facial image. Secondly we apply both BL(Bayesian Learning) and PCA(Principal Component Analysis), lastly we classify five emotions patterns(normal, happy, anger, surprise and sad) also, we experiment with decision fusion and feature fusion to enhance emotion recognition rate. The decision fusion method experiment on emotion recognition that result values of each recognition system apply Fuzzy membership function and the feature fusion method selects superior features through SFS(Sequential Forward Selection) method and superior features are applied to Neural Networks based on MLP(Multi Layer Perceptron) for classifying five emotions patterns. and recognized result apply to 2D facial shape for express emotion.

Divide and Conquer Strategy for CNN Model in Facial Emotion Recognition based on Thermal Images (얼굴 열화상 기반 감정인식을 위한 CNN 학습전략)

  • Lee, Donghwan;Yoo, Jang-Hee
    • Journal of Software Assessment and Valuation
    • /
    • v.17 no.2
    • /
    • pp.1-10
    • /
    • 2021
  • The ability to recognize human emotions by computer vision is a very important task, with many potential applications. Therefore the demand for emotion recognition using not only RGB images but also thermal images is increasing. Compared to RGB images, thermal images has the advantage of being less affected by lighting conditions but require a more sophisticated recognition method with low-resolution sources. In this paper, we propose a Divide and Conquer-based CNN training strategy to improve the performance of facial thermal image-based emotion recognition. The proposed method first trains to classify difficult-to-classify similar emotion classes into the same class group by confusion matrix analysis and then divides and solves the problem so that the emotion group classified into the same class group is recognized again as actual emotions. In experiments, the proposed method has improved accuracy in all the tests than when recognizing all the presented emotions with a single CNN model.

Exploring facial emotion processing in individuals with psychopathic traits during the implicit/explicit tasks: An ERP study (암묵적/외현적 과제에서 나타난 정신병질특성집단의 얼굴 정서 처리: 사건관련전위 연구)

  • Lee, Ye-Ji;Kim, Young Youn
    • Korean Journal of Forensic Psychology
    • /
    • v.12 no.2
    • /
    • pp.99-120
    • /
    • 2021
  • This study examined the differences in facial emotion processing related to psychopathic traits. On the basis of the Psychopathic Personality Inventory-Revised (Lee & Park, 2008), students were divided into psychopathic trait (n=15) and control (n=15) groups. Participants performed two tasks consisted of negative(angry, fear, sad) and neutral faces. Event-related potentials(EPRs) were recorded when participants categorized gender in the implicit task and emotion in the explicit task. We analyzed the late positive potentials(LPP) amplitude to investigate differences in emotion processing between psychopathic trait group and control group. In the implicit task, there was no significant difference in both groups. However, there was a significant interaction between emotion and group at the frontocentral region in the explicit task. The psychopathic trait group showed greater LPP amplitudes for the neutral faces than for the negative faces, whereas the control group showed similar LPP amplitudes for the neutral and negative faces at the frontocentral site. These results might reflect the abnormalities in emotional processing in individuals with psychopathic traits.

Comparison Between Core Affect Dimensional Structures of Different Ages using Representational Similarity Analysis (표상 유사성 분석을 이용한 연령별 얼굴 정서 차원 비교)

  • Jongwan Kim
    • Science of Emotion and Sensibility
    • /
    • v.26 no.1
    • /
    • pp.33-42
    • /
    • 2023
  • Previous emotion studies employing facial expressions have focused on the differences between age groups for each of the emotion categories. Instead, Kim (2021) has compared representations of facial expressions in the lower-dimensional emotion space. However, he reported descriptive comparisons without statistical significance testing. This research used representational similarity analysis (Kriegeskorte et al., 2008) to directly compare empirical datasets from young, middle-aged, and old groups and conceptual models. In addition, individual differences multidimensional scaling (Carroll & Chang, 1970) was conducted to explore individual weights on the emotional dimensions for each age group. The results revealed that the old group was the least similar to the other age groups in the empirical datasets and the valence model. In addition, the arousal dimension was the least weighted for the old group compared to the other groups. This study directly tested the differences between the three age groups in terms of empirical datasets, conceptual models, and weights on the emotion dimensions.

Developmental Changes in Emotional-States and Facial Expression (정서 상태와 얼굴표정간의 연결 능력의 발달)

  • Park, Soo-Jin;Song, In-Hae;Ghim, Hei-Rhee;Cho, Kyung-Ja
    • Science of Emotion and Sensibility
    • /
    • v.10 no.1
    • /
    • pp.127-133
    • /
    • 2007
  • The present study investigated whether the emotional states reading ability through facial expression changes by age(3-, 5-year-old and university student groups), sex(male, female), facial expression's presenting areas(face, eyes) and the type of emotions(basic emotions, complex emotions). 32 types of emotional state's facial expressions which are linked relatively strong with the emotional vocabularies were used as stimuli. Stimuli were collected by taking photographs of professional actors facial expression performance. Each individuals were presented with stories which set off certain emotions, and then were asked to choose a facial expression that the principal character would have made for the occasion presented in stories. The result showed that the ability of facial expression reading improves as the age get higher. Also, they performed better with the condition of face than eyes, and basic emotions than complex emotions. While female doesn't show any performance difference with the presenting areas, male shows better performance in case of facial condition compared with eye condition. The results demonstrate that age, facial expression's presenting areas and the type of emotions effect on estimation of other people's emotion through facial expressions.

  • PDF

Variation of facial temperature to 3D visual fatigue evoked (3D 시각피로 유발에 따른 안면 온도 변화)

  • Hwang, Sung Teac;Park, SangIn;Won, Myoung Ju;Whang, MinCheol
    • Science of Emotion and Sensibility
    • /
    • v.16 no.4
    • /
    • pp.509-516
    • /
    • 2013
  • As the visual fatigue induced by 3D visual stimulation has raised some safety concerns in the industry, this study aims to quantify the visual fatigue through the means of measuring the facial temperature changes. Facial temperature was measured for one minute before and after watching a visual stimulus. Whether the visual fatigue has occurred was measured through subjective evaluations and high cognitive tasks. The difference in the changes that occurred after watching a 2D stimulus and a 3D stimulus was computed in order to associate the facial temperature changes and the visual fatigue induced by watching 3D contents. The results showed significant differences in the subjective evaluations and in the high cognitive tasks. Also, the ERP latency increased after watching 3D stimuli. There were significant differences in the maximum value of the temperature at the forehead and at the tip of the nose. A previous study showed that 3D visual fatigue activates the sympathetic nervous system. Activation of the sympathetic nervous system is known to increase the heart rate as well as the blood flow into the face through the carotid arteries system. When watching 2D or 3D stimuli, the sympathetic nervous system activation dictates the blood flow, which then influences the facial temperature. This study is meaningful in that it is one of the first investigations that looks into the possibility to measure 3D visual fatigue with thermal images.