• Title/Summary/Keyword: Facial emotion

Search Result 311, Processing Time 0.02 seconds

Effect of Depressive Mood on Identification of Emotional Facial Expression (우울감이 얼굴 표정 정서 인식에 미치는 영향)

  • Ryu, Kyoung-Hi;Oh, Kyung-Ja
    • Science of Emotion and Sensibility
    • /
    • v.11 no.1
    • /
    • pp.11-21
    • /
    • 2008
  • This study was designed to examine the effect of depressive mood on identification of emotional facial expression. Participants were screened out of 305 college students on the basis of the BDI-II score. Students with BDI-II score higher than 14(upper 20%) were selected for the Depression Group and those with BDI-II score lower than 5(lower 20%) were selected for the Control Group. A final sample of 20 students in the Depression Group and 20 in the Control Group were presented with facial expression stimuli of an increasing degree of emotional intensity, slowly changing from a neutral to a full intensity of happy, sad, angry, or fearful expressions. The result showed that there was the significant interaction of Group by Emotion(esp. happy and sad) which suggested that depressive mood affects processing of emotional stimuli such as facial expressions. Implication of this result for mood-congruent information processing were discussed.

  • PDF

Implementation of Multi Channel Network Platform based Augmented Reality Facial Emotion Sticker using Deep Learning (딥러닝을 이용한 증강현실 얼굴감정스티커 기반의 다중채널네트워크 플랫폼 구현)

  • Kim, Dae-Jin
    • Journal of Digital Contents Society
    • /
    • v.19 no.7
    • /
    • pp.1349-1355
    • /
    • 2018
  • Recently, a variety of contents services over the internet are becoming popular, among which MCN(Multi Channel Network) platform services have become popular with the generalization of smart phones. The MCN platform is based on streaming, and various factors are added to improve the service. Among them, augmented reality sticker service using face recognition is widely used. In this paper, we implemented the MCN platform that masks the augmented reality sticker on the face through facial emotion recognition in order to further increase the interest factor. We analyzed seven facial emotions using deep learning technology for facial emotion recognition, and applied the emotional sticker to the face based on it. To implement the proposed MCN platform, emotional stickers were applied to the clients and various servers that can stream the servers were designed.

Emotion Recognition of Korean and Japanese using Facial Images (얼굴영상을 이용한 한국인과 일본인의 감정 인식 비교)

  • Lee, Dae-Jong;Ahn, Ui-Sook;Park, Jang-Hwan;Chun, Myung-Geun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.2
    • /
    • pp.197-203
    • /
    • 2005
  • In this paper, we propose an emotion recognition using facial Images to effectively design human interface. Facial database consists of six basic human emotions including happiness, sadness, anger, surprise, fear and dislike which have been known as common emotions regardless of nation and culture. Emotion recognition for the facial images is performed after applying the discrete wavelet. Here, the feature vectors are extracted from the PCA and LDA. Experimental results show that human emotions such as happiness, sadness, and anger has better performance than surprise, fear and dislike. Expecially, Japanese shows lower performance for the dislike emotion. Generally, the recognition rates for Korean have higher values than Japanese cases.

Facial Image Analysis Algorithm for Emotion Recognition (감정 인식을 위한 얼굴 영상 분석 알고리즘)

  • Joo, Y.H.;Jeong, K.H.;Kim, M.H.;Park, J.B.;Lee, J.;Cho, Y.J.
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.14 no.7
    • /
    • pp.801-806
    • /
    • 2004
  • Although the technology for emotion recognition is important one which demanded in various fields, it still remains as the unsolved problem. Especially, it needs to develop the algorithm based on human facial image. In this paper, we propose the facial image analysis algorithm for emotion recognition. The proposed algorithm is composed as the facial image extraction algorithm and the facial component extraction algorithm. In order to have robust performance under various illumination conditions, the fuzzy color filter is proposed in facial image extraction algorithm. In facial component extraction algorithm, the virtual face model is used to give information for high accuracy analysis. Finally, the simulations are given in order to check and evaluate the performance.

Recognition of Facial Emotion Using Multi-scale LBP (멀티스케일 LBP를 이용한 얼굴 감정 인식)

  • Won, Chulho
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.12
    • /
    • pp.1383-1392
    • /
    • 2014
  • In this paper, we proposed a method to automatically determine the optimal radius through multi-scale LBP operation generalizing the size of radius variation and boosting learning in facial emotion recognition. When we looked at the distribution of features vectors, the most common was $LBP_{8.1}$ of 31% and sum of $LBP_{8.1}$ and $LBP_{8.2}$ was 57.5%, $LBP_{8.3}$, $LBP_{8.4}$, and $LBP_{8.5}$ were respectively 18.5%, 12.0%, and 12.0%. It was found that the patterns of relatively greater radius express characteristics of face well. In case of normal and anger, $LBP_{8.1}$ and $LBP_{8.2}$ were mainly distributed. The distribution of $LBP_{8.3}$ is greater than or equal to the that of $LBP_{8.1}$ in laugh and surprise. It was found that the radius greater than 1 or 2 was useful for a specific emotion recognition. The facial expression recognition rate of proposed multi-scale LBP method was 97.5%. This showed the superiority of proposed method and it was confirmed through various experiments.

A Study on the System of Facial Expression Recognition for Emotional Information and Communication Technology Teaching (감성ICT 교육을 위한 얼굴감성 인식 시스템에 관한 연구)

  • Song, Eun Jee
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.4 no.2
    • /
    • pp.171-175
    • /
    • 2012
  • Recently, the research on ICT (Information and Communication Technology), which cognizes and communicates human's emotion through information technology, is increasing. For instance, there are researches on phones and services that perceive users' emotions through detecting people's voices, facial emotions, and biometric data. In short, emotions which were used to be predicted only by humans are, now, predicted by digital equipment instead. Among many ICT researches, research on emotion recognition from face is fully expected as the most effective and natural human interface. This paper studies about sensitivity ICT and examines mechanism of facial expression recognition system as an example of sensitivity ICT.

  • PDF

Development of Emotional Feature Extraction Method based on Advanced AAM (Advanced AAM 기반 정서특징 검출 기법 개발)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.6
    • /
    • pp.834-839
    • /
    • 2009
  • It is a key element that the problem of emotional feature extraction based on facial image to recognize a human emotion status. In this paper, we propose an Advanced AAM that is improved version of proposed Facial Expression Recognition Systems based on Bayesian Network by using FACS and AAM. This is a study about the most efficient method of optimal facial feature area for human emotion recognition about random user based on generalized HCI system environments. In order to perform such processes, we use a Statistical Shape Analysis at the normalized input image by using Advanced AAM and FACS as a facial expression and emotion status analysis program. And we study about the automatical emotional feature extraction about random user.

Emotional Expression of the Virtual Influencer "Luo Tianyi(洛天依)" in Digital'

  • Guangtao Song;Albert Young Choi
    • International Journal of Advanced Culture Technology
    • /
    • v.12 no.2
    • /
    • pp.375-385
    • /
    • 2024
  • In the context of contemporary digital media, virtual influencers have become an increasingly important form of socialization and entertainment, in which emotional expression is a key factor in attracting viewers. In this study, we take Luo Tianyi, a Chinese virtual influencer, as an example to explore how emotions are expressed and perceived through facial expressions in different types of videos. Using Paul Ekman's Facial Action Coding System (FACS) and six basic emotion classifications, the study systematically analyzes Luo Tianyi's emotional expressions in three types of videos, namely Music show, Festivals and Brand Cooperation. During the study, Luo Tianyi's facial expressions and emotional expressions were analyzed through rigorous coding and categorization, as well as matching the context of the video content. The results show that Enjoyment is the most frequently expressed emotion by Luo Tianyi, reflecting the centrality of positive emotions in content creation. Meanwhile, the presence of other emotion types reveals the virtual influencer's efforts to create emotionally rich and authentic experiences. The frequency and variety of emotions expressed in different video genres indicate Luo Tianyi's diverse strategies for communicating and connecting with viewers in different contexts. The study provides an empirical basis for understanding and utilizing virtual influencers' emotional expressions, and offers valuable insights for digital media content creators to design emotional expression strategies. Overall, this study is valuable for understanding the complexity of virtual influencer emotional expression and its importance in digital media strategy.

Difference of Facial Skin Temperature Responses between Fear and Joy (공포와 기쁨 정서 간 안면온도 반응의 차이)

  • Eum, Yeong-Ji;Eom, Jin-Sup;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.15 no.1
    • /
    • pp.1-8
    • /
    • 2012
  • There have been many emotion researches to investigate physiological responses on specific emotions with physiological parameters such as heart rate, blood volume flow, and skin conductance. Very few researches, however, exists by detecting them with facial skin temperature. The purpose of present study was to observe the differences of facial skin temperature by using thermal camera, when participants stimulated by monitor scenes which could evoke fear or joy. There were totally 98 of participants; undergraduate students who were in their adult age and middle, high school students who were in their adolescence. We measured their facial temperature, before and after presenting emotional stimulus to see changes between both times. Temperature values were extracted in these regions; forehead, inner corners of the eyes, bridge of the nose, end of the nose, and cheeks. Temperature values in bridge and end of the nose were significantly decreased in fear emotion stimulated. There was also significant temperature increase in the area of forehead and the inner corners of the eyes, while the temperature value in end of the nose decreased. It showed decrease in both stimulated fear and joy. These results might be described as follows: When arousal level going up, sympathetic nervous activity increases, and in turn it makes blood flow in peripheral vessels under the nose decrease. Facial temperature changes by fear or joy in this study were the same as the previous studies which measured temperature of finger tip, when participants experiencing emotions. Our results may help to develop emotion-measuring techniques and establish computer system bases which are to detect human emotions.

  • PDF

The Effects of Emotional Contexts on Infant Smiling (정서 유발 맥락이 영아의 미소 얼굴 표정에 미치는 영향)

  • Hong, Hee Young;Lee, Young
    • Korean Journal of Child Studies
    • /
    • v.24 no.6
    • /
    • pp.15-31
    • /
    • 2003
  • This study examined the effects of emotion inducing contexts on types of infants smiling. Facial expressions of forty-five 11-to 15-month-old infants were videotaped in an experimental lab with positive and negative emotional contests. Infants' smiling was identified as the Duchenne smile or non-Duchenne smile based on FACS(Facial Action Coding System, Ekman & Friesen, 1978). Duration of smiling types was analyzed. Overall, infants showed more smiling in the positive than in the negative emotional context. Occurrence of Duchenne smiling was more likely in the positive than in the negative context and in the peek-a-boo than in the melody toy condition within the same positive context. Non-Duchenne smiling did not differ by context.

  • PDF