• Title/Summary/Keyword: Facial Emotion Expression

Search Result 202, Processing Time 0.03 seconds

A Study on The Expression of Digital Eye Contents for Emotional Communication (감성 커뮤니케이션을 위한 디지털 눈 콘텐츠 표현 연구)

  • Lim, Yoon-Ah;Lee, Eun-Ah;Kwon, Jieun
    • Journal of Digital Convergence
    • /
    • v.15 no.12
    • /
    • pp.563-571
    • /
    • 2017
  • The purpose of this paper is to establish an emotional expression factors of digital eye contents that can be applied to digital environments. The emotion which can be applied to the smart doll is derived and we suggest guidelines for expressive factors of each emotion. For this paper, first, we research the concepts and characteristics of emotional expression are shown in eyes by the publications, animation and actual video. Second, we identified six emotions -Happy, Angry, Sad, Relaxed, Sexy, Pure- and extracted the emotional expression factors. Third, we analyzed the extracted factors to establish guideline for emotional expression of digital eyes. As a result, this study found that the factors to distinguish and represent each emotion are classified four categories as eye shape, gaze, iris size and effect. These can be used as a way to enhance emotional communication effects such as digital contents including animations, robots and smart toys.

Improving the Processing Speed and Robustness of Face Detection for a Psychological Robot Application (심리로봇적용을 위한 얼굴 영역 처리 속도 향상 및 강인한 얼굴 검출 방법)

  • Ryu, Jeong Tak;Yang, Jeen Mo;Choi, Young Sook;Park, Se Hyun
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.20 no.2
    • /
    • pp.57-63
    • /
    • 2015
  • Compared to other emotion recognition technology, facial expression recognition technology has the merit of non-contact, non-enforceable and convenience. In order to apply to a psychological robot, vision technology must be able to quickly and accurately extract the face region in the previous step of facial expression recognition. In this paper, we remove the background from any image using the YCbCr skin color technology, and use Haar-like Feature technology for robust face detection. We got the result of improved processing speed and robust face detection by removing the background from the input image.

Brain Activation to Facial Expressions Among Alcoholics (알코올 중독자의 얼굴 표정 인식과 관련된 뇌 활성화 특성)

  • Park, Mi-Sook;Lee, Bae Hwan;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.20 no.4
    • /
    • pp.1-14
    • /
    • 2017
  • The purpose of this study was to investigate the neural substrates for recognizing facial expressions among alcoholics by using functional magnetic resonance imaging (fMRI). Abstinent inpatient alcoholics (n=18 males) and demographically similar social drinkers (n=16 males) participated in the study. The participants viewed pictures from the Japanese Female Facial Expression Database (JAFFE) and evaluated intensity of facial expressions. the alcoholics had a reduced activation in the limbic areas including amygdala and hippocampus while recognizing the emotional facial expressions compared to the nonalcoholic controls. On the other hand, the alcoholics showed greater brain activations than the controls in the left lingual (BA 19)/fusiform gyrus, the left middle frontal gyrus (BA 8/9/46), and the right superior parietal lobule (BA 7) during the viewing of emotional faces. In sum, specific brain regions were identified that are associated with recognition of facial expressions among alcoholics. The implication of the present study could be used in developing intervention for alcoholism.

Development of an Emotion Recognition Robot using a Vision Method (비전 방식을 이용한 감정인식 로봇 개발)

  • Shin, Young-Geun;Park, Sang-Sung;Kim, Jung-Nyun;Seo, Kwang-Kyu;Jang, Dong-Sik
    • IE interfaces
    • /
    • v.19 no.3
    • /
    • pp.174-180
    • /
    • 2006
  • This paper deals with the robot system of recognizing human's expression from a detected human's face and then showing human's emotion. A face detection method is as follows. First, change RGB color space to CIElab color space. Second, extract skin candidate territory. Third, detect a face through facial geometrical interrelation by face filter. Then, the position of eyes, a nose and a mouth which are used as the preliminary data of expression, he uses eyebrows, eyes and a mouth. In this paper, the change of eyebrows and are sent to a robot through serial communication. Then the robot operates a motor that is installed and shows human's expression. Experimental results on 10 Persons show 78.15% accuracy.

Emotional Interface Technologies for Service Robot (서비스 로봇을 위한 감성인터페이스 기술)

  • Yang, Hyun-Seung;Seo, Yong-Ho;Jeong, Il-Woong;Han, Tae-Woo;Rho, Dong-Hyun
    • The Journal of Korea Robotics Society
    • /
    • v.1 no.1
    • /
    • pp.58-65
    • /
    • 2006
  • The emotional interface is essential technology for the robot to provide the proper service to the user. In this research, we developed emotional components for the service robot such as a neural network based facial expression recognizer, emotion expression technologies based on 3D graphical face expression and joints movements, considering a user's reaction, behavior selection technology for emotion expression. We used our humanoid robots, AMI and AMIET as the test-beds of our emotional interface. We researched on the emotional interaction between a service robot and a user by integrating the developed technologies. Emotional interface technology for the service robot, enhance the performance of friendly interaction to the service robot, to increase the diversity of the service and the value-added of the robot for human. and it elevates the market growth and also contribute to the popularization of the robot. The emotional interface technology can enhance the performance of friendly interaction of the service robot. This technology can also increase the diversity of the service and the value-added of the robot for human. and it can elevate the market growth and also contribute to the popularization of the robot.

  • PDF

Multimodal Parametric Fusion for Emotion Recognition

  • Kim, Jonghwa
    • International journal of advanced smart convergence
    • /
    • v.9 no.1
    • /
    • pp.193-201
    • /
    • 2020
  • The main objective of this study is to investigate the impact of additional modalities on the performance of emotion recognition using speech, facial expression and physiological measurements. In order to compare different approaches, we designed a feature-based recognition system as a benchmark which carries out linear supervised classification followed by the leave-one-out cross-validation. For the classification of four emotions, it turned out that bimodal fusion in our experiment improves recognition accuracy of unimodal approach, while the performance of trimodal fusion varies strongly depending on the individual. Furthermore, we experienced extremely high disparity between single class recognition rates, while we could not observe a best performing single modality in our experiment. Based on these observations, we developed a novel fusion method, called parametric decision fusion (PDF), which lies in building emotion-specific classifiers and exploits advantage of a parametrized decision process. By using the PDF scheme we achieved 16% improvement in accuracy of subject-dependent recognition and 10% for subject-independent recognition compared to the best unimodal results.

Emotional Recognition According to General Characteristics of Stroke Patients (뇌졸중 환자의 일반적 특성에 따른 정서인식의 차이)

  • Park, Sungho;Kim, Minho
    • Journal of The Korean Society of Integrative Medicine
    • /
    • v.3 no.1
    • /
    • pp.63-69
    • /
    • 2015
  • Purpose: The purpose of this study was to investigate the differences in emotion recognition according to general characteristics of stroke patients. Method: The subjects consisted of 38 stroke patients receiving rehabilitation at S Hospital in Busan. Used the eMETT program to assess emotional cognition. Result: The age and duration of disease showed statistically significant differences in emotion recognition ability score, the gender and lesion showed a statistically significant difference in some emotion(p<.05). Conclusion: The results of this study it can be seen that the difference in emotion recognition ability in accordance with the general characteristics of the stroke. There will be a variety of future research related to standardized research or interventions targeted at stroke patients and normal controls to be carried out.

Facial expression recognition-based contents preference inference system (얼굴 표정 인식 기반 컨텐츠 선호도 추론 시스템)

  • Lee, Yeon-Gon;Cho, Durkhyun;Jang, Jun Ik;Suh, Il Hong
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2013.01a
    • /
    • pp.201-204
    • /
    • 2013
  • 디지털 컨텐츠의 종류와 양이 폭발적으로 증가하면서 컨텐츠 선호도 투표는 강한 파급력을 지니게 되었다. 하지만 컨텐츠 소비자가 직접 투표를 해야 하는 현재의 방법은 사람들의 투표 참여율이 저조하며, 조작 위험성이 높다는 문제점이 있다. 이에 본 논문에서는 컨텐츠 소비자의 얼굴 표정에 드러나는 감정을 인식함으로써 자동으로 컨텐츠 선호도를 추론하는 시스템을 제안한다. 본 논문에서 제안하는 시스템은 기존의 수동 컨텐츠 선호도 투표 시스템의 문제점인 컨텐츠 소비자의 부담감과 번거로움, 조작 위험성 등을 해소함으로써 보다 편리하고 효율적이며 신뢰도 높은 서비스를 제공하는 것을 목표로 한다. 따라서 본 논문에서는 컨텐츠 선호도 추론 시스템을 구축하기 위한 방법을 구체적으로 제안하고, 실험을 통하여 제안하는 시스템의 실용성과 효율성을 보인다.

  • PDF

Detection of Face Expression Based on Deep Learning (딥러닝 기반의 얼굴영상에서 표정 검출에 관한 연구)

  • Won, Chulho;Lee, Bub-ki
    • Journal of Korea Multimedia Society
    • /
    • v.21 no.8
    • /
    • pp.917-924
    • /
    • 2018
  • Recently, researches using LBP and SVM have been performed as one of the image - based methods for facial emotion recognition. LBP, introduced by Ojala et al., is widely used in the field of image recognition due to its high discrimination of objects, robustness to illumination change, and simple operation. In addition, CS(Center-Symmetric)-LBP was used as a modified form of LBP, which is widely used for face recognition. In this paper, we propose a method to detect four facial expressions such as expressionless, happiness, surprise, and anger using deep neural network. The validity of the proposed method is verified using accuracy. Based on the existing LBP feature parameters, it was confirmed that the method using the deep neural network is superior to the method using the Adaboost and SVM classifier.

Using AI Facial Expression Recognition, Healing and Advertising Service Tailored to User's Emotion (인공지능 표정 인식 기술을 활용한 사용자 감정 맞춤 힐링·광고 서비스)

  • Kim, Minsik;Jeong, Hyeon-woo;Moon, Yoonji;Moon, Jaehyun
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2021.11a
    • /
    • pp.1160-1163
    • /
    • 2021
  • DOOH(Degital Out of Home) advertisement market is developing steadily, and the case of use is also increasing, In advertisement market, personalized services is actively being provided with technological development. On the other hand, personalized services are difficult to be provided in DOOH and are p rovided by only personal information, not feelings. This study aims to construct personalized DOOH se rvices by using AI facial expression recognition and suggesting a solution optimized for interaction bet ween user and services by providing healing and advertisement.