• Title/Summary/Keyword: Emotional Facial Expression Change

Search Result 15, Processing Time 0.021 seconds

Emotion Training: Image Color Transfer with Facial Expression and Emotion Recognition (감정 트레이닝: 얼굴 표정과 감정 인식 분석을 이용한 이미지 색상 변환)

  • Kim, Jong-Hyun
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.4
    • /
    • pp.1-9
    • /
    • 2018
  • We propose an emotional training framework that can determine the initial symptom of schizophrenia by using emotional analysis method through facial expression change. We use Emotion API in Microsoft to obtain facial expressions and emotion values at the present time. We analyzed these values and recognized subtle facial expressions that change with time. The emotion states were classified according to the peak analysis-based variance method in order to measure the emotions appearing in facial expressions according to time. The proposed method analyzes the lack of emotional recognition and expressive ability by using characteristics that are different from the emotional state changes classified according to the six basic emotions proposed by Ekman. As a result, the analyzed values are integrated into the image color transfer framework so that users can easily recognize and train their own emotional changes.

The Effect of Emotional Expression Change, Delay, and Background at Retrieval on Face Recognition (얼굴자극의 검사단계 표정변화와 검사 지연시간, 자극배경이 얼굴재인에 미치는 효과)

  • Youngshin Park
    • Korean Journal of Culture and Social Issue
    • /
    • v.20 no.4
    • /
    • pp.347-364
    • /
    • 2014
  • The present study was conducted to investigate how emotional expression change, test delay, and background influence on face recognition. In experiment 1, participants were presented with negative faces at study phase and administered for standard old-new recognition test including targets of negative and neutral expression for the same faces. In experiment 2, participants were studied negative faces and tested by old-new face recognition test with targets of negative and positive faces. In experiment 3, participants were presented with neutral faces at study phase and had to identify the same faces with no regard for negative and neutral expression at face recognition test. In all three experiments, participants were assigned into either immediate test or delay test, and target faces were presented in both white and black background. Results of experiments 1 and 2 indicated higher rates for negative faces than neutral or positive faces. Facial expression consistency enhanced face recognition memory. In experiment 3, the superiority of facial expression consistency were demonstrated by higher rates for neutral faces at recognition test. If facial expressions were consistent across encoding and retrieval, memory performance on face recognition were enhanced in all three experiments. And the effect of facial expression change have different effects on background conditions. The findings suggest that facial expression change make face identification hard, and time and background also affect on face recognition.

  • PDF

Mood Suggestion Framework Using Emotional Relaxation Matching Based on Emotion Meshes

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.8
    • /
    • pp.37-43
    • /
    • 2018
  • In this paper, we propose a framework that automatically suggests emotion using emotion analysis method based on facial expression change. We use Microsoft's Emotion API to calculate and analyze emotion values in facial expressions to recognize emotions that change over time. In this step, we use standard deviations based on peak analysis to measure and classify emotional changes. The difference between the classified emotion and the normal emotion is calculated, and the difference is used to recognize the emotion abnormality. We match user's emotions to relatively relaxed emotions using histograms and emotional meshes. As a result, we provide relaxed emotions to users through images. The proposed framework helps users to recognize emotional changes easily and to train their emotions through emotional relaxation.

The Effect of Cognitive Movement Therapy on Emotional Rehabilitation for Children with Affective and Behavioral Disorder Using Emotional Expression and Facial Image Analysis (감정표현 표정의 영상분석에 의한 인지동작치료가 정서·행동장애아 감성재활에 미치는 영향)

  • Byun, In-Kyung;Lee, Jae-Ho
    • The Journal of the Korea Contents Association
    • /
    • v.16 no.12
    • /
    • pp.327-345
    • /
    • 2016
  • The purpose of this study was to carry out cognitive movement therapy program for children with affective and behavioral disorder based on neuro science, psychology, motor learning, muscle physiology, biomechanics, human motion analysis, movement control and to quantify characteristic of expression and gestures according to change of facial expression by emotional change. We could observe problematic expression of children with affective disorder, and could estimate the efficiency of application of movement therapy program by the face expression change of children with affective disorder. And it could be expected to accumulate data for early detection and therapy process of development disorder applying converged measurement and analytic method for human development by quantification of emotion and behavior therapy analysis, kinematic analysis. Therefore, the result of this study could be extendedly applied to the disabled, the elderly and the sick as well as children.

Accurate Visual Working Memory under a Positive Emotional Expression in Face (얼굴표정의 긍정적 정서에 의한 시각작업기억 향상 효과)

  • Han, Ji-Eun;Hyun, Joo-Seok
    • Science of Emotion and Sensibility
    • /
    • v.14 no.4
    • /
    • pp.605-616
    • /
    • 2011
  • The present study examined memory accuracy for faces with positive, negative and neutral emotional expressions to test whether their emotional content can affect visual working memory (VWM) performance. Participants remembered a set of face pictures in which facial expressions of the faces were randomly assigned from pleasant, unpleasant and neutral emotional categories. Participants' task was to report presence or absence of an emotion change in the faces by comparing the remembered set against another set of test faces displayed after a short delay. The change detection accuracies of the pleasant, unpleasant and neutral face conditions were compared under two memory exposure duration of 500ms vs. 1000ms. Under the duration of 500ms, the accuracy in the pleasant condition was higher than both unpleasant and neutral conditions. However the difference disappeared when the duration was extended to 1000ms. The results indicate that a positive facial expression can improve VWM accuracy relative to the negative or positive expressions especially when there is not enough time for forming durable VWM representations.

  • PDF

A Study on Character's Emotional Appearance in Distinction Focused on 3D Animation "Inside Out" (3D 애니메이션 "인사이드 아웃" 분석을 통한 감성별 캐릭터 외형특징 연구)

  • Ahn, Duck-ki;Chung, Jean-Hun
    • Journal of Digital Convergence
    • /
    • v.15 no.2
    • /
    • pp.361-368
    • /
    • 2017
  • This study analyzes into the characteristic appearance in distintion with emotional changes toward visual forms of psychology along with character development in the 3D animation industry. In this regard, the study seeks to propose essential targets of the five emotional characters from the Pixar's animation Inside-Out to prove psychological effects to the character's visual appearance. As a previous research, the study analysis the visual representations oriented toward both emotional facial expression and emotional color expression using both Paul Ekman and Robert Plutchik's human basic emotion research. The purpose of this study is to present the visual guideline of emotional character's appearance through the various human expression for differentiated character development in animation production.

Transfer Learning for Face Emotions Recognition in Different Crowd Density Situations

  • Amirah Alharbi
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.4
    • /
    • pp.26-34
    • /
    • 2024
  • Most human emotions are conveyed through facial expressions, which represent the predominant source of emotional data. This research investigates the impact of crowds on human emotions by analysing facial expressions. It examines how crowd behaviour, face recognition technology, and deep learning algorithms contribute to understanding the emotional change according to different level of crowd. The study identifies common emotions expressed during congestion, differences between crowded and less crowded areas, changes in facial expressions over time. The findings can inform urban planning and crowd event management by providing insights for developing coping mechanisms for affected individuals. However, limitations and challenges in using reliable facial expression analysis are also discussed, including age and context-related differences.

Study on the Relationship Between 12Meridians Flow and Facial Expressions by Emotion (감정에 따른 얼굴 표정변화와 12경락(經絡) 흐름의 상관성 연구)

  • Park, Yu-Jin;Moon, Ju-Ho;Choi, Su-Jin;Shin, Seon-Mi;Kim, Ki-Tae;Ko, Heung
    • Journal of Physiology & Pathology in Korean Medicine
    • /
    • v.26 no.2
    • /
    • pp.253-258
    • /
    • 2012
  • Facial expression was an important communication methods. In oriental medicine, according to the emotion the face has changed shape and difference occurs in physiology and pathology. To verify such a theory, we studied the correlation between emotional facial expressions and meridian and collateral flow. The facial region divided by meridian, outer brow was Gallbladder meridian, inner brow was Bladder meridian, medial canthus was Bladder meridian, lateral canthus was Gallbladder meridian, upper eyelid was Bladder meridian, lower eyelid was Stomach meridian, central cheeks was Stomach meridian, lateral cheeks was Small intestine meridian, upper and lower lips, lip corner, chin were Small and Large intestine meridian. Meridian and collateral associated with happiness was six. This proves happiness is a high importance on facial expression. Meridian and collateral associated with anger was five. Meridian and Collateral associated with fear and sadness was four. This shows fear and sadness are a low importance on facial expression than different emotion. Based on yang meridian which originally descending flow in the body, the ratio of anterograde and retrograde were happiness 3:4, angry 2:5, sadness 5:3, fear 4:1. Based on face of the meridian flow, the ratio of anterograde and retrograde were happiness 5:2, angry 3:4, sadness 3:5, fear 4:1. We found out that practical meridian and collateral flow change by emotion does not correspond to the expected meridian and collateral flow change by emotion.

Representation of Dynamic Facial ImageGraphic for Multi-Dimensional (다차원 데이터의 동적 얼굴 이미지그래픽 표현)

  • 최철재;최진식;조규천;차홍준
    • Journal of the Korea Computer Industry Society
    • /
    • v.2 no.10
    • /
    • pp.1291-1300
    • /
    • 2001
  • This article come to study the visualization representation technique of eye brain of person, basing on the ground of the dynamic graphics which is able to change the real time, manipulating the image as graphic factors of the multi-data. And the important thought in such realization is as follows ; corresponding the character points of human face and the parameter control value which obtains basing on the existing image recognition algorithm to the multi-dimensional data, synthesizing the image, it is to create the virtual image from the emotional expression according to the changing contraction expression. The proposed DyFIG system is realized that it as the completing module and we suggest the module of human face graphics which is able to express the emotional expression by manipulating and experimenting, resulting in realizing the emotional data expression description and technology.

  • PDF

Efficient Emotional Relaxation Framework with Anisotropic Features Based Dijkstra Algorithm

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.25 no.4
    • /
    • pp.79-86
    • /
    • 2020
  • In this paper, we propose an efficient emotional relaxation framework using Dijkstra algorithm based on anisotropic features. Emotional relaxation is as important as emotional analysis. This is a framework that can automatically alleviate the person's depression or loneliness. This is very important for HCI (Human-Computer Interaction). In this paper, 1) Emotion value changing from facial expression is calculated using Microsoft's Emotion API, 2) Using these differences in emotion values, we recognize abnormal feelings such as depression or loneliness. 3) Finally, emotional mesh based matching process considering the emotional histogram and anisotropic characteristics is proposed, which suggests emotional relaxation to the user. In this paper, we propose a system which can recognize the change of emotion easily by using face image and train personal emotion by emotion relaxation.