• Title/Summary/Keyword: 표정분석

Search Result 393, Processing Time 0.027 seconds

Analysis and Synthesis of Facial Expression using Base Faces (기준얼굴을 이용한 얼굴표정 분석 및 합성)

  • Park, Moon-Ho;Ko, Hee-Dong;Byun, Hye-Ran
    • Journal of KIISE:Software and Applications
    • /
    • v.27 no.8
    • /
    • pp.827-833
    • /
    • 2000
  • Facial expression is an effective tool to express human emotion. In this paper, a facial expression analysis method based on the base faces and their blending ratio is proposed. The seven base faces were chosen as axes describing and analyzing arbitrary facial expression. We set up seven facial expressions such as, surprise, fear, anger, disgust, happiness, sadness, and expressionless as base faces. Facial expression was built by fitting generic 3D facial model to facial image. Two comparable methods, Genetic Algorithms and Simulated Annealing were used to search the blending ratio of base faces. The usefulness of the proposed method for facial expression analysis was proved by the facial expression synthesis results.

  • PDF

An Integrated Stress Analysis System using Facial and Voice Sentiment (표정과 음성 감성 분석을 통한 통합 스트레스 분석 시스템)

  • Lee, Aejin;Chun, Jiwon;Yu, Suhwa;Kim, Yoonhee
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2021.11a
    • /
    • pp.9-12
    • /
    • 2021
  • 현대 사회에서 극심한 스트레스로 고통을 호소하는 사람들이 많아짐에 따라 효과적인 스트레스 측정 시스템의 필요성이 대두되었다. 본 연구에서는 영상 속 인물의 표정과 음성 감성 분석을 통한 통합 스트레스 분석 시스템을 제안한다. 영상 속 인물의 표정과 음성 감성 분석 후 각 감성값에서 스트레스 지수를 도출하고 정량화한다. 표정과 음성 스트레스 지수로 도출된 통합 스트레스 지수가 높을수록 스트레스 강도가 높음을 증명하였다.

Development of an intelligent camera for multiple body temperature detection (다중 체온 감지용 지능형 카메라 개발)

  • Lee, Su-In;Kim, Yun-Su;Seok, Jong-Won
    • Journal of IKEEE
    • /
    • v.26 no.3
    • /
    • pp.430-436
    • /
    • 2022
  • In this paper, we propose an intelligent camera for multiple body temperature detection. The proposed camera is composed of optical(4056*3040) and thermal(640*480), which detects abnormal symptoms by analyzing a person's facial expression and body temperature from the acquired image. The optical and thermal imaging cameras are operated simultaneously and detect an object in the optical image, in which the facial region and expression analysis are calculated from the object. Additionally, the calculated coordinate values from the optical image facial region are applied to the thermal image, also the maximum temperature is measured from the region and displayed on the screen. Abnormal symptom detection is determined by using the analyzed three facial expressions(neutral, happy, sadness) and body temperature values. In order to evaluate the performance of the proposed camera, the optical image processing part is tested on Caltech, WIDER FACE, and CK+ datasets for three algorithms(object detection, facial region detection, and expression analysis). Experimental results have shown 91%, 91%, and 84% accuracy scores each.

Real-time Recognition System of Facial Expressions Using Principal Component of Gabor-wavelet Features (표정별 가버 웨이블릿 주성분특징을 이용한 실시간 표정 인식 시스템)

  • Yoon, Hyun-Sup;Han, Young-Joon;Hahn, Hern-Soo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.6
    • /
    • pp.821-827
    • /
    • 2009
  • Human emotion can be reflected by their facial expressions. So, it is one of good ways to understand people's emotions by recognizing their facial expressions. General recognition system of facial expressions had selected interesting points, and then only extracted features without analyzing physical meanings. They takes a long time to find interesting points, and it is hard to estimate accurate positions of these feature points. And in order to implement a recognition system of facial expressions on real-time embedded system, it is needed to simplify the algorithm and reduce the using resources. In this paper, we propose a real-time recognition algorithm of facial expressions that project the grid points on an expression space based on Gabor wavelet feature. Facial expression is simply described by feature vectors on the expression space, and is classified by an neural network with its resources dramatically reduced. The proposed system deals 5 expressions: anger, happiness, neutral, sadness, and surprise. In experiment, average execution time is 10.251 ms and recognition rate is measured as 87~93%.

Analyzing facial expression of a learner in e-Learning system (e-Learning에서 나타날 수 있는 학습자의 얼굴 표정 분석)

  • Park, Jung-Hyun;Jeong, Sang-Mok;Lee, Wan-Bok;Song, Ki-Sang
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.05a
    • /
    • pp.160-163
    • /
    • 2006
  • If an instruction system understood the interest and activeness of a learner in real time, it could provide some interesting factors when a learner is tired of learning. It could work as an adaptive tutoring system to help a learner to understand something difficult to understand. Currently the area of the facial expression recognition mainly deals with the facial expression of adults focusing on anger, hatred, fear, sadness, surprising and gladness. These daily facial expressions couldn't be one of expressions of a learner in e-Learning. They should first study the facial expressions of a learner in e-Learning to recognize the feeling of a learner. Collecting as many expression pictures as possible, they should study the meaning of each expression. This study, as a prior research, analyzes the feelings of learners and facial expressions of learners in e-Learning in relation to the feelings to establish the facial expressions database.

  • PDF

A system for facial expression synthesis based on a dimensional model of internal states (내적상태 차원모형에 근거한 얼굴표정 합성 시스템)

  • 한재현;정찬섭
    • Korean Journal of Cognitive Science
    • /
    • v.13 no.3
    • /
    • pp.11-21
    • /
    • 2002
  • Parke and Waters' model[1] of muscle-based face deformation was used to develop a system that can synthesize facial expressions when the pleasure-displeasure and arousal-sleep coordinate values of internal states are specified. Facial expressions sampled from a database developed by Chung, Oh, Lee and Byun [2] and its underlying model of internal states were used to find rules for face deformation. The internal - state model included dimensional and categorical values of the sampled facial expressions. To find out deformation rules for each of the expressions, changes in the lengths of 21 facial muscles were measured. Then, a set of multiple regression analyses was performed to find out the relationship between the muscle lengths and internal states. The deformation rules obtained from the process turned out to produce natural-looking expressions when the internal states were specified by the pleasure-displeasure and arousal-sleep coordinate values. Such a result implies that the rules derived from a large scale database and regression analyses capturing the variations of individual muscles can be served as a useful and powerful tool for synthesizing facial expressions.

  • PDF

Dynamic Facial Expression of Fuzzy Modeling Using Probability of Emotion (감정확률을 이용한 동적 얼굴표정의 퍼지 모델링)

  • Gang, Hyo-Seok;Baek, Jae-Ho;Kim, Eun-Tae;Park, Min-Yong
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2007.04a
    • /
    • pp.401-404
    • /
    • 2007
  • 본 논문은 거울 투영을 이용하여 2D의 감정인식 데이터베이스를 3D에 적용 가능하다는 것을 증명한다. 또한, 감정 확률을 이용하여 퍼지 모델링을 기반으로한 얼굴표정을 생성하고, 표정을 움직이는 3가지 기본 움직임에 대한 퍼지이론을 적용하여 얼굴표현함수를 제안한다. 제안된 방법은 거울 투영을 통한 다중 이미지를 이용하여 2D에서 사용되는 감정인식에 대한 특징벡터를 3D에 적용한다. 이로 인해, 2D의 모델링 대상이 되는 실제 모델의 기본감정에 대한 비선형적인 얼굴표정을 퍼지를 기반으로 모델링한다. 그리고 얼굴표정을 표현하는데 기본 감정 6가지인 행복, 슬픔, 혐오, 화남, 놀람, 무서움으로 표현되며 기본 감정의 확률에 대해서 각 감정의 평균값을 사용하고, 6가지 감정 확률을 이용하여 동적 얼굴표정을 생성한다. 제안된 방법을 3D 인간형 아바타에 적용하여 실제 모델의 표정 벡터와 비교 분석한다.

  • PDF

Mapping facial expression onto internal states (얼굴표정에 의한 내적상태 추정)

  • 한재현;정찬섭
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 1998.04a
    • /
    • pp.118-123
    • /
    • 1998
  • 얼굴표정과 내적상태의 관계 모형을 수립하기 위한 기초 자료로서 얼굴표정과 내적상태의 대응관계를 조사하였다. 심리적으로 최소유의미거리에 있는 두 내적상태는 서로 구별되는 얼굴표정과 내적상태의 일대일 대응 관계가 성립한다는 것을 발결하였다. 얼굴표정 차원값과 내적상태 차원값의 관계 구조를 파악하기 위하여 중다회귀분석을 실시한 결과, 쾌-불쾌상태는 입의 너비에 의해서, 각성-수면상태는 눈과 입이 열린 정도에 의해서 얼굴표정에 민감하게 반영되는 것으로 나타났다. 얼굴표정 차원 열 두개가 내적상태 차원 상의 변화를 설명하는 정도는 40%내외였다. 선형모형이 이처럼 높은 예측력을 갖는다는 것은 이 두 변수 사이에 비교적 단순한 수리적 대응 구조가 존재한다는 것을 암시한다.

  • PDF

Stability Analysis of a Stereo-Camera for Close-range Photogrammetry (근거리 사진측량을 위한 스테레오 카메라의 안정성 분석)

  • Kim, Eui Myoung;Choi, In Ha
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.39 no.3
    • /
    • pp.123-132
    • /
    • 2021
  • To determine 3D(three-dimensional) positions using a stereo-camera in close-range photogrammetry, camera calibration to determine not only the interior orientation parameters of each camera but also the relative orientation parameters between the cameras must be preceded. As time passes after performing camera calibration, in the case of non-metric cameras, the interior and relative orientation parameters may change due to internal instability or external factors. In this study, to evaluate the stability of the stereo-camera, not only the stability of two single cameras and a stereo-camera were analyzed, but also the three-dimensional position accuracy was evaluated using checkpoints. As a result of evaluating the stability of two single cameras through three camera calibration experiments over four months, the root mean square error was ±0.001mm, and the root mean square error of the stereo-camera was ±0.012mm ~ ±0.025mm, respectively. In addition, as the results of distance accuracy using the checkpoint were ±1mm, the interior and relative orientation parameters of the stereo-camera were considered stable over that period.

The Effect of Cognitive Movement Therapy on Emotional Rehabilitation for Children with Affective and Behavioral Disorder Using Emotional Expression and Facial Image Analysis (감정표현 표정의 영상분석에 의한 인지동작치료가 정서·행동장애아 감성재활에 미치는 영향)

  • Byun, In-Kyung;Lee, Jae-Ho
    • The Journal of the Korea Contents Association
    • /
    • v.16 no.12
    • /
    • pp.327-345
    • /
    • 2016
  • The purpose of this study was to carry out cognitive movement therapy program for children with affective and behavioral disorder based on neuro science, psychology, motor learning, muscle physiology, biomechanics, human motion analysis, movement control and to quantify characteristic of expression and gestures according to change of facial expression by emotional change. We could observe problematic expression of children with affective disorder, and could estimate the efficiency of application of movement therapy program by the face expression change of children with affective disorder. And it could be expected to accumulate data for early detection and therapy process of development disorder applying converged measurement and analytic method for human development by quantification of emotion and behavior therapy analysis, kinematic analysis. Therefore, the result of this study could be extendedly applied to the disabled, the elderly and the sick as well as children.