• Title/Summary/Keyword: information of emotion

Search Result 1,326, Processing Time 0.024 seconds

Development of an Emotion Recognition Robot using a Vision Method (비전 방식을 이용한 감정인식 로봇 개발)

  • Shin, Young-Geun;Park, Sang-Sung;Kim, Jung-Nyun;Seo, Kwang-Kyu;Jang, Dong-Sik
    • IE interfaces
    • /
    • v.19 no.3
    • /
    • pp.174-180
    • /
    • 2006
  • This paper deals with the robot system of recognizing human's expression from a detected human's face and then showing human's emotion. A face detection method is as follows. First, change RGB color space to CIElab color space. Second, extract skin candidate territory. Third, detect a face through facial geometrical interrelation by face filter. Then, the position of eyes, a nose and a mouth which are used as the preliminary data of expression, he uses eyebrows, eyes and a mouth. In this paper, the change of eyebrows and are sent to a robot through serial communication. Then the robot operates a motor that is installed and shows human's expression. Experimental results on 10 Persons show 78.15% accuracy.

Design of a Mirror for Fragrance Recommendation based on Personal Emotion Analysis (개인의 감성 분석 기반 향 추천 미러 설계)

  • Hyeonji Kim;Yoosoo Oh
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.28 no.4
    • /
    • pp.11-19
    • /
    • 2023
  • The paper proposes a smart mirror system that recommends fragrances based on user emotion analysis. This paper combines natural language processing techniques such as embedding techniques (CounterVectorizer and TF-IDF) and machine learning classification models (DecisionTree, SVM, RandomForest, SGD Classifier) to build a model and compares the results. After the comparison, the paper constructs a personal emotion-based fragrance recommendation mirror model based on the SVM and word embedding pipeline-based emotion classifier model with the highest performance. The proposed system implements a personalized fragrance recommendation mirror based on emotion analysis, providing web services using the Flask web framework. This paper uses the Google Speech Cloud API to recognize users' voices and use speech-to-text (STT) to convert voice-transcribed text data. The proposed system provides users with information about weather, humidity, location, quotes, time, and schedule management.

A Study on the System of Facial Expression Recognition for Emotional Information and Communication Technology Teaching (감성ICT 교육을 위한 얼굴감성 인식 시스템에 관한 연구)

  • Song, Eun Jee
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.4 no.2
    • /
    • pp.171-175
    • /
    • 2012
  • Recently, the research on ICT (Information and Communication Technology), which cognizes and communicates human's emotion through information technology, is increasing. For instance, there are researches on phones and services that perceive users' emotions through detecting people's voices, facial emotions, and biometric data. In short, emotions which were used to be predicted only by humans are, now, predicted by digital equipment instead. Among many ICT researches, research on emotion recognition from face is fully expected as the most effective and natural human interface. This paper studies about sensitivity ICT and examines mechanism of facial expression recognition system as an example of sensitivity ICT.

  • PDF

Emotional Memory Mechanism Depending on Emotional Experience (감정적 경험에 의존하는 정서 기억 메커니즘)

  • Yeo, Ji Hye;Ham, Jun Seok;Ko, Il Ju
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.5 no.4
    • /
    • pp.169-177
    • /
    • 2009
  • In come cases, people differently respond on the same joke or thoughtless behavior - sometimes like it and laugh, another time feel annoyed or angry. This fact is explained that experiences which we had in the past are remembered by emotional memory, so they cause different responses. When people face similar situation or feel similar emotion, they evoke the emotion experienced in the past and the emotional memory affects current emotion. This paper suggested the mechanism of the emotional memory using SOM through the similarity between the emotional memory and SOM learning algorithm. It was assumed that the mechanism of the emotional memory has also the characteristics of association memory, long-term memory and short-term memory in its process of remembering emotional experience, which are known as the characteristics of the process of remembering factual experience. And then these characteristics were applied. The mechanism of the emotional memory designed like this was applied to toy hammer game and I measured the change in the power of toy hammer caused by differently responding on the same stimulus. The mechanism of the emotional memory suggest in above is expected to apply to the fields of game, robot engineering, because the mechanism can express various emotions on the same stimulus.

Trust and facial information in online negotiation (온라인 협상에서 얼굴 정보에 따른 신뢰감 비교)

  • Ji, Jae-Yeong;Han, Gwang-Hui
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2007.05a
    • /
    • pp.153-156
    • /
    • 2007
  • 본 연구는 온라인으로 이루어지는 협상에서 얼굴정보가 협상의 신뢰 수준에 영향을 주는지 보고자 한다. 협상의 진행이 실시간 문자 대화(Synchronous text communication) 또는 음성(Voice)으로 이루어지는 조건과 얼굴 정보가 추가된 조건을 비교한 결과, 얼굴 정보가 추가되었을 때 신뢰의 수준이 높게 나타났다.

  • PDF

Character's facial expression realization program with emotional state dimensions (내적 정서 상태 차원에 근거한 캐릭터 표정생성 프로그램 구현)

  • Ko, Hae-Young;Lee, Jae-Sik;Kim, Jae-Ho
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.3
    • /
    • pp.438-444
    • /
    • 2008
  • In this study, we propose a automatic facial expression generation tool which can generate various facial expressions efficiently for animation characters expressing various emotion. For this purpose, 9 types of facial expressions were defined by adopting Russell's pleasure-displeasure and arousal-sleep coordinate values of infernal states. And by rendering specific coordinate values consisted of the two basic dimensions of emotion(i.e., dimensions of pleasure-displeasure and arousal-sleep) and their combination, the proposed automatic facial expression generation tool could yield various facial expressions which reflected subtle changes in characters' emotion. It is expected that the tool could provide useful method of generating character face with various emotion.

Impact Analysis of nonverbal multimodals for recognition of emotion expressed virtual humans (가상 인간의 감정 표현 인식을 위한 비언어적 다중모달 영향 분석)

  • Kim, Jin Ok
    • Journal of Internet Computing and Services
    • /
    • v.13 no.5
    • /
    • pp.9-19
    • /
    • 2012
  • Virtual human used as HCI in digital contents expresses his various emotions across modalities like facial expression and body posture. However, few studies considered combinations of such nonverbal multimodal in emotion perception. Computational engine models have to consider how a combination of nonverbal modal like facial expression and body posture will be perceived by users to implement emotional virtual human, This paper proposes the impacts of nonverbal multimodal in design of emotion expressed virtual human. First, the relative impacts are analysed between different modals by exploring emotion recognition of modalities for virtual human. Then, experiment evaluates the contribution of the facial and postural congruent expressions to recognize basic emotion categories, as well as the valence and activation dimensions. Measurements are carried out to the impact of incongruent expressions of multimodal on the recognition of superposed emotions which are known to be frequent in everyday life. Experimental results show that the congruence of facial and postural expression of virtual human facilitates perception of emotion categories and categorical recognition is influenced by the facial expression modality, furthermore, postural modality are preferred to establish a judgement about level of activation dimension. These results will be used to implementation of animation engine system and behavior syncronization for emotion expressed virtual human.