• Title/Summary/Keyword: emotion technology

Search Result 802, Processing Time 0.023 seconds

AE-Artificial Emotion

  • Xuyan, Tu;Liqun, Han
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.146-149
    • /
    • 2003
  • This paper proposes the concept of “Artificial Emotion”(AE). The goal of AE is simulation, extension and expansion of natural emotion, especially human emotion. The object of AE is machine emotion and emotion machine. The contents of AE are emotion recognition, emotion measurement, emotion understanding emotion representation emotion generation, emotion processing, emotion control and emotion communication. The methodology, technology, scientific significance and application value of artificial emotion are discussed

  • PDF

Review on Discrete, Appraisal, and Dimensional Models of Emotion (정서의 심리적 모델: 개별 정서 모델, 평가 모델, 차원 모델을 중심으로)

  • Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.30 no.1
    • /
    • pp.179-186
    • /
    • 2011
  • Objective: This study is to review three representative psychological perspectives that explain scientific construct of emotion, that are the discrete emotion model, appraisal model, and dimensional model. Background: To develop emotion sensitive interface is the fusion area of emotion and scientific technology, it is necessary to have a balanced mixture of both the scientific theory of emotion and practical engineering technology. Extensional theories of the emotional structure can provide engineers with relevant knowledge in functional application of the systems. Method: To achieve this purpose, firstly, literature review on the basic emotion model and the circuit model of discrete emotion model as well as representative theories was done. Secondly, review on the classical and modern theories of the appraisal model emphasizing cognitive appraisal in emotion provoking events was conducted. Lastly, a review on dimensional theories describing emotion by dimensions and representative theories was conducted. Results: The paper compared the three models based on the prime points of the each model. In addition, this paper also made a comment on a need for a comprehensive model an alternative to each model, which is componential model by Scherer(2001) describing numerous emotional aspects. Conclusion: However, this review suggests a need for an evolved comprehensive model taking consideration of social context effect and discrete neural circuit while pinpointing the limitation of componential model. Application: Insight obtained by extensive scientific research in human emotion can be valuable in development of emotion sensitive interface and emotion recognition technology.

Improved Two-Phase Framework for Facial Emotion Recognition

  • Yoon, Hyunjin;Park, Sangwook;Lee, Yongkwi;Han, Mikyong;Jang, Jong-Hyun
    • ETRI Journal
    • /
    • v.37 no.6
    • /
    • pp.1199-1210
    • /
    • 2015
  • Automatic emotion recognition based on facial cues, such as facial action units (AUs), has received huge attention in the last decade due to its wide variety of applications. Current computer-based automated two-phase facial emotion recognition procedures first detect AUs from input images and then infer target emotions from the detected AUs. However, more robust AU detection and AU-to-emotion mapping methods are required to deal with the error accumulation problem inherent in the multiphase scheme. Motivated by our key observation that a single AU detector does not perform equally well for all AUs, we propose a novel two-phase facial emotion recognition framework, where the presence of AUs is detected by group decisions of multiple AU detectors and a target emotion is inferred from the combined AU detection decisions. Our emotion recognition framework consists of three major components - multiple AU detection, AU detection fusion, and AU-to-emotion mapping. The experimental results on two real-world face databases demonstrate an improved performance over the previous two-phase method using a single AU detector in terms of both AU detection accuracy and correct emotion recognition rate.

Emotion Recognition based on Tracking Facial Keypoints (얼굴 특징점 추적을 통한 사용자 감성 인식)

  • Lee, Yong-Hwan;Kim, Heung-Jun
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.1
    • /
    • pp.97-101
    • /
    • 2019
  • Understanding and classification of the human's emotion play an important tasks in interacting with human and machine communication systems. This paper proposes a novel emotion recognition method by extracting facial keypoints, which is able to understand and classify the human emotion, using active Appearance Model and the proposed classification model of the facial features. The existing appearance model scheme takes an expression of variations, which is calculated by the proposed classification model according to the change of human facial expression. The proposed method classifies four basic emotions (normal, happy, sad and angry). To evaluate the performance of the proposed method, we assess the ratio of success with common datasets, and we achieve the best 93% accuracy, average 82.2% in facial emotion recognition. The results show that the proposed method effectively performed well over the emotion recognition, compared to the existing schemes.

Comparative Study of Emotion Evaluation Based on Lighting Scenario of Office, Meeting Room, Lounge, and OA Room (사무실, 회의실, 휴게실, OA실의 조명 시나리오에 따른 감성평가 비교 연구 - 동·서양인을 대상으로 -)

  • Lee, Min-Jin;Cho, Mee-Ryoung;Ko, Jae-Kyu;Kim, Ju-Hyun
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.27 no.9
    • /
    • pp.23-35
    • /
    • 2013
  • In this study, we collected the emotion vocabulary novel related to the LED system light emotion and the existing terms through previous research, targeting professionals and KJ method, we selected emotion term of 35 kinds. Targeting this east Foreigner, we compared the emotion evaluation by changing the lighting elements in accordance office, meeting room, lounge, the OA room different behavior patterns. The results, showing the difference between the results of emotion existing research generally derived factors three axes "future, functionality" See, "stability", "activity". As a result of the comparison of the emotion of the East and the West, the Oriental, functional aspects of the LED system light of space office, meeting room, lounge is drawn most, On the other hand, Westerners, come up, stable surface shows the difference in accidentally. Based on these results, the future, and tries to utilized to evaluate the emotion reaction of the illumination elements each Test-Bed actual.

Dynamic Emotion Classification through Facial Recognition (얼굴 인식을 통한 동적 감정 분류)

  • Han, Wuri;Lee, Yong-Hwan;Park, Jeho;Kim, Youngseop
    • Journal of the Semiconductor & Display Technology
    • /
    • v.12 no.3
    • /
    • pp.53-57
    • /
    • 2013
  • Human emotions are expressed in various ways. It can be expressed through language, facial expression and gestures. In particular, the facial expression contains many information about human emotion. These vague human emotion appear not in single emotion, but in combination of various emotion. This paper proposes a emotional expression algorithm using Active Appearance Model(AAM) and Fuzz k- Nearest Neighbor which give facial expression in similar with vague human emotion. Applying Mahalanobis distance on the center class, determine inclusion level between center class and each class. Also following inclusion level, appear intensity of emotion. Our emotion recognition system can recognize a complex emotion using Fuzzy k-NN classifier.

Emotion Recognition Implementation with Multimodalities of Face, Voice and EEG

  • Udurume, Miracle;Caliwag, Angela;Lim, Wansu;Kim, Gwigon
    • Journal of information and communication convergence engineering
    • /
    • v.20 no.3
    • /
    • pp.174-180
    • /
    • 2022
  • Emotion recognition is an essential component of complete interaction between human and machine. The issues related to emotion recognition are a result of the different types of emotions expressed in several forms such as visual, sound, and physiological signal. Recent advancements in the field show that combined modalities, such as visual, voice and electroencephalography signals, lead to better result compared to the use of single modalities separately. Previous studies have explored the use of multiple modalities for accurate predictions of emotion; however the number of studies regarding real-time implementation is limited because of the difficulty in simultaneously implementing multiple modalities of emotion recognition. In this study, we proposed an emotion recognition system for real-time emotion recognition implementation. Our model was built with a multithreading block that enables the implementation of each modality using separate threads for continuous synchronization. First, we separately achieved emotion recognition for each modality before enabling the use of the multithreaded system. To verify the correctness of the results, we compared the performance accuracy of unimodal and multimodal emotion recognitions in real-time. The experimental results showed real-time user emotion recognition of the proposed model. In addition, the effectiveness of the multimodalities for emotion recognition was observed. Our multimodal model was able to obtain an accuracy of 80.1% as compared to the unimodality, which obtained accuracies of 70.9, 54.3, and 63.1%.

Sijo Literature Therapeutic Research on the Structuring of Emotion-DNA

  • Park, In-Kwa
    • International Journal of Advanced Culture Technology
    • /
    • v.5 no.1
    • /
    • pp.26-31
    • /
    • 2017
  • In this study, Emotion-DNA is constructed in the same way asthat the human DNA constructs the human body. Emotion-DNA is copied and translated in the same way as that the human DNA copies and translates itself. We made an attempt to embody the mind by Emotion-DNA like the symbols "A, T, G, C, U" that make up the chromosome of the human body. This is a diagram of the flow of emotions that the human body operates by literary works. These schemes present new directions for the therapeutic analysis of literary works and for the creation of therapeutic literary works. In this study, we analyzed the nominal Emotion-syllables as a framework of the structuring of emotional DNA. As a result, through the structuring of the emotional DNA, it was judged that the therapeutic action of the human body, which is included in the Rated Sijo among the literary works, can be more concrete and powerful than the works of other genres.

Mutant Emotion Coded by Sijo

  • Park, Inkwa
    • International Journal of Advanced Culture Technology
    • /
    • v.7 no.2
    • /
    • pp.188-194
    • /
    • 2019
  • Always, emotion is mutant. This is principle of literary treatment. In the literature, sadness is not sadness, and 'loving emotion' is not 'loving emotion.' Despite loving of our, loving is sadness. Also loving is to cry. This crying becomes love. This study is to show the mutant emotion which is to be able to code Deep Learning AI. We explored the Sijo "Streams that cried last night", because this Sijo was useful to study mutant emotion. The result was that this Sijo was coding the mutant emotion. Almost continuously, the sadness codes were spawning and concentrating. So this Sijo was making the emotion of love with the sadness. If this study is continued, It is believed that our lives will be much happier. And the method of literary therapy will be able to more upgrade.

Classification of Three Different Emotion by Physiological Parameters

  • Jang, Eun-Hye;Park, Byoung-Jun;Kim, Sang-Hyeob;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.2
    • /
    • pp.271-279
    • /
    • 2012
  • Objective: This study classified three different emotional states(boredom, pain, and surprise) using physiological signals. Background: Emotion recognition studies have tried to recognize human emotion by using physiological signals. It is important for emotion recognition to apply on human-computer interaction system for emotion detection. Method: 122 college students participated in this experiment. Three different emotional stimuli were presented to participants and physiological signals, i.e., EDA(Electrodermal Activity), SKT(Skin Temperature), PPG(Photoplethysmogram), and ECG (Electrocardiogram) were measured for 1 minute as baseline and for 1~1.5 minutes during emotional state. The obtained signals were analyzed for 30 seconds from the baseline and the emotional state and 27 features were extracted from these signals. Statistical analysis for emotion classification were done by DFA(discriminant function analysis) (SPSS 15.0) by using the difference values subtracting baseline values from the emotional state. Results: The result showed that physiological responses during emotional states were significantly differed as compared to during baseline. Also, an accuracy rate of emotion classification was 84.7%. Conclusion: Our study have identified that emotions were classified by various physiological signals. However, future study is needed to obtain additional signals from other modalities such as facial expression, face temperature, or voice to improve classification rate and to examine the stability and reliability of this result compare with accuracy of emotion classification using other algorithms. Application: This could help emotion recognition studies lead to better chance to recognize various human emotions by using physiological signals as well as is able to be applied on human-computer interaction system for emotion recognition. Also, it can be useful in developing an emotion theory, or profiling emotion-specific physiological responses as well as establishing the basis for emotion recognition system in human-computer interaction.