• Title/Summary/Keyword: expression of emotions

Search Result 329, Processing Time 0.022 seconds

Emotion Adjustment Method for Diverse Expressions of Same Emotion Depending on Each Character's Characteristics (캐릭터 성격에 따른 동일 감정 표현의 다양화를 위한 감정 조정 방안)

  • Lee, Chang-Sook;Um, Ky-Hyun;Cho, Kyung-Eun
    • Journal of Korea Game Society
    • /
    • v.10 no.2
    • /
    • pp.37-47
    • /
    • 2010
  • Along with language, emotion is an effective means of expression. By expressing our emotions as well as speaking language, we can deliver our message better. Because each person expresses the same emotion differently, this expression is a useful gauge to measure an individual personality. To avoid monotonous emotional expression from virtual characters, therefore, it is necessary to adjust the creation and deletion of the same emotion depending on each character's personality. This paper has attempted to define personality characteristics that have an impact on each emotion and propose a method to adjust the emotions. Furthermore, the relationship between particular emotion and personality characteristics has been defined by matching the significance of specified personality characteristics with the lexical meaning. In addition, using the Raw Score, the weighted value which is necessary for the adjustment, continuance and deletion of each emotion has been defined. Then, emotion was properly adjusted. When the same emotion was adjusted using actual personality test data, different results have been observed by personality. This paper has been conducted using NEO Personality Inventory (NEO-PI) which consisted of 5 broad domains and 30 sub domains.

Children's Interpretation of Facial Expression onto Two-Dimension Structure of Emotion (정서의 이차원 구조에서 유아의 얼굴표정 해석)

  • Shin, Young-Suk;Chung, Hyun-Sook
    • Korean Journal of Cognitive Science
    • /
    • v.18 no.1
    • /
    • pp.57-68
    • /
    • 2007
  • This study explores children's categories of emotion understanding from facial expressions onto two dimensional structure of emotion. Children of 89 from 3 to 5 years old were required to those facial expressions related the fourteen emotion terms. Facial expressions applied for experiment are used the photographs rated the degree of expression in each of the two dimensions (pleasure-displeasure dimension and arousal-sleep dimension) on a nine-point scale from 54 university students. The experimental results showed that children indicated the greater stability in arousal dimension than stability in pleasure-displeasure dimension. Emotions about sadness, sleepiness, anger and surprise onto two dimensions was understand very well, but emotions about fear, boredom were showed instability in pleasure-displeasure dimension. Specifically, 3 years old children indicated highly the perception in a degree of arousal-sleep than perception of pleasure-displeasure.

  • PDF

Effect of emotional expression method type on subjective happiness

  • Kim, Jungae
    • International Journal of Advanced Culture Technology
    • /
    • v.9 no.4
    • /
    • pp.126-134
    • /
    • 2021
  • This study was a cross-sectional reaserch that analyzed the effect of emotional expression type on happiness level with structured questionnaires. A total of 186 participants in the study were in their 20s to 60s. Data collection was received from those who voluntarily agreed to the study from September 1 to 15, 2021. The collected data were frequency analysis, t-test, and regression analysis with spss 18.0. As a result of the analysis, it was found that people who perform superficial acts disguised as emotions were less satisfied with their current life (t=-10.482, p<0.01). People who do inner behavior showed high levels of happiness. These results suggest the following: It can be seen that in order to improve effective happiness, it is required to actively and truthfully act in a physical environment.

Video Expression Recognition Method Based on Spatiotemporal Recurrent Neural Network and Feature Fusion

  • Zhou, Xuan
    • Journal of Information Processing Systems
    • /
    • v.17 no.2
    • /
    • pp.337-351
    • /
    • 2021
  • Automatically recognizing facial expressions in video sequences is a challenging task because there is little direct correlation between facial features and subjective emotions in video. To overcome the problem, a video facial expression recognition method using spatiotemporal recurrent neural network and feature fusion is proposed. Firstly, the video is preprocessed. Then, the double-layer cascade structure is used to detect a face in a video image. In addition, two deep convolutional neural networks are used to extract the time-domain and airspace facial features in the video. The spatial convolutional neural network is used to extract the spatial information features from each frame of the static expression images in the video. The temporal convolutional neural network is used to extract the dynamic information features from the optical flow information from multiple frames of expression images in the video. A multiplication fusion is performed with the spatiotemporal features learned by the two deep convolutional neural networks. Finally, the fused features are input to the support vector machine to realize the facial expression classification task. The experimental results on cNTERFACE, RML, and AFEW6.0 datasets show that the recognition rates obtained by the proposed method are as high as 88.67%, 70.32%, and 63.84%, respectively. Comparative experiments show that the proposed method obtains higher recognition accuracy than other recently reported methods.

Life-like Facial Expression of Mascot-Type Robot Based on Emotional Boundaries (감정 경계를 이용한 로봇의 생동감 있는 얼굴 표정 구현)

  • Park, Jeong-Woo;Kim, Woo-Hyun;Lee, Won-Hyong;Chung, Myung-Jin
    • The Journal of Korea Robotics Society
    • /
    • v.4 no.4
    • /
    • pp.281-288
    • /
    • 2009
  • Nowadays, many robots have evolved to imitate human social skills such that sociable interaction with humans is possible. Socially interactive robots require abilities different from that of conventional robots. For instance, human-robot interactions are accompanied by emotion similar to human-human interactions. Robot emotional expression is thus very important for humans. This is particularly true for facial expressions, which play an important role in communication amongst other non-verbal forms. In this paper, we introduce a method of creating lifelike facial expressions in robots using variation of affect values which consist of the robot's emotions based on emotional boundaries. The proposed method was examined by experiments of two facial robot simulators.

  • PDF

Emotion Recognition and Expression System of User using Multi-Modal Sensor Fusion Algorithm (다중 센서 융합 알고리즘을 이용한 사용자의 감정 인식 및 표현 시스템)

  • Yeom, Hong-Gi;Joo, Jong-Tae;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.1
    • /
    • pp.20-26
    • /
    • 2008
  • As they have more and more intelligence robots or computers these days, so the interaction between intelligence robot(computer) - human is getting more and more important also the emotion recognition and expression are indispensable for interaction between intelligence robot(computer) - human. In this paper, firstly we extract emotional features at speech signal and facial image. Secondly we apply both BL(Bayesian Learning) and PCA(Principal Component Analysis), lastly we classify five emotions patterns(normal, happy, anger, surprise and sad) also, we experiment with decision fusion and feature fusion to enhance emotion recognition rate. The decision fusion method experiment on emotion recognition that result values of each recognition system apply Fuzzy membership function and the feature fusion method selects superior features through SFS(Sequential Forward Selection) method and superior features are applied to Neural Networks based on MLP(Multi Layer Perceptron) for classifying five emotions patterns. and recognized result apply to 2D facial shape for express emotion.

AI Chatbot-Based Daily Journaling System for Eliciting Positive Emotions (긍정적 감정 유발을 위한 AI챗봇기반 일기 작성 시스템)

  • Jun-Hyeon Kim;Mi-Kyeong Moon
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.19 no.1
    • /
    • pp.105-112
    • /
    • 2024
  • In contemporary society, the expression of emotions and self-reflection are considered pivotal factors with a positive impact on stress management and mental well-being, thereby highlighting the significance of journaling. However, traditional journaling methods have posed challenges for many individuals due to constraints in terms of time and space. Recent rapid advancements in chatbot and emotion analysis technologies have garnered significant attention as essential tools to address these issues. This paper introduces an artificial intelligence chatbot that integrates the GPT-3 model and emotion analysis technology, detailing the development process of a system that automatically generates journals based on users' chat data. Through this system, users can engage in journaling more conveniently and efficiently, fostering a deeper understanding of their emotions and promoting positive emotional experiences.

On the Implementation of a Facial Animation Using the Emotional Expression Techniques (FAES : 감성 표현 기법을 이용한 얼굴 애니메이션 구현)

  • Kim Sang-Kil;Min Yong-Sik
    • The Journal of the Korea Contents Association
    • /
    • v.5 no.2
    • /
    • pp.147-155
    • /
    • 2005
  • In this paper, we present a FAES(a Facial Animation with Emotion and Speech) system for speech-driven face animation with emotions. We animate face cartoons not only from input speech, but also based on emotions derived from speech signal. And also our system can ensure smooth transitions and exact representation in animation. To do this, after collecting the training data, we have made the database using SVM(Support Vector Machine) to recognize four different categories of emotions: neutral, dislike, fear and surprise. So that, we can make the system for speech-driven animation with emotions. Also, we trained on Korean young person and focused on only Korean emotional face expressions. Experimental results of our system demonstrate that more emotional areas expanded and the accuracies of the emotional recognition and the continuous speech recognition are respectively increased 7% and 5% more compared with the previous method.

  • PDF

Effects of Korean Marine Police's Emotive Dissonance on Job Burnout: Focused on Moderating Effects of Emotional Intelligence (해양경찰공무원의 감정적 부조화와 직무소진의 영향관계: 감정지능의 조절효과 분석)

  • Lim, You-Seok
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.22 no.4
    • /
    • pp.328-334
    • /
    • 2016
  • The everyday life of police officers requires them to face a number of criminal acts and enter compromising crime scenes. In this case, officers may be compelled to make a personal expression of negative emotions. Negative emotions of members confidence for effective job processing and contributions duties and achievement of the job within the marine police force. Therefore, to control the emotive dissonance of organization members is a great help to the development of the organization. This study focuses on emotional dissonance among marine police officers to verify the impact of this dissonance on job burnout and consider the mediating effect of emotional intelligence. Research results are as follows: First, the relationship between emotional dissonance sub-factors and job burnout among marine police officers was studied. It was found that marine police did not feel emotionally jarred because they consciously tried to abstain from emotional engagement, but this was found to reduce emotional intelligence as related to desired emotions as well. Second, Emotional intelligence of the Marine Police was found on a significant impact on job burnout. Third, the impact of emotions in relation to emotional dissonance that job burnout of marine police intelligence officials confirmed that there is a statistically significant mediating effect. Finally, in a comparison of direct effects versus mediated effects, marine police were seen to be prone to emotional dissonance and experienced job burnout as a direct result of applying greater emotional intelligence.

The relationship between autistic features and empathizing-systemizing traits (자폐성향과 공감-체계화능력 간의 관계)

  • Cho, Kyung-Ja;Kim, Jung-K.
    • Science of Emotion and Sensibility
    • /
    • v.14 no.2
    • /
    • pp.245-256
    • /
    • 2011
  • This study consists of two sections to figure out the relationship between autistic features and empathizing-systemizing traits. For the first section, the research involved 355 university students to measure their EQ, SQ-R and AQ. As a result, it is found that AQ was negatively correlated to EQ, and D score(relative difference between EQ and SQ-R of the individuals), but it was not significantly related to SQ-R. It means that the subject has high AQ if he has relatively lower EQ than SQ-R. For the second section, the subjects were divided into two groups based on their AQ score; the subjects who had a tendency of autism and the subjects who did not. The test measured how these two groups were different in terms of facial expressions' recognition according to the tendency of autism, facial expression presenting areas(whole face, eyes-alone, mouth-alone) and different types of emotions(basic and complex emotions). As a result, the subjects who had a tendency of autism were lower at judging facial expressions than the subjects who did not. Also, the results showed that the subjects judged better on the condition of basic emotions more than complex emotions, the whole face more than eyes-alone and mouth-alone. Especially, for the eyes-alone condition, the subjects who had a tendency of autism were lower at judging facial expressions than the subjects who did not. This study suggests that empathizing traits and facial expressions' recognition are related to the tendency of autism.

  • PDF