• Title/Summary/Keyword: expression of emotions

Search Result 328, Processing Time 0.084 seconds

Emotion Recognition based on Tracking Facial Keypoints (얼굴 특징점 추적을 통한 사용자 감성 인식)

  • Lee, Yong-Hwan;Kim, Heung-Jun
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.1
    • /
    • pp.97-101
    • /
    • 2019
  • Understanding and classification of the human's emotion play an important tasks in interacting with human and machine communication systems. This paper proposes a novel emotion recognition method by extracting facial keypoints, which is able to understand and classify the human emotion, using active Appearance Model and the proposed classification model of the facial features. The existing appearance model scheme takes an expression of variations, which is calculated by the proposed classification model according to the change of human facial expression. The proposed method classifies four basic emotions (normal, happy, sad and angry). To evaluate the performance of the proposed method, we assess the ratio of success with common datasets, and we achieve the best 93% accuracy, average 82.2% in facial emotion recognition. The results show that the proposed method effectively performed well over the emotion recognition, compared to the existing schemes.

An Intelligent Emotion Recognition Model Using Facial and Bodily Expressions

  • Jae Kyeong Kim;Won Kuk Park;Il Young Choi
    • Asia pacific journal of information systems
    • /
    • v.27 no.1
    • /
    • pp.38-53
    • /
    • 2017
  • As sensor technologies and image processing technologies make collecting information on users' behavior easy, many researchers have examined automatic emotion recognition based on facial expressions, body expressions, and tone of voice, among others. Specifically, many studies have used normal cameras in the multimodal case using facial and body expressions. Thus, previous studies used a limited number of information because normal cameras generally produce only two-dimensional images. In the present research, we propose an artificial neural network-based model using a high-definition webcam and Kinect to recognize users' emotions from facial and bodily expressions when watching a movie trailer. We validate the proposed model in a naturally occurring field environment rather than in an artificially controlled laboratory environment. The result of this research will be helpful in the wide use of emotion recognition models in advertisements, exhibitions, and interactive shows.

Emotion Recognition and Expression Method using Bi-Modal Sensor Fusion Algorithm (다중 센서 융합 알고리즘을 이용한 감정인식 및 표현기법)

  • Joo, Jong-Tae;Jang, In-Hun;Yang, Hyun-Chang;Sim, Kwee-Bo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.8
    • /
    • pp.754-759
    • /
    • 2007
  • In this paper, we proposed the Bi-Modal Sensor Fusion Algorithm which is the emotional recognition method that be able to classify 4 emotions (Happy, Sad, Angry, Surprise) by using facial image and speech signal together. We extract the feature vectors from speech signal using acoustic feature without language feature and classify emotional pattern using Neural-Network. We also make the feature selection of mouth, eyes and eyebrows from facial image. and extracted feature vectors that apply to Principal Component Analysis(PCA) remakes low dimension feature vector. So we proposed method to fused into result value of emotion recognition by using facial image and speech.

An Artificial Emotion Model for Expression of Game Character (감정요소가 적용된 게임 캐릭터의 표현을 위한 인공감정 모델)

  • Kim, Ki-Il;Yoon, Jin-Hong;Park, Pyoung-Sun;Kim, Mi-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02b
    • /
    • pp.411-416
    • /
    • 2008
  • The development of games has brought about the birth of game characters that are visually very realistic. At present, one sees much enthusiasm for giving the characters emotions through such devices as avatars and emoticons. However, in a freely changing environment of games, the devices merely allow for the expression of the value derived from a first input rather than creating expressions of emotion that actively respond to their surroundings. As such, there are as of yet no displays of deep emotions among game characters. In light of this, the present article proposes the 'CROSS(Character Reaction on Specific Situation) Model AE Engine' for game characters in order to develop characters that will actively express action and emotion within the environment of the changing face of games. This is accomplished by classifying the emotional components applicable to game characters based on the OCC model, which is one of the most well known cognitive psychological models. Then, the situation of game playing analysis of the commercialized RPG game is systematized by ontology.

  • PDF

A Comparative Analysis on Facial Expression in Advertisements -By Utilising Facial Action Coding System(FACS) (광고 속의 얼굴 표정에 따른 비교 연구 -FACS를 활용하여)

  • An, Kyoung Hee
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.3
    • /
    • pp.61-71
    • /
    • 2019
  • Due to the limit of the time length of advertisement, facial expressions among the types of nonverbal communication are much more expressive and convincing to appeal to costumers. The purpose of this paper is not only to investigate how facial expressions are portrayed but also to examine how facial expressions convey emotion in TV advertisements. Research subjects are TV advertisements of and which had the wide range of popularity for customer known as one of the most touching commercials. The research method is Facial Action Coding System based on the theoretical perspective of a discrete emotions and designed to measure specific facial muscle movements. This research is to analyse the implications of facial expressions in the both TV ads by using FACS based on Psychology as well as anatomy. From the all the result of this, it is shown that the facial expressions portrayed with the conflict of emotional states and the dramatic emotional relief of the heroin could move more customers' emotions.

Emotional Intelligence, Academic Motivation, and Achievement among Health Science Students in Saudi Arabia: A Self-Deterministic Approach

  • Mahrous, Rasha Mohammed;Bugis, Bussma Ahmed;Sayed, Samiha Hamdi
    • Journal of Korean Academy of Nursing
    • /
    • v.53 no.6
    • /
    • pp.571-583
    • /
    • 2023
  • Purpose: This study used a self-deterministic approach to explore the relationship between emotional intelligence (EI), academic motivation (AM), and achievement among health science students. Methods: A descriptive cross-sectional study was conducted in three cities of Saudi Arabia (Dammam, Riyadh, and Jeddah). A convenience sample of 450 students was incorporated using the multistage cluster sampling technique. The online survey contained three sections: students' basic data and academic achievement level, the modified Schutte self-report inventory, and the Academic Motivation Scale lowercase. Results: This study revealed moderate overall scores for EI (57.1%), AM (55.6%), and grade point average (GPA) (57.6%). The overall EI score, its domains, and GPA had significant positive correlations with overall AM and intrinsic and extrinsic motivation (p < .01). Amotivation had an insignificant correlation with GPA (p < .05), but it was negatively correlated with EI and its domains (p < .01). Multiple regression analysis proved that EI domains predicted 5.0% of GPA variance; emotions appraisal and expression (β = .02, p = .024), regulation (β = .11, p = .032), and utilization (β = .24, p < .01). EI domains also predicted 26.0% of AM variance; emotions appraisal and expression (β = .11, p = .04), regulation (β = .33, p < .01), and utilization (β = .23, p < .01). Moreover, AM predicted 4.0% of the variance in GPA; intrinsic (β = .25, p = .004) and extrinsic (β = .11, p = .022) motivation. AM also predicted 25.0% of the variance in EI: intrinsic (β = .34, p < .01) and extrinsic motivation (β = .26, p = .026). Conclusion: EI and AM have a bidirectional influence on each other, significantly shaping the GPA of health sciences students in Saudi Arabia, where intrinsic motivation has a predominant role. Thus, promoting students' AM and EI is recommended to foster their academic achievement.

A Generation Methodology of Facial Expressions for Avatar Communications (아바타 통신에서의 얼굴 표정의 생성 방법)

  • Kim Jin-Yong;Yoo Jae-Hwi
    • Journal of the Korea Society of Computer and Information
    • /
    • v.10 no.3 s.35
    • /
    • pp.55-64
    • /
    • 2005
  • The avatar can be used as an auxiliary methodology of text and image communications in cyber space. An intelligent communication method can also be utilized to achieve real-time communication, where intelligently coded data (joint angles for arm gestures and action units for facial emotions) are transmitted instead of real or compressed pictures. In this paper. for supporting the action of arm and leg gestures, a method of generating the facial expressions that can represent sender's emotions is provided. The facial expression can be represented by Action Unit(AU), in this paper we suggest the methodology of finding appropriate AUs in avatar models that have various shape and structure. And, to maximize the efficiency of emotional expressions, a comic-style facial model having only eyebrows, eyes, nose, and mouth is employed. Then generation of facial emotion animation with the parameters is also investigated.

  • PDF

Korean Emotional Speech and Facial Expression Database for Emotional Audio-Visual Speech Generation (대화 영상 생성을 위한 한국어 감정음성 및 얼굴 표정 데이터베이스)

  • Baek, Ji-Young;Kim, Sera;Lee, Seok-Pil
    • Journal of Internet Computing and Services
    • /
    • v.23 no.2
    • /
    • pp.71-77
    • /
    • 2022
  • In this paper, a database is collected for extending the speech synthesis model to a model that synthesizes speech according to emotions and generating facial expressions. The database is divided into male and female data, and consists of emotional speech and facial expressions. Two professional actors of different genders speak sentences in Korean. Sentences are divided into four emotions: happiness, sadness, anger, and neutrality. Each actor plays about 3300 sentences per emotion. A total of 26468 sentences collected by filming this are not overlap and contain expression similar to the corresponding emotion. Since building a high-quality database is important for the performance of future research, the database is assessed on emotional category, intensity, and genuineness. In order to find out the accuracy according to the modality of data, the database is divided into audio-video data, audio data, and video data.

Color and Blinking Control to Support Facial Expression of Robot for Emotional Intensity (로봇 감정의 강도를 표현하기 위한 LED 의 색과 깜빡임 제어)

  • Kim, Min-Gyu;Lee, Hui-Sung;Park, Jeong-Woo;Jo, Su-Hun;Chung, Myung-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.547-552
    • /
    • 2008
  • Human and robot will have closer relation in the future, and we can expect that the interaction between human and robot will be more intense. To take the advantage of people's innate ability of communication, researchers concentrated on the facial expression so far. But for the robot to express emotional intensity, other modalities such as gesture, movement, sound, color are also needed. This paper suggests that the intensity of emotion can be expressed with color and blinking so that it is possible to apply the result to LED. Color and emotion definitely have relation, however, the previous results are difficult to implement due to the lack of quantitative data. In this paper, we determined color and blinking period to express the 6 basic emotions (anger, sadness, disgust, surprise, happiness, fear). It is implemented on avatar and the intensities of emotions are evaluated through survey. We figured out that the color and blinking helped to express the intensity of emotion for sadness, disgust, anger. For fear, happiness, surprise, the color and blinking didn't play an important role; however, we may improve them by adjusting the color or blinking.

  • PDF

The Effect of Emotional Labor and Emotional Dissonance on Burnout and Turnover Intention for the Hotel's Employee (호텔종사원의 감정노동에 따른 감정부조화가 소진 및 이직의도에 미치는 영향)

  • Ahn, Dae-Hee;Park, Jong-Chul
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.9
    • /
    • pp.335-345
    • /
    • 2009
  • This paper is 1) to find out what make cause in the emotional labor and emotional dissonance on hotel employee, 2) to investigate relationship between burnout and turnover intention on the personal character, 3) finally to suggest strategical implications for hotel management decision-maker. The questionnaires are distributed 400 on hotel employees, then used for data analysis 351. The results are as followed. First, the higher surface acting, deep acting, and emotional deviance in the emotional labor, the higher emotional dissonance. But the higher expression of natural emotions, it was showed lowly in the emotional dissonance. Second, the higher emotional dissonance, it was revealed highly burnout and turnover intention.