• Title/Summary/Keyword: expression of emotions

Search Result 328, Processing Time 0.023 seconds

Non-verbal Emotional Expressions for Social Presence of Chatbot Interface (챗봇의 사회적 현존감을 위한 비언어적 감정 표현 방식)

  • Kang, Minjeong
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.1
    • /
    • pp.1-11
    • /
    • 2021
  • The users of a chatbot messenger can be better engaged in the conversation if they feel intimacy with the chatbot. This can be achieved by the chatbot's effective expressions of human emotions to chatbot users. Thus motivated, this study aims to identify the appropriate emotional expressions of a chatbot that make people feel the social presence of the chatbot. In the background research, we obtained that facial expression is the most effective way of emotions and movement is important for relationship emersion. In a survey, we prepared moving text, moving gestures, and still emoticon that represent five emotions such as happiness, sadness, surprise, fear, and anger. Then, we asked the best way for them to feel social presence with a chatbot in each emotion. We found that, for an arousal and pleasant emotion such as 'happiness', people prefer moving gesture and text most while for unpleasant emotions such as 'sadness' and 'anger', people prefer emoticons. Lastly, for the neutral emotions such as 'surprise' and 'fear', people tend to select moving text that delivers clear meaning. We expect that this results of the study are useful for developing emotional chatbots that enable more effective conversations with users.

The Implementation and Analysis of Facial Expression Customization for a Social Robot (소셜 로봇의 표정 커스터마이징 구현 및 분석)

  • Jiyeon Lee;Haeun Park;Temirlan Dzhoroev;Byounghern Kim;Hui Sung Lee
    • The Journal of Korea Robotics Society
    • /
    • v.18 no.2
    • /
    • pp.203-215
    • /
    • 2023
  • Social robots, which are mainly used by individuals, emphasize the importance of human-robot relationships (HRR) more compared to other types of robots. Emotional expression in robots is one of the key factors that imbue HRR with value; emotions are mainly expressed through the face. However, because of cultural and preference differences, the desired robot facial expressions differ subtly depending on the user. It was expected that a robot facial expression customization tool may mitigate such difficulties and consequently improve HRR. To prove this, we created a robot facial expression customization tool and a prototype robot. We implemented a suitable emotion engine for generating robot facial expressions in a dynamic human-robot interaction setting. We conducted experiments and the users agreed that the availability of a customized version of the robot has a more positive effect on HRR than a predefined version of the robot. Moreover, we suggest recommendations for future improvements of the customization process of robot facial expression.

A Study on the Creative Expression of Fashion Ilustration -Focusing on Creative Idea Technique- (패션일러스트레이션의 창의적 표현방법 연구 -창의적 발상기법을 중심으로-)

  • 김하림;유영선
    • Journal of the Korean Society of Clothing and Textiles
    • /
    • v.26 no.8
    • /
    • pp.1153-1164
    • /
    • 2002
  • The purpose of this study is to find out the creative visual expression in fashion illustration through the previous general theories on creativity. The creative visual expression, in fashion illustration works, can be summarized eight categories as follows. There are 'maximization & minimization'to emphasize the illustrators'emotions and contemporary trends, proposed specially in body expression, 'inversion' seen as the forms of contra-perspective, upside-down, rearrangement of parts, diverted process, etc, 'Unusual uses'to add the effect of caricature, parody, and humor to the fashion illustration works, 'extraordinary connection'seen as the shapes of various combination between animals, plants, stuff, and man, 'concealment & elimination'used frequently in creative visual expression includes a deformed human body, an abstracted human body, an extreme value contrast, simple colors, and dress in silhouette, 'association'seen in various methods; comparing the similar shapes, describing a certain situation or details for analogizing the whole, ‘illusion’ expressed in surreal, mysterious, and fairy depictions and 'substitution'to imitate the composition and colors of masterpieces, copy the parts.

Valence of Social Emotions' Sense and Expression in SNS (SNS내 사회감성의 어휘적 의미와 표현에 대한 유의성)

  • Hyun, Hye-Jung;Whang, Min-Cheol
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.6
    • /
    • pp.37-48
    • /
    • 2014
  • Social emotion is being highlighted as an important factor of human life in terms of quality of communication as a variety of social networks are commonly used. To understand such social emotion, this study verifies and analyzes the significance of lexical meaning and expression of emotion basically for understanding of complex meaning of social emotion. The emotional expressions represented in SNS text messages, one of the major channel of communication, are examined in this study to create scales of meaning and expression and to understand the differences deeply. As a result of the analysis, it turned out that negative assessment factors were more than positive ones among social emotional factors while positive ones were outstandingly many in the case of social emotional expression. Social emotional factors were classified by basic emotional elements and valences while emotional expression included complex meaning and especially positive elements were dominant in general.

Exploration of Anger Expression Patterns of Female Nursing Students using Q Methodology (Q방법론을 활용한 여자 간호대학생의 분노표현 양상 탐색)

  • LEE, Eun-Ju;PARK, Euna
    • Journal of Fisheries and Marine Sciences Education
    • /
    • v.27 no.3
    • /
    • pp.682-695
    • /
    • 2015
  • The purpose of this study was to identify subjective perception types on anger expression among female nursing students and to investigate characteristics according to the subjective perception types. In order to achieve this purpose, the study used Q-methodology. There were 30 study participants and they completed the Q-sort activity, rating each statement relative to the others using 32 Q statements. The data were analyzed using the QUANL PC program. There were three types of forms extracted from anger expressions among female nursing students; a type on 'embracive soothing', a type on 'reasonable expression', and a type on 'ambivalence over emotional expression'. The commonality of the three types were: talking about their anger emotions with others, asking for help in religion and pushing their opponent. Therefore, Adequate strategies based on anger expression types need to be developed to resolve anger among female nursing students.

Dynamic Emotion Classification through Facial Recognition (얼굴 인식을 통한 동적 감정 분류)

  • Han, Wuri;Lee, Yong-Hwan;Park, Jeho;Kim, Youngseop
    • Journal of the Semiconductor & Display Technology
    • /
    • v.12 no.3
    • /
    • pp.53-57
    • /
    • 2013
  • Human emotions are expressed in various ways. It can be expressed through language, facial expression and gestures. In particular, the facial expression contains many information about human emotion. These vague human emotion appear not in single emotion, but in combination of various emotion. This paper proposes a emotional expression algorithm using Active Appearance Model(AAM) and Fuzz k- Nearest Neighbor which give facial expression in similar with vague human emotion. Applying Mahalanobis distance on the center class, determine inclusion level between center class and each class. Also following inclusion level, appear intensity of emotion. Our emotion recognition system can recognize a complex emotion using Fuzzy k-NN classifier.

Korean Facial Expression Emotion Recognition based on Image Meta Information (이미지 메타 정보 기반 한국인 표정 감정 인식)

  • Hyeong Ju Moon;Myung Jin Lim;Eun Hee Kim;Ju Hyun Shin
    • Smart Media Journal
    • /
    • v.13 no.3
    • /
    • pp.9-17
    • /
    • 2024
  • Due to the recent pandemic and the development of ICT technology, the use of non-face-to-face and unmanned systems is expanding, and it is very important to understand emotions in communication in non-face-to-face situations. As emotion recognition methods for various facial expressions are required to understand emotions, artificial intelligence-based research is being conducted to improve facial expression emotion recognition in image data. However, existing research on facial expression emotion recognition requires high computing power and a lot of learning time because it utilizes a large amount of data to improve accuracy. To improve these limitations, this paper proposes a method of recognizing facial expressions using age and gender, which are image meta information, as a method of recognizing facial expressions with even a small amount of data. For facial expression emotion recognition, a face was detected using the Yolo Face model from the original image data, and age and gender were classified through the VGG model based on image meta information, and then seven emotions were recognized using the EfficientNet model. The accuracy of the proposed data classification learning model was higher as a result of comparing the meta-information-based data classification model with the model trained with all data.

Facial Expression Training Digital Therapeutics for Autistic Children (자폐아를 위한 표정 훈련 디지털 치료제)

  • Jiyeon Park;Kyoung Won Lee;Seong Yong Ohm
    • The Journal of the Convergence on Culture Technology
    • /
    • v.9 no.1
    • /
    • pp.581-586
    • /
    • 2023
  • Recently a drama that features a lawyer with autism spectrum disorder has attracted a lot of attention, raising interest in the difficulties faced by people with autism spectrum disorders. If the Autism spectrum gets detected early and proper education and treatment, the prognosis can be improved, so the development of the treatment is urgently needed. Drugs currently used to treat autism spectrum often have side effects, so Digital Therapeutics that have no side effects and can be supplied in large quantities are drawing attention. In this paper, we introduce 'AEmotion', an application and a Digital Therapeutic that provides emotion and facial expression learning for toddlers with an autism spectrum disorder. This system is developed as an application for smartphones to increase interest in training autistic children and to test easily. Using machine learning, this system consists of three main stages: an 'emotion learning' step to learn emotions with facial expression cards, an 'emotion identification' step to check if the user understood emotions and facial expressions properly, and an 'expression training' step to make appropriate facial expressions. Through this system, it is expected that it will help autistic toddlers who have difficulties with social interactions by having problems recognizing facial expressions and emotions.

Smart Affect Jewelry based on Multi-modal (멀티 모달 기반의 스마트 감성 주얼리)

  • Kang, Yun-Jeong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.20 no.7
    • /
    • pp.1317-1324
    • /
    • 2016
  • Utilizing the Arduino platform to express the emotions that reflect the colors expressed the jewelry. Emotional color expression utilizes Plutchik's Wheel of Emotions model was applied to the similarity of emotions and colors. It receives the recognized value from the temperature, lighting, sound, pulse sensor and gyro sensor of a smart jewelery that can be easily accessible from your smartphone processes that recognize and process the emotion applied the rules of inference based on ontology. The emotional feelings color depending on the color looking for the emotion seen in context and applied to the smart LED jewelry. The emotion and the color combination of contextual information extracted from the recognition sensors are reflected in the built-in smart LED Jewelry depending on the emotions of the wearer. Take a light plus the emotion in a smart jewelery can represent the emotions of the situation, the doctor will be able to be a tool of representation.

Impact Analysis of nonverbal multimodals for recognition of emotion expressed virtual humans (가상 인간의 감정 표현 인식을 위한 비언어적 다중모달 영향 분석)

  • Kim, Jin Ok
    • Journal of Internet Computing and Services
    • /
    • v.13 no.5
    • /
    • pp.9-19
    • /
    • 2012
  • Virtual human used as HCI in digital contents expresses his various emotions across modalities like facial expression and body posture. However, few studies considered combinations of such nonverbal multimodal in emotion perception. Computational engine models have to consider how a combination of nonverbal modal like facial expression and body posture will be perceived by users to implement emotional virtual human, This paper proposes the impacts of nonverbal multimodal in design of emotion expressed virtual human. First, the relative impacts are analysed between different modals by exploring emotion recognition of modalities for virtual human. Then, experiment evaluates the contribution of the facial and postural congruent expressions to recognize basic emotion categories, as well as the valence and activation dimensions. Measurements are carried out to the impact of incongruent expressions of multimodal on the recognition of superposed emotions which are known to be frequent in everyday life. Experimental results show that the congruence of facial and postural expression of virtual human facilitates perception of emotion categories and categorical recognition is influenced by the facial expression modality, furthermore, postural modality are preferred to establish a judgement about level of activation dimension. These results will be used to implementation of animation engine system and behavior syncronization for emotion expressed virtual human.