• Title/Summary/Keyword: expression of emotions

Search Result 329, Processing Time 0.022 seconds

Development of Emotional Messenger for IPTV (IPTV를 위한 감성 메신저의 개발)

  • Sung, Min-Young;Paek, Seon-Uck;Ahn, Seong-Hye;Lee, Jun-Ha
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.12
    • /
    • pp.51-58
    • /
    • 2010
  • In the environment of instant messengers, the recognition of human emotions and its automated representation with personalized 3D character animations facilitate the use of affectivity in the machine-based communication, which will contribute to enhanced communication. This paper describes an emotional messenger system developed for the automated recognition and expression of emotions for IPTVs (Internet Protocol televisions). Aiming for efficient delivery of users' emotions, we propose emotion estimation that assesses the affective contents of given textual messages, character animation that supports both 3D rendering and video playback, and smart phone-based input method. Demonstration and experiments validate the usefulness and performance of the proposed system.

Emotional Expression System Based on Dynamic Emotion Space (동적 감성 공간에 기반한 감성 표현 시스템)

  • Sim Kwee-Bo;Byun Kwang-Sub;Park Chang-Hyun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.1
    • /
    • pp.18-23
    • /
    • 2005
  • It is difficult to define and classify human emotion. These vague human emotion appear not in single emotion, but in combination of various emotion. And among them, a remarkable emotion is expressed. This paper proposes a emotional expression algorithm using dynamic emotion space, which give facial expression in similar with vague human emotion. While existing avatar express several predefined emotions from database, our emotion expression system can give unlimited various facial expression by expressing emotion based on dynamically changed emotion space. In order to see whether our system practically give complex and various human expression, we perform real implementation and experiment and verify the efficacy of emotional expression system based on dynamic emotion space.

Comparative Analysis for Emotion Expression Using Three Methods Based by CNN (CNN기초로 세 가지 방법을 이용한 감정 표정 비교분석)

  • Yang, Chang Hee;Park, Kyu Sub;Kim, Young Seop;Lee, Yong Hwan
    • Journal of the Semiconductor & Display Technology
    • /
    • v.19 no.4
    • /
    • pp.65-70
    • /
    • 2020
  • CNN's technologies that represent emotional detection include primitive CNN algorithms, deployment normalization, and drop-off. We present the methods and data of the three experiments in this paper. The training database and the test database are set up differently. The first experiment is to extract emotions using Batch Normalization, which complemented the shortcomings of distribution. The second experiment is to extract emotions using Dropout, which is used for rapid computation. The third experiment uses CNN using convolution and maxpooling. All three results show a low detection rate, To supplement these problems, We will develop a deep learning algorithm using feature extraction method specialized in image processing field.

Engine of computational Emotion model for emotional interaction with human (인간과 감정적 상호작용을 위한 '감정 엔진')

  • Lee, Yeon Gon
    • Science of Emotion and Sensibility
    • /
    • v.15 no.4
    • /
    • pp.503-516
    • /
    • 2012
  • According to the researches of robot and software agent until now, computational emotion model is dependent on system, so it is hard task that emotion models is separated from existing systems and then recycled into new systems. Therefore, I introduce the Engine of computational Emotion model (shall hereafter appear as EE) to integrate with any robots or agents. This is the engine, ie a software for independent form from inputs and outputs, so the EE is Emotion Generation to control only generation and processing of emotions without both phases of Inputs(Perception) and Outputs(Expression). The EE can be interfaced with any inputs and outputs, and produce emotions from not only emotion itself but also personality and emotions of person. In addition, the EE can be existed in any robot or agent by a kind of software library, or be used as a separate system to communicate. In EE, emotions is the Primary Emotions, ie Joy, Surprise, Disgust, Fear, Sadness, and Anger. It is vector that consist of string and coefficient about emotion, and EE receives this vectors from input interface and then sends its to output interface. In EE, each emotions are connected to lists of emotional experiences, and the lists consisted of string and coefficient of each emotional experiences are used to generate and process emotional states. The emotional experiences are consisted of emotion vocabulary understanding various emotional experiences of human. This study EE is available to use to make interaction products to response the appropriate reaction of human emotions. The significance of the study is on development of a system to induce that person feel that product has your sympathy. Therefore, the EE can help give an efficient service of emotional sympathy to products of HRI, HCI area.

  • PDF

Experiencing and Expression of Deaf Adolescents (농인 청소년의 감정 경험 및 표현 특성)

  • Park, Ji-Eun;Kim, Eun-Ye;Jang, Un-Jung;Cheong, E-Nae;Eum, Young-Ji;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.19 no.3
    • /
    • pp.51-58
    • /
    • 2016
  • This study examined the difference between the deaf and hearing adolescents of experiencing emotions and the intensity levels of expressing them. Three different video clips were used to induce pleasure, anger, and sadness. While watching the clips, facial expressions of the participants were recorded. The experienced emotions were measured by a self-report method, and the third person rated participants' expressed emotions based upon the recorded facial images. Two groups (deaf and hearing) were compared if those two groups shared the same experienced emotions, and whether the ratings scored by the third person corresponded with the self-rated scores. There was no significant difference in experienced emotion and its intensity level. However, hearing adolescents showed more intensive responses of pleasure than they reported, while deaf adolescents showed less intensive expressions of happiness than they reported themselves. Thus, hearing people might not be able to detect and fully comprehend how the deaf feel in general circumstances. This further indicates that the deaf adolescents cannot get enough supports from the hearing people when they express their feelings, and consequently, have a possibility of causing misunderstandings, conflicts, or even a break in relationships.

Component Analysis for Constructing an Emotion Ontology (감정 온톨로지의 구축을 위한 구성요소 분석)

  • Yoon, Ae-Sun;Kwon, Hyuk-Chul
    • Korean Journal of Cognitive Science
    • /
    • v.21 no.1
    • /
    • pp.157-175
    • /
    • 2010
  • Understanding dialogue participant's emotion is important as well as decoding the explicit message in human communication. It is well known that non-verbal elements are more suitable for conveying speaker's emotions than verbal elements. Written texts, however, contain a variety of linguistic units that express emotions. This study aims at analyzing components for constructing an emotion ontology, that provides us with numerous applications in Human Language Technology. A majority of the previous work in text-based emotion processing focused on the classification of emotions, the construction of a dictionary describing emotion, and the retrieval of those lexica in texts through keyword spotting and/or syntactic parsing techniques. The retrieved or computed emotions based on that process did not show good results in terms of accuracy. Thus, more sophisticate components analysis is proposed and the linguistic factors are introduced in this study. (1) 5 linguistic types of emotion expressions are differentiated in terms of target (verbal/non-verbal) and the method (expressive/descriptive/iconic). The correlations among them as well as their correlation with the non-verbal expressive type are also determined. This characteristic is expected to guarantees more adaptability to our ontology in multi-modal environments. (2) As emotion-related components, this study proposes 24 emotion types, the 5-scale intensity (-2~+2), and the 3-scale polarity (positive/negative/neutral) which can describe a variety of emotions in more detail and in standardized way. (3) We introduce verbal expression-related components, such as 'experiencer', 'description target', 'description method' and 'linguistic features', which can classify and tag appropriately verbal expressions of emotions. (4) Adopting the linguistic tag sets proposed by ISO and TEI and providing the mapping table between our classification of emotions and Plutchik's, our ontology can be easily employed for multilingual processing.

  • PDF

Mobile Emoticon Use for Positive Behavior and Communication: Focusing on Male and Female College Students (커뮤니케이션을 위한 모바일 이모티콘의 긍정적 사용행태연구: 남녀대학생을 중심으로)

  • Ju, Youngae;Kim, Seonju;Kim, Woojoung
    • Journal of Families and Better Life
    • /
    • v.34 no.5
    • /
    • pp.35-52
    • /
    • 2016
  • The purpose of this study was to compare college students' emoticon use for positive behavior and to analyze the impact of the factors. This study is consisted of 66 questions based on Social demographic characteristics, personality characteristics factors and both scales of emotional expression. The operational test was administered to 340 college students. In an effort to calibrate emoticon use of positive behavior, the results displayed significance differences depending on the gender and grade of the students. Women used more emoticons than men, and freshman used more emoticons than any other year of students. Women used more emoticons such as texts, images, and flash more than men. Students in their freshman year were shown to use more emoticons than students in other grade levels. Emotional expression was strongly related man with personality traits of integrity. Personality traits of sensitivity, openness, intimacy, and sincerity had a significant influence on the emotional expression. An emoticon is, therefore, be a useful tool to express their emotions. Male and female students with higher levels of congeniality had expressed more emotions that were self-defence ambivalent and concerned-relation ambivalent. However, in the case of the female students, those with higher levels of self-defence ambivalent emotional expressiveness had a lower usage of positive emoticons, while male students with higher levels of self-defence emotional expressiveness used more positive emoticons. Higher levels of congeniality among groups that use mobile emoticons with their parents, whether the frequency of use was high or low, were associated with higher levels of self-defence ambivalence and concerned-relation ambivalence. Those with higher levels of sincerity had low levels of self-defence ambivalence and concerned-relation ambivalence, and those with higher levels of concerned-relation ambivalence had higher levels of positive emoticon usage.

Sex differences of children's facial expression discrimination based on two-dimensional model of emotion (정서의 이차원모델에서 아동의 얼굴표정 변별에서 성 차이)

  • Shin, Young-Suk
    • Korean Journal of Cognitive Science
    • /
    • v.21 no.1
    • /
    • pp.127-143
    • /
    • 2010
  • This study explores children's sex differences of emotion discrimination from facial expressions based on two dimensional model of emotion. The study group consisted of 92 children, of 40, 52, and 64 months of age, and the rate of male and female children was male children (50%) and female children (50%). Children of 92 were required to choose facial expressions related the twelve emotion terms. Facial expressions applied for experiment are used the photographs rated the degree of expression in each of the two dimensions (pleasure-displeasure dimension and arousal-sleep dimension) on a nine-point scale from 54 university students. The experimental findings appeared that the sex differences were distinctly the arousal-sleep dimension than the pleasure-displeasure dimension. In the arousal-sleep dimensionoussleepness, anger, comfort, and loneliness' emotions showed large sex differences over 1 value. Especially, while male children showed high arousal more than female children in the emotions like 'sleepiness, anger and loneliness', female children showed high arousal more than male children in 'comfort' emotion.

  • PDF

A Study on the Creation of Interactive Text Collage using Viewer Narratives (관람자 내러티브를 활용한 인터랙티브 텍스트 콜라주 창작 연구)

  • Lim, Sooyeon
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.4
    • /
    • pp.297-302
    • /
    • 2022
  • Contemporary viewers familiar with the digital space show their desire for self-expression and use voice, text and gestures as tools for expression. The purpose of this study is to create interactive art that expresses the narrative uttered by the viewer in the form of a collage using the viewer's figure, and reproduces and expands the story by the viewer's movement. The proposed interactive art visualizes audio and video information acquired from the viewer in a text collage, and uses gesture information and a natural user interface to easily and conveniently interact in real time and express personalized emotions. The three pieces of information obtained from the viewer are connected to each other to express the viewer's current temporary emotions. The rigid narrative of the text has some degree of freedom through the viewer's portrait images and gestures, and at the same time produces and expands the structure of the story close to reality. The artwork space created in this way is an experience space where the viewer's narrative is reflected, updated, and created in real time, and it is a reflection of oneself. It also induces active appreciation through the active intervention and action of the viewer.

The Influence of Children's Emotional Expression and Sociability, and Their Mothers' Communication Pattern on Their Prosocial Behavior (아동의 정서 표현성과 사교성, 어머니의 의사소통 유형이 아동의 친사회적 행동에 미치는 영향)

  • Song, Ha-Na;Choi, Kyoung-Sook
    • Journal of the Korean Home Economics Association
    • /
    • v.47 no.6
    • /
    • pp.1-10
    • /
    • 2009
  • This study investigated the influence of children's emotional expression and sociability, and their mothers' communication pattern on their prosocial behavior. The participants were 65 preschool children aged between 5 and 6, and their mothers. Each child-mother dyad was observed for 30 minutes in a lab setting, which was designed to evaluate the child's socioemotional competence and the mother's socialization behavior. Videotaped data were analyzed by two coders for aspects of sharing behavior, the expression of happiness, sadness, anger, anxiety, and sociability for children, and mothers' communication strategies. Results showed that children's anger and anxiety expression were the most significant predictors for their prosocial behavior. Mothers' punitive communication pattern negatively affected children's prosocial behavior. However, when compared to the children's emotional expression, its' accountability were not significant. The influence of negative emotions, and its' adverse role in interpersonal interactions are discussed.