• Title/Summary/Keyword: Emotion Communication

Search Result 515, Processing Time 0.026 seconds

Effect of Verbal and Non-verbal Salesperson Communication in Service Encounters on Customer Emotions and Service Quality Perceptions -Focus on National Brands- (캐주얼의류매장 판매원의 커뮤니케이션이 감정유형과 서비스품질지각에 미치는 영향 -내셔널브랜드를 중심으로-)

  • Lee, Ok-Hee
    • Journal of the Korean Society of Clothing and Textiles
    • /
    • v.37 no.1
    • /
    • pp.51-63
    • /
    • 2013
  • This study investigates the effect of verbal and nonverbal communication on customer emotions and service quality perceptions. The subjects used in this study were customers of a fashion shop in Sunchon South Korea. The questionnaires were conveniently sampled from July 2010 to August, 2010. Questionnaire data from 335 customers of a national brand were analyzed through a reliability analysis, factor analysis, and multiple regression analysis. The results of this study are as follows. First, it was found that the verbal communication of service providers have a significant impact on customer emotion. Second postures/proxemics and physical appearance/paralanguage (out of 3 factors of nonverbal communication) have significant (+) influences on the positive emotion of customers and kinesics have significant (-) effects on the negative emotion of customers. Third, the verbal communication of service providers has a considerable impact on customer service quality perceptions. Forth, given the relationship between non-verbal communication and service quality, it was represented that all factors (postures/proxemics, physical appearance/paralanguage, and kinesics) of nonverbal communication, have significant positive influences upon customer service quality perceptions. Fifth, it was found that customer emotions have a significant impact on customer service quality perceptions.

The Relationship between Collegiate Athletes' Communication and Problem-Solving Capacity: The Mediating Effect of Cognitive Emotion Regulation Strategy (대학 운동선수들의 의사소통과 문제해결능력의 관계: 인지적 정서조절전략의 매개효과)

  • Choi, Youngjun
    • 한국체육학회지인문사회과학편
    • /
    • v.58 no.3
    • /
    • pp.67-78
    • /
    • 2019
  • The purpose of this study was to examine the mediating effects of adaptive-maladaptive cognitive emotion regulation strategies in relationship between communication competence and problem-solving capacity. Subjects were 189 male collegiate athletes. The results were as follows: Their communication competence had a positive influence on their problem-solving capacity, and their adaptive emotion regulation strategies had a partial mediating effect on the relationship between communication competence and problem-solving capacity. But the maladaptive emotion regulation strategy did not have a statistically significant relationship with communication competence or problem-solving capacity. This result suggests that the communication competence and customized adaptive emotion regulation strategies are necessary to improve the problem-solving capacity of collegiate athletes.

Design and Implementation of Dynamic Emotion System for Affective Robots (감성로봇을 위한 동적 감성시스템의 설계와 구현)

  • Lee, Yong-Woo;Kim, Jong-Bok;Kim, Sung-Hoon;Suh, Il-Hong;Park, Myung-Kwan
    • Proceedings of the IEEK Conference
    • /
    • 2006.06a
    • /
    • pp.927-928
    • /
    • 2006
  • In this paper, we propose a dynamic emotion system involving the state equation and the output equation from the control theory. In our emotion system, the state equation accepts external stimulus and generates emotions. And the output equation modifies the intensity of emotions in accordance with personalities and circumstances. The validity of the proposed emotion system is shown by two simulation works which express emotions according to personalities and circumstances.

  • PDF

An Emotion Classification Based on Fuzzy Inference and Color Psychology

  • Son, Chang-Sik;Chung, Hwan-Mook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.1
    • /
    • pp.18-22
    • /
    • 2004
  • It is difficult to understand a person's emotion, since it is subjective and vague. Therefore, we are proposing a method by which will effectively classify human emotions into two types (that is, single emotion and composition emotion). To verify validity of te proposed method, we conducted two experiments based on general inference and $\alpha$-cut, and compared the experimental results. In the first experiment emotions were classified according to fuzzy inference. On the other hand in the second experiment emotions were classified according to $\alpha$-cut. Our experimental results showed that the classification of emotion based on a- cut was more definite that that based on fuzzy inference.

Emotion Recognition based on Multiple Modalities

  • Kim, Dong-Ju;Lee, Hyeon-Gu;Hong, Kwang-Seok
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.12 no.4
    • /
    • pp.228-236
    • /
    • 2011
  • Emotion recognition plays an important role in the research area of human-computer interaction, and it allows a more natural and more human-like communication between humans and computer. Most of previous work on emotion recognition focused on extracting emotions from face, speech or EEG information separately. Therefore, a novel approach is presented in this paper, including face, speech and EEG, to recognize the human emotion. The individual matching scores obtained from face, speech, and EEG are combined using a weighted-summation operation, and the fused-score is utilized to classify the human emotion. In the experiment results, the proposed approach gives an improvement of more than 18.64% when compared to the most successful unimodal approach, and also provides better performance compared to approaches integrating two modalities each other. From these results, we confirmed that the proposed approach achieved a significant performance improvement and the proposed method was very effective.

Design of Intelligent Emotion Recognition Model

  • Kim, Yi-gon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.7
    • /
    • pp.611-614
    • /
    • 2001
  • Voice is one of the most efficient communication media and it includes several kinds of factors about speaker, context emotion and so on. Human emotion is expressed is expressed in the speech, the gesture, the physiological phenomena(the breath, the beating of the pulse, etc). In this paper, the emotion recognition method model using neuro-fuzzy in order to have cognizance of emotion from voice signal is presented and simulated.

  • PDF

A Study on the Emotion State Classification using Multi-channel EEG (다중채널 뇌파를 이용한 감정상태 분류에 관한 연구)

  • Kang, Dong-Kee;Kim, Heung-Hwan;Kim, Dong-Jun;Lee, Byung-Chae;Ko, Han-Woo
    • Proceedings of the KIEE Conference
    • /
    • 2001.07d
    • /
    • pp.2815-2817
    • /
    • 2001
  • This study describes the emotion classification using two different feature extraction methods for four-channel EEG signals. One of the methods is linear prediction analysis based on AR model. Another method is cross-correlation coefficients on frequencies of ${\theta}$, ${\alpha}$, ${\beta}$ bands. Using the linear predictor coefficients and the cross-correlation coefficients of frequencies, the emotion classification test for four emotions, such as anger, sad, joy, and relaxation is performed with a neural network. Comparing the results of two methods, it seems that the linear predictor coefficients produce the better results than the cross-correlation coefficients of frequencies for-emotion classification.

  • PDF

A Multimodal Emotion Recognition Using the Facial Image and Speech Signal

  • Go, Hyoun-Joo;Kim, Yong-Tae;Chun, Myung-Geun
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.5 no.1
    • /
    • pp.1-6
    • /
    • 2005
  • In this paper, we propose an emotion recognition method using the facial images and speech signals. Six basic emotions including happiness, sadness, anger, surprise, fear and dislike are investigated. Facia] expression recognition is performed by using the multi-resolution analysis based on the discrete wavelet. Here, we obtain the feature vectors through the ICA(Independent Component Analysis). On the other hand, the emotion recognition from the speech signal method has a structure of performing the recognition algorithm independently for each wavelet subband and the final recognition is obtained from the multi-decision making scheme. After merging the facial and speech emotion recognition results, we obtained better performance than previous ones.

Multiclass Music Classification Approach Based on Genre and Emotion

  • Jonghwa Kim
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.16 no.3
    • /
    • pp.27-32
    • /
    • 2024
  • Reliable and fine-grained musical metadata are required for efficient search of rapidly increasing music files. In particular, since the primary motive for listening to music is its emotional effect, diversion, and the memories it awakens, emotion classification along with genre classification of music is crucial. In this paper, as an initial approach towards a "ground-truth" dataset for music emotion and genre classification, we elaborately generated a music corpus through labeling of a large number of ordinary people. In order to verify the suitability of the dataset through the classification results, we extracted features according to MPEG-7 audio standard and applied different machine learning models based on statistics and deep neural network to automatically classify the dataset. By using standard hyperparameter setting, we reached an accuracy of 93% for genre classification and 80% for emotion classification, and believe that our dataset can be used as a meaningful comparative dataset in this research field.

Emotion Recognition Method using Gestures and EEG Signals (제스처와 EEG 신호를 이용한 감정인식 방법)

  • Kim, Ho-Duck;Jung, Tae-Min;Yang, Hyun-Chang;Sim, Kwee-Bo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.9
    • /
    • pp.832-837
    • /
    • 2007
  • Electroencephalographic(EEG) is used to record activities of human brain in the area of psychology for many years. As technology develope, neural basis of functional areas of emotion processing is revealed gradually. So we measure fundamental areas of human brain that controls emotion of human by using EEG. Hands gestures such as shaking and head gesture such as nodding are often used as human body languages for communication with each other, and their recognition is important that it is a useful communication medium between human and computers. Research methods about gesture recognition are used of computer vision. Many researchers study Emotion Recognition method which uses one of EEG signals and Gestures in the existing research. In this paper, we use together EEG signals and Gestures for Emotion Recognition of human. And we select the driver emotion as a specific target. The experimental result shows that using of both EEG signals and gestures gets high recognition rates better than using EEG signals or gestures. Both EEG signals and gestures use Interactive Feature Selection(IFS) for the feature selection whose method is based on a reinforcement learning.