• Title/Summary/Keyword: Emotional valence

Search Result 69, Processing Time 0.029 seconds

Acoustic parameters for induced emotion categorizing and dimensional approach (자연스러운 정서 반응의 범주 및 차원 분류에 적합한 음성 파라미터)

  • Park, Ji-Eun;Park, Jeong-Sik;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.16 no.1
    • /
    • pp.117-124
    • /
    • 2013
  • This study examined that how precisely MFCC, LPC, energy, and pitch related parameters of the speech data, which have been used mainly for voice recognition system could predict the vocal emotion categories as well as dimensions of vocal emotion. 110 college students participated in this experiment. For more realistic emotional response, we used well defined emotion-inducing stimuli. This study analyzed the relationship between the parameters of MFCC, LPC, energy, and pitch of the speech data and four emotional dimensions (valence, arousal, intensity, and potency). Because dimensional approach is more useful for realistic emotion classification. It results in the best vocal cue parameters for predicting each of dimensions by stepwise multiple regression analysis. Emotion categorizing accuracy analyzed by LDA is 62.7%, and four dimension regression models are statistically significant, p<.001. Consequently, this result showed the possibility that the parameters could also be applied to spontaneous vocal emotion recognition.

  • PDF

The Emotional Effect of Emoticon on Interpreting Text Message (메시지 해석에 이모티콘이 미치는 정서적 효과 - 휴대전화 문자 메시지 상황을 중심으로)

  • Ahn, Won-Mi;Kim, Jong-Wan;Han, Kwang-Hee
    • Journal of the HCI Society of Korea
    • /
    • v.5 no.1
    • /
    • pp.11-18
    • /
    • 2010
  • Cell phone users frequently send and receive text messages with emoticons in their everyday lives, but emoticon and its emotional effect on message in mobile context has not been focused on. The purpose of this study is to investigate the emotional effect of emoticons on interpretation of text message considering valent texts and emoticons. Study 1 showed the effect of valent emoticons on neutral message, and also revealed that existence of emoticon, regardless of its valence, decreases rigidness and increases familiarity of message. In Study 2 that used positive and negative text messages, we found that emoticon still affects assessing sender's mood state and emotion. Result of interaction between valences of text and emoticon showed that participants in condition of incongruent pairs felt more sarcasm.

  • PDF

Audio and Video Bimodal Emotion Recognition in Social Networks Based on Improved AlexNet Network and Attention Mechanism

  • Liu, Min;Tang, Jun
    • Journal of Information Processing Systems
    • /
    • v.17 no.4
    • /
    • pp.754-771
    • /
    • 2021
  • In the task of continuous dimension emotion recognition, the parts that highlight the emotional expression are not the same in each mode, and the influences of different modes on the emotional state is also different. Therefore, this paper studies the fusion of the two most important modes in emotional recognition (voice and visual expression), and proposes a two-mode dual-modal emotion recognition method combined with the attention mechanism of the improved AlexNet network. After a simple preprocessing of the audio signal and the video signal, respectively, the first step is to use the prior knowledge to realize the extraction of audio characteristics. Then, facial expression features are extracted by the improved AlexNet network. Finally, the multimodal attention mechanism is used to fuse facial expression features and audio features, and the improved loss function is used to optimize the modal missing problem, so as to improve the robustness of the model and the performance of emotion recognition. The experimental results show that the concordance coefficient of the proposed model in the two dimensions of arousal and valence (concordance correlation coefficient) were 0.729 and 0.718, respectively, which are superior to several comparative algorithms.

Valence of Social Emotions' Sense and Expression in SNS (SNS내 사회감성의 어휘적 의미와 표현에 대한 유의성)

  • Hyun, Hye-Jung;Whang, Min-Cheol
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.6
    • /
    • pp.37-48
    • /
    • 2014
  • Social emotion is being highlighted as an important factor of human life in terms of quality of communication as a variety of social networks are commonly used. To understand such social emotion, this study verifies and analyzes the significance of lexical meaning and expression of emotion basically for understanding of complex meaning of social emotion. The emotional expressions represented in SNS text messages, one of the major channel of communication, are examined in this study to create scales of meaning and expression and to understand the differences deeply. As a result of the analysis, it turned out that negative assessment factors were more than positive ones among social emotional factors while positive ones were outstandingly many in the case of social emotional expression. Social emotional factors were classified by basic emotional elements and valences while emotional expression included complex meaning and especially positive elements were dominant in general.

2D Emotion Classification using Short-Time Fourier Transform of Pupil Size Variation Signals and Convolutional Neural Network (동공크기 변화신호의 STFT와 CNN을 이용한 2차원 감성분류)

  • Lee, Hee-Jae;Lee, David;Lee, Sang-Goog
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.10
    • /
    • pp.1646-1654
    • /
    • 2017
  • Pupil size variation can not be controlled intentionally by the user and includes various features such as the blinking frequency and the duration of a blink, so it is suitable for understanding the user's emotional state. In addition, an ocular feature based emotion classification method should be studied for virtual and augmented reality, which is expected to be applied to various fields. In this paper, we propose a novel emotion classification based on CNN with pupil size variation signals which include not only various ocular feature information but also time information. As a result, compared to previous studies using the same database, the proposed method showed improved results of 5.99% and 12.98% respectively from arousal and valence emotion classification.

A Comparative Study of Emotion Using the International Affective Picture System (국제정서사진체계를 사용하여 유발된 정서의 측정: 비교문화적 타당성 연구)

  • 이경화;김지은;이임갑;손진훈
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 1997.11a
    • /
    • pp.220-223
    • /
    • 1997
  • The International Affective Picture System (IAPS) developed by Lang and colleagues[1] is widely used in studies relating a variety of physiological indices to subjective emotions. In this study we investigated whether the IAPS can be used for Koreans without significant cultural biases in their subjective emotional reactions. Thirty IAPS picture slides were presented to a group of 52 college students and different 30 slides with similar 3 dimensional emotion ratings to another group of 42 students. Fof each slieds with exposal time of 8sec, subjects were asked to rate on the Semantic Differential Scale (SDS) and Self-Assessment Manikin (SAM) in the 3 dimensions of pleasure valence, arousal, and domensions of pleasure valence, arousal, and dominance. Fnctor analysis was done for SDS ratings, and correlations of SDS and SAM were calculated. Eighteen bipolar adjective were grouped into 3 dimensions of pleasure, arousal, dominance showing good agreement with previous study. SAM were calculated. Eighteen bipolar adjectives were grouped into 3 dimensions of pleasure, arousal, dominance showing good agreement with the previous study. SAM ratings were highly corrlated with two of the 6 SDS adjective pairs associated with the pleasure and dominance dimensions, but not with those associated with arousal dimension suggerting some cultural differences.

  • PDF

Psychophysiological Reactivity to Affective Visual Stimulation of Negative Emotional Valence: Comparative Analysis of Autonomic and Frontal EEG Responses to the IAPS and the KAPS

  • Sohn, Jin-Hun;Estate M. Sokhadze;Lee, Kyung-Hwa
    • Science of Emotion and Sensibility
    • /
    • v.3 no.2
    • /
    • pp.29-40
    • /
    • 2000
  • Autonomic and EEG responses were analyzed in 32 college students exposed to visual stimulation with Korean Affective Picture System (KAPS) and 36 students exposed to the International Affective Picture System (IAPS). Cardiac, electrodermal, and electrocortical measures were recorded during 30 sec of viewing affective pictures. The slides intended to elicit basic emotions (fear, anger, surprise, disgust, and sadness) were presented to subjects via Kodak slide-projector. The aim of the study was to differentiate autonomic and EEG responses associated with the same negative valence emotions elicited by KAPS and IAPS stimulation and to identify the influence of cultural relevance on physiological reactivity. The analysis of obtained results revealed significant differences in physiological responsiveness to emotionally negative valence slides from KAPS and IAPS. The typical response profile for all emotions elicited by the KAPS included HR acceleration (except surprise), and increase of electrodermal activity, slow and fast alpha blocking and fast beta power increase in EEG, which was not associated with significant asymmetry (except fast alpha in sadness). Stimulation with the IAPS evoked HR deceleration, specific electrodermal responses with relatively high tonic electrodermal activation, alpha-blocking and fast beta increase, and was accompanied also by theta power increase and marked frontal asymmetry (e.g., fast beta, theta asymmetries in sadness, fast alpha in fear). Physiological responses to fear and anger-eliciting slides from the IAPS were significantly less profound and were accompanied by autonomic and EEG changes more typical for attention rather than negative affect. Higher cardiovascular and electrodermal reactivity to fear emotion observed in the KAPS, e.g., as compared to data with the IAPS as stimuli, can be explained by cultural relevance and higher effectiveness of the KAPS in producing certain emotions such as fear in Koreans.

  • PDF

Emotional effect of the Covid-19 pandemic on oral surgery procedures: a social media analysis

  • Altan, Ahmet
    • Journal of Dental Anesthesia and Pain Medicine
    • /
    • v.21 no.3
    • /
    • pp.237-244
    • /
    • 2021
  • Background: This study aimed to analyze Twitter users' emotional tendencies regarding oral surgery procedures before and after the coronavirus disease 2019 (COVID-19) pandemic worldwide. Methods: Tweets posted in English before and after the COVID-19 pandemic were included in the study. Popular tweets in 2019 were searched using the keywords "tooth removal", "tooth extraction", "dental pain", "wisdom tooth", "wisdom teeth", "oral surgery", "oral surgeon", and "OMFS". In 2020, another search was conducted by adding the words "COVID" and "corona" to the abovementioned keywords. Emotions underlying the tweets were analyzed using CrystalFeel - Multidimensional Emotion Analysis. In this analysis, we focused on four emotions: fear, anger, sadness, and joy. Results: A total of 1240 tweets, which were posted before and after the COVID-19 pandemic, were analyzed. There was a statistically significant difference between the emotions' distribution before and after the pandemic (p < 0.001). While the sense of joy decreased after the pandemic, anger and fear increased. There was a statistically significant difference between the emotional valence distributions before and after the pandemic (p < 0.001). While a negative emotion intensity was noted in 52.9% of the messages before the pandemic, it was observed in 74.3% of the messages after the pandemic. A positive emotional intensity was observed in 29.8% of the messages before the pandemic, but was seen in 10.7% of the messages after the pandemic. Conclusion: Infectious diseases, such as COVID-19, may lead to mental, emotional, and behavioral changes in people. Unpredictability, uncertainty, disease severity, misinformation, and social isolation may further increase dental anxiety and fear among people.

Effect of Emotional Incongruence in Negative Emotional Valence & Cross-modality (교차 양상과 부정 정서에서의 정서 불일치 효과에 따른 기억의 차이)

  • Kim, Soyeon;Han, Kwang-Hee
    • Science of Emotion and Sensibility
    • /
    • v.17 no.3
    • /
    • pp.107-116
    • /
    • 2014
  • In the current study, it is suggested that when two emotions are presented through cross-modality, such as auditory and visual, incongruence will influence arousal, recognition, and recall of subjects. The first hypothesis is that incongruent cross-modality does not only increase arousal more than the congruent, but it also increases recall and recognition more than congruent. The second hypothesis is that arousal modulates recall and recognition of subjects. To demonstrate the two hypotheses, our experiment's conditions were manipulated to be congruent and incongruent by presenting positive or negative emotions, visually and acoustically. For dependent variables, we measured recall rate and recognition rates. and arousal was measured by PAD (pleasure-arousal-dominance) scales. After eight days, only recognition was measured repeatedly online. As a result, our behavioral experiment showed that there was a significant difference between arousal before watching a movie clip and after (p<.001), but no difference between the congruent condition and incongruent condition. Also, there was no significant difference between recognition performance in the congruent condition and incongruent condition, but there was a main effect of the clips' emotions. Interestingly after analyzing recognition rates separately depending on clips' emotions, there was a significant difference between congruent and incongruent conditions in the only negative clip (p= .044), not in the positive clip. In a detailed result, recognition in the incongruent condition is more than in the congruent condition. Furthermore, in the case of recall performance, there was a significant interaction between the clips' emotions shown in the clips and congruent conditions (p=.039). Through these results, the effect of incongruence with negative emotion was demonstrated, but an incongruent effect by arousal could not be demonstrated. In conclusion, in our study, we tried to determine the impact of one method to convey a story dramatically and have an effect on memory. These effects are influenced by the subjects' perceived emotions (valence and arousal).

Improvement of a Context-aware Recommender System through User's Emotional State Prediction (사용자 감정 예측을 통한 상황인지 추천시스템의 개선)

  • Ahn, Hyunchul
    • Journal of Information Technology Applications and Management
    • /
    • v.21 no.4
    • /
    • pp.203-223
    • /
    • 2014
  • This study proposes a novel context-aware recommender system, which is designed to recommend the items according to the customer's responses to the previously recommended item. In specific, our proposed system predicts the user's emotional state from his or her responses (such as facial expressions and movements) to the previous recommended item, and then it recommends the items that are similar to the previous one when his or her emotional state is estimated as positive. If the customer's emotional state on the previously recommended item is regarded as negative, the system recommends the items that have characteristics opposite to the previous item. Our proposed system consists of two sub modules-(1) emotion prediction module, and (2) responsive recommendation module. Emotion prediction module contains the emotion prediction model that predicts a customer's arousal level-a physiological and psychological state of being awake or reactive to stimuli-using the customer's reaction data including facial expressions and body movements, which can be measured using Microsoft's Kinect Sensor. Responsive recommendation module generates a recommendation list by using the results from the first module-emotion prediction module. If a customer shows a high level of arousal on the previously recommended item, the module recommends the items that are most similar to the previous item. Otherwise, it recommends the items that are most dissimilar to the previous one. In order to validate the performance and usefulness of the proposed recommender system, we conducted empirical validation. In total, 30 undergraduate students participated in the experiment. We used 100 trailers of Korean movies that had been released from 2009 to 2012 as the items for recommendation. For the experiment, we manually constructed Korean movie trailer DB which contains the fields such as release date, genre, director, writer, and actors. In order to check if the recommendation using customers' responses outperforms the recommendation using their demographic information, we compared them. The performance of the recommendation was measured using two metrics-satisfaction and arousal levels. Experimental results showed that the recommendation using customers' responses (i.e. our proposed system) outperformed the recommendation using their demographic information with statistical significance.