• Title/Summary/Keyword: emotion technology

Search Result 802, Processing Time 0.023 seconds

Hotspot Analysis of Korean Twitter Sentiments (한국어 트위터 감정의 핫스팟 분석)

  • Lim, Joasang;Kim, Jinman
    • Journal of Korea Multimedia Society
    • /
    • v.18 no.2
    • /
    • pp.233-243
    • /
    • 2015
  • A hotspot is a spatial pattern that properties or events of spaces are densely revealed in a particular area. Whereas location information is easily captured with increasing use of mobile devices, so is not our emotion unless asking directly through a survey. Tweet provides a good way of analyzing such spatial sentiment, but relevant research is hard to find. Therefore, we analyzed hotspots of emotion in the twitter using spatial autocorrelation. 10,142 tweets and related GPS data were extracted. Sentiment of tweets was classified into good or bad with a support vector machine algorithm. We used Moran's I and Getis-Ord $G_i^*$ for global and local spatial autocorrelation. Some hotspots were found significant and drawn on Seoul metropolitan area map. These results were found very similar to an earlier conducted official survey of happiness index.

Stylized Image Generation based on Music-image Synesthesia Emotional Style Transfer using CNN Network

  • Xing, Baixi;Dou, Jian;Huang, Qing;Si, Huahao
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.4
    • /
    • pp.1464-1485
    • /
    • 2021
  • Emotional style of multimedia art works are abstract content information. This study aims to explore emotional style transfer method and find the possible way of matching music with appropriate images in respect to emotional style. DCNNs (Deep Convolutional Neural Networks) can capture style and provide emotional style transfer iterative solution for affective image generation. Here, we learn the image emotion features via DCNNs and map the affective style on the other images. We set image emotion feature as the style target in this style transfer problem, and held experiments to handle affective image generation of eight emotion categories, including dignified, dreaming, sad, vigorous, soothing, exciting, joyous, and graceful. A user study was conducted to test the synesthesia emotional image style transfer result with ground truth user perception triggered by the music-image pairs' stimuli. The transferred affective image result for music-image emotional synesthesia perception was proved effective according to user study result.

The Study of Bio Emotion Cognition follow Stress Index Number by Multiplex SVM Algorithm (다중 SVM 알고리즘을 이용한 스트레스 지수에 따른 생체 감성 인식에 관한 연구)

  • Kim, Tae-Yeun;Seo, Dae-Woong;Bae, Sang-Hyun
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.5 no.1
    • /
    • pp.45-51
    • /
    • 2012
  • In this paper, it's a system which recognize the user's emotions after obtaining the biological informations(pulse sensor, blood pressure sensor, blood sugar sensor etc.) about user's bio informations through wireless sensors in accordance of previously collected informations about user's stress index and classification the Colors & Music. This system collects the inputs, saves in the database and finally, classifies emotions according to the stress quotient by using multiple SVM(Support Vector Machine) algorithm. The experiment of multiple SVM algorithm was conducted by using 2,000 data sets. The experiment has approximately 87.7% accuracy.

A Agent System Supporting Personalization based on Customer-Emotion (고객 감성 기반의 개인화를 지원하는 에이전트 시스템)

  • 고일석;김의재;신승수;나윤지
    • The Journal of the Korea Contents Association
    • /
    • v.2 no.1
    • /
    • pp.113-119
    • /
    • 2002
  • E-Biz. system must provide convenient and easy interface, efficient functions, and provides information and contents satisfying customers. To do this, many kinds of studies are being advanced actively about E-Biz. system using intelligent agent technology and efficient E-Biz. strategy. Also E-Biz. system score a lot of information and contents to get a efficient business activities and relationships between a company arid customers. To effectively adopt individual customer's preference and actively adapt change of business situation, we may consider individual user characteristics and emotion. This paper suggests E-Biz. system using intelligent agent technology based on user characteristics and emotion. This system can provide convenient user interface and improve customer royalty. In the results, it is found that proposed system is more efficient than existing systems about customer's preference.

  • PDF

Component Analysis for Constructing an Emotion Ontology (감정 온톨로지의 구축을 위한 구성요소 분석)

  • Yoon, Aesun;Kwon, Hyuk-Chul
    • Annual Conference on Human and Language Technology
    • /
    • 2009.10a
    • /
    • pp.19-24
    • /
    • 2009
  • 의사소통에서 대화자 간 감정의 이해는 메시지의 내용만큼이나 중요하다. 비언어적 요소에 의해 감정에 관한 더 많은 정보가 전달되고 있기는 하지만, 텍스트에도 화자의 감정을 나타내는 언어적 표지가 다양하고 풍부하게 녹아 들어 있다. 본 연구의 목적은 인간언어공학에 활용할 수 있는 감정 온톨로지를 설계하는 데 있다. 텍스트 기반 감정 처리 분야의 선행 연구가 감정을 분류하고, 각 감정의 서술적 어휘 목록을 작성하고, 이를 텍스트에서 검색함으로써, 추출된 감정의 정확도가 높지 않았다. 이에 비해, 본 연구에서 제안하는 감정 온톨로지는 다음과 같은 장점을 갖는다. 첫째, 감정 표현의 범주를 기술 대상(언어적 vs. 비언어적)과 방식(표현적, 서술적, 도상적)으로 분류하고, 이질적 특성을 갖는 6개 범주 간 상호 대응관계를 설정함으로써, 멀티모달 환경에 적용할 수 있다. 둘째, 세분화된 감정을 분류할 수 있되, 감정 간 차별성을 가질 수 있도록 24개의 감정 명세를 선별하고, 더 섬세하게 감정을 분류할 수 있는 속성으로 강도와 극성을 설정하였다. 셋째, 텍스트에 나타난 감정 표현을 명시적으로 구분할 수 있도록, 경험자 기술 대상과 방식 언어적 자질에 관한 속성을 도입하였다. 이때 본 연구에서 제안하는 감정 온톨로지가 한국어 처리에 국한되지 않고, 다국어 처리에 활용할 수 있도록 확장성을 고려했다.

  • PDF

Emotional Expression of the Virtual Influencer "Luo Tianyi(洛天依)" in Digital'

  • Guangtao Song;Albert Young Choi
    • International Journal of Advanced Culture Technology
    • /
    • v.12 no.2
    • /
    • pp.375-385
    • /
    • 2024
  • In the context of contemporary digital media, virtual influencers have become an increasingly important form of socialization and entertainment, in which emotional expression is a key factor in attracting viewers. In this study, we take Luo Tianyi, a Chinese virtual influencer, as an example to explore how emotions are expressed and perceived through facial expressions in different types of videos. Using Paul Ekman's Facial Action Coding System (FACS) and six basic emotion classifications, the study systematically analyzes Luo Tianyi's emotional expressions in three types of videos, namely Music show, Festivals and Brand Cooperation. During the study, Luo Tianyi's facial expressions and emotional expressions were analyzed through rigorous coding and categorization, as well as matching the context of the video content. The results show that Enjoyment is the most frequently expressed emotion by Luo Tianyi, reflecting the centrality of positive emotions in content creation. Meanwhile, the presence of other emotion types reveals the virtual influencer's efforts to create emotionally rich and authentic experiences. The frequency and variety of emotions expressed in different video genres indicate Luo Tianyi's diverse strategies for communicating and connecting with viewers in different contexts. The study provides an empirical basis for understanding and utilizing virtual influencers' emotional expressions, and offers valuable insights for digital media content creators to design emotional expression strategies. Overall, this study is valuable for understanding the complexity of virtual influencer emotional expression and its importance in digital media strategy.

Enhancing Multimodal Emotion Recognition in Speech and Text with Integrated CNN, LSTM, and BERT Models (통합 CNN, LSTM, 및 BERT 모델 기반의 음성 및 텍스트 다중 모달 감정 인식 연구)

  • Edward Dwijayanto Cahyadi;Hans Nathaniel Hadi Soesilo;Mi-Hwa Song
    • The Journal of the Convergence on Culture Technology
    • /
    • v.10 no.1
    • /
    • pp.617-623
    • /
    • 2024
  • Identifying emotions through speech poses a significant challenge due to the complex relationship between language and emotions. Our paper aims to take on this challenge by employing feature engineering to identify emotions in speech through a multimodal classification task involving both speech and text data. We evaluated two classifiers-Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM)-both integrated with a BERT-based pre-trained model. Our assessment covers various performance metrics (accuracy, F-score, precision, and recall) across different experimental setups). The findings highlight the impressive proficiency of two models in accurately discerning emotions from both text and speech data.

Intelligent Color Control for Display Panel (지능형 디스플레이 색상 조절)

  • Jo Jang-Gun;Kim Jong-Won;Seo Jae-Yong;Cho Hyun-Chan;Cho Tae-Hoon
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2006.05a
    • /
    • pp.237-240
    • /
    • 2006
  • Human's sight holds the most extents among other senses. It will become more beneficial in person's emotion or body, if we form much better environment to human in connection with visual information as importance of visual information. Human is using a lot of display units on modern society. Basic colors that compose these are Red, Green and Blue. Using these three colors, we can change color sense of monitor or brightness degree. Suitable color degree by individual's environment can reduce person's stress or give comfortable feeling. So Factors by human's emotion and environment are standardized using fuzzy and the method that is to apply the result of Intelligent Color Control(ICC) on display is proposed.

  • PDF