• Title/Summary/Keyword: emotion technology

Search Result 802, Processing Time 0.028 seconds

Extraction of Speech Features for Emotion Recognition (감정 인식을 위한 음성 특징 도출)

  • Kwon, Chul-Hong;Song, Seung-Kyu;Kim, Jong-Yeol;Kim, Keun-Ho;Jang, Jun-Su
    • Phonetics and Speech Sciences
    • /
    • v.4 no.2
    • /
    • pp.73-78
    • /
    • 2012
  • Emotion recognition is an important technology in the filed of human-machine interface. To apply speech technology to emotion recognition, this study aims to establish a relationship between emotional groups and their corresponding voice characteristics by investigating various speech features. The speech features related to speech source and vocal tract filter are included. Experimental results show that statistically significant speech parameters for classifying the emotional groups are mainly related to speech sources such as jitter, shimmer, F0 (F0_min, F0_max, F0_mean, F0_std), harmonic parameters (H1, H2, HNR05, HNR15, HNR25, HNR35), and SPI.

A Survey on Image Emotion Recognition

  • Zhao, Guangzhe;Yang, Hanting;Tu, Bing;Zhang, Lei
    • Journal of Information Processing Systems
    • /
    • v.17 no.6
    • /
    • pp.1138-1156
    • /
    • 2021
  • Emotional semantics are the highest level of semantics that can be extracted from an image. Constructing a system that can automatically recognize the emotional semantics from images will be significant for marketing, smart healthcare, and deep human-computer interaction. To understand the direction of image emotion recognition as well as the general research methods, we summarize the current development trends and shed light on potential future research. The primary contributions of this paper are as follows. We investigate the color, texture, shape and contour features used for emotional semantics extraction. We establish two models that map images into emotional space and introduce in detail the various processes in the image emotional semantic recognition framework. We also discuss important datasets and useful applications in the field such as garment image and image retrieval. We conclude with a brief discussion about future research trends.

Emotional Mechanism Impacting Adoption of Luxury Wearables in E-Tail

  • Lee, Eun-Jung
    • International Journal of Advanced Culture Technology
    • /
    • v.10 no.2
    • /
    • pp.281-291
    • /
    • 2022
  • Recenlty, the category of luxury wearbles has expanded and the relevant reseach has been scarce. The study tests whether the emotional mechanism regarding luxury wearables within e-tail affects luxury brand perceptions. Furthermore, it tests the moderation effect of gender in the mechanism. A total of 393 responses from U.S. populations were collected through an international research company with using online survey methods. In the results, the positive and direct effect of dominance on positive emotion was significant, and the positive emotion significantly increases perceived brand luxury. However, no direct effect of dominance was found on perceived brand luxury. The moderation effect of gender in the relationship between positive emotion and perceived brand luxury was found positive and significant, but the hypothesized moderation effect of gender was insignificant in the relationship between dominance and perceived brand luxury. Implications and study limitations are discussed.

Proposal of Emotion Recognition Service in Mobile Health Application (모바일 헬스 애플리케이션의 감정인식 서비스 제안)

  • Ha, Mina;Lee, Yoo Jin;Park, Seung Ho
    • Design Convergence Study
    • /
    • v.15 no.1
    • /
    • pp.233-246
    • /
    • 2016
  • Mobile health industry has been combined with IT technology and is attracting attention. The health application has been developed to provide users a healthy life style. First of all, 5 mobile health applications were selected and reviewed in terms of their service trend. It turned out that none of those applications had any emotional data but physical one. Secondly, to extract users' emotion, technological researches were sorted into different categories. And the result implied that text-based emotion recognition technology is the most suitable for the mobile health service. To implement the service, the application was designed and developed the process of emotion recognition system based on the contents of the research. One-dimension emotion model, which is the standard of classifying emotional data and social network service, was set up as a source. In last, to suggest the usage of health application has been combined with persuasive technology. As a result, this paper prospered a overall service process, concrete service scheme and a guidelines containing 15 services in accordance with the five emotions and time. It is expected to become a direction for indicators considering a psychological individual context.

Real-time emotion analysis service with big data-based user face recognition (빅데이터 기반 사용자 얼굴인식을 통한 실시간 감성분석 서비스)

  • Kim, Jung-Ah;Park, Roy C.;Hwang, Gi-Hyun
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.18 no.2
    • /
    • pp.49-54
    • /
    • 2017
  • In this paper, we use face database to detect human emotion in real time. Although human emotions are defined globally, real emotional perception comes from the subjective thoughts of the judging person. Therefore, judging human emotions using computer image processing technology requires high technology. In order to recognize the emotion, basically the human face must be detected accurately and the emotion should be recognized based on the detected face. In this paper, based on the Cohn-Kanade Database, one of the face databases, faces are detected by combining the detected faces with the database.

  • PDF

A Study on Emotion Classification using 4-Channel EEG Signals (4채널 뇌파 신호를 이용한 감정 분류에 관한 연구)

  • Kim, Dong-Jun;Lee, Hyun-Min
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.2 no.2
    • /
    • pp.23-28
    • /
    • 2009
  • This study describes an emotion classification method using two different feature parameters of four-channel EEG signals. One of the parameters is linear prediction coefficients based on AR modelling. Another one is cross-correlation coefficients on frequencies of ${\theta}$, ${\alpha}$, ${\beta}$ bands of FFT spectra. Using the linear predictor coefficients and the cross-correlation coefficients of frequencies, the emotion classification test for four emotions, such as anger, sad, joy, and relaxation is performed with an artificial neural network. The results of the two parameters showed that the linear prediction coefficients have produced the better results for emotion classification than the cross-correlation coefficients of FFT spectra.

  • PDF