• 제목/요약/키워드: use of emotion

검색결과 708건 처리시간 0.028초

뇌파 스펙트럼 분석과 베이지안 접근법을 이용한 정서 분류 (Emotion Classification Using EEG Spectrum Analysis and Bayesian Approach)

  • 정성엽;윤현중
    • 산업경영시스템학회지
    • /
    • 제37권1호
    • /
    • pp.1-8
    • /
    • 2014
  • This paper proposes an emotion classifier from EEG signals based on Bayes' theorem and a machine learning using a perceptron convergence algorithm. The emotions are represented on the valence and arousal dimensions. The fast Fourier transform spectrum analysis is used to extract features from the EEG signals. To verify the proposed method, we use an open database for emotion analysis using physiological signal (DEAP) and compare it with C-SVC which is one of the support vector machines. An emotion is defined as two-level class and three-level class in both valence and arousal dimensions. For the two-level class case, the accuracy of the valence and arousal estimation is 67% and 66%, respectively. For the three-level class case, the accuracy is 53% and 51%, respectively. Compared with the best case of the C-SVC, the proposed classifier gave 4% and 8% more accurate estimations of valence and arousal for the two-level class. In estimation of three-level class, the proposed method showed a similar performance to the best case of the C-SVC.

Audio and Video Bimodal Emotion Recognition in Social Networks Based on Improved AlexNet Network and Attention Mechanism

  • Liu, Min;Tang, Jun
    • Journal of Information Processing Systems
    • /
    • 제17권4호
    • /
    • pp.754-771
    • /
    • 2021
  • In the task of continuous dimension emotion recognition, the parts that highlight the emotional expression are not the same in each mode, and the influences of different modes on the emotional state is also different. Therefore, this paper studies the fusion of the two most important modes in emotional recognition (voice and visual expression), and proposes a two-mode dual-modal emotion recognition method combined with the attention mechanism of the improved AlexNet network. After a simple preprocessing of the audio signal and the video signal, respectively, the first step is to use the prior knowledge to realize the extraction of audio characteristics. Then, facial expression features are extracted by the improved AlexNet network. Finally, the multimodal attention mechanism is used to fuse facial expression features and audio features, and the improved loss function is used to optimize the modal missing problem, so as to improve the robustness of the model and the performance of emotion recognition. The experimental results show that the concordance coefficient of the proposed model in the two dimensions of arousal and valence (concordance correlation coefficient) were 0.729 and 0.718, respectively, which are superior to several comparative algorithms.

Sentiment Analysis on 'HelloTalk' App Reviews Using NRC Emotion Lexicon and GoEmotions Dataset

  • Simay Akar;Yang Sok Kim;Mi Jin Noh
    • 스마트미디어저널
    • /
    • 제13권6호
    • /
    • pp.35-43
    • /
    • 2024
  • During the post-pandemic period, the interest in foreign language learning surged, leading to increased usage of language-learning apps. With the rising demand for these apps, analyzing app reviews becomes essential, as they provide valuable insights into user experiences and suggestions for improvement. This research focuses on extracting insights into users' opinions, sentiments, and overall satisfaction from reviews of HelloTalk, one of the most renowned language-learning apps. We employed topic modeling and emotion analysis approaches to analyze reviews collected from the Google Play Store. Several experiments were conducted to evaluate the performance of sentiment classification models with different settings. In addition, we identified dominant emotions and topics within the app reviews using feature importance analysis. The experimental results show that the Random Forest model with topics and emotions outperforms other approaches in accuracy, recall, and F1 score. The findings reveal that topics emphasizing language learning and community interactions, as well as the use of language learning tools and the learning experience, are prominent. Moreover, the emotions of 'admiration' and 'annoyance' emerge as significant factors across all models. This research highlights that incorporating emotion scores into the model and utilizing a broader range of emotion labels enhances model performance.

구급차 안전사고에 대한 공분산 구조분석 (A Study on Behavioral Factors for the Safety of Ambulance Driving by Coefficiencial Structural Analysis)

  • 조진만;이태용
    • 한국응급구조학회지
    • /
    • 제4권1호
    • /
    • pp.95-100
    • /
    • 2000
  • This is a study to evaluate the effects of the safety of ambulance driving and traffic accidents and to provide statistic information for the various factors to reduce the ambulance traffic accidents. The major instruments of this study were Korean Self-Analysis Driver Opinionnaire. This Questionnaire contains 8 items which measure drivers' opinions or attitudes: driving courtesy, emotion, traffic law, speed, vehicle condition, the use of drugs, high-risk behavior, human factors. The total of 145 divers were investigated ambulance drivers in Taejon City and others(6 City) from 2000. 5. July to 2000. 11. July. The data were analyzed by the path analysis - with SPSS and AMOS package program. The result are as follows : 1. It have suggested that risk factors of ambulance traffic accident much affected with emotion and speed control on safety ambulance driving(Y(Accident) = $0.88{\times}1$(Emotion Control) + $0.92{\times}2$(Speed) - $0.46{\times}3$(Traffic Law)+E). 2. It have suggested that risk factors of ambulance traffic accident much affected with emotion and speed control on safety ambulance driving(Y(Accident) = $0.398{\times}1$(Emotion Control) + $0.500{\times}2$(Speed) - $0.263{\times}3$(Traffic Law)+E) by coefficiecial structural analysis.

  • PDF

딥러닝 기반의 다범주 감성분석 모델 개발 (Development of Deep Learning Models for Multi-class Sentiment Analysis)

  • 알렉스 샤이코니;서상현;권영식
    • 한국IT서비스학회지
    • /
    • 제16권4호
    • /
    • pp.149-160
    • /
    • 2017
  • Sentiment analysis is the process of determining whether a piece of document, text or conversation is positive, negative, neural or other emotion. Sentiment analysis has been applied for several real-world applications, such as chatbot. In the last five years, the practical use of the chatbot has been prevailing in many field of industry. In the chatbot applications, to recognize the user emotion, sentiment analysis must be performed in advance in order to understand the intent of speakers. The specific emotion is more than describing positive or negative sentences. In light of this context, we propose deep learning models for conducting multi-class sentiment analysis for identifying speaker's emotion which is categorized to be joy, fear, guilt, sad, shame, disgust, and anger. Thus, we develop convolutional neural network (CNN), long short term memory (LSTM), and multi-layer neural network models, as deep neural networks models, for detecting emotion in a sentence. In addition, word embedding process was also applied in our research. In our experiments, we have found that long short term memory (LSTM) model performs best compared to convolutional neural networks and multi-layer neural networks. Moreover, we also show the practical applicability of the deep learning models to the sentiment analysis for chatbot.

신경회로망 기반 감성 인식 비젼 시스템 (Vision System for NN-based Emotion Recognition)

  • 이상윤;김성남;주영훈;박창현;심귀보
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2001년도 하계학술대회 논문집 D
    • /
    • pp.2036-2038
    • /
    • 2001
  • In this paper, we propose the neural network based emotion recognition method for intelligently recognizing the human's emotion using vision system. In the proposed method, human's emotion is divided into four emotion (surprise, anger, happiness, sadness). Also, we use R,G,B(red, green, blue) color image data and the gray image data to get the highly trust rate of feature point extraction. For this, we propose an algorithm to extract four feature points (eyebrow, eye, nose, mouth) from the face image acquired by the color CCD camera and find some feature vectors from those. And then we apply back-prapagation algorithm to the secondary feature vector(position and distance among the feature points). Finally, we show the practical application possibility of the proposed method.

  • PDF

정서재활 바이오피드백을 위한 얼굴 영상 기반 정서인식 연구 (Study of Emotion Recognition based on Facial Image for Emotional Rehabilitation Biofeedback)

  • 고광은;심귀보
    • 제어로봇시스템학회논문지
    • /
    • 제16권10호
    • /
    • pp.957-962
    • /
    • 2010
  • If we want to recognize the human's emotion via the facial image, first of all, we need to extract the emotional features from the facial image by using a feature extraction algorithm. And we need to classify the emotional status by using pattern classification method. The AAM (Active Appearance Model) is a well-known method that can represent a non-rigid object, such as face, facial expression. The Bayesian Network is a probability based classifier that can represent the probabilistic relationships between a set of facial features. In this paper, our approach to facial feature extraction lies in the proposed feature extraction method based on combining AAM with FACS (Facial Action Coding System) for automatically modeling and extracting the facial emotional features. To recognize the facial emotion, we use the DBNs (Dynamic Bayesian Networks) for modeling and understanding the temporal phases of facial expressions in image sequences. The result of emotion recognition can be used to rehabilitate based on biofeedback for emotional disabled.

빅데이터 기반 사용자 얼굴인식을 통한 실시간 감성분석 서비스 (Real-time emotion analysis service with big data-based user face recognition)

  • 김정아;박찬홍;황기현
    • 융합신호처리학회논문지
    • /
    • 제18권2호
    • /
    • pp.49-54
    • /
    • 2017
  • 본 논문에서는 실시간으로 사람의 감정을 검출하기 위해 얼굴 데이터베이스를 사용하여 감정을 인식한다. 사람의 감정은 사전적으로 정의되어 있지만 실제 감정인식은 판단하는 사람의 주관적인 생각에서 이루어진다. 따라서 컴퓨터 영상처리 기술을 이용하여 사람의 감정을 판단한다는 것은 높은 기술력을 요구한다. 감정을 인식하려면 기본적으로 사람의 얼굴을 정확하게 검출해야하고, 검출된 얼굴을 바탕으로 감정을 인식하여야 한다. 본 논문에서는 얼굴 데이터베이스 중 하나인 Cohn-Kanade Database를 바탕으로 검출이 완료된 얼굴영역에 데이터베이스를 접목하여 얼굴을 검출하였다.

  • PDF

An Emotion-based Image Retrieval System by Using Fuzzy Integral with Relevance Feedback

  • Lee, Joon-Whoan;Zhang, Lei;Park, Eun-Jong
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2008년도 하계종합학술대회
    • /
    • pp.683-688
    • /
    • 2008
  • The emotional information processing is to simulate and recognize human sensibility, sensuality or emotion, to realize natural and harmonious human-machine interface. This paper proposes an emotion-based image retrieval method. In this method, user can choose a linguistic query among some emotional adjectives. Then the system shows some corresponding representative images that are pre-evaluated by experts. Again the user can select a representative one among the representative images to initiate traditional content-based image retrieval (CBIR). By this proposed method any CBIR can be easily expanded as emotion-based image retrieval. In CBIR of our system, we use several color and texture visual descriptors recommended by MPEG-7. We also propose a fuzzy similarity measure based on Choquet integral in the CBIR system. For the communication between system and user, a relevance feedback mechanism is used to represent human subjectivity in image retrieval. This can improve the performance of image retrieval, and also satisfy the user's individual preference.

  • PDF

임상간호사의 감성지능과 대인관계능력, 직무만족도의 관계 (The Relationships among Emotional Intelligence, Interpersonal Relationship, and Job Satisfaction of Clinical Nurses)

  • 고현록;김정희
    • 한국간호교육학회지
    • /
    • 제20권3호
    • /
    • pp.413-423
    • /
    • 2014
  • Purpose: The purpose of this study was to investigate the relations among emotional intelligence, interpersonal relationship, and job satisfaction among clinical nurses. Methods: Data were collected from 315 nurses who had worked for more than 6 months at five general hospitals by a self-reported questionnaire. The collected data were analyzed with descriptive statistics, t-test, ANOVA, Scheffe-test, Pearson's correlation, and hierarchial regression using SPSS 18.0 program. Results: The mean score of emotional intelligence was 3.42 and the score of self emotional appraisal was the highest. The mean score of interpersonal relationship was 3.44 and the score of intimacy was the highest. The mean score of job satisfaction was 3.04. Emotional intelligence and interpersonal relationship were positively correlated with the job satisfaction. Hierarchial multiple regression analysis showed that use of emotion was the main factor of affecting job satisfaction, which explained 30.8% of the variance for the nurses' job satisfaction together with intimacy, regulation of emotion, position, and monthly salary. Conclusion: These findings indicated that emotional intelligence and interpersonal relationship, especially use and control of emotion and intimacy contributed to nurses' job satisfaction. It is necessary to develop and implement the program for increasing emotional intelligence and interpersonal relationship for improving job satisfaction.