• Title/Summary/Keyword: Emotion recognition system

Search Result 220, Processing Time 0.024 seconds

Proposal of Emotion Recognition Service in Mobile Health Application (모바일 헬스 애플리케이션의 감정인식 서비스 제안)

  • Ha, Mina;Lee, Yoo Jin;Park, Seung Ho
    • Design Convergence Study
    • /
    • v.15 no.1
    • /
    • pp.233-246
    • /
    • 2016
  • Mobile health industry has been combined with IT technology and is attracting attention. The health application has been developed to provide users a healthy life style. First of all, 5 mobile health applications were selected and reviewed in terms of their service trend. It turned out that none of those applications had any emotional data but physical one. Secondly, to extract users' emotion, technological researches were sorted into different categories. And the result implied that text-based emotion recognition technology is the most suitable for the mobile health service. To implement the service, the application was designed and developed the process of emotion recognition system based on the contents of the research. One-dimension emotion model, which is the standard of classifying emotional data and social network service, was set up as a source. In last, to suggest the usage of health application has been combined with persuasive technology. As a result, this paper prospered a overall service process, concrete service scheme and a guidelines containing 15 services in accordance with the five emotions and time. It is expected to become a direction for indicators considering a psychological individual context.

Enhancing e-Learning Interactivity vla Emotion Recognition Computing Technology (감성 인식 컴퓨팅 기술을 적용한 이러닝 상호작용 기술 연구)

  • Park, Jung-Hyun;Kim, InOk;Jung, SangMok;Song, Ki-Sang;Kim, JongBaek
    • The Journal of Korean Association of Computer Education
    • /
    • v.11 no.2
    • /
    • pp.89-98
    • /
    • 2008
  • Providing appropriate interactions between learner and e- Learning system is an essential factor of a successful e-Learning system. Although many interaction functions are employed in multimedia Web-based Instruction content, learner experience a lack of similar feedbacks from educators in real- time due to the limitation of Human-Computer Interaction techniques. In this paper, an emotion recognition system via learner facial expressions has been developed and applied to a tutoring system. As human educators do, the system observes learners' emotions from facial expressions and provides any or all pertinent feedback. And various feedbacks can bring to motivations and get rid of isolation from e-Learning environments by oneself. The test results showed that this system may provide significant improvement in terms of interesting and educational achievement.

  • PDF

Implementation of Pet Management System including Deep Learning-based Breed and Emotion Recognition SNS (딥러닝 기반 품종 및 감정인식 SNS를 포함하는 애완동물 관리 시스템 구현)

  • Inhwan Jung;Kitae Hwang;Jae-Moon Lee
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.23 no.3
    • /
    • pp.45-50
    • /
    • 2023
  • As the ownership of pets has steadily increased in recent years, the need for an effective pet management system has grown. In this study, we propose a pet management system with a deep learning-based emotion recognition SNS. The system detects emotions through pet facial expressions using a convolutional neural network (CNN) and shares them with a user community through SNS. Through SNS, pet owners can connect with other users, share their experiences, and receive support and advice for pet management. Additionally, the system provides comprehensive pet management, including tracking pet health and vaccination and reservation reminders. Furthermore, we added a function to manage and share pet walking records so that pet owners can share their walking experiences with other users. This study demonstrates the potential of utilizing AI technology to improve pet management systems and enhance the well-being of pets and their owners.

Performance Enhancement of Phoneme and Emotion Recognition by Multi-task Training of Common Neural Network (공용 신경망의 다중 학습을 통한 음소와 감정 인식의 성능 향상)

  • Kim, Jaewon;Park, Hochong
    • Journal of Broadcast Engineering
    • /
    • v.25 no.5
    • /
    • pp.742-749
    • /
    • 2020
  • This paper proposes a method for recognizing both phoneme and emotion using a common neural network and a multi-task training method for the common neural network. The common neural network performs the same function for both recognition tasks, which corresponds to the structure of multi-information recognition of human using a single auditory system. The multi-task training conducts a feature modeling that is commonly applicable to multiple information and provides generalized training, which enables to improve the performance by reducing an overfitting occurred in the conventional individual training for each information. A method for increasing phoneme recognition performance is also proposed that applies weight to the phoneme in the multi-task training. When using the same feature vector and neural network, it is confirmed that the proposed common neural network with multi-task training provides higher performance than the individual one trained for each task.

A Design and Implementation Digital Vessel Bio Emotion Recognition LED Control System (디지털 선박 생체 감성 인식 LED 조명 제어 시스템 설계 및 구현)

  • Song, Byoung-Ho;Oh, Il-Whan;Lee, Seong-Ro
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.48 no.2
    • /
    • pp.102-108
    • /
    • 2011
  • The existing vessels lighting control system has several problems, which are complexity of construction and high cost of establishment and maintenance. In this paper, We designed low cost and high performance lighting control system at digital vessel environment. We proposed a system which recognize the user's emotions after obtaining the biological informations about user's bio information(pulse sensor, blood pressure sensor, blood sugar sensor etc) through wireless sensors controls the LED Lights. This system classified emotions using backpropagation algorithm. We chose 3,000 data sets to train the backpropagation algorithm. As a result, obtained about 88.7% accuracy. And the classified emotions find the most appropriate point in the method of controlling the waves or frequencies to the red, green, blue LED Lamp comparing with the 20-color-emotion models in the HP's 'The meaning of color' and control the brightness or contrast of the LED Lamp. In this method, the system saved about 20% of the electricity consumed.

A Design and Implementation of Music & Image Retrieval Recommendation System based on Emotion (감성기반 음악.이미지 검색 추천 시스템 설계 및 구현)

  • Kim, Tae-Yeun;Song, Byoung-Ho;Bae, Sang-Hyun
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.47 no.1
    • /
    • pp.73-79
    • /
    • 2010
  • Emotion intelligence computing is able to processing of human emotion through it's studying and adaptation. Also, Be able more efficient to interaction of human and computer. As sight and hearing, music & image is constitute of short time and continue for long. Cause to success marketing, understand-translate of humanity emotion. In this paper, Be design of check system that matched music and image by user emotion keyword(irritability, gloom, calmness, joy). Suggested system is definition by 4 stage situations. Then, Using music & image and emotion ontology to retrieval normalized music & image. Also, A sampling of image peculiarity information and similarity measurement is able to get wanted result. At the same time, Matched on one space through pared correspondence analysis and factor analysis for classify image emotion recognition information. Experimentation findings, Suggest system was show 82.4% matching rate about 4 stage emotion condition.

Emotion-based music visualization using LED lighting control system (LED조명 시스템을 이용한 음악 감성 시각화에 대한 연구)

  • Nguyen, Van Loi;Kim, Donglim;Lim, Younghwan
    • Journal of Korea Game Society
    • /
    • v.17 no.3
    • /
    • pp.45-52
    • /
    • 2017
  • This paper proposes a new strategy of emotion-based music visualization. Emotional LED lighting control system is suggested to help audiences enhance the musical experience. In the system, emotion in music is recognized by a proposed algorithm using a dimensional approach. The algorithm used a method of music emotion variation detection to overcome some weaknesses of Thayer's model in detecting emotion in a one-second music segment. In addition, IRI color model is combined with Thayer's model to determine LED light colors corresponding to 36 different music emotions. They are represented on LED lighting control system through colors and animations. The accuracy of music emotion visualization achieved to over 60%.

Design of Emotion Recognition system utilizing fusion of Speech and Context based emotion recognition in Smartphone (스마트폰에서 음성과 컨텍스트 기반 감정인식 융합을 활용한 감정인식 시스템 설계)

  • Cho, Seong Jin;Lee, Seongho;Lee, Sungyoung
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2012.07a
    • /
    • pp.323-324
    • /
    • 2012
  • 최근 스마트폰 환경에서 제공되는 수많은 서비스들은 일률적으로 소비자에게 단방향으로 제공되던 예전과 달리 사용자마다 개인화된 서비스 제공을 통해, 더욱 효율적으로 서비스를 제공하려는 시도들이 이루어지고 있다. 그 중에서 감정인식을 이용한 연구는 사용자에게 최적화된 개인화 서비스 제공을 위해 사용자의 감정을 인식하여 사용자가 느끼는 감정에 맞는 서비스를 제공함으로써 보다 몰입감을 느낄 수 있도록 하여 결과적으로 특정 서비스의 이용을 유도하도록 할 수 있다. 본 논문에서는 사용자 선호도와 컨텍스트 정보를 활용하여 사용자의 감정을 추출하고 이를 음성기반 감정인식과 융합하여 그 정확도를 높이고 실제서비스에서 활용할 수 있는 시스템 설계를 제안한다. 제안하는 시스템은 사용자 선호도와 컨텍스트 인식으로 감정을 판단했을 경우의 오류를 음성을 통한 감정인식으로 보완하며, 사용자가 감정인식 시스템을 활용하기 위한 비용을 최소화한다. 제안하는 시스템은 스마트폰에서 사용자 감정을 이용한 애플리케이션이나 추천서비스 등에서 활용이 가능하다.

  • PDF

Enhancing Music Recommendation Systems Through Emotion Recognition and User Behavior Analysis

  • Qi Zhang
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.5
    • /
    • pp.177-187
    • /
    • 2024
  • 177-Existing music recommendation systems do not sufficiently consider the discrepancy between the intended emotions conveyed by song lyrics and the actual emotions felt by users. In this study, we generate topic vectors for lyrics and user comments using the LDA model, and construct a user preference model by combining user behavior trajectories reflecting time decay effects and playback frequency, along with statistical characteristics. Empirical analysis shows that our proposed model recommends music with higher accuracy compared to existing models that rely solely on lyrics. This research presents a novel methodology for improving personalized music recommendation systems by integrating emotion recognition and user behavior analysis.

Motion based Autonomous Emotion Recognition System: A Preliminary Study on Bodily Map according to Type of Emotional Stimuli (동작 기반 Autonomous Emotion Recognition 시스템: 감정 유도 자극에 따른 신체 맵 형성을 중심으로)

  • Jungeun Bae;Myeongul Jung;Youngwug Cho;Hyungsook Kim;Kwanguk (Kenny) Kim
    • Journal of the Korea Computer Graphics Society
    • /
    • v.29 no.3
    • /
    • pp.33-43
    • /
    • 2023
  • Not only emotions affect physical sensations, but they also have an impact on physical movements. The responses to emotions vary depending on the type of emotional stimuli. However, research on the effects of emotional stimuli on the activation of bodily movements has not been rigorously examined, and these effects have not been investigated in Autonomous Emotion Recognition (AER) systems. In this study, we aimed to compare the emotional responses of 20 participants to three types of emotional stimuli (words, pictures, and videos) and investigate their activation or deactivation for the AER system. Our dependent measures included emotional responses, computer-based self-reporting methods, and bodily movements recorded using motion capture devices. The results suggested that video stimuli elicited higher levels of emotional movement, and emotional movement patterns were similar across different types of emotional stimuli for happiness, sadness, anger, and neutrality. Additionally, the findings indicated that bodily changes observed during video stimuli had the highest classification accuracy. These findings have implications for future research on the bodily changes elicited by emotional stimuli.