• Title/Summary/Keyword: Animal Emotion Recognition

Search Result 4, Processing Time 0.02 seconds

The Effects of Animal Experience Activities on Young Children's Emotional Intelligence and Resilience (동물체험활동이 유아의 정서지능 및 심리적 건강성에 미치는 영향)

  • Lee, Soeun;Lim, Hui Yoon
    • Korean Journal of Childcare and Education
    • /
    • v.10 no.3
    • /
    • pp.121-135
    • /
    • 2014
  • The purpose of this study was to investigate the effects of animal experience activities on young children's emotional intelligence and resilience. Seventy 3- and 4-year-olds were divided into experimental group in which children participated 12 times in animal experience activities, and control group. Data were analyzed by mean, t-test and ANCOVA. The results revealed that children with animal experience activities got higher scores in emotional intelligence including utilization of emotion, recognition and consideration of others' emotion, recognition and expression of self emotion, and emotional adjustment and impulse control. In addition, 4-year-olds scored higher in resilience than 3-year-olds, and the interaction effect between group and age indicated that the positive effects of animal experience activities were more significant in 4-year-olds.

A Deep Learning System for Emotional Cat Sound Classification and Generation (감정별 고양이 소리 분류 및 생성 딥러닝 시스템)

  • Joo Yong Shim;SungKi Lim;Jong-Kook Kim
    • The Transactions of the Korea Information Processing Society
    • /
    • v.13 no.10
    • /
    • pp.492-496
    • /
    • 2024
  • Cats are known to express their emotions through a variety of vocalizations during interactions. These sounds reflect their emotional states, making the understanding and interpretation of these sounds crucial for more effective communication. Recent advancements in artificial intelligence has introduced research related to emotion recognition, particularly focusing on the analysis of voice data using deep learning models. Building on this background, the study aims to develop a deep learning system that classifies and generates cat sounds based on their emotional content. The classification model is trained to accurately categorize cat vocalizations by emotion. The sound generation model, which uses deep learning based models such as SampleRNN, is designed to produce cat sounds that reflect specific emotional states. The study finally proposes an integrated system that takes recorded cat vocalizations, classify them by emotion, and generate cat sounds based on user requirements.

The effect of orientation on recognizing object representation (규범적 표상의 방향성 효과)

  • Jung, Hyo-Sun;Lee, Seung-Bok;Jung, Woo-Hyun
    • Science of Emotion and Sensibility
    • /
    • v.11 no.4
    • /
    • pp.501-510
    • /
    • 2008
  • The purpose of this study was to investigate whether the orientation of the head position across different categories affect reaction time and accuracy of object recognition. Fifty four right handed undergraduate students were participated in the experiment. Participants performed the word-picture matching tasks, which were different in terms of head direction of object (i.e., Left-headed or Right-headed) and object category (i.e., natural : animal or artificial : tool). Participants were asked to decide whether each picture matched the word which was followed by the picture. For accuracy, no statistically significant difference was found for both animal and tool pictures due to the ceiling effect. Interaction effect of category and orientation were statistically significant, whereas only the main effect of category was significant. In the animal condition, faster reaction times were observed for left to right than right to left presentation, while no statistical significant difference was found in the tool condition. The orientation of the object's canonical representation was different across different categories. The faster RT for the animal condition implies that the canonical representation for animal is left-headed. This could be due to the orientation of the face.

  • PDF

A Pilot Study on Outpainting-powered Pet Pose Estimation (아웃페인팅 기반 반려동물 자세 추정에 관한 예비 연구)

  • Gyubin Lee;Youngchan Lee;Wonsang You
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.24 no.1
    • /
    • pp.69-75
    • /
    • 2023
  • In recent years, there has been a growing interest in deep learning-based animal pose estimation, especially in the areas of animal behavior analysis and healthcare. However, existing animal pose estimation techniques do not perform well when body parts are occluded or not present. In particular, the occlusion of dog tail or ear might lead to a significant degradation of performance in pet behavior and emotion recognition. In this paper, to solve this intractable problem, we propose a simple yet novel framework for pet pose estimation where pet pose is predicted on an outpainted image where some body parts hidden outside the input image are reconstructed by the image inpainting network preceding the pose estimation network, and we performed a preliminary study to test the feasibility of the proposed approach. We assessed CE-GAN and BAT-Fill for image outpainting, and evaluated SimpleBaseline for pet pose estimation. Our experimental results show that pet pose estimation on outpainted images generated using BAT-Fill outperforms the existing methods of pose estimation on outpainting-less input image.