• Title/Summary/Keyword: Gesture Pattern Recognition

Search Result 54, Processing Time 0.027 seconds

AdaBoost-Based Gesture Recognition Using Time Interval Trajectory Features (시간 간격 특징 벡터를 이용한 AdaBoost 기반 제스처 인식)

  • Hwang, Seung-Jun;Ahn, Gwang-Pyo;Park, Seung-Je;Baek, Joong-Hwan
    • Journal of Advanced Navigation Technology
    • /
    • v.17 no.2
    • /
    • pp.247-254
    • /
    • 2013
  • The task of 3D gesture recognition for controlling equipments is highly challenging due to the propagation of 3D smart TV recently. In this paper, the AdaBoost algorithm is applied to 3D gesture recognition by using Kinect sensor. By tracking time interval trajectory of hand, wrist and arm by Kinect, AdaBoost algorithm is used to train and classify 3D gesture. Experimental results demonstrate that the proposed method can successfully extract trained gestures from continuous hand, wrist and arm motion in real time.

Recognition of hand gestures with different prior postures using EMG signals (사전 자세에 따른 근전도 기반 손 제스처 인식)

  • Hyun-Tae Choi;Deok-Hwa Kim;Won-Du Chang
    • Journal of Internet of Things and Convergence
    • /
    • v.9 no.6
    • /
    • pp.51-56
    • /
    • 2023
  • Hand gesture recognition is an essential technology for the people who have difficulties using spoken language to communicate. Electromyogram (EMG), which is often utilized for hand gesture recognition, is expected to have difficulties in hand gesture recognition because its people's movements varies depending on prior postures, but the study on this subject is rare. In this study, we conducted tests to confirm if the prior postures affect on the accuracy of gesture recognition. Data were recorded from 20 subjects with different prior postures. We achieved average accuracies of 89.6% and 52.65% when the prior states between the training and test data were unique and different, respectively. The accuracy was increased when both prior states were considered, which confirmed the need to consider a variety of prior states in hand gesture recognition with EMG.

Application of Sensor Network Using Multivariate Gaussian Function to Hand Gesture Recognition (Multivariate Gaussian 함수를 이용한 센서 네트워크의 수화 인식에의 적용)

  • Kim Sung-Ho;Han Yun-Jong;Bogdana Diaconescu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.12
    • /
    • pp.991-995
    • /
    • 2005
  • Sensor networks are the results of convergence of very important technologies such as wireless communication and micro electromechanical systems. In recent years, sensor networks found a wide applicability in various fields such as health, environment and habitat monitoring, military, etc. A very important step for these many applications is pattern classification and recognition of data collected by sensors installed or deployed in different ways. But, pattern classification and recognition are sometimes difficult to perform. Systematic approach to pattern classification based on modern teaming techniques like Multivariate Gaussian mixture models, can greatly simplify the process of developing and implementing real-time classification models. This paper proposes a new recognition system which is hierarchically composed of many sensor nodes haying the capability of simple processing and wireless communication. The proposed system is able to perform classification of sensed data using the Multivariate Gaussian function. In order to verify the usefulness of the proposed system, it was applied to hand gesture recognition system.

Emotion Recognition Based on Human Gesture (인간의 제스쳐에 의한 감정 인식)

  • Song, Min-Kook;Park, Jin-Bae;Joo, Young-Hoon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.1
    • /
    • pp.46-51
    • /
    • 2007
  • This paper is to present gesture analysis for human-robot interaction. Understanding human emotions through gesture is one of the necessary skills fo the computers to interact intelligently with their human counterparts. Gesture analysis is consisted of several processes such as detecting of hand, extracting feature, and recognizing emotions. For efficient operation we used recognizing a gesture with HMM(Hidden Markov Model). We constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile system.

A Framework for 3D Hand Gesture Design and Modeling (삼차원 핸드 제스쳐 디자인 및 모델링 프레임워크)

  • Kwon, Doo-Young
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.14 no.10
    • /
    • pp.5169-5175
    • /
    • 2013
  • We present a framework for 3D hand gesture design and modeling. We adapted two different pattern matching techniques, Dynamic Time Warping (DTW) and Hidden Markov Models (HMMs), to support the registration and evaluation of 3D hand gestures as well as their recognition. One key ingredient of our framework is a concept for the convenient gesture design and registration using HMMs. DTW is used to recognize hand gestures with a limited training data, and evaluate how the performed gesture is similar to its template gesture. We facilitate the use of visual sensors and body sensors for capturing both locative and inertial gesture information. In our experimental evaluation, we designed 18 example hand gestures and analyzed the performance of recognition methods and gesture features under various conditions. We discuss the variability between users in gesture performance.

Emotional Human Body Recognition by Using Extraction of Human Body from Image (인간의 움직임 추출을 이용한 감정적인 행동 인식 시스템 개발)

  • Song, Min-Kook;Joo, Young-Hoon;Park, Jin-Bae
    • Proceedings of the KIEE Conference
    • /
    • 2006.10c
    • /
    • pp.214-216
    • /
    • 2006
  • Expressive face and human body gestures are among the main non-verbal communication channels in human-human interaction. Understanding human emotions through body gesture is one of the necessary skills both for humans and also for the computers to interact with their human counterparts. Gesture analysis is consisted of several processes such as detecting of hand, extracting feature, and recognizing emotions. Skin color information for tracking hand gesture is obtained from face detection region. We have revealed relationships between paricular body movements and specific emotions by using HMM(Hidden Markov Model) classifier. Performance evaluation of emotional human body recognition has experimented.

  • PDF

Gesture Recognition using MHI Shape Information (MHI의 형태 정보를 이용한 동작 인식)

  • Kim, Sang-Kyoon
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.4
    • /
    • pp.1-13
    • /
    • 2011
  • In this paper, we propose a gesture recognition system to recognize motions using the shape information of MHI (Motion History Image). The system acquires MHI to provide information on motions from images with input and extracts the gradient images from such MHI for each X and Y coordinate. It extracts the shape information by applying the shape context to each gradient image and uses the extracted pattern information values as the feature values. It recognizes motions by learning and classifying the obtained feature values with a SVM (Support Vector Machine) classifier. The suggested system is able to recognize the motions for multiple people as well as to recognize the direction of movements by using the shape information of MHI. In addition, it shows a high ratio of recognition with a simple method to extract features.

A Structure and Framework for Sign Language Interaction

  • Kim, Soyoung;Pan, Younghwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.34 no.5
    • /
    • pp.411-426
    • /
    • 2015
  • Objective: The goal of this thesis is to design the interaction structure and framework of system to recognize sign language. Background: The sign language of meaningful individual gestures is combined to construct a sentence, so it is difficult to interpret and recognize the meaning of hand gesture for system, because of the sequence of continuous gestures. This being so, in order to interpret the meaning of individual gesture correctly, the interaction structure and framework are needed so that they can segment the indication of individual gesture. Method: We analyze 700 sign language words to structuralize the sign language gesture interaction. First of all, we analyze the transformational patterns of the hand gesture. Second, we analyze the movement of the transformational patterns of the hand gesture. Third, we analyze the type of other gestures except hands. Based on this, we design a framework for sign language interaction. Results: We elicited 8 patterns of hand gesture on the basis of the fact on whether the gesture has a change from starting point to ending point. And then, we analyzed the hand movement based on 3 elements: patterns of movement, direction, and whether hand movement is repeating or not. Moreover, we defined 11 movements of other gestures except hands and classified 8 types of interaction. The framework for sign language interaction, which was designed based on this mentioned above, applies to more than 700 individual gestures of the sign language, and can be classified as an individual gesture in spite of situation which has continuous gestures. Conclusion: This study has structuralized in 3 aspects defined to analyze the transformational patterns of the starting point and the ending point of hand shape, hand movement, and other gestures except hands for sign language interaction. Based on this, we designed the framework that can recognize the individual gestures and interpret the meaning more accurately, when meaningful individual gesture is input sequence of continuous gestures. Application: When we develop the system of sign language recognition, we can apply interaction framework to it. Structuralized gesture can be used for using database of sign language, inventing an automatic recognition system, and studying on the action gestures in other areas.

Development of Emotion-Based Human Interaction Method for Intelligent Robot (지능형 로봇을 위한 감성 기반 휴먼 인터액션 기법 개발)

  • Joo, Young-Hoon;So, Jea-Yun;Sim, Kee-Bo;Song, Min-Kook;Park, Jin-Bae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.5
    • /
    • pp.587-593
    • /
    • 2006
  • This paper is to present gesture analysis for human-robot interaction. Understanding human emotions through gesture is one of the necessary skills for the computers to interact intelligently with their human counterparts. Gesture analysis is consisted of several processes such as detecting of hand, extracting feature, and recognizing emotions. For efficient operation we used recognizing a gesture with HMM(Hidden Markov Model). We constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile system.

On-line dynamic hand gesture recognition system for the korean sign language (KSL) (한글 수화용 동적 손 제스처의 실시간 인식 시스템의 구현에 관한 연구)

  • Kim, Jong-Sung;Lee, Chan-Su;Jang, Won;Bien, Zeungnam
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.34C no.2
    • /
    • pp.61-70
    • /
    • 1997
  • Human-hand gestures have been used a means of communication among people for a long time, being interpreted as streams of tokens for a language. The signed language is a method of communication for hearing impaired person. Articulated gestures and postures of hands and fingers are commonly used for the signed language. This paper presents a system which recognizes the korean sign language (KSL) and translates the recognition results into a normal korean text and sound. A pair of data-gloves are used a sthe sensing device for detecting motions of hands and fingers. In this paper, we propose a dynamic gesture recognition mehtod by employing a fuzzy feature analysis method for efficient classification of hand motions, and applying a fuzzy min-max neural network to on-line pattern recognition.

  • PDF