• 제목/요약/키워드: Hands Gesture

검색결과 60건 처리시간 0.032초

연속DP와 칼만필터를 이용한 손동작의 추적 및 인식 (Tracking and Recognizing Hand Gestures using Kalman Filter and Continuous Dynamic Programming)

  • 문인혁;금영광
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2002년도 하계종합학술대회 논문집(3)
    • /
    • pp.13-16
    • /
    • 2002
  • This paper proposes a method to track hand gesture and to recognize the gesture pattern using Kalman filter and continuous dynamic programming (CDP). The positions of hands are predicted by Kalman filter, and corresponding pixels to the hands are extracted by skin color filter. The center of gravity of the hands is the same as the input pattern vector. The input gesture is then recognized by matching with the reference gesture patterns using CDP. From experimental results to recognize circle shape gesture and intention gestures such as “Come on” and “Bye-bye”, we show the proposed method is feasible to the hand gesture-based human -computer interaction.

  • PDF

A Structure and Framework for Sign Language Interaction

  • Kim, Soyoung;Pan, Younghwan
    • 대한인간공학회지
    • /
    • 제34권5호
    • /
    • pp.411-426
    • /
    • 2015
  • Objective: The goal of this thesis is to design the interaction structure and framework of system to recognize sign language. Background: The sign language of meaningful individual gestures is combined to construct a sentence, so it is difficult to interpret and recognize the meaning of hand gesture for system, because of the sequence of continuous gestures. This being so, in order to interpret the meaning of individual gesture correctly, the interaction structure and framework are needed so that they can segment the indication of individual gesture. Method: We analyze 700 sign language words to structuralize the sign language gesture interaction. First of all, we analyze the transformational patterns of the hand gesture. Second, we analyze the movement of the transformational patterns of the hand gesture. Third, we analyze the type of other gestures except hands. Based on this, we design a framework for sign language interaction. Results: We elicited 8 patterns of hand gesture on the basis of the fact on whether the gesture has a change from starting point to ending point. And then, we analyzed the hand movement based on 3 elements: patterns of movement, direction, and whether hand movement is repeating or not. Moreover, we defined 11 movements of other gestures except hands and classified 8 types of interaction. The framework for sign language interaction, which was designed based on this mentioned above, applies to more than 700 individual gestures of the sign language, and can be classified as an individual gesture in spite of situation which has continuous gestures. Conclusion: This study has structuralized in 3 aspects defined to analyze the transformational patterns of the starting point and the ending point of hand shape, hand movement, and other gestures except hands for sign language interaction. Based on this, we designed the framework that can recognize the individual gestures and interpret the meaning more accurately, when meaningful individual gesture is input sequence of continuous gestures. Application: When we develop the system of sign language recognition, we can apply interaction framework to it. Structuralized gesture can be used for using database of sign language, inventing an automatic recognition system, and studying on the action gestures in other areas.

기본 요소분류기를 이용한 가상현실용 실시간 동적 손 제스처 인식 시스템의 구현에 관한 연구 (On-line dyamic hand gesture recognition system for virtual reality using elementary component classifiers)

  • 김종성;이찬수
    • 전자공학회논문지C
    • /
    • 제34C권9호
    • /
    • pp.68-76
    • /
    • 1997
  • This paper presents a system which recognizes dynamic hand gestures for virtual reality(VR). A dynamic hand gesture is a method of communication for a computer and human who uses gestures, especially both hands and fingers. Since the human hands and fingers are not the same in physical dimension, the same form of a gestrue produced by two persons with their hands may not have the same numerical values which are obtained through electronic sensors. In this paper, we apply a fuzzy min-max neural network and feature analysis method using fuzzy logic for on-line pattern recognition.

  • PDF

On-line Korean Sing Language(KSL) Recognition using Fuzzy Min-Max Neural Network and feature Analysis

  • zeungnam Bien;Kim, Jong-Sung
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 1995년도 추계학술대회 학술발표 논문집
    • /
    • pp.85-91
    • /
    • 1995
  • This paper presents a system which recognizes the Korean Sign Language(KSL) and translates into normal Korean speech. A sign language is a method of communication for the deaf-mute who uses gestures, especially both hands and fingers. Since the human hands and fingers are not the same in physical dimension, the same form of a gesture produced by two signers with their hands may not produce the same numerical values when obtained through electronic sensors. In this paper, we propose a dynamic gesture recognition method based on feature analysis for efficient classification of hand motions, and on a fuzzy min-max neural network for on-line pattern recognition.

  • PDF

Automatic Gesture Recognition for Human-Machine Interaction: An Overview

  • Nataliia, Konkina
    • International Journal of Computer Science & Network Security
    • /
    • 제22권1호
    • /
    • pp.129-138
    • /
    • 2022
  • With the increasing reliance of computing systems in our everyday life, there is always a constant need to improve the ways users can interact with such systems in a more natural, effective, and convenient way. In the initial computing revolution, the interaction between the humans and machines have been limited. The machines were not necessarily meant to be intelligent. This begged for the need to develop systems that could automatically identify and interpret our actions. Automatic gesture recognition is one of the popular methods users can control systems with their gestures. This includes various kinds of tracking including the whole body, hands, head, face, etc. We also touch upon a different line of work including Brain-Computer Interface (BCI), Electromyography (EMG) as potential additions to the gesture recognition regime. In this work, we present an overview of several applications of automated gesture recognition systems and a brief look at the popular methods employed.

최대 공통 부열을 이용한 비전 기반의 양팔 제스처 인식 (Vision-Based Two-Arm Gesture Recognition by Using Longest Common Subsequence)

  • 최철민;안정호;변혜란
    • 한국통신학회논문지
    • /
    • 제33권5C호
    • /
    • pp.371-377
    • /
    • 2008
  • 본 논문은 비전에 기반한 사람의 양팔 제스처의 모델링과 인식에 관한 연구이다. 우리는 양팔 제스처 인식을 위한 특징점의 추출에서부터 제스처의 분류에 이르는 전체적 틀을 제안하였다. 먼저, 양팔 제스처의 모델링을 위해 색채 기반의 양손 추적 방법을 제안하였고, 추출된 양손의 궤적 정보를 효과적으로 선택하게 하는 제스처 구(Phrase) 분석법을 제시하였다. 선택된 특징 점들의 시퀀스(sequence) 들로 이루어진 훈련 데이터들의 최대 공통부열(Longest Common Subsequence) 정보를 이용하여 제스처를 모델링하고 이에 따른 유사도 척도를 제안하였다. 제안된 방법론을 공항 등에서 이용하는 항공기 유도 수신호에 적용하였고, 실험을 통해 제안된 방법론의 효율성과 인식성능을 보였다.

HRI 시스템에서 제스처 인식을 위한 Moving Mean-Shift 기반 사용자 손 위치 보정 알고리즘 (A Compensation Algorithm for the Position of User Hands Based on Moving Mean-Shift for Gesture Recognition in HRI System)

  • 김태완;권순량;이동명
    • 한국통신학회논문지
    • /
    • 제40권5호
    • /
    • pp.863-870
    • /
    • 2015
  • 본 논문은 키넥트 센서 (Kinect sensor)를 탑재한 Human Robot Interface (HRI) 시스템에서 손 위치 데이터를 측정하여 제스처 인식 및 처리성능을 높이기 위하여 Moving Mean-Shift 기반 사용자 손 위치 보정 알고리즘($CAPUH_{MMS}$)을 제안하였다. 또한, $CAPUH_{MMS}$의 성능을 자체 개발한 실시간 성능 시뮬레이터로 이동궤적에 대한 평균 오차 성능개선 비율을 다른 보정 기법인 $CA_{KF}$ (Kalman-Filter 기반 보정 알고리즘) 및 $CA_{LSM}$ (Least-Squares Method 기반 보정 알고리즘)의 성능과 비교하였다. 실험결과, $CAPUH_{MMS}$의 이동궤적에 대한 평균 오차 성능개선 비율은 양손 상하 운동에서 평균 19.35%으로, 이는 $CA_{KF}$$CA_{LSM}$ 보다 각각 13.88%, 16.68% 더 높은 평균 오차 성능 개선 비율을, 그리고 양손 좌우 운동에서 평균 28.54%으로 $CA_{KF}$$CA_{LSM}$ 보다 각각 9.51%, 17.31% 더 높은 평균 오차 성능 개선 비율을 나타낸 것이다.

인간의 행동 인식을 위한 얼굴 방향과 손 동작 해석 (Analysis of Face Direction and Hand Gestures for Recognition of Human Motion)

  • 김성은;조강현;전희성;최원호;박경섭
    • 제어로봇시스템학회논문지
    • /
    • 제7권4호
    • /
    • pp.309-318
    • /
    • 2001
  • In this paper, we describe methods that analyze a human gesture. A human interface(HI) system for analyzing gesture extracts the head and hand regions after taking image sequence of and operators continuous behavior using CCD cameras. As gestures are accomplished with operators head and hands motion, we extract the head and hand regions to analyze gestures and calculate geometrical information of extracted skin regions. The analysis of head motion is possible by obtaining the face direction. We assume that head is ellipsoid with 3D coordinates to locate the face features likes eyes, nose and mouth on its surface. If was know the center of feature points, the angle of the center in the ellipsoid is the direction of the face. The hand region obtained from preprocessing is able to include hands as well as arms. For extracting only the hand region from preprocessing, we should find the wrist line to divide the hand and arm regions. After distinguishing the hand region by the wrist line, we model the hand region as an ellipse for the analysis of hand data. Also, the finger part is represented as a long and narrow shape. We extract hand information such as size, position, and shape.

  • PDF

한글 수화용 동적 손 제스처의 실시간 인식 시스템의 구현에 관한 연구 (On-line dynamic hand gesture recognition system for the korean sign language (KSL))

  • 김종성;이찬수;장원;변증남
    • 전자공학회논문지C
    • /
    • 제34C권2호
    • /
    • pp.61-70
    • /
    • 1997
  • Human-hand gestures have been used a means of communication among people for a long time, being interpreted as streams of tokens for a language. The signed language is a method of communication for hearing impaired person. Articulated gestures and postures of hands and fingers are commonly used for the signed language. This paper presents a system which recognizes the korean sign language (KSL) and translates the recognition results into a normal korean text and sound. A pair of data-gloves are used a sthe sensing device for detecting motions of hands and fingers. In this paper, we propose a dynamic gesture recognition mehtod by employing a fuzzy feature analysis method for efficient classification of hand motions, and applying a fuzzy min-max neural network to on-line pattern recognition.

  • PDF

상태 오토마타와 기본 요소분류기를 이용한 가상현실용 실시간 인터페이싱 (Virtual Environment Interfacing based on State Automata and Elementary Classifiers)

  • 김종성;이찬수;송경준;민병의;박치항
    • 한국정보처리학회논문지
    • /
    • 제4권12호
    • /
    • pp.3033-3044
    • /
    • 1997
  • 본 논문에서는 가상현실의 기본 요소중의 하나인 사용자 인터페이스 분야에서 동적 손 제스처를 실시간으로 인식하는 시스템의 구현에 관하여 상술한다. 사람의 손과 손가락은 사람마다 같은 동작이라도 데이터의 변화가 다양하며 같은 동작을 반복해서 할 때에도 다른 데이터를 얻게되는등 시간에따른 변화도 존재한다. 또한, 손가락의 외형 및 물리적 구조가 사람마다 다르기 때문에 다른 두사람에 의해 만들어진 같은 손 모양도 일반적인 센싱장비에의해 측정될 때 다른 측정값을 나타낸다. 또한 동적 손제스처에서 동작의 시작과 끝을 명확히 구분하기가 매우 힘들다. 본 논문에서는 동적 손 제스처에 대해 각각의 의미있는 동작을 구분하기위해 상태 오토마타를 이용하였고, 인식 범위의 확장성을 고려하여 동적 손 제스처를 퍼지 이론을 도입한 특징 해석에의해 기본 요소인 손의 운동을 분류하고 퍼지 최대-최소 신경망을 적용하여 손의 모양을 분류함으로써 전체 손 제스처를 인식하는 시스템을 제안한다.

  • PDF