• Title/Summary/Keyword: hand motion recognition

Search Result 145, Processing Time 0.023 seconds

Implementation of the Hand-motion Recognition based Auxiliary Input Device using Gyro Sensor (자이로센서를 이용한 손 동작 인식형 보조 입력장치 구현)

  • Park, Ki-Hong;Lee, Hyun-Jik;Kim, Yoon-Ho
    • Journal of Advanced Navigation Technology
    • /
    • v.13 no.4
    • /
    • pp.503-508
    • /
    • 2009
  • In this paper, we have designed the auxiliary input device which based on hand-motion recognition. It is aimed at some individually specified person such as the disabled, rehabilitation patient, and the aged. The gyro sensor is used to recognize the hand-motion in 3D space, and communication bandwidth for transceiver is also set to the 2.4GHz. Prototype board includes a set of modules; Gyro sensor, RF transmitter/receiver, MCU for signal processing and USB connector etc. Some experiments are conducted so as to verify the prototype, and as a result, mouse-based curser motion as well as program control are well operated just same as the design specification.

  • PDF

Alphabetical Gesture Recognition using HMM (HMM을 이용한 알파벳 제스처 인식)

  • Yoon, Ho-Sub;Soh, Jung;Min, Byung-Woo
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 1998.10c
    • /
    • pp.384-386
    • /
    • 1998
  • The use of hand gesture provides an attractive alternative to cumbersome interface devices for human-computer interaction(HCI). Many methods hand gesture recognition using visual analysis have been proposed such as syntactical analysis, neural network(NN), Hidden Markov Model(HMM) and so on. In our research, a HMMs is proposed for alphabetical hand gesture recognition. In the preprocessing stage, the proposed approach consists of three different procedures for hand localization, hand tracking and gesture spotting. The hand location procedure detects the candidated regions on the basis of skin-color and motion in an image by using a color histogram matching and time-varying edge difference techniques. The hand tracking algorithm finds the centroid of a moving hand region, connect those centroids, and thus, produces a trajectory. The spotting a feature database, the proposed approach use the mesh feature code for codebook of HMM. In our experiments, 1300 alphabetical and 1300 untrained gestures are used for training and testing, respectively. Those experimental results demonstrate that the proposed approach yields a higher and satisfying recognition rate for the images with different sizes, shapes and skew angles.

  • PDF

Hand gesture based a pet robot control (손 제스처 기반의 애완용 로봇 제어)

  • Park, Se-Hyun;Kim, Tae-Ui;Kwon, Kyung-Su
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.13 no.4
    • /
    • pp.145-154
    • /
    • 2008
  • In this paper, we propose the pet robot control system using hand gesture recognition in image sequences acquired from a camera affixed to the pet robot. The proposed system consists of 4 steps; hand detection, feature extraction, gesture recognition and robot control. The hand region is first detected from the input images using the skin color model in HSI color space and connected component analysis. Next, the hand shape and motion features from the image sequences are extracted. Then we consider the hand shape for classification of meaning gestures. Thereafter the hand gesture is recognized by using HMMs (hidden markov models) which have the input as the quantized symbol sequence by the hand motion. Finally the pet robot is controlled by a order corresponding to the recognized hand gesture. We defined four commands of sit down, stand up, lie flat and shake hands for control of pet robot. And we show that user is able to control of pet robot through proposed system in the experiment.

  • PDF

An Implementation of Real-Time Numeral Recognizer Based on Hand Gesture Using Both Gradient and Positional Information (기울기와 위치 정보를 이용한 손동작기반 실시간 숫자 인식기 구현)

  • Kim, Ji-Ho;Park, Yang-Woo;Han, Kyu-Phil
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.3
    • /
    • pp.199-204
    • /
    • 2013
  • An implementation method of real-time numeral recognizer based on gesture is presented in this paper for various information devices. The proposed algorithm steadily captures the motion of a hand on 3D open space with the Kinect sensor. The captured hand motion is simplified with PCA, in order to preserve the trace consistency and to minimize the trace variations due to noises and size changes. In addition, we also propose a new HMM using both the gradient and the positional features of the simplified hand stroke. As the result, the proposed algorithm has robust characteristics to the variations of the size and speed of hand motion. The recognition rate is increased up to 30%, because of this combined model. Experimental results showed that the proposed algorithm gives a high recognition rate about 98%.

NATURAL INTERACTION WITH VIRTUAL PET ON YOUR PALM

  • Choi, Jun-Yeong;Han, Jae-Hyek;Seo, Byung-Kuk;Park, Han-Hoon;Park, Jong-Il
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.341-345
    • /
    • 2009
  • We present an augmented reality (AR) application for cell phone where users put a virtual pet on their palms and play/interact with the pet by moving their hands and fingers naturally. The application is fundamentally based on hand/palm pose recognition and finger motion estimation, which is the main concern in this paper. We propose a fast and efficient hand/palm pose recognition method which uses natural features (e.g. direction, width, contour shape of hand region) extracted from a hand image with prior knowledge for hand shape or geometry (e.g. its approximated shape when a palm is open, length ratio between palm width and pal height). We also propose a natural interaction method which recognizes natural motion of fingers such as opening/closing palm based on fingertip tracking. Based on the proposed methods, we developed and tested the AR application on an ultra-mobile PC (UMPC).

  • PDF

Development of Motion Recognition Platform Using Smart-Phone Tracking and Color Communication (스마트 폰 추적 및 색상 통신을 이용한 동작인식 플랫폼 개발)

  • Oh, Byung-Hun
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.17 no.5
    • /
    • pp.143-150
    • /
    • 2017
  • In this paper, we propose a novel motion recognition platform using smart-phone tracking and color communication. The interface requires only a camera and a personal smart-phone to provide a motion control interface rather than expensive equipment. The platform recognizes the user's gestures by the tracking 3D distance and the rotation angle of the smart-phone, which acts essentially as a motion controller in the user's hand. Also, a color coded communication method using RGB color combinations is included within the interface. Users can conveniently send or receive any text data through this function, and the data can be transferred continuously even while the user is performing gestures. We present the result that implementation of viable contents based on the proposed motion recognition platform.

Analysis of Face Direction and Hand Gestures for Recognition of Human Motion (인간의 행동 인식을 위한 얼굴 방향과 손 동작 해석)

  • Kim, Seong-Eun;Jo, Gang-Hyeon;Jeon, Hui-Seong;Choe, Won-Ho;Park, Gyeong-Seop
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.7 no.4
    • /
    • pp.309-318
    • /
    • 2001
  • In this paper, we describe methods that analyze a human gesture. A human interface(HI) system for analyzing gesture extracts the head and hand regions after taking image sequence of and operators continuous behavior using CCD cameras. As gestures are accomplished with operators head and hands motion, we extract the head and hand regions to analyze gestures and calculate geometrical information of extracted skin regions. The analysis of head motion is possible by obtaining the face direction. We assume that head is ellipsoid with 3D coordinates to locate the face features likes eyes, nose and mouth on its surface. If was know the center of feature points, the angle of the center in the ellipsoid is the direction of the face. The hand region obtained from preprocessing is able to include hands as well as arms. For extracting only the hand region from preprocessing, we should find the wrist line to divide the hand and arm regions. After distinguishing the hand region by the wrist line, we model the hand region as an ellipse for the analysis of hand data. Also, the finger part is represented as a long and narrow shape. We extract hand information such as size, position, and shape.

  • PDF

A Robust Fingertip Extraction and Extended CAMSHIFT based Hand Gesture Recognition for Natural Human-like Human-Robot Interaction (강인한 손가락 끝 추출과 확장된 CAMSHIFT 알고리즘을 이용한 자연스러운 Human-Robot Interaction을 위한 손동작 인식)

  • Lee, Lae-Kyoung;An, Su-Yong;Oh, Se-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.4
    • /
    • pp.328-336
    • /
    • 2012
  • In this paper, we propose a robust fingertip extraction and extended Continuously Adaptive Mean Shift (CAMSHIFT) based robust hand gesture recognition for natural human-like HRI (Human-Robot Interaction). Firstly, for efficient and rapid hand detection, the hand candidate regions are segmented by the combination with robust $YC_bC_r$ skin color model and haar-like features based adaboost. Using the extracted hand candidate regions, we estimate the palm region and fingertip position from distance transformation based voting and geometrical feature of hands. From the hand orientation and palm center position, we find the optimal fingertip position and its orientation. Then using extended CAMSHIFT, we reliably track the 2D hand gesture trajectory with extracted fingertip. Finally, we applied the conditional density propagation (CONDENSATION) to recognize the pre-defined temporal motion trajectories. Experimental results show that the proposed algorithm not only rapidly extracts the hand region with accurately extracted fingertip and its angle but also robustly tracks the hand under different illumination, size and rotation conditions. Using these results, we successfully recognize the multiple hand gestures.

On-line Motion Control of Avatar Using Hand Gesture Recognition (손 제스터 인식을 이용한 실시간 아바타 자세 제어)

  • Kim, Jong-Sung;Kim, Jung-Bae;Song, Kyung-Joon;Min, Byung-Eui;Bien, Zeung-Nam
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.36C no.6
    • /
    • pp.52-62
    • /
    • 1999
  • This paper presents a system which recognizes dynamic hand gestures on-line for controlling motion of numan avatar in virtual environment(VF). A dynamic hand gesture is a method of communication between a computer and a human being who uses gestures, especially both hands and fingers. A human avatar consists of 32 degree of freedom(DOF) for natural motion in VE and navigates by 8 pre-defined dynamic hand gestures. Inverse kinematics and dynamic kinematics are applied for real-time motion control of human avatar. In this paper, we apply a fuzzy min-max neural network and feature analysis method using fuzzy logic for on-line dynamic hand gesture recognition.

  • PDF

Analysis of 3D Motion Recognition using Meta-analysis for Interaction (기존 3차원 인터랙션 동작인식 기술 현황 파악을 위한 메타분석)

  • Kim, Yong-Woo;Whang, Min-Cheol;Kim, Jong-Hwa;Woo, Jin-Cheol;Kim, Chi-Jung;Kim, Ji-Hye
    • Journal of the Ergonomics Society of Korea
    • /
    • v.29 no.6
    • /
    • pp.925-932
    • /
    • 2010
  • Most of the research on three-dimensional interaction field have showed different accuracy in terms of sensing, mode and method. Furthermore, implementation of interaction has been a lack of consistency in application field. Therefore, this study is to suggest research trends of three-dimensional interaction using meta-analysis. Searching relative keyword in database provided with 153 domestic papers and 188 international papers covering three-dimensional interaction. Analytical coding tables determined 18 domestic papers and 28 international papers for analysis. Frequency analysis was carried out on method of action, element, number, accuracy and then verified accuracy by effect size of the meta-analysis. As the results, the effect size of sensor-based was higher than vision-based, but the effect size was extracted to small as 0.02. The effect size of vision-based using hand motion was higher than sensor-based using hand motion. Therefore, implementation of three-dimensional sensor-based interaction and vision-based using hand motions more efficient. This study was significant to comprehensive analysis of three-dimensional motion recognition for interaction and suggest to application directions of three-dimensional interaction.