• Title/Summary/Keyword: 3D Gesture Recognition

Search Result 98, Processing Time 0.023 seconds

A Hand Gesture Recognition System using 3D Tracking Volume Restriction Technique (3차원 추적영역 제한 기법을 이용한 손 동작 인식 시스템)

  • Kim, Kyung-Ho;Jung, Da-Un;Lee, Seok-Han;Choi, Jong-Soo
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.6
    • /
    • pp.201-211
    • /
    • 2013
  • In this paper, we propose a hand tracking and gesture recognition system. Our system employs a depth capture device to obtain 3D geometric information of user's bare hand. In particular, we build a flexible tracking volume and restrict the hand tracking area, so that we can avoid diverse problems caused by conventional object detection/tracking systems. The proposed system computes running average of the hand position, and tracking volume is actively adjusted according to the statistical information that is computed on the basis of uncertainty of the user's hand motion in the 3D space. Once the position of user's hand is obtained, then the system attempts to detect stretched fingers to recognize finger gesture of the user's hand. In order to test the proposed framework, we built a NUI system using the proposed technique, and verified that our system presents very stable performance even in the case that multiple objects exist simultaneously in the crowded environment, as well as in the situation that the scene is occluded temporarily. We also verified that our system ensures running speed of 24-30 frames per second throughout the experiments.

Developing Interactive Game Contents using 3D Human Pose Recognition (3차원 인체 포즈 인식을 이용한 상호작용 게임 콘텐츠 개발)

  • Choi, Yoon-Ji;Park, Jae-Wan;Song, Dae-Hyeon;Lee, Chil-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.12
    • /
    • pp.619-628
    • /
    • 2011
  • Normally vision-based 3D human pose recognition technology is used to method for convey human gesture in HCI(Human-Computer Interaction). 2D pose model based recognition method recognizes simple 2D human pose in particular environment. On the other hand, 3D pose model which describes 3D human body skeletal structure can recognize more complex 3D pose than 2D pose model in because it can use joint angle and shape information of body part. In this paper, we describe a development of interactive game contents using pose recognition interface that using 3D human body joint information. Our system was proposed for the purpose that users can control the game contents with body motion without any additional equipment. Poses are recognized comparing current input pose and predefined pose template which is consist of 14 human body joint 3D information. We implement the game contents with the our pose recognition system and make sure about the efficiency of our proposed system. In the future, we will improve the system that can be recognized poses in various environments robustly.

Development of a Hand~posture Recognition System Using 3D Hand Model (3차원 손 모델을 이용한 비전 기반 손 모양 인식기의 개발)

  • Jang, Hyo-Young;Bien, Zeung-Nam
    • Proceedings of the KIEE Conference
    • /
    • 2007.04a
    • /
    • pp.219-221
    • /
    • 2007
  • Recent changes to ubiquitous computing requires more natural human-computer(HCI) interfaces that provide high information accessibility. Hand-gesture, i.e., gestures performed by one 'or two hands, is emerging as a viable technology to complement or replace conventional HCI technology. This paper deals with hand-posture recognition. Hand-posture database construction is important in hand-posture recognition. Human hand is composed of 27 bones and the movement of each joint is modeled by 23 degrees of freedom. Even for the same hand-posture,. grabbed images may differ depending on user's characteristic and relative position between the hand and cameras. To solve the difficulty in defining hand-postures and construct database effective in size, we present a method using a 3D hand model. Hand joint angles for each hand-posture and corresponding silhouette images from many viewpoints by projecting the model into image planes are used to construct the ?database. The proposed method does not require additional equations to define movement constraints of each joint. Also using the method, it is easy to get images of one hand-posture from many vi.ewpoints and distances. Hence it is possible to construct database more precisely and concretely. The validity of the method is evaluated by applying it to the hand-posture recognition system.

  • PDF

Gesture based Input Device: An All Inertial Approach

  • Chang Wook;Bang Won-Chul;Choi Eun-Seok;Yang Jing;Cho Sung-Jung;Cho Joon-Kee;Oh Jong-Koo;Kim Dong-Yoon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.5 no.3
    • /
    • pp.230-245
    • /
    • 2005
  • In this paper, we develop a gesture-based input device equipped with accelerometers and gyroscopes. The sensors measure the inertial measurements, i.e., accelerations and angular velocities produced by the movement of the system when a user is inputting gestures on a plane surface or in a 3D space. The gyroscope measurements are integrated to give orientation of the device and consequently used to compensate the accelerations. The compensated accelerations are doubly integrated to yield the position of the device. With this approach, a user's gesture input trajectories can be recovered without any external sensors. Three versions of motion tracking algorithms are provided to cope with wide spectrum of applications. Then, a Bayesian network based recognition system processes the recovered trajectories to identify the gesture class. Experimental results convincingly show the feasibility and effectiveness of the proposed gesture input device. In order to show practical use of the proposed input method, we implemented a prototype system, which is a gesture-based remote controller (Magic Wand).

Effects of Spatio-temporal Features of Dynamic Hand Gestures on Learning Accuracy in 3D-CNN (3D-CNN에서 동적 손 제스처의 시공간적 특징이 학습 정확성에 미치는 영향)

  • Yeongjee Chung
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.23 no.3
    • /
    • pp.145-151
    • /
    • 2023
  • 3D-CNN is one of the deep learning techniques for learning time series data. Such three-dimensional learning can generate many parameters, so that high-performance machine learning is required or can have a large impact on the learning rate. When learning dynamic hand-gestures in spatiotemporal domain, it is necessary for the improvement of the efficiency of dynamic hand-gesture learning with 3D-CNN to find the optimal conditions of input video data by analyzing the learning accuracy according to the spatiotemporal change of input video data without structural change of the 3D-CNN model. First, the time ratio between dynamic hand-gesture actions is adjusted by setting the learning interval of image frames in the dynamic hand-gesture video data. Second, through 2D cross-correlation analysis between classes, similarity between image frames of input video data is measured and normalized to obtain an average value between frames and analyze learning accuracy. Based on this analysis, this work proposed two methods to effectively select input video data for 3D-CNN deep learning of dynamic hand-gestures. Experimental results showed that the learning interval of image data frames and the similarity of image frames between classes can affect the accuracy of the learning model.

Motion Plane Estimation for Real-Time Hand Motion Recognition (실시간 손동작 인식을 위한 동작 평면 추정)

  • Jeong, Seung-Dae;Jang, Kyung-Ho;Jung, Soon-Ki
    • The KIPS Transactions:PartB
    • /
    • v.16B no.5
    • /
    • pp.347-358
    • /
    • 2009
  • In this thesis, we develop a vision based hand motion recognition system using a camera with two rotational motors. Existing systems were implemented using a range camera or multiple cameras and have a limited working area. In contrast, we use an uncalibrated camera and get more wide working area by pan-tilt motion. Given an image sequence provided by the pan-tilt camera, color and pattern information are integrated into a tracking system in order to find the 2D position and direction of the hand. With these pose information, we estimate 3D motion plane on which the gesture motion trajectory from approximately forms. The 3D trajectory of the moving finger tip is projected into the motion plane, so that the resolving power of the linear gesture patterns is enhanced. We have tested the proposed approach in terms of the accuracy of trace angle and the dimension of the working volume.

Histogram Based Hand Recognition System for Augmented Reality (증강현실을 위한 히스토그램 기반의 손 인식 시스템)

  • Ko, Min-Su;Yoo, Ji-Sang
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.7
    • /
    • pp.1564-1572
    • /
    • 2011
  • In this paper, we propose a new histogram based hand recognition algorithm for augmented reality. Hand recognition system makes it possible a useful interaction between an user and computer. However, there is difficulty in vision-based hand gesture recognition with viewing angle dependency due to the complexity of human hand shape. A new hand recognition system proposed in this paper is based on the features from hand geometry. The proposed recognition system consists of two steps. In the first step, hand region is extracted from the image captured by a camera and then hand gestures are recognized in the second step. At first, we extract hand region by deleting background and using skin color information. Then we recognize hand shape by determining hand feature point using histogram of the obtained hand region. Finally, we design a augmented reality system by controlling a 3D object with the recognized hand gesture. Experimental results show that the proposed algorithm gives more than 91% accuracy for the hand recognition with less computational power.

Study on Virtual Reality (VR) Operating System Prototype (가상환경(VR) 운영체제 프로토타입 연구)

  • Kim, Eunsol;Kim, Jiyeon;Yoo, Eunjin;Park, Taejung
    • Journal of Broadcast Engineering
    • /
    • v.22 no.1
    • /
    • pp.87-94
    • /
    • 2017
  • This paper presents a prototype for virtual reality operating system (VR OS) concept with head mount display (HMD) and hand gesture recognition technology based on game engine (Unity3D). We have designed and implemented simple multitasking thread mechanism constructed on the realtime environment provided by Unity3D game engine. Our virtual reality operating system receives user input from the hand gesture recognition device (Leap Motion) to simulate mouse and keyboard and provides output via head mount display (Oculus Rift DK2). As a result, our system provides users with more broad and immersive work environment by implementing 360 degree work space.

Study on the Hand Gesture Recognition System and Algorithm based on Millimeter Wave Radar (밀리미터파 레이더 기반 손동작 인식 시스템 및 알고리즘에 관한 연구)

  • Lee, Youngseok
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.12 no.3
    • /
    • pp.251-256
    • /
    • 2019
  • In this paper we proposed system and algorithm to recognize hand gestures based on the millimeter wave that is in 65GHz bandwidth. The proposed system is composed of millimeter wave radar board, analog to data conversion and data capture board and notebook to perform gesture recognition algorithms. As feature vectors in proposed algorithm. we used global and local zernike moment descriptor which are robust to distort by rotation of scaling of 2D data. As Experimental result, performance of the proposed algorithm is evaluated and compared with those of algorithms using single global or local zernike descriptor as feature vectors. In analysis of confusion matrix of algorithms, the proposed algorithm shows the better performance in comparison of precision, accuracy and sensitivity, subsequently total performance index of our method is 95.6% comparing with another two mehods in 88.4% and 84%.

Gesture-based Table Tennis Game in AR Environment (증강현실과 제스처를 이용한 비전기반 탁구 게임)

  • Yang, Jong-Yeol;Lee, Sang-Kyung;Kyoung, Dong-Wuk;Jung, Kee-Chul
    • Journal of Korea Game Society
    • /
    • v.5 no.3
    • /
    • pp.3-10
    • /
    • 2005
  • We present the computer table tennis game using player's swing motion. We need to transform a real world coordinate into a virtual world coordinate in order to hit the virtual ball. We can not get a correct 3-dimension position of racket in environment that using one camera or simple image processing. Therefore we use Augmented Reality (AR) concept to develop the game. This paper shows the AR table tennis game using gesture and method to develop the 3D interaction game that only using one camera without any motion detection device or stereo cameras. Also, we use a scan line method to recognize gesture for speedy processing. The game is developed using ARtoolkit and DirectX that is popular tool of SDK for game development.

  • PDF