• Title/Summary/Keyword: Hand-tracking

Search Result 350, Processing Time 0.031 seconds

Subjective Evaluation on Perceptual Tracking Errors from Modeling Errors in Model-Based Tracking

  • Rhee, Eun Joo;Park, Jungsik;Seo, Byung-Kuk;Park, Jong-Il
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.4 no.6
    • /
    • pp.407-412
    • /
    • 2015
  • In model-based tracking, an accurate 3D model of a target object or scene is mostly assumed to be known or given in advance, but the accuracy of the model should be guaranteed for accurate pose estimation. In many application domains, on the other hand, end users are not highly distracted by tracking errors from certain levels of modeling errors. In this paper, we examine perceptual tracking errors, which are predominantly caused by modeling errors, on subjective evaluation and compare them to computational tracking errors. We also discuss the tolerance of modeling errors by analyzing their permissible ranges.

Robust appearance feature learning using pixel-wise discrimination for visual tracking

  • Kim, Minji;Kim, Sungchan
    • ETRI Journal
    • /
    • v.41 no.4
    • /
    • pp.483-493
    • /
    • 2019
  • Considering the high dimensions of video sequences, it is often challenging to acquire a sufficient dataset to train the tracking models. From this perspective, we propose to revisit the idea of hand-crafted feature learning to avoid such a requirement from a dataset. The proposed tracking approach is composed of two phases, detection and tracking, according to how severely the appearance of a target changes. The detection phase addresses severe and rapid variations by learning a new appearance model that classifies the pixels into foreground (or target) and background. We further combine the raw pixel features of the color intensity and spatial location with convolutional feature activations for robust target representation. The tracking phase tracks a target by searching for frame regions where the best pixel-level agreement to the model learned from the detection phase is achieved. Our two-phase approach results in efficient and accurate tracking, outperforming recent methods in various challenging cases of target appearance changes.

Moving Object Tracking Using Active Contour Model (동적 윤곽 모델을 이용한 이동 물체 추적)

  • Han, Kyu-Bum;Baek, Yoon-Su
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.27 no.5
    • /
    • pp.697-704
    • /
    • 2003
  • In this paper, the visual tracking system for arbitrary shaped moving object is proposed. The established tracking system can be divided into model based method that needs previous model for target object and image based method that uses image feature. In the model based method, the reliable tracking is possible, but simplification of the shape is necessary and the application is restricted to definite target mod el. On the other hand, in the image based method, the process speed can be increased, but the shape information is lost and the tracking system is sensitive to image noise. The proposed tracking system is composed of the extraction process that recognizes the existence of moving object and tracking process that extracts dynamic characteristics and shape information of the target objects. Specially, active contour model is used to effectively track the object that is undergoing shape change. In initializatio n process of the contour model, the semi-automatic operation can be avoided and the convergence speed of the contour can be increased by the proposed effective initialization method. Also, for the efficient solution of the correspondence problem in multiple objects tracking, the variation function that uses the variation of position structure in image frame and snake energy level is proposed. In order to verify the validity and effectiveness of the proposed tracking system, real time tracking experiment for multiple moving objects is implemented.

Real-time hand tracking and recognition based on structured template matching (구조적 템플렛 매칭에 기반을 둔 실시간 손 추적 및 인식)

  • Kim, Song-Gook;Bae, Ki-Tae;Lee, Chil-Woo
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.1037-1043
    • /
    • 2006
  • 본 논문에서는 유비쿼터스 컴퓨팅 오피스 환경에서 가장 직관적인 HCI 수단인 손 제스처를 사용하여 대형 스크린 상의 응용 프로그램들을 쉽게 제어할 수 있는 시스템을 제안한다. 손 제스처는 손 영역의 정보, 손 중심점의 위치 변화값과 손가락 형상을 이용하여 시스템 제어에 필요한 종류들을 미리 정의해 둔다. 먼저 효율적으로 손 영역 획득을 위해 적외선 카메라를 사용하여 연속된 영상을 획득한다. 획득된 영상 프레임으로부터 구조적 템플레이트 매칭 방법을 사용하여 손의 중심(centroid) 및 손가락끝(fingertip)을 검출한다. 인식과정에서는 양손의 Euclidean distance와 손가락 형상 정보를 이용하여 미리 정의된 제스처와 비교하여 인식을 행한다. 본 논문에서 제안한 비전 기반 hand gesture 제어 시스템은 인간과 컴퓨터의 상호작용을 이해하는데 많은 이점을 제공할 수 있다. 실험 결과를 통해 본 논문에서 제안한 방법의 효율성을 입증한다.

  • PDF

Tracking Algorithm For Golf Swing Using the Information of Pixels and Movements (화소 및 이동 정보를 이용한 골프 스윙 궤도 추적 알고리즘)

  • Lee, Hong, Ro;Hwang, Chi-Jung
    • The KIPS Transactions:PartB
    • /
    • v.12B no.5 s.101
    • /
    • pp.561-566
    • /
    • 2005
  • This paper presents a visual tracking algorithm for the golf swing motion analysis by using the information of the pixels of video frames and movement of the golf club to solve the problem fixed center point in model based tracking method. The model based tracking method use the polynomial function for trajectory displaying of upswing and downswing. Therefore it is under the hypothesis of the no movement of the center of gravity so this method is not for the amateurs. we proposed method using the information of pixel and movement, we first detected the motion by using the information of pixel in the frames in golf swing motion. Then we extracted the club head and hand by a properties of club shaft that consist of the parallel line and the moved location of club in up-swing and down-swing. In addition, we can extract the center point of user by tracking center point of the line between center of head and both foots. And we made an experiment with data that movement of center point is big. Finally, we can track the real trajectory of club head, hand and center point by using proposed tracking algorithm.

Implementation of Gesture Interface for Projected Surfaces

  • Park, Yong-Suk;Park, Se-Ho;Kim, Tae-Gon;Chung, Jong-Moon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.1
    • /
    • pp.378-390
    • /
    • 2015
  • Image projectors can turn any surface into a display. Integrating a surface projection with a user interface transforms it into an interactive display with many possible applications. Hand gesture interfaces are often used with projector-camera systems. Hand detection through color image processing is affected by the surrounding environment. The lack of illumination and color details greatly influences the detection process and drops the recognition success rate. In addition, there can be interference from the projection system itself due to image projection. In order to overcome these problems, a gesture interface based on depth images is proposed for projected surfaces. In this paper, a depth camera is used for hand recognition and for effectively extracting the area of the hand from the scene. A hand detection and finger tracking method based on depth images is proposed. Based on the proposed method, a touch interface for the projected surface is implemented and evaluated.

A Study on A Boundary Tracking Algorithm for Finger Crease Pattern Identification Algorithm (손가락 마디지문 패턴을 이용한 개인식별 알고리즘 구현을 위한 경계 추적 알고리즘에 관한 연구)

  • Jung, Hee-Cheol;Shin, Chango-Ho;Lee, Hyun-Youl;Choi, Hwan-Soo
    • Proceedings of the KIEE Conference
    • /
    • 1999.11c
    • /
    • pp.818-820
    • /
    • 1999
  • In this paper, a new boundary tracking algorithm for extracting finger area, which may be utilized by a finger crease pattern recognition algorithm, is proposed. Due to noise and irregular illumination, conventional algorithms for boundary tracking such as skeleton-based tracking methods were not suitable for typical boundary image of hand. So we propose a new finger boundary tracking algorithm utilizing a boundary-point-detection mask. We have observed that the proposed method provides stable and optimised boundary tracking.

  • PDF

Multiple Cues Based Particle Filter for Robust Tracking (다중 특징 기반 입자필터를 이용한 강건한 영상객체 추적)

  • Hossain, Kabir;Lee, Chi-Woo
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2012.11a
    • /
    • pp.552-555
    • /
    • 2012
  • The main goal of this paper is to develop a robust visual tracking algorithm with particle filtering. Visual Tracking with particle filter technique is not easy task due to cluttered environment, illumination changes. To deal with these problems, we develop an efficient observation model for target tracking with particle filter. We develop a robust phase correlation combined with motion information based observation model for particle filter framework. Phase correlation provides straight-forward estimation of rigid translational motion between two images, which is based on the well-known Fourier shift property. Phase correlation has the advantage that it is not affected by any intensity or contrast differences between two images. On the other hand, motion cue is also very well known technique and widely used due to its simplicity. Therefore, we apply the phase correlation integrated with motion information in particle filter framework for robust tracking. In experimental results, we show that tracking with multiple cues based model provides more reliable performance than single cue.

Real-Time Hand Pose Tracking and Finger Action Recognition Based on 3D Hand Modeling (3차원 손 모델링 기반의 실시간 손 포즈 추적 및 손가락 동작 인식)

  • Suk, Heung-Il;Lee, Ji-Hong;Lee, Seong-Whan
    • Journal of KIISE:Software and Applications
    • /
    • v.35 no.12
    • /
    • pp.780-788
    • /
    • 2008
  • Modeling hand poses and tracking its movement are one of the challenging problems in computer vision. There are two typical approaches for the reconstruction of hand poses in 3D, depending on the number of cameras from which images are captured. One is to capture images from multiple cameras or a stereo camera. The other is to capture images from a single camera. The former approach is relatively limited, because of the environmental constraints for setting up multiple cameras. In this paper we propose a method of reconstructing 3D hand poses from a 2D input image sequence captured from a single camera by means of Belief Propagation in a graphical model and recognizing a finger clicking motion using a hidden Markov model. We define a graphical model with hidden nodes representing joints of a hand, and observable nodes with the features extracted from a 2D input image sequence. To track hand poses in 3D, we use a Belief Propagation algorithm, which provides a robust and unified framework for inference in a graphical model. From the estimated 3D hand pose we extract the information for each finger's motion, which is then fed into a hidden Markov model. To recognize natural finger actions, we consider the movements of all the fingers to recognize a single finger's action. We applied the proposed method to a virtual keypad system and the result showed a high recognition rate of 94.66% with 300 test data.

Object Tracking using Feature Map from Convolutional Neural Network (컨볼루션 신경망의 특징맵을 사용한 객체 추적)

  • Lim, Suchang;Kim, Do Yeon
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.2
    • /
    • pp.126-133
    • /
    • 2017
  • The conventional hand-crafted features used to track objects have limitations in object representation. Convolutional neural networks, which show good performance results in various areas of computer vision, are emerging as new ways to break through the limitations of feature extraction. CNN extracts the features of the image through layers of multiple layers, and learns the kernel used for feature extraction by itself. In this paper, we use the feature map extracted from the convolution layer of the convolution neural network to create an outline model of the object and use it for tracking. We propose a method to adaptively update the outline model to cope with various environment change factors affecting the tracking performance. The proposed algorithm evaluated the validity test based on the 11 environmental change attributes of the CVPR2013 tracking benchmark and showed excellent results in six attributes.