• Title/Summary/Keyword: hands tracking

Search Result 53, Processing Time 0.024 seconds

Real-Time Two Hands Tracking System

  • Liu, Nianjun;Lovell, Brian C.
    • Proceedings of the IEEK Conference
    • /
    • 2002.07c
    • /
    • pp.1491-1494
    • /
    • 2002
  • The paper introduces a novel system of two hands real-time tracking based on the unrestricted hand skin segmentation by multi color systems. After corer-based segmentation and pre-processing operation, a label set of regions is created to locate the two hands automatically. By the normalization, template matching is used to find out the left or right hand. An improved fast self-adaptive tracking algorithm is applied and Canny filter is used for hand detection.

  • PDF

A Study on the Interaction with Virtual Objects through XR Hands (XR Hands를 통한 가상 객체들과의 상호 작용에 관한 연구)

  • BeomJun Jo;SeongKi Kim
    • Journal of the Korea Computer Graphics Society
    • /
    • v.30 no.3
    • /
    • pp.43-49
    • /
    • 2024
  • Hand tracking is currently one of the most promising technologies in XR with the release of extended reality (XR) devices, in which hand tracking is used as the main manipulation. Hand tracking offers advantages in terms of immersion and realism, and as a result, it is being employed in a range of fields, including education, business, and medical care. The archery movement requires using both hands at the same time, but requires sophistication to hit the target and is a movement that has cultural and sports significance in the past. This study aimed to implement this archery movement. Therefore, this paper used the XR Hands package provided by Unity to recognize hand movements, explored the underlying OpenXR, and finally implemented the archery movement and tested it in Meta Quest 2.

Face and Hand Tracking Algorithm for Sign Language Recognition (수화 인식을 위한 얼굴과 손 추적 알고리즘)

  • Park, Ho-Sik;Bae, Cheol-Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.11C
    • /
    • pp.1071-1076
    • /
    • 2006
  • In this paper, we develop face and hand tracking for sign language recognition system. The system is divided into two stages; the initial and tracking stages. In initial stage, we use the skin feature to localize face and hands of signer. The ellipse model on CbCr space is constructed and used to detect skin color. After the skin regions have been segmented, face and hand blobs are defined by using size and facial feature with the assumption that the movement of face is less than that of hands in this signing scenario. In tracking stage, the motion estimation is applied only hand blobs, in which first and second derivative are used to compute the position of prediction of hands. We observed that there are errors in the value of tracking position between two consecutive frames in which velocity has changed abruptly. To improve the tracking performance, our proposed algorithm compensates the error of tracking position by using adaptive search area to re-compute the hand blobs. The experimental results indicate that our proposed method is able to decrease the prediction error up to 96.87% with negligible increase in computational complexity of up to 4%.

Hands-free Robot Control System Using Mouth Tracking (입 추적을 이용한 로봇 원격 제어 시스템)

  • Wang, Liang;Xu, Yongzhe;Ahmed, Minhaz;Rhee, Phill-Kyu
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2011.06c
    • /
    • pp.405-408
    • /
    • 2011
  • In this paper, we propose a robot remote control system based on mouth tracking. The main idea behind the work is to help disabled people who cannot operate a joystick or keyboard to control a robot with their hands. The mouth detection method in this paper is mainly based on the Adaboost feature detection approach. By using the proposed new Haar-like features for detecting the corner of mouth, the speed and accuracy of detection are improved. Combined with the Kalman filter, a continuous and accurate mouth tracking has been achieved. Meanwhile, the gripping commands of the robot manipulator were also achieved by the recognition of the user.s mouth shape, such as 'pout mouth' or 'grin mouth'. To assess the validity of the method, a mouth detection experiment and a robot cargo transport experiment were applied. The result indicated that the system can realize a quick and accurate mouse tracking; and the operation of the robot worked successfully in moving and bringing back items.

Robust 3D Hand Tracking based on a Coupled Particle Filter (결합된 파티클 필터에 기반한 강인한 3차원 손 추적)

  • Ahn, Woo-Seok;Suk, Heung-Il;Lee, Seong-Whan
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.1
    • /
    • pp.80-84
    • /
    • 2010
  • Tracking hands is an essential technique for hand gesture recognition which is an efficient way in Human Computer Interaction (HCI). Recently, many researchers have focused on hands tracking using a 3D hand model and showed robust tracking results compared to using 2D hand models. In this paper, we propose a novel 3D hand tracking method based on a coupled particle filter. This provides robust and fast tracking results by estimating each part of global hand poses and local finger motions separately and then utilizing the estimated results as a prior for each other. Furthermore, in order to improve the robustness, we apply a multi-cue based method by integrating a color-based area matching method and an edge-based distance matching method. In our experiments, the proposed method showed robust tracking results for complex hand motions in a cluttered background.

Computer Interface Using Head-Gaze Tracking (응시 위치 추적 기술을 이용한 인터페이스 시스템 개발)

  • 이정준;박강령;김재희
    • Proceedings of the IEEK Conference
    • /
    • 1999.06a
    • /
    • pp.516-519
    • /
    • 1999
  • Gaze detection is to find out the position on a monitor screen where a user is looking at, using the image processing and computer vision technology, We developed a computer interface system using the gaze detection technology, This system enables a user to control the computer system without using their hands. So this system will help the handicapped to use a computer and is also useful for the man whose hands are busy doing another job, especially in tasks in factory. For the practical use, command signal like mouse clicking is necessary and we used eye winking to give this command signal to the system.

  • PDF

Dynamic Manipulation of a Virtual Object in Marker-less AR system Based on Both Human Hands

  • Chun, Jun-Chul;Lee, Byung-Sung
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.4
    • /
    • pp.618-632
    • /
    • 2010
  • This paper presents a novel approach to control the augmented reality (AR) objects robustly in a marker-less AR system by fingertip tracking and hand pattern recognition. It is known that one of the promising ways to develop a marker-less AR system is using human's body such as hand or face for replacing traditional fiducial markers. This paper introduces a real-time method to manipulate the overlaid virtual objects dynamically in a marker-less AR system using both hands with a single camera. The left bare hand is considered as a virtual marker in the marker-less AR system and the right hand is used as a hand mouse. To build the marker-less system, we utilize a skin-color model for hand shape detection and curvature-based fingertip detection from an input video image. Using the detected fingertips the camera pose are estimated to overlay virtual objects on the hand coordinate system. In order to manipulate the virtual objects rendered on the marker-less AR system dynamically, a vision-based hand control interface, which exploits the fingertip tracking for the movement of the objects and pattern matching for the hand command initiation, is developed. From the experiments, we can prove that the proposed and developed system can control the objects dynamically in a convenient fashion.

Object Tracking Using Particle Filter with an Improved Observe Method (개선된 Observe 기법을 적용한 Particle Filter 물체 추적)

  • Cho, Hyun-Joong;Lee, Chul-Woo;Jung, Jae-Gi;Kim, Jin-Yul
    • Proceedings of the IEEK Conference
    • /
    • 2009.05a
    • /
    • pp.210-212
    • /
    • 2009
  • In object tracking based on the particle filter algorithm controlling the proper distribution of the samples is essential to accurately track the target. If the samples are spread too wide compared to the target size, the tracking accuracy may degrade as some samples can be caught by background clutters that is similar to the target. On the other hands if the samples are spread too narrow, the particle filter may fail to track the abrupt motion of the target. To solve this problem we propose an improved particle filter that adopts "re-weighting" technique at the observe step. We estimate the distribution of the weights of the current samples by its mean and variance. Then the samples are re-weighted so that the appropriate distribution of the samples in proportional to the target scale is obtained at the next select step. The proposed tracking method can avoid convergence to local mean and improve the accuracy of the estimated target state.

  • PDF

Tracking and Recognizing Hand Gestures using Kalman Filter and Continuous Dynamic Programming (연속DP와 칼만필터를 이용한 손동작의 추적 및 인식)

  • 문인혁;금영광
    • Proceedings of the IEEK Conference
    • /
    • 2002.06c
    • /
    • pp.13-16
    • /
    • 2002
  • This paper proposes a method to track hand gesture and to recognize the gesture pattern using Kalman filter and continuous dynamic programming (CDP). The positions of hands are predicted by Kalman filter, and corresponding pixels to the hands are extracted by skin color filter. The center of gravity of the hands is the same as the input pattern vector. The input gesture is then recognized by matching with the reference gesture patterns using CDP. From experimental results to recognize circle shape gesture and intention gestures such as “Come on” and “Bye-bye”, we show the proposed method is feasible to the hand gesture-based human -computer interaction.

  • PDF

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.4
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.