• Title/Summary/Keyword: Fingertip Detection

Search Result 25, Processing Time 0.019 seconds

Vision-Based Finger Action Recognition by Angle Detection and Contour Analysis

  • Lee, Dae-Ho;Lee, Seung-Gwan
    • ETRI Journal
    • /
    • v.33 no.3
    • /
    • pp.415-422
    • /
    • 2011
  • In this paper, we present a novel vision-based method of recognizing finger actions for use in electronic appliance interfaces. Human skin is first detected by color and consecutive motion information. Then, fingertips are detected by a novel scale-invariant angle detection based on a variable k-cosine. Fingertip tracking is implemented by detected region-based tracking. By analyzing the contour of the tracked fingertip, fingertip parameters, such as position, thickness, and direction, are calculated. Finger actions, such as moving, clicking, and pointing, are recognized by analyzing these fingertip parameters. Experimental results show that the proposed angle detection can correctly detect fingertips, and that the recognized actions can be used for the interface with electronic appliances.

Fingertip Touch Recognition using Shadow Information for General Wall Touch Screen (일반벽 터치 스크린의 손가락 터치 판별을 위한 그림자 정보의 사용)

  • Jeong, Hyun-Jeong;Hwang, Tae-Ryang;Choi, Yong-Gyun;Lee, Suk-Ho
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.12
    • /
    • pp.1430-1436
    • /
    • 2014
  • We propose an algorithm which detects the touch of the fingertip on a general wall using the shadow information. Nowadays, there is a demand for presentation systems which can perceive the presenter's action so that the presenter can use natural movements without extra interface hardware. One of the most fundamental techniques in this area is the detection of the fingertip and the recognition of the touch of the fingertip on the screen. The proposed algorithm recognizes the touch of the fingertip without using the depth information, and therefore needs no depth or touch sensing devices. The proposed method computes the convex hull points of both the fingertip and the shadow of the fingertip, and then computes the distance between those points to decide whether a touch event has occured. Using the proposed method, it is possible to develop a new projector device which can perceive a fingertip touch on a general wall.

Dynamic Manipulation of a Virtual Object in Marker-less AR system Based on Both Human Hands

  • Chun, Jun-Chul;Lee, Byung-Sung
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.4
    • /
    • pp.618-632
    • /
    • 2010
  • This paper presents a novel approach to control the augmented reality (AR) objects robustly in a marker-less AR system by fingertip tracking and hand pattern recognition. It is known that one of the promising ways to develop a marker-less AR system is using human's body such as hand or face for replacing traditional fiducial markers. This paper introduces a real-time method to manipulate the overlaid virtual objects dynamically in a marker-less AR system using both hands with a single camera. The left bare hand is considered as a virtual marker in the marker-less AR system and the right hand is used as a hand mouse. To build the marker-less system, we utilize a skin-color model for hand shape detection and curvature-based fingertip detection from an input video image. Using the detected fingertips the camera pose are estimated to overlay virtual objects on the hand coordinate system. In order to manipulate the virtual objects rendered on the marker-less AR system dynamically, a vision-based hand control interface, which exploits the fingertip tracking for the movement of the objects and pattern matching for the hand command initiation, is developed. From the experiments, we can prove that the proposed and developed system can control the objects dynamically in a convenient fashion.

Fingertip Detection through Atrous Convolution and Grad-CAM (Atrous Convolution과 Grad-CAM을 통한 손 끝 탐지)

  • Noh, Dae-Cheol;Kim, Tae-Young
    • Journal of the Korea Computer Graphics Society
    • /
    • v.25 no.5
    • /
    • pp.11-20
    • /
    • 2019
  • With the development of deep learning technology, research is being actively carried out on user-friendly interfaces that are suitable for use in virtual reality or augmented reality applications. To support the interface using the user's hands, this paper proposes a deep learning-based fingertip detection method to enable the tracking of fingertip coordinates to select virtual objects, or to write or draw in the air. After cutting the approximate part of the corresponding fingertip object from the input image with the Grad-CAM, and perform the convolution neural network with Atrous Convolution for the cut image to detect fingertip location. This method is simpler and easier to implement than existing object detection algorithms without requiring a pre-processing for annotating objects. To verify this method we implemented an air writing application and showed that the recognition rate of 81% and the speed of 76 ms were able to write smoothly without delay in the air, making it possible to utilize the application in real time.

A Robust Fingertip Extraction and Extended CAMSHIFT based Hand Gesture Recognition for Natural Human-like Human-Robot Interaction (강인한 손가락 끝 추출과 확장된 CAMSHIFT 알고리즘을 이용한 자연스러운 Human-Robot Interaction을 위한 손동작 인식)

  • Lee, Lae-Kyoung;An, Su-Yong;Oh, Se-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.4
    • /
    • pp.328-336
    • /
    • 2012
  • In this paper, we propose a robust fingertip extraction and extended Continuously Adaptive Mean Shift (CAMSHIFT) based robust hand gesture recognition for natural human-like HRI (Human-Robot Interaction). Firstly, for efficient and rapid hand detection, the hand candidate regions are segmented by the combination with robust $YC_bC_r$ skin color model and haar-like features based adaboost. Using the extracted hand candidate regions, we estimate the palm region and fingertip position from distance transformation based voting and geometrical feature of hands. From the hand orientation and palm center position, we find the optimal fingertip position and its orientation. Then using extended CAMSHIFT, we reliably track the 2D hand gesture trajectory with extracted fingertip. Finally, we applied the conditional density propagation (CONDENSATION) to recognize the pre-defined temporal motion trajectories. Experimental results show that the proposed algorithm not only rapidly extracts the hand region with accurately extracted fingertip and its angle but also robustly tracks the hand under different illumination, size and rotation conditions. Using these results, we successfully recognize the multiple hand gestures.

Hand Region Tracking and Fingertip Detection based on Depth Image (깊이 영상 기반 손 영역 추적 및 손 끝점 검출)

  • Joo, Sung-Il;Weon, Sun-Hee;Choi, Hyung-Il
    • Journal of the Korea Society of Computer and Information
    • /
    • v.18 no.8
    • /
    • pp.65-75
    • /
    • 2013
  • This paper proposes a method of tracking the hand region and detecting the fingertip using only depth images. In order to eliminate the influence of lighting conditions and obtain information quickly and stably, this paper proposes a tracking method that relies only on depth information, as well as a method of using region growing to identify errors that can occur during the tracking process and a method of detecting the fingertip that can be applied for the recognition of various gestures. First, the closest point of approach is identified through the process of transferring the center point in order to locate the tracking point, and the region is grown from that point to detect the hand region and boundary line. Next, the ratio of the invalid boundary, obtained by means of region growing, is used to calculate the validity of the tracking region and thereby judge whether the tracking is normal. If tracking is normal, the contour line is extracted from the detected hand region and the curvature and RANSAC and Convex-Hull are used to detect the fingertip. Lastly, quantitative and qualitative analyses are performed to verify the performance in various situations and prove the efficiency of the proposed algorithm for tracking and detecting the fingertip.

Skin Color Based Hand and Finger Detection for Gesture Recognition in CCTV Surveillance (CCTV 관제에서 동작 인식을 위한 색상 기반 손과 손가락 탐지)

  • Kang, Sung-Kwan;Chung, Kyung-Yong;Rim, Kee-Wook;Lee, Jung-Hyun
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.10
    • /
    • pp.1-10
    • /
    • 2011
  • In this paper, we proposed the skin color based hand and finger detection technology for the gesture recognition in CCTV surveillance. The aim of this paper is to present the methodology for hand detection and propose the finger detection method. The detected hand and finger can be used to implement the non-contact mouse. This technology can be used to control the home devices such as home-theater and television. Skin color is used to segment the hand region from background and contour is extracted from the segmented hand. Analysis of contour gives us the location of finger tip in the hand. After detecting the location of the fingertip, this system tracks the fingertip by using only R channel alone, and in recognition of hand motions to apply differential image, such as the removal of useless image shows a robust side. We explain about experiment which relates in fingertip tracking and finger gestures recognition, and experiment result shows the accuracy above 96%.

Tangible AR interaction based on fingertip touch using small-sized non-square markers

  • Park, Hyungjun;Jung, Ho-Kyun;Park, Sang-Jin
    • Journal of Computational Design and Engineering
    • /
    • v.1 no.4
    • /
    • pp.289-297
    • /
    • 2014
  • Although big-sized markers are good for accurate marker recognition and tracking, they are easily occluded by other objects and deteriorate natural visualization and level of immersion during user interaction in AR environments. In this paper, we propose an approach to exploiting the use of rectangular markers to support tangible AR interaction based on fingertip touch using small-sized markers. It basically adjusts the length, width, and interior area of rectangular markers to make them more suitably fit to longish objects like fingers. It also utilizes convex polygons to resolve the partial occlusion of a marker and properly enlarges the pattern area of a marker while adjusting its size without deteriorating the quality of marker detection. We obtained encouraging results from users that the approach can provide better natural visualization and higher level of immersion, and be accurate and tangible enough to support a pseudo feeling of touching virtual products with human hands or fingertips during design evaluation of digital handheld products.

A Study On Positioning Of Mouse Cursor Using Kinect Depth Camera (Kinect Depth 카메라를이용한 마우스 커서의 위치 선정에 관한 연구)

  • Goo, Bong-Hoe;Lee, Seung-Ho
    • Journal of IKEEE
    • /
    • v.18 no.4
    • /
    • pp.478-484
    • /
    • 2014
  • In this paper, we propose new algorithm for positioning of mouse cursor using fingertip direction on kinect depth camera. The proposed algorithm uses center of parm points from distance transform when fingertip point toward screen. Otherwise, algorithm use fingertip points. After image preprocessing, the center of parm points is calculated from distance transform results. If the direction of the finger towards the camera becomes close to the distance between the fingertip point and center of parm point, it is possible to improve the accuracy of positioning by using the center of parm point. After remove arm on image, the fingertip points is obtained by using a pixel on the long distance from the center of the image. To calculate accuracy of mouse positioning, we selected any 5 points. Also, we calculated error rate between reference points and mouse points by performed 500 times. The error rate results could be confirmed the accuracy of our algorithm indicated an average error rate of less than 11%.

Tangible AR Interaction based on Fingertip Touch Using Small-Sized Markers (소형 마커를 이용한 손가락 터치 기반 감각형 증강현실 상호작용 방안)

  • Jung, Ho-Kyun;Park, Hyungjun
    • Korean Journal of Computational Design and Engineering
    • /
    • v.18 no.5
    • /
    • pp.374-383
    • /
    • 2013
  • Various interaction techniques have been studied for providing the feeling of touch and improve immersion in augmented reality (AR) environments. Tangible AR interaction exploiting two types (product-type and pointer-type) of simple objects has earned great interest for cost-effective design evaluation of digital handheld products. When the sizes of markers attached to the objects are kept big to obtain better marker recognition, the pointer-type object frequently and significantly occludes the product-type object, which deteriorates natural visualization and level of immersion in an AR environment. In this paper, in order to overcome such problems, we propose tangible AR interaction using fingertip touch combined with small-sized markers. The proposed approach facilitates the use of convex polygons to recover the boundaries of AR markers which are partially occluded. It also properly enlarges the pattern area of each AR marker to reduce the sizes of AR markers without sacrificing the quality of marker detection. We empirically verified the quality of the proposed approach, and applied it in the process of design evaluation of digital products. From experimental results, we found that the approach is comparably accurate enough to be applied to the design evaluation process and tangible enough to provide a pseudo feeling of manipulating virtual products with human hands.