• Title/Summary/Keyword: Finger Pointing

Search Result 16, Processing Time 0.019 seconds

Finger-Pointing Gesture Analysis for Slide Presentation

  • Harika, Maisevli;Setijadi P, Ary;Hindersah, Hilwadi;Sin, Bong-Kee
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.8
    • /
    • pp.1225-1235
    • /
    • 2016
  • This paper presents a method for computer-assisted slide presentation using vision-based gesture recognition. The proposed method consists of a sequence of steps, first detecting a hand in the scene of projector beam, then estimating the smooth trajectory of a hand or a pointing finger using Kalman Filter, and finally interfacing to an application system. Additional slide navigation control includes moving back and forth the pages of the presentation. The proposed method is to help speakers for an effective presentation with natural improved interaction with the computer. In particular, the proposed method of using finger pointing is believed to be more effective than using a laser pointer since the hand, the pointing or finger are more visible and thus can better grab the attention of the audience.

3D Object Location Identification Using Finger Pointing and a Robot System for Tracking an Identified Object (손가락 Pointing에 의한 물체의 3차원 위치정보 인식 및 인식된 물체 추적 로봇 시스템)

  • Gwak, Dong-Gi;Hwang, Soon-Chul;Ok, Seo-Won;Yim, Jung-Sae;Kim, Dong Hwan
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.24 no.6
    • /
    • pp.703-709
    • /
    • 2015
  • In this work, a robot aimed at grapping and delivering an object by using a simple finger-pointing command from a hand- or arm-handicapped person is introduced. In this robot system, a Leap Motion sensor is utilized to obtain the finger-motion data of the user. In addition, a Kinect sensor is also used to measure the 3D (Three Dimensional)-position information of the desired object. Once the object is pointed at through the finger pointing of the handicapped user, the exact 3D information of the object is determined using an image processing technique and a coordinate transformation between the Leap Motion and Kinect sensors. It was found that the information obtained is transmitted to the robot controller, and that the robot eventually grabs the target and delivers it to the handicapped person successfully.

Infrared Sensitive Camera Based Finger-Friendly Interactive Display System

  • Ghimire, Deepak;Kim, Joon-Cheol;Lee, Kwang-Jae;Lee, Joon-Whoan
    • International Journal of Contents
    • /
    • v.6 no.4
    • /
    • pp.49-56
    • /
    • 2010
  • In this paper we present a system that enables the user to interact with large display system even without touching the screen. With two infrared sensitive cameras mounted on the bottom left and bottom right of the display system pointing upwards, the user fingertip position on the selected region of interest of each camera view is found using vertical intensity profile of the background subtracted image. The position of the finger in two images of left and right camera is mapped to the display screen coordinate by using pre-determined matrices, which are calculated by interpolating samples of user finger position on the images taken by pointing finger over some known coordinate position of the display system. The screen is then manipulated according to the calculated position and depth of the fingertip with respect to the display system. Experimental results demonstrate an efficient, robust and stable human computer interaction.

Cursor Control by the Finger Moton Using Circular Pattern Vector Algorithm (원형 패턴 벡터 알고리즘을 이용한 손가락 이동에 의한 커서제어)

  • 정향영;신일식;손영선
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.6
    • /
    • pp.487-490
    • /
    • 2001
  • In this paper, we realize a system that moves a cursor with a finger using the circular pattern vector algorithm that in one of the image analysis algorithms. To apply this algorithm, we use central point of the biggest circle among the various circles that recognize the image of the hand , and find out the pointing finger by looking for the distance of the outline of the hand from the central point. The horizontal direction of the cursor on the display is controlled by converting the direction of the pointing finger to the analysis of the plane corrdinate. Because of setting up only one camera of the upper, the middle and the lower discretely. On account of the discrete movement of the cursor of the vertical direction, we move th cursor to the objective, which the user wants. by expanding the local are to the whole area.

  • PDF

Implementation of interactive 3D floating image pointing device (인터렉티브 3D 플로팅 영상 포인팅 장치의 구현)

  • Shin, Dong-Hak;Kim, Eun-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.8
    • /
    • pp.1481-1487
    • /
    • 2008
  • In this paper, we propose a novel interactive 3D floating image pointing device for the use of 3D environment. The proposed system consists of the 3D floating image generation system by use of a floating lens array and the a user interface based on real-time finger detection. In the proposed system, a user selects single image among the floating images so that the interaction function are performed effectively by pressing the button event through the finger recognition using two cameras. To show the usefulness of the proposed system, we carry out the experiment and the preliminary results are presented.

Vision-Based Finger Action Recognition by Angle Detection and Contour Analysis

  • Lee, Dae-Ho;Lee, Seung-Gwan
    • ETRI Journal
    • /
    • v.33 no.3
    • /
    • pp.415-422
    • /
    • 2011
  • In this paper, we present a novel vision-based method of recognizing finger actions for use in electronic appliance interfaces. Human skin is first detected by color and consecutive motion information. Then, fingertips are detected by a novel scale-invariant angle detection based on a variable k-cosine. Fingertip tracking is implemented by detected region-based tracking. By analyzing the contour of the tracked fingertip, fingertip parameters, such as position, thickness, and direction, are calculated. Finger actions, such as moving, clicking, and pointing, are recognized by analyzing these fingertip parameters. Experimental results show that the proposed angle detection can correctly detect fingertips, and that the recognized actions can be used for the interface with electronic appliances.

Design and Evaluation of a Hand-held Device for Recognizing Mid-air Hand Gestures (공중 손동작 인식을 위한 핸드 헬드형 기기의 설계 및 평가)

  • Seo, Kyeongeun;Cho, Hyeonjoong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.2
    • /
    • pp.91-96
    • /
    • 2015
  • We propose AirPincher, a handheld pointing device for recognizing delicate mid-air hand gestures to control a remote display. AirPincher is designed to overcome disadvantages of the two kinds of existing hand gesture-aware techniques such as glove-based and vision-based. The glove-based techniques cause cumbersomeness of wearing gloves every time and the vision-based techniques incur performance dependence on distance between a user and a remote display. AirPincher allows a user to hold the device in one hand and to generate several delicate finger gestures. The gestures are captured by several sensors proximately embedded into AirPincher. These features help AirPincher avoid the aforementioned disadvantages of the existing techniques. We experimentally find an efficient size of the virtual input space and evaluate two types of pointing interfaces with AirPincher for a remote display. Our experiments suggest appropriate configurations to use the proposed device.

Knitted Data Glove System for Finger Motion Classification (손가락 동작 분류를 위한 니트 데이터 글러브 시스템)

  • Lee, Seulah;Choi, Yuna;Cha, Gwangyeol;Sung, Minchang;Bae, Jihyun;Choi, Youngjin
    • The Journal of Korea Robotics Society
    • /
    • v.15 no.3
    • /
    • pp.240-247
    • /
    • 2020
  • This paper presents a novel knitted data glove system for pattern classification of hand posture. Several experiments were conducted to confirm the performance of the knitted data glove. To find better sensor materials, the knitted data glove was fabricated with stainless-steel yarn and silver-plated yarn as representative conductive yarns, respectively. The result showed that the signal of the knitted data glove made of silver-plated yarn was more stable than that of stainless-steel yarn according as the measurement distance becomes longer. Also, the pattern classification was conducted for the performance verification of the data glove knitted using the silver-plated yarn. The average classification reached at 100% except for the pointing finger posture, and the overall classification accuracy of the knitted data glove was 98.3%. With these results, we expect that the knitted data glove is applied to various robot fields including the human-machine interface.

Stereo-Vision-Based Human-Computer Interaction with Tactile Stimulation

  • Yong, Ho-Joong;Back, Jong-Won;Jang, Tae-Jeong
    • ETRI Journal
    • /
    • v.29 no.3
    • /
    • pp.305-310
    • /
    • 2007
  • If a virtual object in a virtual environment represented by a stereo vision system could be touched by a user with some tactile feeling on his/her fingertip, the sense of reality would be heightened. To create a visual impression as if the user were directly pointing to a desired point on a virtual object with his/her own finger, we need to align virtual space coordinates and physical space coordinates. Also, if there is no tactile feeling when the user touches a virtual object, the virtual object would seem to be a ghost. Therefore, a haptic interface device is required to give some tactile sensation to the user. We have constructed such a human-computer interaction system in the form of a simple virtual reality game using a stereo vision system, a vibro-tactile device module, and two position/orientation sensors.

  • PDF

The Development of an Object-linked Broadcasting System

  • Kanatsugu, Yasuaki;Misu, Toshihiko;Takahashi, Masaki;Gohshi, Seiichi
    • Journal of Broadcast Engineering
    • /
    • v.9 no.2
    • /
    • pp.102-109
    • /
    • 2004
  • We have proposed an Object-linked Broadcasting Service that displays various data related to the object onscreen. In thls paper, we describe the structure of the Object-linked Broadcasting System that will enable our proposal to be realized, and report new techniques that we have developed to create the system. We have carried out the experiment to confirm the system performance as well as execution of each technology assembling the system. We have confirmed that the performance of the system we developed satisfies the proposed specification based on user requirements and current technology.