• Title/Summary/Keyword: Finger gesture

Search Result 67, Processing Time 0.022 seconds

Development of Sensor System for Finger Gesture (수화 인식에 대한 센서 시스템)

  • Lee, Jaehong;Jeong, Eunseok;Kim, DaeEun
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2011.07a
    • /
    • pp.4-5
    • /
    • 2011
  • 수화는 몸 동작 또는 손가락의 움직임을 통하여 상호 커뮤니케이션을 하는 하나의 언어이며, 이 언어의 디지탈 미디어로의 소통을 위해서는 동작을 하나의 의미 있는 단어, 음절로의 표현이 가능해야 한다. 여기서는 몸 동작, 팔 다리의 움직임 보다는 손가락의 움직임에 초점을 맞추어 지문자 인식에 필요한 센서 시스템에 대하여 고찰한다. 우선 연속적인 지문자, 지숫자의 입력을 인식하기 위해서는 각 문자 절음 인식이 가장 중요한 문제가 된다. 절음 위치를 인식하는 것은 현재 입력된 패턴과 다음 패턴을 구분지어 각각 다른 지문자 혹은 지숫자로 인식할 수 있게 하는 기반이 된다. 손가락 구부러짐의 구분 및 인식을 위한 방법의 개발을 위해, 언어별 수화의 특징 분석을 토대로 다양한 적용 가능한 센서를 탐색하고 수화 장갑을 위한 원천기술을 개발, 수화 장갑 프로토타입을 제작하였다.

  • PDF

MPEG-U-based Advanced User Interaction Interface Using Hand Posture Recognition

  • Han, Gukhee;Choi, Haechul
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.5 no.4
    • /
    • pp.267-273
    • /
    • 2016
  • Hand posture recognition is an important technique to enable a natural and familiar interface in the human-computer interaction (HCI) field. This paper introduces a hand posture recognition method using a depth camera. Moreover, the hand posture recognition method is incorporated with the Moving Picture Experts Group Rich Media User Interface (MPEG-U) Advanced User Interaction (AUI) Interface (MPEG-U part 2), which can provide a natural interface on a variety of devices. The proposed method initially detects positions and lengths of all fingers opened, and then recognizes the hand posture from the pose of one or two hands, as well as the number of fingers folded when a user presents a gesture representing a pattern in the AUI data format specified in MPEG-U part 2. The AUI interface represents a user's hand posture in the compliant MPEG-U schema structure. Experimental results demonstrate the performance of the hand posture recognition system and verified that the AUI interface is compatible with the MPEG-U standard.

Design of OpenCV based Finger Recognition System using binary processing and histogram graph

  • Baek, Yeong-Tae;Lee, Se-Hoon;Kim, Ji-Seong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.2
    • /
    • pp.17-23
    • /
    • 2016
  • NUI is a motion interface. It uses the body of the user without the use of HID device such as a mouse and keyboard to control the device. In this paper, we use a Pi Camera and sensors connected to it with small embedded board Raspberry Pi. We are using the OpenCV algorithms optimized for image recognition and computer vision compared with traditional HID equipment and to implement a more human-friendly and intuitive interface NUI devices. comparison operation detects motion, it proposed a more advanced motion sensors and recognition systems fused connected to the Raspberry Pi.

Cognitive and Emotional Structure of a Robotic Game Player in Turn-based Interaction

  • Yang, Jeong-Yean
    • International journal of advanced smart convergence
    • /
    • v.4 no.2
    • /
    • pp.154-162
    • /
    • 2015
  • This paper focuses on how cognitive and emotional structures affect humans during long-term interaction. We design an interaction with a turn-based game, the Chopstick Game, in which two agents play with numbers using their fingers. While a human and a robot agent alternate turn, the human user applies herself to play the game and to learn new winning skills from the robot agent. Conventional valence and arousal space is applied to design emotional interaction. For the robotic system, we implement finger gesture recognition and emotional behaviors that are designed for three-dimensional virtual robot. In the experimental tests, the properness of the proposed schemes is verified and the effect of the emotional interaction is discussed.

A Hand Gesture Recognition System using 3D Tracking Volume Restriction Technique (3차원 추적영역 제한 기법을 이용한 손 동작 인식 시스템)

  • Kim, Kyung-Ho;Jung, Da-Un;Lee, Seok-Han;Choi, Jong-Soo
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.6
    • /
    • pp.201-211
    • /
    • 2013
  • In this paper, we propose a hand tracking and gesture recognition system. Our system employs a depth capture device to obtain 3D geometric information of user's bare hand. In particular, we build a flexible tracking volume and restrict the hand tracking area, so that we can avoid diverse problems caused by conventional object detection/tracking systems. The proposed system computes running average of the hand position, and tracking volume is actively adjusted according to the statistical information that is computed on the basis of uncertainty of the user's hand motion in the 3D space. Once the position of user's hand is obtained, then the system attempts to detect stretched fingers to recognize finger gesture of the user's hand. In order to test the proposed framework, we built a NUI system using the proposed technique, and verified that our system presents very stable performance even in the case that multiple objects exist simultaneously in the crowded environment, as well as in the situation that the scene is occluded temporarily. We also verified that our system ensures running speed of 24-30 frames per second throughout the experiments.

Motion Plane Estimation for Real-Time Hand Motion Recognition (실시간 손동작 인식을 위한 동작 평면 추정)

  • Jeong, Seung-Dae;Jang, Kyung-Ho;Jung, Soon-Ki
    • The KIPS Transactions:PartB
    • /
    • v.16B no.5
    • /
    • pp.347-358
    • /
    • 2009
  • In this thesis, we develop a vision based hand motion recognition system using a camera with two rotational motors. Existing systems were implemented using a range camera or multiple cameras and have a limited working area. In contrast, we use an uncalibrated camera and get more wide working area by pan-tilt motion. Given an image sequence provided by the pan-tilt camera, color and pattern information are integrated into a tracking system in order to find the 2D position and direction of the hand. With these pose information, we estimate 3D motion plane on which the gesture motion trajectory from approximately forms. The 3D trajectory of the moving finger tip is projected into the motion plane, so that the resolving power of the linear gesture patterns is enhanced. We have tested the proposed approach in terms of the accuracy of trace angle and the dimension of the working volume.

Design and Evaluation of a Hand-held Device for Recognizing Mid-air Hand Gestures (공중 손동작 인식을 위한 핸드 헬드형 기기의 설계 및 평가)

  • Seo, Kyeongeun;Cho, Hyeonjoong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.2
    • /
    • pp.91-96
    • /
    • 2015
  • We propose AirPincher, a handheld pointing device for recognizing delicate mid-air hand gestures to control a remote display. AirPincher is designed to overcome disadvantages of the two kinds of existing hand gesture-aware techniques such as glove-based and vision-based. The glove-based techniques cause cumbersomeness of wearing gloves every time and the vision-based techniques incur performance dependence on distance between a user and a remote display. AirPincher allows a user to hold the device in one hand and to generate several delicate finger gestures. The gestures are captured by several sensors proximately embedded into AirPincher. These features help AirPincher avoid the aforementioned disadvantages of the existing techniques. We experimentally find an efficient size of the virtual input space and evaluate two types of pointing interfaces with AirPincher for a remote display. Our experiments suggest appropriate configurations to use the proposed device.

Real-Time Recognition Method of Counting Fingers for Natural User Interface

  • Lee, Doyeob;Shin, Dongkyoo;Shin, Dongil
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.5
    • /
    • pp.2363-2374
    • /
    • 2016
  • Communication occurs through verbal elements, which usually involve language, as well as non-verbal elements such as facial expressions, eye contact, and gestures. In particular, among these non-verbal elements, gestures are symbolic representations of physical, vocal, and emotional behaviors. This means that gestures can be signals toward a target or expressions of internal psychological processes, rather than simply movements of the body or hands. Moreover, gestures with such properties have been the focus of much research for a new interface in the NUI/NUX field. In this paper, we propose a method for recognizing the number of fingers and detecting the hand region based on the depth information and geometric features of the hand for application to an NUI/NUX. The hand region is detected by using depth information provided by the Kinect system, and the number of fingers is identified by comparing the distance between the contour and the center of the hand region. The contour is detected using the Suzuki85 algorithm, and the number of fingers is calculated by detecting the finger tips in a location at the maximum distance to compare the distances between three consecutive dots in the contour and the center point of the hand. The average recognition rate for the number of fingers is 98.6%, and the execution time is 0.065 ms for the algorithm used in the proposed method. Although this method is fast and its complexity is low, it shows a higher recognition rate and faster recognition speed than other methods. As an application example of the proposed method, this paper explains a Secret Door that recognizes a password by recognizing the number of fingers held up by a user.

Development of Five Finger type Myoelectric Hand Prosthesis for State Transition-Based Multi-Hand Gestures change (다중 손동작 변환을 위한 상태 전이 기반 5손가락 근전전동의수 개발)

  • Seung-Gi Kim;Sung-Yoon Jung;Beom-ki Hong;Hyun-Jun Shin;Kyoung-Ho Kim;Se-Hoon Park
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.25 no.2
    • /
    • pp.67-76
    • /
    • 2024
  • Various types of assistive devices have been developed for upper limb amputees over the years, with myoelectric prosthesis particularly aimed at improving user convenience by enabling a range of hand gestures beyond simple grasping, tailored to the size and shape of objects. In this study, we developed a five-finger myoelectric prosthesis mimicking human hand size and finger movements, utilizing motor and worm gear mechanisms for stable and independent operation. Based on this, we designed a control system for independent finger control through electromyographic signal input, proposed a state transition-based hand gesture conversion algorithm by selecting representative eight hand gestures and defining conversion condition parameters. We introduced training and usability evaluation methods, and conducted usability assessments among upper limb amputees using dedicated tools, confirming the potential for commercial application of the algorithm and observing adaptive capabilities and high performance through iterative evaluations.

Developed a golf course scorecard App that improved UI/UX based on C/S (C/S 기반의 UI/UX를 개선한 골프장 스코어카드 App 개발)

  • Jung, Chul-Jong
    • Journal of Digital Contents Society
    • /
    • v.19 no.8
    • /
    • pp.1433-1442
    • /
    • 2018
  • This study develops and improves the EZ Touch App of the scorecard application (app) using the smartphone and the pad, and works with the customer management system (C/S). The research was conducted as follows. First, how do you handle the EZ Touch input method on a scorecard? Second, how to configure the platform of customer (member) management system (C/S) and data server system? Third, does EZ Touch App work organically with customer management system (C/S)? The developed EZ Touch is entered into the scorecard as an input method using the gesture as a result of this research, and it is linked with the C/S system to organize the review function, hall information function, field coaching function through score, It can be used for applications such as information management functions and statistics through differentiated statistics. However, there are some problems and improvements in user convenience in real time use. I think there is a need to study to solve this problem in the future. EZ Touch input method is input to the scorecard by inputting gesture of the finger as a result of this study, and it is linked with this, and it is possible to use differentiated statistics such as review function, hall information function, field coaching function, It is the purpose of the study to improve the technical competitiveness of the product by developing the application.