• Title/Summary/Keyword: 손동작 추적

Search Result 55, Processing Time 0.023 seconds

Vision and Depth Information based Real-time Hand Interface Method Using Finger Joint Estimation (손가락 마디 추정을 이용한 비전 및 깊이 정보 기반 손 인터페이스 방법)

  • Park, Kiseo;Lee, Daeho;Park, Youngtae
    • Journal of Digital Convergence
    • /
    • v.11 no.7
    • /
    • pp.157-163
    • /
    • 2013
  • In this paper, we propose a vision and depth information based real-time hand gesture interface method using finger joint estimation. For this, the areas of left and right hands are segmented after mapping of the visual image and depth information image, and labeling and boundary noise removal is performed. Then, the centroid point and rotation angle of each hand area are calculated. Afterwards, a circle is expanded at following pattern from a centroid point of the hand to detect joint points and end points of the finger by obtaining the midway points of the hand boundary crossing and the hand model is recognized. Experimental results that our method enabled fingertip distinction and recognized various hand gestures fast and accurately. As a result of the experiment on various hand poses with the hidden fingers using both hands, the accuracy showed over 90% and the performance indicated over 25 fps. The proposed method can be used as a without contacts input interface in HCI control, education, and game applications.

Design and Implementation of Finger Language Translation System using Raspberry Pi and Leap Motion (라즈베리 파이와 립 모션을 이용한 지화 번역 시스템 설계 및 구현)

  • Jeong, Pil-Seong;Cho, Yang-Hyun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.9
    • /
    • pp.2006-2013
    • /
    • 2015
  • Deaf are it is difficult to communicate to represent the voice heard, so theay use mostly using the speech, sign language, writing, etc. to communicate. It is the best way to use sign language, in order to communicate deaf and normal people each other. But they must understand to use sign language. In this paper, we designed and implementated finger language translation system to support communicate between deaf and normal people. We used leap motion as input device that can track finger and hand gesture. We used raspberry pi that is low power sing board computer to process input data and translate finger language. We implemented application used Node.js and MongoDB. The client application complied with HTML5 so that can be support any smart device with web browser.

Leap Motion Framework for Juggling Motion According to User Motion in Virtual Environment

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.26 no.11
    • /
    • pp.51-57
    • /
    • 2021
  • In this paper, we propose a new framework that calculates the user's hand motions using a Leap Motion device, and uses this to practice and analyze arm muscles as well as juggling motions. The proposed method can map the movement of the ball in a virtual environment according to the user's hand motions in real time, and analyze the amount of exercise by visualizing the relaxation and contraction of the muscles. The proposed framework consists of three main parts : 1) It tracks the user's hand position with the Leap Motion device. 2) As with juggling, the action pattern of the user throwing the ball is defined as an event. 3) We propose a parabola-based particle method to map the movement of a juggling shape to a ball based on the user's hand position. As a result, using the our framework, it is possible to play a juggling game in real-time.

System implementation share of voice and sign language (지화인식 기반의 음성 및 SNS 공유 시스템 구현)

  • Kang, Jung-Hun;Yang, Dea-Sik;Oh, Min-Seok;Sir, Jung-Wook
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2016.10a
    • /
    • pp.644-646
    • /
    • 2016
  • Deaf are it is difficult to communicate to represent the voice heard, so theay use mostly using the speech, sign language, writing, etc. to communicate. It is the best way to use sign language, in order to communicate deaf and normal people each other. But they must understand to use sign language. In this paper, we designed and implementated finger language translation system to support communicate between deaf and normal people. We used leap motion as input device that can track finger and hand gesture. We used raspberry pi that is low power sing board computer to process input data and translate finger language. We implemented application used Node.js and MongoDB. The client application complied with HTML5 so that can be support any smart device with web browser.

  • PDF

A Study on the Design and Implementation of a Camera-Based 6DoF Tracking and Pose Estimation System (카메라 기반 6DoF 추적 및 포즈 추정 시스템의 설계 및 구현에 관한 연구)

  • Do-Yoon Jeong;Hee-Ja Jeong;Nam-Ho Kim
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.24 no.5
    • /
    • pp.53-59
    • /
    • 2024
  • This study presents the design and implementation of a camera-based 6DoF (6 Degrees of Freedom) tracking and pose estimation system. In particular, we propose a method for accurately estimating the positions and orientations of all fingers of a user utilizing a 6DoF robotic arm. The system is developed using the Python programming language, leveraging the Mediapipe and OpenCV libraries. Mediapipe is employed to extract keypoints of the fingers in real-time, allowing for precise recognition of the joint positions of each finger. OpenCV processes the image data collected from the camera to analyze the finger positions, thereby enabling pose estimation. This approach is designed to maintain high accuracy despite varying lighting conditions and changes in hand position. The proposed system's performance has been validated through experiments, evaluating the accuracy of hand gesture recognition and the control capabilities of the robotic arm. The experimental results demonstrate that the system can estimate finger positions in real-time, facilitating precise movements of the 6DoF robotic arm. This research is expected to make significant contributions to the fields of robotic control and human-robot interaction, opening up various possibilities for future applications. The findings of this study will aid in advancing robotic technology and promoting natural interactions between humans and robots.