• Title/Summary/Keyword: Leap Motion Controller

Search Result 12, Processing Time 0.035 seconds

Design of Realtime MIDI Controller by using Leap Motion (Leap Motion을 이용한 실시간 MIDI Controller의 설계)

  • So, Junseop
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2016.10a
    • /
    • pp.726-728
    • /
    • 2016
  • In this paper, a realtime MIDI controller is designed using leap motion. The controller makes virtual MIDI port on Windows that it is connected to the port and protocol for communication with DAW or VST. It is converted MIDI data when it is detected a real time the position and shape of hand. This converted data transfer MIDI port by MIDI CC(Control Change). Therefore this controller has the purpose to support flexible MIDI input function than existed MIDI controller.

  • PDF

Development of Baseball Game Using Leap Motion Controllers (립 모션 컨트롤러를 이용한 야구 게임 개발)

  • Joo, Hyanghan;Cho, Minsoo;In, SeungKyo;Cho, Kyuwon;Min, Jun-Ki
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.5
    • /
    • pp.343-350
    • /
    • 2015
  • While many games have been published that are used with input-devices such as a mouse and keyboard, the number of games that can recognize the behavior of a human utilizing devices such as Kinect and Wii has increased. In this paper, we present the development of a baseball game that utilizes a Leap Motion Controller. A Leap Motion Controller recognizes accurately the movement of a user's fingers. Our implemented game consists of characters, a background and animation. It is a moving animated game in which the users play a game in point of view of a third person. The major feature of our game is that the game players can enjoy the game using a Leap Motion Controller.

Drone Hand Gesture Control System for DJI Mavic Air (DJI 매빅에이어를 위한 드론 손 제스처 제어 시스템)

  • Hamzah, Mohd Haziq bin;Jung, Jinwoong;Lee, Joohyun;Choo, Hyunseung
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2018.10a
    • /
    • pp.333-334
    • /
    • 2018
  • This is a study on controlling a drone (DJI Mavic Air) with simple hand gesture using Leap Motion controller. Four component involve are MacBook, Leap Motion controller, Android device, and DJI Mavic Air. All of this component are connected through USB, Bluetooth, and Wi-Fi technology. The studies main purpose are to show that by controlling a drone through Leap Motion, drone amateur user can easily learn how to control a drone, and because of longer drone control range can be archived things such as search and rescue mission will be possible.

Selection of features and hidden Markov model parameters for English word recognition from Leap Motion air-writing trajectories

  • Deval Verma;Himanshu Agarwal;Amrish Kumar Aggarwal
    • ETRI Journal
    • /
    • v.46 no.2
    • /
    • pp.250-262
    • /
    • 2024
  • Air-writing recognition is relevant in areas such as natural human-computer interaction, augmented reality, and virtual reality. A trajectory is the most natural way to represent air writing. We analyze the recognition accuracy of words written in air considering five features, namely, writing direction, curvature, trajectory, orthocenter, and ellipsoid, as well as different parameters of a hidden Markov model classifier. Experiments were performed on two representative datasets, whose sample trajectories were collected using a Leap Motion Controller from a fingertip performing air writing. Dataset D1 contains 840 English words from 21 classes, and dataset D2 contains 1600 English words from 40 classes. A genetic algorithm was combined with a hidden Markov model classifier to obtain the best subset of features. Combination ftrajectory, orthocenter, writing direction, curvatureg provided the best feature set, achieving recognition accuracies on datasets D1 and D2 of 98.81% and 83.58%, respectively.

Hand Gesture Recognition with Convolution Neural Networks for Augmented Reality Cognitive Rehabilitation System Based on Leap Motion Controller (립모션 센서 기반 증강현실 인지재활 훈련시스템을 위한 합성곱신경망 손동작 인식)

  • Song, Keun San;Lee, Hyun Ju;Tae, Ki Sik
    • Journal of Biomedical Engineering Research
    • /
    • v.42 no.4
    • /
    • pp.186-192
    • /
    • 2021
  • In this paper, we evaluated prediction accuracy of Euler angle spectrograph classification method using a convolutional neural networks (CNN) for hand gesture recognition in augmented reality (AR) cognitive rehabilitation system based on Leap Motion Controller (LMC). Hand gesture recognition methods using a conventional support vector machine (SVM) show 91.3% accuracy in multiple motions. In this paper, five hand gestures ("Promise", "Bunny", "Close", "Victory", and "Thumb") are selected and measured 100 times for testing the utility of spectral classification techniques. Validation results for the five hand gestures were able to be correctly predicted 100% of the time, indicating superior recognition accuracy than those of conventional SVM methods. The hand motion recognition using CNN meant to be applied more useful to AR cognitive rehabilitation training systems based on LMC than sign language recognition using SVM.

Intuitive Spatial Drawing System based on Hand Interface (손 인터페이스 기반 직관적인 공간 드로잉 시스템)

  • Ko, Ginam;Kim, Serim;Kim, YoungEun;Nam, SangHun
    • Journal of Digital Contents Society
    • /
    • v.18 no.8
    • /
    • pp.1615-1620
    • /
    • 2017
  • The development of Virtual Reality (VR)-related technologies has resulted in the improved performance of VR devices as well as affordable price arrangements, granting many users easy access to VR technology. VR drawing applications are not complicated for users and are also highly mature, being used for education, performances, and more. For controller-based spatial drawing interfaces, the user's drawing interface becomes constrained by the controller. This study proposes hand interaction based spatial drawing system where the user, who has never used the controller before, can intuitively use the drawing application by mounting LEAP Motion at the front of the Head Mounted Display (HMD). This traces the motion of the user's hand in front of the HMD to draw curved surfaces in virtual environments.

3D Object Location Identification Using Finger Pointing and a Robot System for Tracking an Identified Object (손가락 Pointing에 의한 물체의 3차원 위치정보 인식 및 인식된 물체 추적 로봇 시스템)

  • Gwak, Dong-Gi;Hwang, Soon-Chul;Ok, Seo-Won;Yim, Jung-Sae;Kim, Dong Hwan
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.24 no.6
    • /
    • pp.703-709
    • /
    • 2015
  • In this work, a robot aimed at grapping and delivering an object by using a simple finger-pointing command from a hand- or arm-handicapped person is introduced. In this robot system, a Leap Motion sensor is utilized to obtain the finger-motion data of the user. In addition, a Kinect sensor is also used to measure the 3D (Three Dimensional)-position information of the desired object. Once the object is pointed at through the finger pointing of the handicapped user, the exact 3D information of the object is determined using an image processing technique and a coordinate transformation between the Leap Motion and Kinect sensors. It was found that the information obtained is transmitted to the robot controller, and that the robot eventually grabs the target and delivers it to the handicapped person successfully.

Searching Human Motion Data by Sketching 3D Trajectories (3차원 이동 궤적 묘사를 통한 인간 동작 데이터 검색)

  • Lee, Kang Hoon
    • Journal of the Korea Computer Graphics Society
    • /
    • v.19 no.2
    • /
    • pp.1-8
    • /
    • 2013
  • Captured human motion data has been widely utilized for understanding the mechanism of human motion and synthesizing the animation of virtual characters. Searching for desired motions from given motion data is an important prerequisite of analyzing and editing those selected motions. This paper presents a new method of content-based motion retrieval without the need of additional metadata such as keywords. While existing search methods have focused on skeletal configurations of body pose or planar trajectories of locomotion, our method receives a three-dimensional trajectory as its input query and retrieves a set of motion intervals in which the trajectories of body parts such as hands, foods, and pelvis are similar to the input trajectory. In order to allow the user to intuitively sketch spatial trajectories, we used the Leap Motion controller that can precisely trace finger movements as the input device for our experiments. We have evaluated the effectiveness of our approach by conducting a user study in which the users search for dozens of pre-selected motions from baseketball motion data including a variety of moves such as dribbling and shooting.

Design of Gaming Interaction Control using Gesture Recognition and VR Control in FPS Game (FPS 게임에서 제스처 인식과 VR 컨트롤러를 이용한 게임 상호 작용 제어 설계)

  • Lee, Yong-Hwan;Ahn, Hyochang
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.4
    • /
    • pp.116-119
    • /
    • 2019
  • User interface/experience and realistic game manipulation play an important role in virtual reality first-person-shooting game. This paper presents an intuitive hands-free interface of gaming interaction scheme for FPS based on user's gesture recognition and VR controller. We focus on conventional interface of VR FPS interaction, and design the player interaction wearing head mounted display with two motion controllers; leap motion to handle low-level physics interaction and VIVE tracker to control movement of the player joints in the VR world. The FPS prototype system shows that the design interface helps to enjoy playing immersive FPS and gives players a new gaming experience.

A Security System using a Movement Pattern drawn with Fingers (손가락으로 그린 움직임 패턴을 이용한 보안 시스템)

  • Han, Juchan;Jeon, Minseong;Cheoi, Kyungjoo
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2016.04a
    • /
    • pp.730-732
    • /
    • 2016
  • 본 논문에서는 보다 보안을 높이기 위해서 인가된 사용자의 암호를 번호로 구성하지 않고, 사용자만이 알고 있는 간단하면서도 독특한 손가락의 움직임 패턴으로 구성하고, 이를 적용한 보안 시스템을 제안하였다. 제안하는 시스템은 등록단계에서 립모션 컨트롤러(Leap Motion Controller)를 사용하여 손가락의 보안패턴을 입력받았으며, OpenGL과 OpenCV를 사용하여 구현하였다. 실험 결과 제안하는 시스템은 잘못된 보안 패턴을 정확히 걸러내었으며, 정인식률도 91.4%로 만족할 만한 성능을 보여주었다.