• Title/Summary/Keyword: hand interface

Search Result 600, Processing Time 0.038 seconds

Alphabetical Gesture Recognition using HMM (HMM을 이용한 알파벳 제스처 인식)

  • Yoon, Ho-Sub;Soh, Jung;Min, Byung-Woo
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 1998.10c
    • /
    • pp.384-386
    • /
    • 1998
  • The use of hand gesture provides an attractive alternative to cumbersome interface devices for human-computer interaction(HCI). Many methods hand gesture recognition using visual analysis have been proposed such as syntactical analysis, neural network(NN), Hidden Markov Model(HMM) and so on. In our research, a HMMs is proposed for alphabetical hand gesture recognition. In the preprocessing stage, the proposed approach consists of three different procedures for hand localization, hand tracking and gesture spotting. The hand location procedure detects the candidated regions on the basis of skin-color and motion in an image by using a color histogram matching and time-varying edge difference techniques. The hand tracking algorithm finds the centroid of a moving hand region, connect those centroids, and thus, produces a trajectory. The spotting a feature database, the proposed approach use the mesh feature code for codebook of HMM. In our experiments, 1300 alphabetical and 1300 untrained gestures are used for training and testing, respectively. Those experimental results demonstrate that the proposed approach yields a higher and satisfying recognition rate for the images with different sizes, shapes and skew angles.

  • PDF

Interactive sound experience interface based on virtual concert hall (가상 콘서트홀 기반의 인터랙티브 음향 체험 인터페이스)

  • Cho, Hye-Seung;Kim, Hyoung-Gook
    • The Journal of the Acoustical Society of Korea
    • /
    • v.36 no.2
    • /
    • pp.130-135
    • /
    • 2017
  • In this paper, we propose an interface for interactive sound experience in the virtual concert hall. The proposed interface consists of two systems, called 'virtual acoustic position' and 'virtual active listening'. To provide these systems, we applied an artificial reverberation algorithm, multi-channel source separation and head-related transfer function. The proposed interface was implemented by using Unity. The interface provides the virtual concert hall to user through Oculus Rift, one of the virtual reality headsets. Moreover, we used Leap Motion as a control device to allow a user experience the system with free-hand. And user can experience the sound of the system through headphones.

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.4
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

Control of Grain Size of PZT Thin Film through Seed Layers (Seed Layer를 통한 PZT 박막의 결정립 크기 조절)

  • Kim, Tae-Ho;Kim, Ji-Young;Lee, In-Sup
    • Proceedings of the Korean Institute of Electrical and Electronic Material Engineers Conference
    • /
    • 2000.05b
    • /
    • pp.273-278
    • /
    • 2000
  • In order to study effects of interface layers between PZT films and electrodes for MFM(Metal-Ferroelectric-Metal) structure capacitors, we have fabricated the capacitors with the Pt/PZT/interface-layer/Pt/$TiO_2/SiO_2/Si$ structure. $PT(PbTiO_3)$ interface layers were formed by sol-gel deposition and PbO, $ZrO_2$ and $TiO_2$ thin layers were deposited by reactive sputtering. $TiO_2$ interface layers result in the finest grains of PZT films compared to $PbO_2$ and $ZrO_2$ layers. On the other hand, PT interface layers result in improved morphology of PZT films and do not significantly change ferroelectric properties. It is also observed that seed layers at the middle and top of PZT films do not give significant effects on grain size but the PT seed layer at the interface between the bottom electrode and the PZT films results in the small grain size.

  • PDF

A Force-Reflecting Haptic interface using Ultrasonic Motors (초음파 모터를 이용한 힘 반영 촉각장치)

  • Shin, Duk;Oh, Geum-Kon;Kim, Young-Dong
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.13 no.1
    • /
    • pp.111-118
    • /
    • 1999
  • Throughout this thesis, I describe the design, fabrication, and evaluation of the 3 DOF farce-reflecting haptic interface using USMs(ultrasonic motors). This haptic interface allows a htmlaIl "observer" to explore and interact with a virtual environrrent for the sense of touch. To effectively display the mechanical impedance of the htmlaIl hand we need a haptic device with specific characteristics, such as low inertia, alrmst zero friction and very high stiffness. USMs have attracted considerable attention as the actuator satisfied these conditions. An observer may grasp the end effector of revice and interact with surfaces and objects created within a virtual environment The revice provires force feedback, allowing users to "feel" objects within the environment. The device works very well, as users are able to detect the edge of the wall, the stiffness of the button and the puncture. TIle force-reflecting haptic interface could be suitable as a master for micro-surgery or as an interface to virtual reality training systems.

  • PDF

Vision-based hand Gesture Detection and Tracking System (비전 기반의 손동작 검출 및 추적 시스템)

  • Park Ho-Sik;Bae Cheol-soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.12C
    • /
    • pp.1175-1180
    • /
    • 2005
  • We present a vision-based hand gesture detection and tracking system. Most conventional hand gesture recognition systems utilize a simpler method for hand detection such as background subtractions with assumed static observation conditions and those methods are not robust against camera motions, illumination changes, and so on. Therefore, we propose a statistical method to recognize and detect hand regions in images using geometrical structures. Also, Our hand tracking system employs multiple cameras to reduce occlusion problems and non-synchronous multiple observations enhance system scalability. In this experiment, the proposed method has recognition rate of $99.28\%$ that shows more improved $3.91\%$ than the conventional appearance method.

The Center of Hand Detection Using Geometric feature of Hand Image (손 이미지의 기하학적 특징을 이용한 중심 검출)

  • Kim, Min-Ha;Lee, Sang-Geol;Cho, Jae-Hyun;Cha, Eui-Young
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2012.07a
    • /
    • pp.311-313
    • /
    • 2012
  • 본 논문에서는 RGBD(Red Green Blue Depth)센서를 이용하여 얻은 영상의 깊이 정보와 손 이미지의 기하학적 특징을 이용하여 손의 중심을 검출하는 방법을 제안한다. 영상의 깊이 정보와 피부색 정보를 이용하여 손 영역을 검출한다. 검출된 손의 기하학적 정보로 손에 대한 볼록 외피(convex hull)를 형성한다. 볼록 외피의 정점들(vertices)의 위치 정보를 이용하여 손의 중심을 찾는다. 손의 중심은 손의 위치를 추적하거나 손가락 개수를 구하는 것 등에 이용될 수 있다. 이러한 응용은 인간과 컴퓨터의 상호작용(HCI, Human Computer Interface)을 이용한 시스템에 적용될 수 있다.

  • PDF

EEG Signals Measurement and Analysis Method for Brain-Computer Interface (뇌와 컴퓨터의 인터페이스를 위한 뇌파 측정 및 분석 방법)

  • Sim, Kwee-Bo;Yeom, Hong-Gi;Lee, In-Yong
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.5
    • /
    • pp.605-610
    • /
    • 2008
  • There are many methods for Human-Computer Interface. Recently, many researchers are studying about Brain-Signal this is because not only the disabled can use a computer by their thought without their limbs but also it is convenient to general people. But, studies about it are early stages. This paper proposes an EEG signals measurement and analysis methods for Brain-Computer Interface. Our purpose of this research is recognition of subject's intention when they imagine moving their arms. EEG signals are recorded during imaginary movement of subject's arms at electrode positions Fp1, Fp2, C3, C4. We made an analysis ERS(Event-Related Synchronization) and ERD(Event-Related Desynchronization) which are detected when people move their limbs in the ${\mu}$ waves and ${\beta}$ waves. Results of this research showed that ${\mu}$ waves are decreased and ${\beta}$ waves are increased at left brain during the imaginary movement of right hand. In contrast, ${\mu}$ waves are decreased and ${\beta}$ waves are increased at right brain during the imaginary movement of left hand.

Hierarchical Hand Pose Model for Hand Expression Recognition (손 표현 인식을 위한 계층적 손 자세 모델)

  • Heo, Gyeongyong;Song, Bok Deuk;Kim, Ji-Hong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.25 no.10
    • /
    • pp.1323-1329
    • /
    • 2021
  • For hand expression recognition, hand pose recognition based on the static shape of the hand and hand gesture recognition based on the dynamic hand movement are used together. In this paper, we propose a hierarchical hand pose model based on finger position and shape for hand expression recognition. For hand pose recognition, a finger model representing the finger state and a hand pose model using the finger state are hierarchically constructed, which is based on the open source MediaPipe. The finger model is also hierarchically constructed using the bending of one finger and the touch of two fingers. The proposed model can be used for various applications of transmitting information through hands, and its usefulness was verified by applying it to number recognition in sign language. The proposed model is expected to have various applications in the user interface of computers other than sign language recognition.

NUI/NUX framework based on intuitive hand motion (직관적인 핸드 모션에 기반한 NUI/NUX 프레임워크)

  • Lee, Gwanghyung;Shin, Dongkyoo;Shin, Dongil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.11-19
    • /
    • 2014
  • The natural user interface/experience (NUI/NUX) is used for the natural motion interface without using device or tool such as mice, keyboards, pens and markers. Up to now, typical motion recognition methods used markers to receive coordinate input values of each marker as relative data and to store each coordinate value into the database. But, to recognize accurate motion, more markers are needed and much time is taken in attaching makers and processing the data. Also, as NUI/NUX framework being developed except for the most important intuition, problems for use arise and are forced for users to learn many NUI/NUX framework usages. To compensate for this problem in this paper, we didn't use markers and implemented for anyone to handle it. Also, we designed multi-modal NUI/NUX framework controlling voice, body motion, and facial expression simultaneously, and proposed a new algorithm of mouse operation by recognizing intuitive hand gesture and mapping it on the monitor. We implement it for user to handle the "hand mouse" operation easily and intuitively.