• Title/Summary/Keyword: human tracking

Search Result 655, Processing Time 0.034 seconds

Infrared LED Pointer for Interactions in Collaborative Environments (협업 환경에서의 인터랙션을 위한 적외선 LED 포인터)

  • Jin, Yoon-Suk;Lee, Kyu-Hwa;Park, Jun
    • Journal of the HCI Society of Korea
    • /
    • v.2 no.1
    • /
    • pp.57-63
    • /
    • 2007
  • Our research was performed in order to implement a new pointing device for human-computer interactions in a collaborative environments based on Tiled Display system. We mainly focused on tracking the position of an infrared light source and applying our system to various areas. More than simple functionality of mouse clicking and pointing, we developed a device that could be used to help people communicate better with the computer. The strong point of our system is that it could be implemented in any place where a camera can be installed. Due to the fact that this system processes only infrared light, computational overhead for LED recognition was very low. Furthermore, by analyzing user's movement, various actions are expected to be performed with more convenience. This system was tested for presentation and game control.

  • PDF

Design of Korean eye-typing interfaces based on multilevel input system (단계식 입력 체계를 이용한 시선 추적 기반의 한글 입력 인터페이스 설계)

  • Kim, Hojoong;Woo, Sung-kyung;Lee, Kunwoo
    • Journal of the HCI Society of Korea
    • /
    • v.12 no.4
    • /
    • pp.37-44
    • /
    • 2017
  • Eye-typing is one kind of human-computer interactive input system which is implemented by location data of gaze. It is widely used as an input system for paralytics because it does not require physical motions other than the eye movement. However, eye-typing interface based on Korean character has not been suggested yet. Thus, this research aims to implement the eye-typing interface optimized for Korean. To begin with, design objectives were established based on the features of eye-typing: significant noise and Midas touch problem. Multilevel input system was introduced to deal with noise, and an area free from input button was applied to solve Midas touch problem. Then, two types of eye-typing interfaces were suggested on phonological consideration of Korean where each syllable is generated from combination of several phonemes. Named as consonant-vowel integrated interface and separated interface, the two interfaces are designed to input Korean in phases through grouped phonemes. Finally, evaluation procedures composed of comparative experiments against the conventional Double-Korean keyboard interface, and analysis on flow of gaze were conducted. As a result, newly designed interfaces showed potential to be applied as practical tools for eye-typing.

A Study on Implementation of Ubiquitous Home Mess-Cleanup Robot (유비쿼터스 홈 메스클린업 로봇의 구현에 관한 연구)

  • Cha Hyun-Koo;Kim Seung-Woo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.12
    • /
    • pp.1011-1019
    • /
    • 2005
  • In this paper, Ubiquitous Home Mess-Cleanup Robot(UHMR), which has a practical function of the automatic mess-cleanup, is developed. The vacuum-cleaner had made the burden of house chore lighten but the operation labour of a vacuum-cleaner had been so severe. Recently, the cleaning robot was producted to perfectly solve the cleaning labour of a house but it also was not successful because it still had a problem of mess-cleaning, which was the clean-up of big trash and the arrangement of newspapers, clothes, etc. The cleaning robot is to just vacuum dust and small trash but has no function to arrange and take away before the automatic vacuum-cleaning. For this reason, the market for the cleaning robot is not yet built up. So, we need a design method and technological algorithm of new automatic machine to solve the problem of mess-cleanup in house. It needs functions of agile automatic navigation, novel manipulation system for mess-cleanup. The automatic navigation system has to be controlled for the full scanning of living room, to recognize the absolute position and orientation of tile self, the precise tracking of the desired path, and to distinguish the mess object to clean-up from obstacle object to just avoid. The manipulate,, which is not needed in the vacuum-cleaning robot, must have the functions, how to distinguish big trash to clean from mess objects to arrange, how to grasp in according to the form of mess objects, how to move to the destination in according to mess objects and arrange them. We use the RFID system to solve the problems in this paper and propose the reading algorithm of RFID tags installed in indoor objects and environments. Then, it should be an intelligent system so that the mess cleaning task can be autonomously performed in a wide variety of situations and environments. It needs to also has the entertainment functions for the good communication between the human and UHMR. Finally, the good performance of the designed UHMR is confirmed through the results of the mess clean-up and arrangement.

Hand Tracking Based Projection Mapping System and Applications (손 위치 트래킹 기반의 프로젝션 매핑 시스템 및 응용)

  • Lee, Cheongun;Park, Sanghun
    • Journal of the Korea Computer Graphics Society
    • /
    • v.22 no.4
    • /
    • pp.1-9
    • /
    • 2016
  • In this paper we present a projection mapping system onto human's moving hand by a projector as information delivery media and Kinect to recognize hand motion. Most traditional projection mapping techniques project a variety of images onto stationary objects, however, our system provides new user experience by projecting images onto the center of the moving palm. We explain development process of the system, and production of content as applications on our system. We propose hardware organization and development process of open software architecture based on object oriented programming approach. For stable image projection, we describe a device calibration method between the projector and Kinect in three dimensional space, and a denoising technique to minimize artifacts from Kinect coordinates vibration and unstable hand tremor.

Histogram Based Hand Recognition System for Augmented Reality (증강현실을 위한 히스토그램 기반의 손 인식 시스템)

  • Ko, Min-Su;Yoo, Ji-Sang
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.7
    • /
    • pp.1564-1572
    • /
    • 2011
  • In this paper, we propose a new histogram based hand recognition algorithm for augmented reality. Hand recognition system makes it possible a useful interaction between an user and computer. However, there is difficulty in vision-based hand gesture recognition with viewing angle dependency due to the complexity of human hand shape. A new hand recognition system proposed in this paper is based on the features from hand geometry. The proposed recognition system consists of two steps. In the first step, hand region is extracted from the image captured by a camera and then hand gestures are recognized in the second step. At first, we extract hand region by deleting background and using skin color information. Then we recognize hand shape by determining hand feature point using histogram of the obtained hand region. Finally, we design a augmented reality system by controlling a 3D object with the recognized hand gesture. Experimental results show that the proposed algorithm gives more than 91% accuracy for the hand recognition with less computational power.

Real Time Gaze Discrimination for Human Computer Interaction (휴먼 컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Park Ho sik;Bae Cheol soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.3C
    • /
    • pp.125-132
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about 5°horizontally and 8°vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

Hand posture recognition robust to rotation using temporal correlation between adjacent frames (인접 프레임의 시간적 상관 관계를 이용한 회전에 강인한 손 모양 인식)

  • Lee, Seong-Il;Min, Hyun-Seok;Shin, Ho-Chul;Lim, Eul-Gyoon;Hwang, Dae-Hwan;Ro, Yong-Man
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.11
    • /
    • pp.1630-1642
    • /
    • 2010
  • Recently, there is an increasing need for developing the technique of Hand Gesture Recognition (HGR), for vision based interface. Since hand gesture is defined as consecutive change of hand posture, developing the algorithm of Hand Posture Recognition (HPR) is required. Among the factors that decrease the performance of HPR, we focus on rotation factor. To achieve rotation invariant HPR, we propose a method that uses the property of video that adjacent frames in video have high correlation, considering the environment of HGR. The proposed method introduces template update of object tracking using the above mentioned property, which is different from previous works based on still images. To compare our proposed method with previous methods such as template matching, PCA and LBP, we performed experiments with video that has hand rotation. The accuracy rate of the proposed method is 22.7%, 14.5%, 10.7% and 4.3% higher than ordinary template matching, template matching using KL-Transform, PCA and LBP, respectively.

Sound Source Localization using HRTF database

  • Hwang, Sung-Mok;Park, Young-Jin;Park, Youn-Sik
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.751-755
    • /
    • 2005
  • We propose a sound source localization method using the Head-Related-Transfer-Function (HRTF) to be implemented in a robot platform. In conventional localization methods, the location of a sound source is estimated from the time delays of wave fronts arriving in each microphone standing in an array formation in free-field. In case of a human head this corresponds to Interaural-Time-Delay (ITD) which is simply the time delay of incoming sound waves between the two ears. Although ITD is an excellent sound cue in stimulating a lateral perception on the horizontal plane, confusion is often raised when tracking the sound location from ITD alone because each sound source and its mirror image about the interaural axis share the same ITD. On the other hand, HRTFs associated with a dummy head microphone system or a robot platform with several microphones contain not only the information regarding proper time delays but also phase and magnitude distortions due to diffraction and scattering by the shading object such as the head and body of the platform. As a result, a set of HRTFs for any given platform provides a substantial amount of information as to the whereabouts of the source once proper analysis can be performed. In this study, we introduce new phase and magnitude criteria to be satisfied by a set of output signals from the microphones in order to find the sound source location in accordance with the HRTF database empirically obtained in an anechoic chamber with the given platform. The suggested method is verified through an experiment in a household environment and compared against the conventional method in performance.

  • PDF

A Development of Home Mess-Cleanup Robot

  • Cha, Hyun-Koo;Jang, Kyung-Jun;Im, Chan-Young;Kim, Seung-Woo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.1612-1616
    • /
    • 2005
  • In this paper, a Home Mess-Cleanup Robot(HMR), which has a practical function of the automatic mess-cleanup, is developed. The vacuum-cleaner had made the burden of house chore lighten but the operation labour of a vacuum-cleaner had been so severe. Recently, the cleaning robot was producted to perfectly solve the cleaning labour of a house but it also was not successful because it still had a problem of mess-cleaning, which was the clean-up of big trash and the arrangement of newspapers, clothes, etc. The cleaning robot is to just vacuum dust and small trash but has no function to arrange and take away before the automatic vacuum-cleaning. For this reason, the market for the cleaning robot is not yet built up. So, we need a design method and technological algorithm of new automatic machine to solve the problem of mess-cleanup in house. It needs functions of agile automatic navigation, novel manipulation system for mess-cleanup. The automatic navigation system has to be controlled for the full scanning of living room, to recognize the absolute position and orientation of the self, the precise tracking of the desired path, and to distinguish the mess object to clean-up from obstacle object to just avoid. The manipulator, which is not needed in the vacuum-cleaning robot, must have the functions, how to distinguish big trash to clean from mess objects to arrange, how to grasp in according to the form of mess objects, how to move to the destination in according to mess objects and arrange them. We use the RFID system to solve the problems in this paper and propose the reading algorithm of RFID tags installed in indoor objects and environments. Then, it should be an intelligent system so that the mess cleaning task can be autonomously performed in a wide variety of situations and environments. It needs to also has the entertainment functions for the good communication between the human and HMR. Finally, the good performance of the designed HMR is confirmed through the results of the mess clean-up and arrangement.

  • PDF

Design and Implementation of Motion-based Interaction in AR Game (증강현실 게임에서의 동작 기반 상호작용 설계 및 구현)

  • Park, Jong-Seung;Jeon, Young-Jun
    • Journal of Korea Game Society
    • /
    • v.9 no.5
    • /
    • pp.105-115
    • /
    • 2009
  • This article proposes a design and implementation methodology of a gesture-based interface for augmented reality games. The topic of gesture-based augmented reality games is a promising area in the immersive future games using human body motions. However, due to the instability of the current motion recognition technologies, most previous development processes have introduced many ad hoc methods to handle the shortcomings and, hence, the game architectures have become highly irregular and inefficient This article proposes an efficient development methodology for gesture-based augmented reality games through prototyping a table tennis game with a gesture interface. We also verify the applicability of the prototyping mechanism by implementing and demonstrating the augmented reality table tennis game. In the experiments, the implemented prototype has stably tracked real rackets to allow fast movements and interactions without delay.

  • PDF