• Title/Summary/Keyword: Head tracker

Search Result 44, Processing Time 0.027 seconds

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • 이영식;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.2
    • /
    • pp.477-483
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks(GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

과학기술위성 2호 시스템

  • Lee, Seung-Hun;Park, Jong-Oh;Sim, Eun-Sup
    • Aerospace Engineering and Technology
    • /
    • v.4 no.2
    • /
    • pp.60-64
    • /
    • 2005
  • STSAT-2 will demonstrate the scientific mission(acquisition of brightness temperature of the earth at 23.8 GHz and 37 GHz) and spacecraft technologies(laser ranging, frame-type satellite structure, Dual-head star tracker, CCD sun sensor, pulsed plasma thruster, etc.). In this paper STSAT-2 satellite system is described. It includes the definition of the system and the overview of payloads and BUS.

  • PDF

Aspects of a head-mounted eye-tracker based on a bidirectional OLED microdisplay

  • Baumgarten, Judith;Schuchert, Tobias;Voth, Sascha;Wartenberg, Philipp;Richter, Bernd;Vogel, Uwe
    • Journal of Information Display
    • /
    • v.13 no.2
    • /
    • pp.67-71
    • /
    • 2012
  • In today's mobile world, small and lightweight information systems are becoming increasingly important. Microdisplays are the base for several near-to-eye display devices. The addition of an integrated image sensor significantly boosts the range of applications. This paper describes the base-building block for these systems: the bidirectional organic light-emitting diode microdisplay. A small and lightweight optic design, an eye-tracking algorithm, and interaction concepts are also presented.

Attitude and Position Estimation of a Helmet Using Stereo Vision (스테레오 영상을 이용한 헬멧의 자세 및 위치 추정)

  • Shin, Ok-Shik;Heo, Se-Jong;Park, Chan-Gook
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.38 no.7
    • /
    • pp.693-701
    • /
    • 2010
  • In this paper, it is proposed that an attitude and position estimation algorithm based on a stereo camera system for a helmet tracker. Stereo camera system consists of two CCD camera, a helmet, infrared LEDs and a frame grabber. Fifteen infrared LEDs are feature points which are used to determine the attitude and position of the helmet. These features are arranged in triangle pattern with different distance on the helmet. Vision-based the attitude and position algorithm consists of feature segmentation, projective reconstruction, model indexing and attitude estimation. In this paper, the attitude estimation algorithm using UQ (Unit Quaternion) is proposed. The UQ guarantee that the rotation matrix is a unitary matrix. The performance of presented algorithm is verified by simulation and experiment.

Analysis of Initial Activation and Checkout Results of Attitude Sensor Star Trackers for a LEO Satellite (저궤도 위성의 자세센서 별 추적기 초기 운용 분석)

  • Yim, Jo Ryeong;Choi, Hong-Taek
    • Aerospace Engineering and Technology
    • /
    • v.11 no.2
    • /
    • pp.87-95
    • /
    • 2012
  • This technical paper describes the analysis results of telemetry data for the initial activation of star trackers for an agile high accuracy low earth orbit satellite. The satellite was recently launched and is in the Launch and Early Operation Phases. It uses two SED36 star trackers manufactured by SODERN. The star tracker is separated by three parts, an optical head, an electronics box, and a baffle with maintaining optical head base plate temperature 20 degC in order to achieve the better performance in low frequency error. This paper presents the initial activation status, requirements and performance, anomaly occurrence, and noise equivalent angle performance analysis during the mission mode by processing the telemetry data.

Development of electric vehicle maintenance education ability using digital twin technology and VR

  • Lee, Sang-Hyun;Jung, Byeong-Soo
    • International Journal of Advanced Culture Technology
    • /
    • v.8 no.2
    • /
    • pp.58-67
    • /
    • 2020
  • In this paper, the maintenance training manual of EV vehicle was produced by utilizing digital twin technology and various sensors such as IR-based light house tracking and head tracker. In addition, through digital twin technology and VR to provide high immersiveness to users, sensory content creation technology was secured through animation and effect realization suitable for EV vehicle maintenance situation. EV vehicle maintenance training manual is 3D engine programming and real-time creation of 3D objects and minimization of screen obstacles and selection of specific menus in virtual space in the form of training simulation. In addition, automatic output from the Head Mount Display (HUD), EV vehicle maintenance and inspection, etc., user can easily operate content was produced. This technology development can enhance immersion to users through implementation of detailed scenarios for maintenance / inspection of EV vehicles" and 3D parts display by procedure, realization of animations and effects for maintenance situations. Through this study, familiarity with improving the quality of education and safety accidents and correct maintenance process and the experienced person was very helpful in learning how to use equipment naturally and how to maintain EV vehicles.

3D Facial Landmark Tracking and Facial Expression Recognition

  • Medioni, Gerard;Choi, Jongmoo;Labeau, Matthieu;Leksut, Jatuporn Toy;Meng, Lingchao
    • Journal of information and communication convergence engineering
    • /
    • v.11 no.3
    • /
    • pp.207-215
    • /
    • 2013
  • In this paper, we address the challenging computer vision problem of obtaining a reliable facial expression analysis from a naturally interacting person. We propose a system that combines a 3D generic face model, 3D head tracking, and 2D tracker to track facial landmarks and recognize expressions. First, we extract facial landmarks from a neutral frontal face, and then we deform a 3D generic face to fit the input face. Next, we use our real-time 3D head tracking module to track a person's head in 3D and predict facial landmark positions in 2D using the projection from the updated 3D face model. Finally, we use tracked 2D landmarks to update the 3D landmarks. This integrated tracking loop enables efficient tracking of the non-rigid parts of a face in the presence of large 3D head motion. We conducted experiments for facial expression recognition using both framebased and sequence-based approaches. Our method provides a 75.9% recognition rate in 8 subjects with 7 key expressions. Our approach provides a considerable step forward toward new applications including human-computer interactions, behavioral science, robotics, and game applications.

HELIUM3D: A Laser-scanning Head-tracked Autostereoscopic Display

  • Brar, Rajwinder Singh;Surman, Phil;Sexton, Ian;Hopf, Klaus
    • Journal of Information Display
    • /
    • v.11 no.3
    • /
    • pp.100-108
    • /
    • 2010
  • A multi-user autostereoscopic display based on laser scanning is described in this paper. It does not require the wearing of special glasses; it can provide 3D to several viewers who have a large degree of freedom of movement; and it requires the display of only a minimum amount of information. The display operates by providing regions in the viewing field, referred to as "exit pupils," which follow the positions of the viewers' eyes under the control of a multi-user head tracker. The display incorporates an RGB laser illumination source that illuminates a light engine. The light directions are controlled by a spatial light modulator, and a front screen assembly incorporates a novel Gabor superlens. Its operating principle is explained in this paper, as is the construction of three iterations of the display. Finally, a method of developing the display into one that is suitable for television applications is described.

Tight Integration of Graphics and Motion in a VR Simulator with a 1-DoF Motion-Chair (1축 운동의자를 이용한 가상현실 시뮬레이터에서 그래픽과 운동의 통합)

  • 이종환;한순흥
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2001.10b
    • /
    • pp.262-264
    • /
    • 2001
  • 1자유도 회전운동(Rolling-Motion)을 재현할 수 있는 운동의자와 실시간 3차원 가상현실 기반의 시뮬레이터를 개발하여 그래픽과 운동의 통합을 연구하였다. 영상은 실시간 3차원 가상환경을 재생하며, 운동의자는 여기에 동기한다. 영상의 가시화는 Head-Mounted Display와 3자유도 트랙커(Tracker)를 이용해 이루어지며, 조이스틱(Joystick)을 이용해 가상환경 내에서 네비게이션한다. 1자유도 운동의자를 통해 탑승자는 항공기 동역학 모델에 기반한 정면 회전(Roll) 운동을 체감할 수 있으며, 제한된 공간 내에서 체감을 최대화하기 위해 워시아웃 필터(Washout Filter)가 응용되었다. 제작된 시뮬레이터는 일반 아케이드 시뮬레이터 게임의 프로토타입을 제공하며 탑승감 연구 등에 쓰일 수 있다.

  • PDF

Design of Gaming Interaction Control using Gesture Recognition and VR Control in FPS Game (FPS 게임에서 제스처 인식과 VR 컨트롤러를 이용한 게임 상호 작용 제어 설계)

  • Lee, Yong-Hwan;Ahn, Hyochang
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.4
    • /
    • pp.116-119
    • /
    • 2019
  • User interface/experience and realistic game manipulation play an important role in virtual reality first-person-shooting game. This paper presents an intuitive hands-free interface of gaming interaction scheme for FPS based on user's gesture recognition and VR controller. We focus on conventional interface of VR FPS interaction, and design the player interaction wearing head mounted display with two motion controllers; leap motion to handle low-level physics interaction and VIVE tracker to control movement of the player joints in the VR world. The FPS prototype system shows that the design interface helps to enjoy playing immersive FPS and gives players a new gaming experience.