• Title/Summary/Keyword: Head Tracker

Search Result 44, Processing Time 0.021 seconds

STEREOSCOPIC EYE-TRACKING SYSTEM BASED ON A MOVING PARALLAX BARRIER

  • Chae, Ho-Byung;Lee, Gang-Sung;Lee, Seung-Hyun
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.189-192
    • /
    • 2009
  • We present a novel head tracking system for stereoscopic displays that ensures the viewer has a high degree of movement. The tracker is capable of segmenting the viewer from background objects using their relative distance. A depth camera is used to generate a key signal for head tracking application. A method of the moving parallax barrier is also introduced to supplement a disadvantage of the fixed parallax barrier that provides observation at the specific locations.

  • PDF

Real Time Eye and Gaze Tracking (트래킹 Gaze와 실시간 Eye)

  • Min Jin-Kyoung;Cho Hyeon-Seob
    • Proceedings of the KAIS Fall Conference
    • /
    • 2004.11a
    • /
    • pp.234-239
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process fur each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Implementation of Virtual Reality Engine Using Patriot Tracking Device (Patriot Tracking Device를 이용한 가상현실 엔진 구현)

  • Kim Eun-Ju;Lee Yong-Woog;Song Chang-Geun
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2006.05a
    • /
    • pp.143-146
    • /
    • 2006
  • 본 연구는 개인용 PC 에 장착할 수 있는 저가의 가상현실게임 엔진을 설계하고 구현한다. 가상현실 엔진구현에서는 주요한 입출력 장치인 Tracker 와 HMD(Head Mounted Display) 그리고 조이스틱과 마우스의 장착이 필수적이다. 가상현실 엔진을 연동하기 위한 입출력 클래스를 설계하고 입력장치로 마우스와 조이스틱, 출력장치로 HMD 를 장착하였으며 Tracker 의 구현은 상업용 제품인 Polhemus의 Patriot tracker를 이용하였다.

  • PDF

Real Time Eye and Gaze Tracking

  • Park Ho Sik;Nam Kee Hwan;Cho Hyeon Seob;Ra Sang Dong;Bae Cheol Soo
    • Proceedings of the IEEK Conference
    • /
    • 2004.08c
    • /
    • pp.857-861
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Development of Real-Time Vision-based Eye-tracker System for Head Mounted Display (영상정보를 이용한 HMD용 실시간 아이트랙커 시스템)

  • Roh, Eun-Jung;Hong, Jin-Sung;Bang, Hyo-Choong
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.35 no.6
    • /
    • pp.539-547
    • /
    • 2007
  • In this paper, development and tests of a real-time eye-tracker system are discussed. The tracker system tracks a user's gaze point through movement of eyes by means of vision-based pupil detection. The vision-based method has an advantage of detecting the exact positions of user's eyes. An infrared camera and a LED are used to acquire a user's pupil image and to extract pupil region, which was hard to extract with software only, from the obtained image, respectively. We develop a pupil-tracking algorithm with Kalman filter and grab the pupil images by using DSP(Digital Signal Processing) system for real-time image processing technique. The real-time eye-tracker system tracks the movements of user's pupils to project their gaze point onto a background image.

A Time-multiplexed 3d Display Using Steered Exit Pupils

  • Brar, Rajwinder Singh;Surman, Phil;Sexton, Ian;Hopf, Klaus
    • Journal of Information Display
    • /
    • v.11 no.2
    • /
    • pp.76-83
    • /
    • 2010
  • This paper presents the multi-user autostereoscopic 3D display system constructed and operated by the authors using the time-multiplexing approach. This prototype has three main advantages over the previous versions developed by the authors: its hardware was simplified as only one optical array is used to create viewing regions in space, a lenticular multiplexing screen is not necessary as images can be produced sequentially on a fast 120Hz LCD with full resolution, and the holographic projector was replaced with a high-frame-rate digital micromirror device (DMD) projector. The whole system in this prototype consists of four major parts: a 120Hz high-frame-rate DMD projector, a 49-element optical array, a 120Hz screen assembly, and a multi-user head tracker. The display images for the left/right eyes are produced alternatively on a 120Hz direct-view LCD and are synchronized with the output of the projector, which acts as a backlight of the LCD. The novel steering optics controlled by the multiuser head tracker system directs the projector output to regions referred to as exit pupils, which are located in the viewers’eyes. The display can be developed in the "hang-on-the-wall"form.

Improvement of Upload Traffic through Negotiation in UCC Broadcasting System (P2P 기반의 UCC 방송에서 협상을 통한 업로드 트래픽의 개선)

  • Kim, Ji Hoon
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.10 no.3
    • /
    • pp.171-179
    • /
    • 2014
  • Among the P2P based multimedia streaming architecture, multiple chain architecture has advantage in adapting to dynamically changing network topology simply and rapidly, so this architecture is used for UCC broadcasting system. In UCC broadcasting system, general peer involved in DSLAM becomes UCC server rather than broadcasting system that transfers data from ISP servers. Therefore UCC data generated from UCC server peers is transmitted to peers through DSLAM, and this transmission uses uplink bandwidth of DSLAM. In this paper, I propose an efficient management method of DSLAM uplink bandwidths through negotiating tracker and UCC server peer or head peers of DSLAM. I propose the method that tracker restricts a bitrate of uplink stream of UCC servers when used uplink bandwidth of DSLAM exceeds a certain point of maximum uplink bandwidths. I will show the improved performance of proposed scheme rather than general method with respect to the uplink bandwidth of DSLAM by numerical analysis and simulation.

Motion and Structure Estimation Using Fusion of Inertial and Vision Data for Helmet Tracker

  • Heo, Se-Jong;Shin, Ok-Shik;Park, Chan-Gook
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.11 no.1
    • /
    • pp.31-40
    • /
    • 2010
  • For weapon cueing and Head-Mounted Display (HMD), it is essential to continuously estimate the motion of the helmet. The problem of estimating and predicting the position and orientation of the helmet is approached by fusing measurements from inertial sensors and stereo vision system. The sensor fusion approach in this paper is based on nonlinear filtering, especially expended Kalman filter(EKF). To reduce the computation time and improve the performance in vision processing, we separate the structure estimation and motion estimation. The structure estimation tracks the features which are the part of helmet model structure in the scene and the motion estimation filter estimates the position and orientation of the helmet. This algorithm is tested with using synthetic and real data. And the results show that the result of sensor fusion is successful.

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Cho, Hyeon-Seob;Kim, Hee-Sook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.6 no.2
    • /
    • pp.195-201
    • /
    • 2005
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF