• Title/Summary/Keyword: gaze tracker

Search Result 35, Processing Time 0.029 seconds

Real Time Eye and Gaze Tracking (트래킹 Gaze와 실시간 Eye)

  • Min Jin-Kyoung;Cho Hyeon-Seob
    • Proceedings of the KAIS Fall Conference
    • /
    • 2004.11a
    • /
    • pp.234-239
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process fur each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Evaluation of Gaze Depth Estimation using a Wearable Binocular Eye tracker and Machine Learning (착용형 양안 시선추적기와 기계학습을 이용한 시선 초점 거리 추정방법 평가)

  • Shin, Choonsung;Lee, Gun;Kim, Youngmin;Hong, Jisoo;Hong, Sung-Hee;Kang, Hoonjong;Lee, Youngho
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.1
    • /
    • pp.19-26
    • /
    • 2018
  • In this paper, we propose a gaze depth estimation method based on a binocular eye tracker for virtual reality and augmented reality applications. The proposed gaze depth estimation method collects a wide range information of each eye from the eye tracker such as the pupil center, gaze direction, inter pupil distance. It then builds gaze estimation models using Multilayer perceptron which infers gaze depth with respect to the eye tracking information. Finally, we evaluated the gaze depth estimation method with 13 participants in two ways: the performance based on their individual models and the performance based on the generalized model. Through the evaluation, we found that the proposed estimation method recognized gaze depth with 90.1% accuracy for 13 individual participants and with 89.7% accuracy for including all participants.

Real Time Eye and Gaze Tracking

  • Park Ho Sik;Nam Kee Hwan;Cho Hyeon Seob;Ra Sang Dong;Bae Cheol Soo
    • Proceedings of the IEEK Conference
    • /
    • 2004.08c
    • /
    • pp.857-861
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Cho, Hyeon-Seob;Kim, Hee-Sook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.6 no.2
    • /
    • pp.195-201
    • /
    • 2005
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • 이영식;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.2
    • /
    • pp.477-483
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks(GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

Gaze Differences between Expert and Novice Teachers in Science Classes

  • Kim, Won-Jung;Byeon, Jung-Ho;Lee, Il-Sun;Kwon, Yong-Ju
    • Journal of The Korean Association For Science Education
    • /
    • v.32 no.9
    • /
    • pp.1443-1451
    • /
    • 2012
  • This study aims to investigate the gaze patterns of two expert and two novice teachers in one hour of lecture type class. Teachers recruited from the same middle school conducted the class each, wearing an eye-tracker. Gaze rate and gaze movement pattern were analyzed. The scene where teachers faced in the classroom was categorized into three zones; student zone, material zone, and non-teaching zone. Student zone was divided into nine areas of interest to see the gaze distribution within the student zone. Expert teachers showed focused gaze on student zone while novice teachers' gaze rate was significantly higher at the non-teaching zone, compared to expert teachers' one. Within student zone, expert teachers' gaze spread to the rear areas, but novice teachers' one was narrowly resided in the middle areas of the student zone. This difference in gaze caused different eye movement pattern: experts' T pattern and novices' I pattern. On the other hand, both teacher groups showed the least gaze rate onto the left and right front areas. Which change is required to teachers' gaze behavior and what must be considered in order to make effective teacher gaze in the classroom setting were discussed.

The Effect of the Indication of Lengths and Angles on Classifying Triangles: Centering on Correct Answer Rate and Eye Movements (분류하기에서 길이와 직각 표기의 효과: 정답률과 안구운동 분석을 중심으로)

  • Yun, Ju Mi;Lee, Kwang-ho;Lee, Jae-Hak
    • Education of Primary School Mathematics
    • /
    • v.20 no.2
    • /
    • pp.163-175
    • /
    • 2017
  • The purpose of the study is to identify the effect of length and right angle indication on the understanding of the concept of the figure when presenting the task of classifying the plane figures. we recorded thirty three 4th grade students' performance with eye-tracking technologies and analyzed the correct answer rate and gaze duration. The findings from the study were as follows. First, correctness rate increased and Gaze duration decreased by marking length in isosceles triangle and equilateral triangle. Second, correctness rate increased and Gaze duration decreased by marking right angle in acute angle triangle and obtuse triangle. Based on these results, it is necessary to focus on measuring the understanding of the concept of the figure rather than measuring the students' ability to measure by expressing the length and angle when presenting the task of classifying the plane figures.

User's Gaze Analysis for Improving Map Label Readability in Way-finding Situation

  • Moon, Seonggook;Hwang, Chul Sue
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.37 no.5
    • /
    • pp.343-350
    • /
    • 2019
  • Map labels are the most recognizable map elements using the human visual system because they are essentially a natural language. In this study, an experiment was conducted using an eye-tracker to objectively record and analyze the response of subjects regarding visual attention to map labels. A primary building object was identified by analyzing visit counts, average visit duration, fixation counts, and the average fixation duration of a subject's gaze for an area of interest acquired using the eye-tracker. The unmarked rate of map labels in Google map, Naver map, and Daum map was calculated. As a result, this rate exceeded fifty-one percent, with the lowest rate recorded for Google map. It is expected that the results of this study will contribute to an increase in the diversity of research in terms of the spatial cognition approach for map labels, which is more helpful to users than the existing body of work on methods of expression for labels.

Method for Automatic Switching Screen of OST-HMD using Gaze Depth Estimation (시선 깊이 추정 기법을 이용한 OST-HMD 자동 스위칭 방법)

  • Lee, Youngho;Shin, Choonsung
    • Smart Media Journal
    • /
    • v.7 no.1
    • /
    • pp.31-36
    • /
    • 2018
  • In this paper, we propose automatic screen on / off method of OST-HMD screen using gaze depth estimation technique. The proposed method uses MLP (Multi-layer Perceptron) to learn the user's gaze information and the corresponding distance of the object, and inputs the gaze information to estimate the distance. In the learning phase, eye-related features obtained using a wearable eye-tracker. These features are then entered into the Multi-layer Perceptron (MLP) for learning and model generation. In the inference step, eye - related features obtained from the eye tracker in real time input to the MLP to obtain the estimated depth value. Finally, we use the results of this calculation to determine whether to turn the display of the HMD on or off. A prototype was implemented and experiments were conducted to evaluate the feasibility of the proposed method.

Development of Real-Time Vision-based Eye-tracker System for Head Mounted Display (영상정보를 이용한 HMD용 실시간 아이트랙커 시스템)

  • Roh, Eun-Jung;Hong, Jin-Sung;Bang, Hyo-Choong
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.35 no.6
    • /
    • pp.539-547
    • /
    • 2007
  • In this paper, development and tests of a real-time eye-tracker system are discussed. The tracker system tracks a user's gaze point through movement of eyes by means of vision-based pupil detection. The vision-based method has an advantage of detecting the exact positions of user's eyes. An infrared camera and a LED are used to acquire a user's pupil image and to extract pupil region, which was hard to extract with software only, from the obtained image, respectively. We develop a pupil-tracking algorithm with Kalman filter and grab the pupil images by using DSP(Digital Signal Processing) system for real-time image processing technique. The real-time eye-tracker system tracks the movements of user's pupils to project their gaze point onto a background image.