• Title/Summary/Keyword: eye gaze tracking

Search Result 128, Processing Time 0.029 seconds

A Study on the Interest of the Eyes Applying Gazing Phenomena - Based on an Eye-tracking Experiment Carried with a Facade as a Medium - (주시현상을 적용한 시선의 관심도 연구 - 파사드를 매개로 한 아이트래킹 실험 중심으로 -)

  • Yeo, Mi;Lee, Chang No
    • Korean Institute of Interior Design Journal
    • /
    • v.23 no.1
    • /
    • pp.122-131
    • /
    • 2014
  • This study aimed to conduct an eye-tracking experiment carried with facade images as a medium and to do research on 'the interest of the eyes' resulted from people's gazing phenomena. This study secured gazing data which appeared according to visual response and analyzed gazing phenomena to find the basic theory of 'the interest of the eyes' as a methodological role, which consumer interest and attention could be grafted when a plan and a design for space design was made. Data terms used in eye-tracking backgrounds and the movement of the eyes were investigated in literature review. Twenty (20) facade images were selected through a case study to get experimental stimuli for the related experiment. Thirty (30) subjects (men and women) suitable for the experiment were recruited to conduct an eye-tracking experiment. After the experiment, five (5) areas were set up in the facade image to identify the focus level of interest and attention. The level of interest and focus was connected to the interest of the eyes. The analysis to study the interest of the eyes was based on nine (9) items such as sequence, entry time, dwell time, hit ratio, revisits, revisitors, average fixation, first fixation and fixation count. Through gaze analysis, the following conclusion was drawn about the 'interest level of sight' for gaze frequency. The interest level can be interpreted to be higher for faster sequence, shorter entry time, longer all fixation(ms) for dwell time, faster all saccade(%), higher hit ratio, more revisits, more revisitors, longer average fixation, faster and longer first fixation, and more fixation count, and the person can be said to have felt interest faster and/or more.

Compensation for Fast Head Movements on Non-intrusive Eye Gaze Tracking System Using Kalman Filter (Kalman filter를 이용한 비접촉식 응시점 추정 시스템에서의 빠른 머리 이동의 보정)

  • Kim, Soo-Chan;Yoo, Jae-Ha;Kim, Deok-Won
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.44 no.6
    • /
    • pp.35-41
    • /
    • 2007
  • We proposed an eye gaze tracking system under natural head movements. The system consists of one CCD(charge-coupled device) camera and two front-surface mirrors. The mirrors rotate to follow head movements in order to keep the eye within the view of the camera. However, the mirror controller cannot guarantee the fast head movements, because the frame rate is generally 30Hz. To overcome this problem, we applied Kalman filter to estimate next eye position from the current eye image. In the results, our system allowed the subjects head to move 60cm horizontally and 40cm vertically, with the head movement speed about 55cm/sec and 45cm/sec, respectively. And spatial gate resolutions were about 4.5 degree and 5.0 degree, respectively, and the gaze estimation accuracy was 92% under natural head movements.

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • v.16B no.3
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.

Technology Development for Non-Contact Interface of Multi-Region Classifier based on Context-Aware (상황 인식 기반 다중 영역 분류기 비접촉 인터페이스기술 개발)

  • Jin, Songguo;Rhee, Phill-Kyu
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.20 no.6
    • /
    • pp.175-182
    • /
    • 2020
  • The non-contact eye tracking is a nonintrusive human-computer interface providing hands-free communications for people with severe disabilities. Recently. it is expected to do an important role in non-contact systems due to the recent coronavirus COVID-19, etc. This paper proposes a novel approach for an eye mouse using an eye tracking method based on a context-aware based AdaBoost multi-region classifier and ASSL algorithm. The conventional AdaBoost algorithm, however, cannot provide sufficiently reliable performance in face tracking for eye cursor pointing estimation, because it cannot take advantage of the spatial context relations among facial features. Therefore, we propose the eye-region context based AdaBoost multiple classifier for the efficient non-contact gaze tracking and mouse implementation. The proposed method detects, tracks, and aggregates various eye features to evaluate the gaze and adjusts active and semi-supervised learning based on the on-screen cursor. The proposed system has been successfully employed in eye location, and it can also be used to detect and track eye features. This system controls the computer cursor along the user's gaze and it was postprocessing by applying Gaussian modeling to prevent shaking during the real-time tracking using Kalman filter. In this system, target objects were randomly generated and the eye tracking performance was analyzed according to the Fits law in real time. It is expected that the utilization of non-contact interfaces.

Human-Computer Interface using the Eye-Gaze Direction (눈의 응시 방향을 이용한 인간-컴퓨터간의 인터페이스에 관한 연구)

  • Kim, Do-Hyoung;Kim, Jea-Hean;Chung, Myung-Jin
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.38 no.6
    • /
    • pp.46-56
    • /
    • 2001
  • In this paper we propose an efficient approach for real-time eye-gaze tracking from image sequence and magnetic sensory information. The inputs to the eye-gaze tracking system are images taken by a camera and data from a magnetic sensor. The measuring data are sufficient to describe the eye and head movement, because the camera and the receiver of a magnetic sensor are stationary with respect to the head. Experimental result shows the validity of real time application aspect of the proposed system and also shows the feasibility of the system as using a new computer interface instead of the mouse.

  • PDF

Pilot Gaze Tracking and ILS Landing Result Analysis using VR HMD based Flight Simulators (VR HMD 시뮬레이터를 활용한 조종사 시선 추적 및 착륙 절차 결과 분석)

  • Jeong, Gu Moon;Lee, Youngjae;Kwag, TaeHo;Lee, Jae-Woo
    • Journal of the Korean Society for Aviation and Aeronautics
    • /
    • v.30 no.1
    • /
    • pp.44-49
    • /
    • 2022
  • This study performed precision instrument landing procedures for pilots with a commercial pilot license using VR HMD flight simulators, and assuming that the center of the pilot's gaze is in the front, 3-D.O.F. head tracking data and 2-D eye tracking of VR HMD worn by pilots gaze tracking was performed through. After that, AOI (Area of Interesting) was set for the instrument panel and external field of view of the cockpit to analyze how the pilot's gaze was distributed before and after the decision altitude. At the same time, the landing results were analyzed using the Localizer and G/S data as the pilot's precision instrument landing flight data. As a result, the pilot was quantitatively evaluated by reflecting the gaze tracking and the resulting landing result using a VR HMD simulator.

Design and Implementation of Eye-Gaze Estimation Algorithm based on Extraction of Eye Contour and Pupil Region (눈 윤곽선과 눈동자 영역 추출 기반 시선 추정 알고리즘의 설계 및 구현)

  • Yum, Hyosub;Hong, Min;Choi, Yoo-Joo
    • The Journal of Korean Association of Computer Education
    • /
    • v.17 no.2
    • /
    • pp.107-113
    • /
    • 2014
  • In this study, we design and implement an eye-gaze estimation system based on the extraction of eye contour and pupil region. In order to effectively extract the contour of the eye and region of pupil, the face candidate regions were extracted first. For the detection of face, YCbCr value range for normal Asian face color was defined by the pre-study of the Asian face images. The biggest skin color region was defined as a face candidate region and the eye regions were extracted by applying the contour and color feature analysis method to the upper 50% region of the face candidate region. The detected eye region was divided into three segments and the pupil pixels in each pupil segment were counted. The eye-gaze was determined into one of three directions, that is, left, center, and right, by the number of pupil pixels in three segments. In the experiments using 5,616 images of 20 test subjects, the eye-gaze was estimated with about 91 percent accuracy.

  • PDF

Measuring Visual Attention Processing of Virtual Environment Using Eye-Fixation Information

  • Kim, Jong Ha;Kim, Ju Yeon
    • Architectural research
    • /
    • v.22 no.4
    • /
    • pp.155-162
    • /
    • 2020
  • Numerous scholars have explored the modeling, control, and optimization of energy systems in buildings, offering new insights about technology and environments that can advance industry innovation. Eye trackers deliver objective eye-gaze data about visual and attentional processes. Due to its flexibility, accuracy, and efficiency in research, eye tracking has a control scheme that makes measuring rapid eye movement in three-dimensional space possible (e.g., virtual reality, augmented reality). Because eye movement is an effective modality for digital interaction with a virtual environment, tracking how users scan a visual field and fix on various digital objects can help designers optimize building environments and materials. Although several scholars have conducted Virtual Reality studies in three-dimensional space, scholars have not agreed on a consistent way to analyze eye tracking data. We conducted eye tracking experiments using objects in three-dimensional space to find an objective way to process quantitative visual data. By applying a 12 × 12 grid framework for eye tracking analysis, we investigated how people gazed at objects in a virtual space wearing a headmounted display. The findings provide an empirical base for a standardized protocol for analyzing eye tracking data in the context of virtual environments.

Active eye system for tracking a moving object (이동물체 추적을 위한 능동시각 시스템 구축)

  • 백문홍
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.257-259
    • /
    • 1996
  • This paper presents the active eye system for tracking a moving object in 3D space. A prototype system able to track a moving object is designed and implemented. The mechanical system ables the control of platform that consists of binocular camera and also the control of the vergence angle of each camera by step motor. Each camera has two degrees of freedom. The image features of the object are extracted from complicated environment by using zero disparity filtering(ZDF). From the cnetroid of the image features the gaze point on object is calculated and the vergence angle of each camera is controlled by step motor. The Proposed method is implemented on the prototype with robust and fast calculation time.

  • PDF

A New Ergonomic Interface System for the Disabled Person (장애인을 위한 새로운 감성 인터페이스 연구)

  • Heo, Hwan;Lee, Ji-Woo;Lee, Won-Oh;Lee, Eui-Chul;Park, Kang-Ryoung
    • Journal of the Ergonomics Society of Korea
    • /
    • v.30 no.1
    • /
    • pp.229-235
    • /
    • 2011
  • Objective: Making a new ergonomic interface system based on camera vision system, which helps the handicapped in home environment. Background: Enabling the handicapped to manipulate the consumer electronics by the proposed interface system. Method: A wearable device for capturing the eye image using a near-infrared(NIR) camera and illuminators is proposed for tracking eye gaze position(Heo et al., 2011). A frontal viewing camera is attached to the wearable device, which can recognize the consumer electronics to be controlled(Heo et al., 2011). And the amount of user's eye fatigue can be measured based on eye blink rate, and in case that the user's fatigue exceeds in the predetermined level, the proposed system can automatically change the mode of gaze based interface into that of manual selection. Results: The experimental results showed that the gaze estimation error of the proposed method was 1.98 degrees with the successful recognition of the object by the frontal viewing camera(Heo et al., 2011). Conclusion: We made a new ergonomic interface system based on gaze tracking and object recognition Application: The proposed system can be used for helping the handicapped in home environment.