• Title/Summary/Keyword: eye gaze tracking

Search Result 128, Processing Time 0.026 seconds

A Proposal of Eye-Voice Method based on the Comparative Analysis of Malfunctions on Pointer Click in Gaze Interface for the Upper Limb Disabled (상지장애인을 위한 시선 인터페이스에서 포인터 실행 방법의 오작동 비교 분석을 통한 Eye-Voice 방식의 제안)

  • Park, Joo Hyun;Park, Mi Hyun;Lim, Soon-Bum
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.4
    • /
    • pp.566-573
    • /
    • 2020
  • Computers are the most common tool when using the Internet and utilizing a mouse to select and execute objects. Eye tracking technology is welcomed as an alternative technology to help control computers for users who cannot use their hands due to their disabilities. However, the pointer execution method of the existing eye tracking technique causes many malfunctions. Therefore, in this paper, we developed a gaze tracking interface that combines voice commands to solve the malfunction problem when the upper limb disabled uses the existing gaze tracking technology to execute computer menus and objects. Usability verification was conducted through comparative experiments regarding the improvements of the malfunction. The upper limb disabled who are hand-impaired use eye tracking technology to move the pointer and utilize the voice commands, such as, "okay" while browsing the computer screen for instant clicks. As a result of the comparative experiments on the reduction of the malfunction of pointer execution with the existing gaze interfaces, we verified that our system, Eye-Voice, reduced the malfunction rate of pointer execution and is effective for the upper limb disabled to use.

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Hwang, suen ki;Kim, Moon-Hwan;Cha, Sam;Cho, Eun-Seuk;Bae, Cheol-Soo
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.2 no.3
    • /
    • pp.61-69
    • /
    • 2009
  • In this paper, to propose a new approach to real-time eye tracking. Existing methods of tracking the user's attention to the little I move my head was not going to get bad results for each of the users needed to perform the calibration process. Infrared eye tracking methods proposed lighting and Generalized Regression Neural Networks (GRNN) By using the calibration process, the movement of the head is large, even without the reliable and accurate eye tracking, mapping function was to enable each of the calibration process by the generalization can be omitted, did not participate in the study eye other users tracking was possible. Experimental results of facial movements that an average 90% of cases, other users on average 85% of the eye tracking results were shown.

  • PDF

Evaluation of Gaze Depth Estimation using a Wearable Binocular Eye tracker and Machine Learning (착용형 양안 시선추적기와 기계학습을 이용한 시선 초점 거리 추정방법 평가)

  • Shin, Choonsung;Lee, Gun;Kim, Youngmin;Hong, Jisoo;Hong, Sung-Hee;Kang, Hoonjong;Lee, Youngho
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.1
    • /
    • pp.19-26
    • /
    • 2018
  • In this paper, we propose a gaze depth estimation method based on a binocular eye tracker for virtual reality and augmented reality applications. The proposed gaze depth estimation method collects a wide range information of each eye from the eye tracker such as the pupil center, gaze direction, inter pupil distance. It then builds gaze estimation models using Multilayer perceptron which infers gaze depth with respect to the eye tracking information. Finally, we evaluated the gaze depth estimation method with 13 participants in two ways: the performance based on their individual models and the performance based on the generalized model. Through the evaluation, we found that the proposed estimation method recognized gaze depth with 90.1% accuracy for 13 individual participants and with 89.7% accuracy for including all participants.

A Human-Robot Interface Using Eye-Gaze Tracking System for People with Motor Disabilities

  • Kim, Do-Hyoung;Kim, Jae-Hean;Yoo, Dong-Hyun;Lee, Young-Jin;Chung, Myung-Jin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.3 no.4
    • /
    • pp.229-235
    • /
    • 2001
  • Recently, service area has been emerging field f robotic applications. Even though assistant robots play an important role for the disabled and the elderly, they still suffer from operating the robots using conventional interface devices such as joysticks or keyboards. In this paper we propose an efficient computer interface using real-time eye-gaze tracking system. The inputs to the proposed system are images taken by a camera and data from a magnetic sensor. The measured data is sufficient to describe the eye and head movement because the camera and the receiver of a magnetic sensor are stationary with respect to the head. So the proposed system can obtain the eye-gaze direction in spite of head movement as long as the distance between the system and the transmitter of a magnetic position sensor is within 2m. Experimental results show the validity of the proposed system in practical aspect and also verify the feasibility of the system as a new computer interface for the disabled.

  • PDF

Object Magnification and Voice Command in Gaze Interface for the Upper Limb Disabled (상지장애인을 위한 시선 인터페이스에서의 객체 확대 및 음성 명령 인터페이스 개발)

  • Park, Joo Hyun;Jo, Se-Ran;Lim, Soon-Bum
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.7
    • /
    • pp.903-912
    • /
    • 2021
  • Eye tracking research for upper limb disabilities is showing an effect in the aspect of device control. However, the reality is that it is not enough to perform web interaction with only eye tracking technology. In the Eye-Voice interface, a previous study, in order to solve the problem that the existing gaze tracking interfaces cause a malfunction of pointer execution, a gaze tracking interface supplemented with a voice command was proposed. In addition, the reduction of the malfunction rate of the pointer was confirmed through a comparison experiment with the existing interface. In this process, the difficulty of pointing due to the small size of the execution object in the web environment was identified as another important problem of malfunction. In this study, we propose an auto-magnification interface of objects so that people with upper extremities can freely click web contents by improving the problem that it was difficult to point and execute due to the high density of execution objects and their arrangements in web pages.

Eye Gaze Tracking System Under Natural Head Movements (머리 움직임이 자유로운 안구 응시 추정 시스템)

  • ;Matthew, Sked;Qiang, Ji
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.41 no.5
    • /
    • pp.57-64
    • /
    • 2004
  • We proposed the eye gaze tracking system under natural head movements, which consists of one narrow-view field CCD camera, two mirrors which of reflective angles are controlled and active infra-red illumination. The mirrors' angles were computed by geometric and linear algebra calculations to put the pupil images on the optical axis of the camera. Our system allowed the subjects head to move 90cm horizontally and 60cm vertically, and the spatial resolutions were about 6$^{\circ}$ and 7$^{\circ}$, respectively. The frame rate for estimating gaze points was 10~15 frames/sec. As gaze mapping function, we used the hierarchical generalized regression neural networks (H-GRNN) based on the two-pass GRNN. The gaze accuracy showed 94% by H-GRNN improved 9% more than 85% of GRNN even though the head or face was a little rotated. Our system does not have a high spatial gaze resolution, but it allows natural head movements, robust and accurate gaze tracking. In addition there is no need to re-calibrate the system when subjects are changed.

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Cho, Hyeon-Seob;Kim, Hee-Sook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.6 no.2
    • /
    • pp.195-201
    • /
    • 2005
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Gaze Differences between Expert and Novice Teachers in Science Classes

  • Kim, Won-Jung;Byeon, Jung-Ho;Lee, Il-Sun;Kwon, Yong-Ju
    • Journal of The Korean Association For Science Education
    • /
    • v.32 no.9
    • /
    • pp.1443-1451
    • /
    • 2012
  • This study aims to investigate the gaze patterns of two expert and two novice teachers in one hour of lecture type class. Teachers recruited from the same middle school conducted the class each, wearing an eye-tracker. Gaze rate and gaze movement pattern were analyzed. The scene where teachers faced in the classroom was categorized into three zones; student zone, material zone, and non-teaching zone. Student zone was divided into nine areas of interest to see the gaze distribution within the student zone. Expert teachers showed focused gaze on student zone while novice teachers' gaze rate was significantly higher at the non-teaching zone, compared to expert teachers' one. Within student zone, expert teachers' gaze spread to the rear areas, but novice teachers' one was narrowly resided in the middle areas of the student zone. This difference in gaze caused different eye movement pattern: experts' T pattern and novices' I pattern. On the other hand, both teacher groups showed the least gaze rate onto the left and right front areas. Which change is required to teachers' gaze behavior and what must be considered in order to make effective teacher gaze in the classroom setting were discussed.

A Study on the Priorities of Urban Street Environment Components - Focusing on An Analysis of AOI (Area of Interest) Setup through An Eye-tracking Experiment - (도시가로환경 구성요소의 우선순위에 관한 연구 - 아이트래킹 실험을 통한 관심영역설정 분석을 중심으로 -)

  • Lee, Sun Hwa;Lee, Chang No
    • Korean Institute of Interior Design Journal
    • /
    • v.25 no.1
    • /
    • pp.73-80
    • /
    • 2016
  • Street is the most fundamental component of city and place to promote diverse actions of people. Pedestrians gaze at various street environments. A visual gaze means that there are interesting elements and these elements need to be preferentially improved in the street environment improvement project. Therefore, this study aims to set up the priorities of street environment components by analyzing eye movements from a pedestrian perspective. In this study, street environment components were classified into road, street facility, building(facade) and sky and as street environment images, three "Streets of Youth" situated in Gwangbok-ro, Seomyeon and Busan University of Busan were selected. The experiment targeted 30 males and females in their twenties to forties. After setting the angle of sight through a calibration test, an eye-tracking experiment regarding the three images was conducted. Lastly, the subjects were asked to fill in questionnaires. The following three conclusions were obtained from the results of the eye-tracking experiment and the survey. First, building was the top priority of street environment components and it was followed by street facility, road and sky. Second, as components to be regarded as important, fast 'Sequence', many 'Fixation Counts' and 'Visit Counts', short 'Time to First Fixation' and long 'Fixation Duration' and 'Visit Duration' were preferred. Third, after voluntary eye movements, the subjects recognized the objects with the highest gaze frequency and the lowest gaze frequency.

Development of a Non-contact Input System Based on User's Gaze-Tracking and Analysis of Input Factors

  • Jiyoung LIM;Seonjae LEE;Junbeom KIM;Yunseo KIM;Hae-Duck Joshua JEONG
    • Korean Journal of Artificial Intelligence
    • /
    • v.11 no.1
    • /
    • pp.9-15
    • /
    • 2023
  • As mobile devices such as smartphones, tablets, and kiosks become increasingly prevalent, there is growing interest in developing alternative input systems in addition to traditional tools such as keyboards and mouses. Many people use their own bodies as a pointer to enter simple information on a mobile device. However, methods using the body have limitations due to psychological factors that make the contact method unstable, especially during a pandemic, and the risk of shoulder surfing attacks. To overcome these limitations, we propose a simple information input system that utilizes gaze-tracking technology to input passwords and control web surfing using only non-contact gaze. Our proposed system is designed to recognize information input when the user stares at a specific location on the screen in real-time, using intelligent gaze-tracking technology. We present an analysis of the relationship between the gaze input box, gaze time, and average input time, and report experimental results on the effects of varying the size of the gaze input box and gaze time required to achieve 100% accuracy in inputting information. Through this paper, we demonstrate the effectiveness of our system in mitigating the challenges of contact-based input methods, and providing a non-contact alternative that is both secure and convenient.