• Title/Summary/Keyword: eye-tracking method

Search Result 173, Processing Time 0.027 seconds

Real-Time Eye Detection and Tracking Under Various Light Conditions (다양한 조명하에서 실시간 눈 검출 및 추적)

  • 박호식;박동희;남기환;한준희;나상동;배철수
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2003.10a
    • /
    • pp.227-232
    • /
    • 2003
  • Non-intrusive methods based on active remote IR illumination for eye tracking is important for many applications of vision-based man-machine interaction. One problem that has plagued those methods is their sensitivity to lighting condition change. This tends to significantly limit their scope of application. In this paper, we present a new real-time eye detection and tracking methodology that works under variable and realistic lighting conditions. eased on combining the bright-pupil effect resulted from IR light and the conventional appearance-based object recognition technique, our method can robustly track eyes when the pupils are not very bright due to significant external illumination interferences. The appearance model is incorporated in both eyes detection and tracking via the use of support vector machine and the mean shift tracking. Additional improvement is achieved from modifying the image acquisition apparatus including the illuminator and the camera.

  • PDF

Implementation of eye-controlled mouse by real-time tracking of the three dimensional eye-gazing point (3차원 시선 추적에 의한 시각 제어 마우스 구현 연구)

  • Kim Jae-Han
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2006.05a
    • /
    • pp.209-212
    • /
    • 2006
  • This paper presents design and implementation methods of the eye-controlled mouse using the real-time tracking of the three dimensional gazing point. The proposed method is based on three dimensional data processing of eye images in the 3D world coordinates. The system hardware consists of two conventional CCD cameras for acquisition of stereoscopic image and computer for processing. And in this paper, the advantages of the proposed algorithm and test results are described.

  • PDF

A Human-Robot Interface Using Eye-Gaze Tracking System for People with Motor Disabilities

  • Kim, Do-Hyoung;Kim, Jae-Hean;Yoo, Dong-Hyun;Lee, Young-Jin;Chung, Myung-Jin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.3 no.4
    • /
    • pp.229-235
    • /
    • 2001
  • Recently, service area has been emerging field f robotic applications. Even though assistant robots play an important role for the disabled and the elderly, they still suffer from operating the robots using conventional interface devices such as joysticks or keyboards. In this paper we propose an efficient computer interface using real-time eye-gaze tracking system. The inputs to the proposed system are images taken by a camera and data from a magnetic sensor. The measured data is sufficient to describe the eye and head movement because the camera and the receiver of a magnetic sensor are stationary with respect to the head. So the proposed system can obtain the eye-gaze direction in spite of head movement as long as the distance between the system and the transmitter of a magnetic position sensor is within 2m. Experimental results show the validity of the proposed system in practical aspect and also verify the feasibility of the system as a new computer interface for the disabled.

  • PDF

The Relationship between Visual Perception and Emotion from Fear Appeals and Size of Warning Images on Cigarette Packages

  • Hwang, Mi Kyung;Jin, Xin;Zhou, Yi Mou;Kwon, Mahn Woo
    • Journal of Multimedia Information System
    • /
    • v.9 no.2
    • /
    • pp.137-144
    • /
    • 2022
  • This research aims to identify the relationship between visual perception and emotion by the types of fear responses elicited from warning images on cigarette packages as well as the effectiveness of the size of such images through questionnaires and eye-tracking experiments with twenty university students from the colleges based in Busan. The research distinguished and analyzed the warning images as rational appeals and emotional appeals by the degree of fear and disgust and the result concurred with the research conclusions of Maynard that people would naturally avoid eye contact when presented with a warning image on cigarette packages. Also, eye avoidance was highly identified with larger (75%) warning images. While the previous research mostly adopted the self-rated validation method, this research tried to make the methodology more objective by adopting both questionnaires and eye-tracking experiments. Through this research, authors contribute to finding effective warning images on cigarette packages in a way to increase public awareness of the dangers of smoking and discourage smoking. Further research is recommended to explore the effectiveness of using explicit images on cigarette packages by the types of smokers such as heavy smokers, normal smokers, and non-smokers.

Head tracking system using image processing (영상처리를 이용한 머리의 움직임 추적 시스템)

  • 박경수;임창주;반영환;장필식
    • Journal of the Ergonomics Society of Korea
    • /
    • v.16 no.3
    • /
    • pp.1-10
    • /
    • 1997
  • This paper is concerned with the development and evaluation of the camera calibration method for a real-time head tracking system. Tracking of head movements is important in the design of an eye-controlled human/computer interface and the area of virtual environment. We proposed a video-based head tracking system. A camera was mounted on the subject's head and it took the front view containing eight 3-dimensional reference points(passive retr0-reflecting markers) fixed at the known position(computer monitor). The reference points were captured by image processing board. These points were used to calculate the position (3-dimensional) and orientation of the camera. A suitable camera calibration method for providing accurate extrinsic camera parameters was proposed. The method has three steps. In the first step, the image center was calibrated using the method of varying focal length. In the second step, the focal length and the scale factor were calibrated from the Direct Linear Transformation (DLT) matrix obtained from the known position and orientation of the camera. In the third step, the position and orientation of the camera was calculated from the DLT matrix, using the calibrated intrinsic camera parameters. Experimental results showed that the average error of camera positions (3- dimensional) is about $0.53^{\circ}C$, the angular errors of camera orientations are less than $0.55^{\circ}C$and the data aquisition rate is about 10Hz. The results of this study can be applied to the tracking of head movements related to the eye-controlled human/computer interface and the virtual environment.

  • PDF

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.10
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

A Distortion Correction Method for the Fish-Eye Lens using Photogrammetric Techniques (사진측량 기법을 사용한 어안렌즈 왜곡보정에 관한 연구)

  • Kang, Jin-A;Park, Jae-Min;Kim, Byung-Guk
    • Proceedings of the Korean Society of Surveying, Geodesy, Photogrammetry, and Cartography Conference
    • /
    • 2007.04a
    • /
    • pp.161-164
    • /
    • 2007
  • The paper studies in the wide-angle lens and distortion tendency and employs the correction techniques suitable to the fish-eye lens using the existing photographic survey methods. After carrying out the calibration of the the fish-eye lens, we calculated the correction parameters, and then developed the method that convert the original image-point to new image-point correcting distortion. The objectives of suggested calibration method in this paper are to calibrate the image of the the fish-eye lens used in the computer-vision and the control-instrumentation field widely. The proposed technique expects to improve the accuracy of the image of the fish-eye lens in the indoor tracking and monitoring field. Also the referenced cross point auto-extraction program is embodied for improving efficiency of the lens correction techniques. Consequently, this calibration method would be applied to the automated distorting correction method on not only the fish-eye lens also general lens.

  • PDF

EEG-based Customized Driving Control Model Design (뇌파를 이용한 맞춤형 주행 제어 모델 설계)

  • Jin-Hee Lee;Jaehyeong Park;Je-Seok Kim;Soon, Kwon
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.18 no.2
    • /
    • pp.81-87
    • /
    • 2023
  • With the development of BCI devices, it is now possible to use EEG control technology to move the robot's arms or legs to help with daily life. In this paper, we propose a customized vehicle control model based on BCI. This is a model that collects BCI-based driver EEG signals, determines information according to EEG signal analysis, and then controls the direction of the vehicle based on the determinated information through EEG signal analysis. In this case, in the process of analyzing noisy EEG signals, controlling direction is supplemented by using a camera-based eye tracking method to increase the accuracy of recognized direction . By synthesizing the EEG signal that recognized the direction to be controlled and the result of eye tracking, the vehicle was controlled in five directions: left turn, right turn, forward, backward, and stop. In experimental result, the accuracy of direction recognition of our proposed model is about 75% or higher.

Product Images Attracting Attention: Eye-tracking Analysis

  • Pavel Shin;Kil-Soo Suh;Hyunjeong Kang
    • Asia pacific journal of information systems
    • /
    • v.29 no.4
    • /
    • pp.731-751
    • /
    • 2019
  • This study examined the impact of various product photo features on the attention of potential consumers in online apparel retailers' environment. Recently, the method of apparel's product photo representation in online shopping stores has been changed a lot from the classic product photos in the early days. In order to investigate if this shift is effective in attracting consumers' attention, we examined the related theory and verified its effect through laboratory experiments. In particular, experiment data was collected and analyzed using eye tracking technology. According to the results of this study, it was shown that the product photos with asymmetry are more attractive than symmetrical photos, well emphasized object within a photo more attractive than partially emphasized, smiling faces are more attractive for customer than emotionless and sad, and photos with uncentered models focus more consumer's attention than photos with model in the center. These results are expected to help design internet shopping stores to gaze more customers' attention.

Recognition Performance of Vestibular-Ocular Reflex Based Vision Tracking System for Mobile Robot (이동 로봇을 위한 전정안반사 기반 비젼 추적 시스템의 인식 성능 평가)

  • Park, Jae-Hong;Bhan, Wook;Choi, Tae-Young;Kwon, Hyun-Il;Cho, Dong-Il;Kim, Kwang-Soo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.5
    • /
    • pp.496-504
    • /
    • 2009
  • This paper presents a recognition performance of VOR (Vestibular-Ocular Reflex) based vision tracking system for mobile robot. The VOR is a reflex eye movement which, during head movements, produces an eye movement in the direction opposite to the head movement, thus maintaining the image of interested objects placed on the center of retina. We applied this physiological concept to the vision tracking system for high recognition performance in mobile environments. The proposed method was implemented in a vision tracking system consisting of a motion sensor module and an actuation module with vision sensor. We tested the developed system on an x/y stage and a rate table for linear motion and angular motion, respectively. The experimental results show that the recognition rates of the VOR-based method are three times more than non-VOR conventional vision system, which is mainly due to the fact that VOR-based vision tracking system has the line of sight of vision system to be fixed to the object, eventually reducing the blurring effect of images under the dynamic environment. It suggests that the VOR concept proposed in this paper can be applied efficiently to the vision tracking system for mobile robot.