• Title/Summary/Keyword: Eye Tracking system

Search Result 172, Processing Time 0.023 seconds

A study on the eye Location for Video-Conferencing Interface (화상 회의 인터페이스를 위한 눈 위치 검출에 관한 연구)

  • Jung, Jo-Nam;Gang, Jang-Mook;Bang, Kee-Chun
    • Journal of Digital Contents Society
    • /
    • v.7 no.1
    • /
    • pp.67-74
    • /
    • 2006
  • In current video-conferencing systems. user's face movements are restricted by fixed camera, therefore it is inconvenient to users. To solve this problem, tracking of face movements is needed. Tracking using whole face needs much computing time and whole face is difficult to define as an one feature. Thus, using several feature points in face is more desirable to track face movements efficiently. This paper addresses an effective eye location algorithm which is essential process of automatic human face tracking system for natural video-conferencing. The location of eye is very important information for face tracking, as eye has most clear and simplest attribute in face. The proposed algorithm is applied to candidate face regions from the face region extraction. It is not sensitive to lighting conditions and has no restriction on face size and face with glasses. The proposed algorithm shows very encouraging results from experiments on video-conferencing environments.

  • PDF

Head tracking system using image processing (영상처리를 이용한 머리의 움직임 추적 시스템)

  • 박경수;임창주;반영환;장필식
    • Journal of the Ergonomics Society of Korea
    • /
    • v.16 no.3
    • /
    • pp.1-10
    • /
    • 1997
  • This paper is concerned with the development and evaluation of the camera calibration method for a real-time head tracking system. Tracking of head movements is important in the design of an eye-controlled human/computer interface and the area of virtual environment. We proposed a video-based head tracking system. A camera was mounted on the subject's head and it took the front view containing eight 3-dimensional reference points(passive retr0-reflecting markers) fixed at the known position(computer monitor). The reference points were captured by image processing board. These points were used to calculate the position (3-dimensional) and orientation of the camera. A suitable camera calibration method for providing accurate extrinsic camera parameters was proposed. The method has three steps. In the first step, the image center was calibrated using the method of varying focal length. In the second step, the focal length and the scale factor were calibrated from the Direct Linear Transformation (DLT) matrix obtained from the known position and orientation of the camera. In the third step, the position and orientation of the camera was calculated from the DLT matrix, using the calibrated intrinsic camera parameters. Experimental results showed that the average error of camera positions (3- dimensional) is about $0.53^{\circ}C$, the angular errors of camera orientations are less than $0.55^{\circ}C$and the data aquisition rate is about 10Hz. The results of this study can be applied to the tracking of head movements related to the eye-controlled human/computer interface and the virtual environment.

  • PDF

Webcam-Based 2D Eye Gaze Estimation System By Means of Binary Deformable Eyeball Templates

  • Kim, Jin-Woo
    • Journal of information and communication convergence engineering
    • /
    • v.8 no.5
    • /
    • pp.575-580
    • /
    • 2010
  • Eye gaze as a form of input was primarily developed for users who are unable to use usual interaction devices such as keyboard and the mouse; however, with the increasing accuracy in eye gaze detection with decreasing cost of development, it tends to be a practical interaction method for able-bodied users in soon future as well. This paper explores a low-cost, robust, rotation and illumination independent eye gaze system for gaze enhanced user interfaces. We introduce two brand-new algorithms for fast and sub-pixel precise pupil center detection and 2D Eye Gaze estimation by means of deformable template matching methodology. In this paper, we propose a new algorithm based on the deformable angular integral search algorithm based on minimum intensity value to localize eyeball (iris outer boundary) in gray scale eye region images. Basically, it finds the center of the pupil in order to use it in our second proposed algorithm which is about 2D eye gaze tracking. First, we detect the eye regions by means of Intel OpenCV AdaBoost Haar cascade classifiers and assign the approximate size of eyeball depending on the eye region size. Secondly, using DAISMI (Deformable Angular Integral Search by Minimum Intensity) algorithm, pupil center is detected. Then, by using the percentage of black pixels over eyeball circle area, we convert the image into binary (Black and white color) for being used in the next part: DTBGE (Deformable Template based 2D Gaze Estimation) algorithm. Finally, using DTBGE algorithm, initial pupil center coordinates are assigned and DTBGE creates new pupil center coordinates and estimates the final gaze directions and eyeball size. We have performed extensive experiments and achieved very encouraging results. Finally, we discuss the effectiveness of the proposed method through several experimental results.

Human-Computer Interface using the Eye-Gaze Direction (눈의 응시 방향을 이용한 인간-컴퓨터간의 인터페이스에 관한 연구)

  • Kim, Do-Hyoung;Kim, Jea-Hean;Chung, Myung-Jin
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.38 no.6
    • /
    • pp.46-56
    • /
    • 2001
  • In this paper we propose an efficient approach for real-time eye-gaze tracking from image sequence and magnetic sensory information. The inputs to the eye-gaze tracking system are images taken by a camera and data from a magnetic sensor. The measuring data are sufficient to describe the eye and head movement, because the camera and the receiver of a magnetic sensor are stationary with respect to the head. Experimental result shows the validity of real time application aspect of the proposed system and also shows the feasibility of the system as using a new computer interface instead of the mouse.

  • PDF

Eye Gaze Tracking System Under Natural Head Movements (머리 움직임이 자유로운 안구 응시 추정 시스템)

  • ;Matthew, Sked;Qiang, Ji
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.41 no.5
    • /
    • pp.57-64
    • /
    • 2004
  • We proposed the eye gaze tracking system under natural head movements, which consists of one narrow-view field CCD camera, two mirrors which of reflective angles are controlled and active infra-red illumination. The mirrors' angles were computed by geometric and linear algebra calculations to put the pupil images on the optical axis of the camera. Our system allowed the subjects head to move 90cm horizontally and 60cm vertically, and the spatial resolutions were about 6$^{\circ}$ and 7$^{\circ}$, respectively. The frame rate for estimating gaze points was 10~15 frames/sec. As gaze mapping function, we used the hierarchical generalized regression neural networks (H-GRNN) based on the two-pass GRNN. The gaze accuracy showed 94% by H-GRNN improved 9% more than 85% of GRNN even though the head or face was a little rotated. Our system does not have a high spatial gaze resolution, but it allows natural head movements, robust and accurate gaze tracking. In addition there is no need to re-calibrate the system when subjects are changed.

The Effects of Accuracy on Skill Level and Eye-Tracking Type in Golf Putting (숙련도와 시선형태가 골프퍼팅의 정확성에 미치는 영향)

  • Woo, Byung-Hoon;Kim, Chang-Won;Park, Yang-Sun;Lee, Kun-Chun;Lim, Young-Tae
    • Korean Journal of Applied Biomechanics
    • /
    • v.19 no.4
    • /
    • pp.729-738
    • /
    • 2009
  • The purpose of this study was to analyze the impact accuracy and kinematic parameters of skill level and eye-tracking type during putting strokes. For comparison, five elite golfers and five novice golfers participated in this study. Three-dimensional kinematic data were collected for each subject while 10 putting trials were performed for each skill level and eye-tracking type. The APAS system was used to compute the impact accuracy and kinematic parameters of putter heads. The putting stroke was divided into three phases: back swing, downswing, and follow-through. The findings indicated that significant differences were found in skill level as it affected the rate of success. For impact accuracy and the displacement of putter heads, a significant difference was found for the skill level, particularly in backs-wing and follow-through. In addition, the displacement of the putter head had a greater influence on stroke accuracy than on velocity.

Eye-Tracking 3D Display System Using Variable Parallax Barrier and DDC/CI (가변형 패럴랙스 배리어와 DDC 통신을 이용한 시점추적형 3D 디스플레이 시스템)

  • Che, Ho-Byoung;Yoo, Young-Rok;Kim, Jin-Soo;Lee, Sang-Hun;Lee, Seung-Hyun
    • Korean Journal of Optics and Photonics
    • /
    • v.20 no.2
    • /
    • pp.102-109
    • /
    • 2009
  • In this paper, we introduce an eye-tracking 3D display system using variable parallax barrier and DDC communication. A barrier of variable parallax barrier composed of 4 sub-barriers and a commercially available web camera is utilized to implement the eye-tracking system. The coordinates of a viewer is extracted from the web camera transfer to 3D display via DDD/CI communication. The variable barrier attached to the LCD moves electrically according to the right eye position for 3D images. This system is compared experimentally with the commercial parallax barrier methods.

Distance error of monopulse radar in cross-eye jamming using terrain bounce (지형 바운스를 이용하는 크로스 아이 재밍의 모노펄스 레이다 거리 오차)

  • Lim, Joong-Soo;Chae, Gyoo-Soo
    • Journal of Convergence for Information Technology
    • /
    • v.12 no.4
    • /
    • pp.9-13
    • /
    • 2022
  • In this paper, the tracking error of monopulse radar caused by cross-eye jamming using terrain bounce is analyzed. Cross-eye jamming is a method of generating an error in a radar tracking system by simultaneously transmitting two signals with different phases and amplitudes. When the monopulse radar receives the cross-eye jamming signal generated by the terrain bounce, a tracking error occurs in the elevation direction. In the presence of multipath, this signal is a combination of the direct target return and a return seemingly emanating from the target image beneath the terrain surface. Terrain bounce jamming has the advantage of using a single jammer, but the space affecting the jamming is limited by the terrain reflection angle and the degree of scattering of the terrain. This study can be usefully used to protect ships from low-altitude missiles or aircraft in the sea.

Real-Time Automatic Tracking of Facial Feature (얼굴 특징 실시간 자동 추적)

  • 박호식;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.6
    • /
    • pp.1182-1187
    • /
    • 2004
  • Robust, real-time, fully automatic tracking of facial features is required for many computer vision and graphics applications. In this paper, we describe a fully automatic system that tracks eyes and eyebrows in real time. The pupils are tracked using the red eye effect by an infrared sensitive camera equipped with infrared LEDs. Templates are used to parameterize the facial features. For each new frame, the pupil coordinates are used to extract cropped images of eyes and eyebrows. The template parameters are recovered by PCA analysis on these extracted images using a PCA basis, which was constructed during the training phase with some example images. The system runs at 30 fps and requires no manual initialization or calibration. The system is shown to work well on sequences with considerable head motions and occlusions.

A New Ergonomic Interface System for the Disabled Person (장애인을 위한 새로운 감성 인터페이스 연구)

  • Heo, Hwan;Lee, Ji-Woo;Lee, Won-Oh;Lee, Eui-Chul;Park, Kang-Ryoung
    • Journal of the Ergonomics Society of Korea
    • /
    • v.30 no.1
    • /
    • pp.229-235
    • /
    • 2011
  • Objective: Making a new ergonomic interface system based on camera vision system, which helps the handicapped in home environment. Background: Enabling the handicapped to manipulate the consumer electronics by the proposed interface system. Method: A wearable device for capturing the eye image using a near-infrared(NIR) camera and illuminators is proposed for tracking eye gaze position(Heo et al., 2011). A frontal viewing camera is attached to the wearable device, which can recognize the consumer electronics to be controlled(Heo et al., 2011). And the amount of user's eye fatigue can be measured based on eye blink rate, and in case that the user's fatigue exceeds in the predetermined level, the proposed system can automatically change the mode of gaze based interface into that of manual selection. Results: The experimental results showed that the gaze estimation error of the proposed method was 1.98 degrees with the successful recognition of the object by the frontal viewing camera(Heo et al., 2011). Conclusion: We made a new ergonomic interface system based on gaze tracking and object recognition Application: The proposed system can be used for helping the handicapped in home environment.