• Title/Summary/Keyword: pupil detection.

Search Result 80, Processing Time 0.02 seconds

Optical Design of a Snapshot Nonmydriatic Fundus-imaging Spectrometer Based on the Eye Model

  • Zhao, Xuehui;Chang, Jun;Zhang, Wenchao;Wang, Dajiang;Chen, Weilin;Cao, Jiajing
    • Current Optics and Photonics
    • /
    • v.6 no.2
    • /
    • pp.151-160
    • /
    • 2022
  • Fundus images can reflect ocular diseases and systemic diseases such as glaucoma, diabetes mellitus, and hypertension. Thus, research on fundus-detection equipment is of great importance. The fundus camera has been widely used as a kind of noninvasive detection equipment. Most existing devices can only obtain two-dimensional (2D) retinal-image information, yet the fundus of the human eye also has spectral characteristics. The fundus has many pigments, and their different distributions in the eye lead to dissimilar tissue penetration for light waves, which can reflect the corresponding fundus structure. To obtain more abundant information and improve the detection level of equipment, a snapshot nonmydriatic fundus imaging spectral system, including fundus-imaging spectrometer and illumination system, is studied in this paper. The system uses a microlens array to realize snapshot technology; information can be obtained from only a single exposure. The system does not need to dilate the pupil. Hence, the operation is simple, which reduces its influence on the detected object. The system works in the visible and near-infrared bands (550-800 nm), with a volume less than 400 mm × 120 mm × 75 mm and a spectral resolution better than 6 nm.

Active Facial Tracking for Fatigue Detection (피로 검출을 위한 능동적 얼굴 추적)

  • Kim, Tae-Woo;Kang, Yong-Seok
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.2 no.3
    • /
    • pp.53-60
    • /
    • 2009
  • The vision-based driver fatigue detection is one of the most prospective commercial applications of facial expression recognition technology. The facial feature tracking is the primary technique issue in it. Current facial tracking technology faces three challenges: (1) detection failure of some or all of features due to a variety of lighting conditions and head motions; (2) multiple and non-rigid object tracking; and (3) features occlusion when the head is in oblique angles. In this paper, we propose a new active approach. First, the active IR sensor is used to robustly detect pupils under variable lighting conditions. The detected pupils are then used to predict the head motion. Furthermore, face movement is assumed to be locally smooth so that a facial feature can be tracked with a Kalman filter. The simultaneous use of the pupil constraint and the Kalman filtering greatly increases the prediction accuracy for each feature position. Feature detection is accomplished in the Gabor space with respect to the vicinity of predicted location. Local graphs consisting of identified features are extracted and used to capture the spatial relationship among detected features. Finally, a graph-based reliability propagation is proposed to tackle the occlusion problem and verify the tracking results. The experimental results show validity of our active approach to real-life facial tracking under variable lighting conditions, head orientations, and facial expressions.

  • PDF

Active Facial Tracking for Fatigue Detection (피로 검출을 위한 능동적 얼굴 추적)

  • 박호식;정연숙;손동주;나상동;배철수
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2004.05b
    • /
    • pp.603-607
    • /
    • 2004
  • The vision-based driver fatigue detection is one of the most prospective commercial applications of facial expression recognition technology. The facial feature tracking is the primary technique issue in it. Current facial tracking technology faces three challenges: (1) detection failure of some or all of features due to a variety of lighting conditions and head motions; (2) multiple and non-rigid object tracking and (3) features occlusion when the head is in oblique angles. In this paper, we propose a new active approach. First, the active IR sensor is used to robustly detect pupils under variable lighting conditions. The detected pupils are then used to predict the head motion. Furthermore, face movement is assumed to be locally smooth so that a facial feature can be tracked with a Kalman filter. The simultaneous use of the pupil constraint and the Kalman filtering greatly increases the prediction accuracy for each feature position. Feature detection is accomplished in the Gabor space with respect to the vicinity of predicted location. Local graphs consisting of identified features are extracted and used to capture the spatial relationship among detected features. Finally, a graph-based reliability propagation is proposed to tackle the occlusion problem and verify the tracking results. The experimental results show validity of our active approach to real-life facial tracking under variable lighting conditions, head orientations, and facial expressions.

  • PDF

Real Time Eye and Gaze Tracking (실시간 눈과 시선 위치 추적)

  • Cho, Hyeon-Seob;Kim, Hee-Sook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.6 no.2
    • /
    • pp.195-201
    • /
    • 2005
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Resolution Estimation Technique in Gaze Tracking System for HCI (HCI를 위한 시선추적 시스템에서 분해능의 추정기법)

  • Kim, Ki-Bong;Choi, Hyun-Ho
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.1
    • /
    • pp.20-27
    • /
    • 2021
  • Eye tracking is one of the NUI technologies, and it finds out where the user is gazing. This technology allows users to input text or control GUI, and further analyzes the user's gaze so that it can be applied to commercial advertisements. In the eye tracking system, the allowable range varies depending on the quality of the image and the degree of freedom of movement of the user. Therefore, there is a need for a method of estimating the accuracy of eye tracking in advance. The accuracy of eye tracking is greatly affected by how the eye tracking algorithm is implemented in addition to hardware variables. Accordingly, in this paper, we propose a method to estimate how many degrees of gaze changes when the pupil center moves by one pixel by estimating the maximum possible movement distance of the pupil center in the image.

Eye Gaze toy Human Computer Interaction (눈동자의 움직임을 이용한 휴먼 컴퓨터 인터랙션)

  • 권기문;이정준;박강령;김재희
    • Proceedings of the IEEK Conference
    • /
    • 2003.11b
    • /
    • pp.83-86
    • /
    • 2003
  • This paper suggests user's interface with computer by means of detecting gaze under HMD, head mounted display, environment. System is derived as follows; firstly, calibrate a camera in HMD, which determines geometrical relationship between monitor and captured image. Second, detect the center of pupil using algorithm of the center of mass and represent a gazing position on the computer screen. If user blinks or stares at a certain position for a while, message is sent to computer. Experimental results show the center of mass is robust against glint effects, and detecting error was 7.1%. and 4.85% in vertical and horizontal direction, respectively. To adjust detailed movement of a mouse takes 0.8 sec more. The 98% of blinking is detected successfully and 94% of clicking detection is resulted.

  • PDF

A Study on Eye Detection by Using Adaboost for Iris Recognition in Mobile Environments (Adaboost를 이용한 모바일 환경에서의 홍채인식을 위한 눈 검출에 관한 연구)

  • Park, Kang-Ryoung;Park, Sung-Hyo;Cho, Dal-Ho
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.45 no.4
    • /
    • pp.1-11
    • /
    • 2008
  • In this paper, we propose the new eye detection method by using adaboost (adaptive boosting) method. Also, to reduce the false alarm rate which identifies the non-eye region as genuine eye that is the Problems of previous method using conventional adaboost, we proposed the post processing methods which used the cornea specular reflection and determined the optimized ratio of eye detecting box. Based on detected eye region by using adaboost, we performed the double circular edge detector for localizing a pupil and an iris region at the same time. Experimental results showed that the accuracy of eye detection was about 98% and the processing time was less than 1 second in mobile device.

Real-Time Eye Detection and Tracking Under Various Light Conditions (다양한 조명하에서 실시간 눈 검출 및 추적)

  • 박호식;박동희;남기환;한준희;나상동;배철수
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2003.10a
    • /
    • pp.227-232
    • /
    • 2003
  • Non-intrusive methods based on active remote IR illumination for eye tracking is important for many applications of vision-based man-machine interaction. One problem that has plagued those methods is their sensitivity to lighting condition change. This tends to significantly limit their scope of application. In this paper, we present a new real-time eye detection and tracking methodology that works under variable and realistic lighting conditions. eased on combining the bright-pupil effect resulted from IR light and the conventional appearance-based object recognition technique, our method can robustly track eyes when the pupils are not very bright due to significant external illumination interferences. The appearance model is incorporated in both eyes detection and tracking via the use of support vector machine and the mean shift tracking. Additional improvement is achieved from modifying the image acquisition apparatus including the illuminator and the camera.

  • PDF

A Novel Eyelashes Removal Method for Improving Iris Data Preservation Rate (홍채영역에서의 홍채정보 보존율 향상을 위한 새로운 속눈썹 제거 방법)

  • Kim, Seong-Hoon;Han, Gi-Tae
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.3 no.10
    • /
    • pp.429-440
    • /
    • 2014
  • The iris recognition is a biometrics technology to extract and code an unique iris feature from human eye image. Also, it includes the technology to compare with other's various iris stored in the system. On the other hand, eyelashes in iris image are a external factor to affect to recognition rate of iris. If eyelashes are not removed exactly from iris area, there are two false recognitions that recognize eyelashes to iris features or iris features to eyelashes. Eventually, these false recognitions bring out a lot of loss in iris informations. In this paper, in order to solve that problems, we removed eyelashes by gabor filter that using for analysis of frequency feature and improve preservation rate of iris informations. By novel method to extract various features on iris area using angle, frequency, and gaussian parameter on gabor filter that is one of the filters for analysing frequency feature for an image, we could remove accurately eyelashes with various lengths and shapes. As the result, proposed method represents that improve about 4% than previous methods using GMM or histogram analysis in iris preservation rate.

Far Distance Face Detection from The Interest Areas Expansion based on User Eye-tracking Information (시선 응시 점 기반의 관심영역 확장을 통한 원 거리 얼굴 검출)

  • Park, Heesun;Hong, Jangpyo;Kim, Sangyeol;Jang, Young-Min;Kim, Cheol-Su;Lee, Minho
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.49 no.9
    • /
    • pp.113-127
    • /
    • 2012
  • Face detection methods using image processing have been proposed in many different ways. Generally, the most widely used method for face detection is an Adaboost that is proposed by Viola and Jones. This method uses Haar-like feature for image learning, and the detection performance depends on the learned images. It is well performed to detect face images within a certain distance range, but if the image is far away from the camera, face images become so small that may not detect them with the pre-learned Haar-like feature of the face image. In this paper, we propose the far distance face detection method that combine the Aadaboost of Viola-Jones with a saliency map and user's attention information. Saliency Map is used to select the candidate face images in the input image, face images are finally detected among the candidated regions using the Adaboost with Haar-like feature learned in advance. And the user's eye-tracking information is used to select the interest regions. When a subject is so far away from the camera that it is difficult to detect the face image, we expand the small eye gaze spot region using linear interpolation method and reuse that as input image and can increase the face image detection performance. We confirmed the proposed model has better results than the conventional Adaboost in terms of face image detection performance and computational time.