• Title/Summary/Keyword: pupil detect

Search Result 36, Processing Time 0.019 seconds

A Study on Eyelid and Eyelash Localization for Iris Recognition (홍채 인식에서의 눈꺼풀 및 눈썹 추출 연구)

  • Kang, Byung-Joon;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.7
    • /
    • pp.898-905
    • /
    • 2005
  • Iris recognition Is that identifies a user based on the unique iris muscle patterns which has the functionalities of dilating or contracting pupil region. Because it is reported that iris recognition is more accurate than other biometries such as face, fingerprint, vein and speaker recognition, iris recognition is widely used in the high security application domain. However, if unnecessary information such as eyelid and eyelash is included in iris region, the error for iris recognition is increased, consequently. In detail, if iris region is used to generate iris code including eyelash and eyelid, the iris codes are also changed and the error rate is increased. To overcome such problem, we propose the method of detecting eyelid by using pyramid searching parabolic deformable template. In addition, we detect the eyelash by using the eyelash mask. Experimental results show that EER(Equal Error Rate) for iris recognition using the proposed algorithm is lessened as much as $0.3\%$ compared to that not using it.

  • PDF

3D View Controlling by Using Eye Gaze Tracking in First Person Shooting Game (1 인칭 슈팅 게임에서 눈동자 시선 추적에 의한 3차원 화면 조정)

  • Lee, Eui-Chul;Cho, Yong-Joo;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.10
    • /
    • pp.1293-1305
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMD. The proposed method is composed of 3 parts. In the first fart, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, the geometric relationship is determined between the monitor gazing position and the detected eye position gazing at the monitor position. In the last fart, the final gaze position on the HMB monitor is tracked and the 3D view in game is control]ed by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his (or her) hand. Also, it can increase the interest and immersion by synchronizing the gaze direction of game player and that of game character.

  • PDF

Wearable Robot System Enabling Gaze Tracking and 3D Position Acquisition for Assisting a Disabled Person with Disabled Limbs (시선위치 추적기법 및 3차원 위치정보 획득이 가능한 사지장애인 보조용 웨어러블 로봇 시스템)

  • Seo, Hyoung Kyu;Kim, Jun Cheol;Jung, Jin Hyung;Kim, Dong Hwan
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.37 no.10
    • /
    • pp.1219-1227
    • /
    • 2013
  • A new type of wearable robot is developed for a disabled person with disabled limbs, that is, a person who cannot intentionally move his/her legs and arms. This robot can enable the disabled person to grip an object using eye movements. A gaze tracking algorithm is employed to detect pupil movements by which the person observes the object to be gripped. By using this gaze tracking 2D information, the object is identified and the distance to the object is measured using a Kinect device installed on the robot shoulder. By using several coordinate transformations and a matching scheme, the final 3D information about the object from the base frame can be clearly identified, and the final position data is transmitted to the DSP-controlled robot controller, which enables the target object to be gripped successfully.

Active Facial Tracking for Fatigue Detection (피로 검출을 위한 능동적 얼굴 추적)

  • Kim, Tae-Woo;Kang, Yong-Seok
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.2 no.3
    • /
    • pp.53-60
    • /
    • 2009
  • The vision-based driver fatigue detection is one of the most prospective commercial applications of facial expression recognition technology. The facial feature tracking is the primary technique issue in it. Current facial tracking technology faces three challenges: (1) detection failure of some or all of features due to a variety of lighting conditions and head motions; (2) multiple and non-rigid object tracking; and (3) features occlusion when the head is in oblique angles. In this paper, we propose a new active approach. First, the active IR sensor is used to robustly detect pupils under variable lighting conditions. The detected pupils are then used to predict the head motion. Furthermore, face movement is assumed to be locally smooth so that a facial feature can be tracked with a Kalman filter. The simultaneous use of the pupil constraint and the Kalman filtering greatly increases the prediction accuracy for each feature position. Feature detection is accomplished in the Gabor space with respect to the vicinity of predicted location. Local graphs consisting of identified features are extracted and used to capture the spatial relationship among detected features. Finally, a graph-based reliability propagation is proposed to tackle the occlusion problem and verify the tracking results. The experimental results show validity of our active approach to real-life facial tracking under variable lighting conditions, head orientations, and facial expressions.

  • PDF

Active Facial Tracking for Fatigue Detection (피로 검출을 위한 능동적 얼굴 추적)

  • 박호식;정연숙;손동주;나상동;배철수
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2004.05b
    • /
    • pp.603-607
    • /
    • 2004
  • The vision-based driver fatigue detection is one of the most prospective commercial applications of facial expression recognition technology. The facial feature tracking is the primary technique issue in it. Current facial tracking technology faces three challenges: (1) detection failure of some or all of features due to a variety of lighting conditions and head motions; (2) multiple and non-rigid object tracking and (3) features occlusion when the head is in oblique angles. In this paper, we propose a new active approach. First, the active IR sensor is used to robustly detect pupils under variable lighting conditions. The detected pupils are then used to predict the head motion. Furthermore, face movement is assumed to be locally smooth so that a facial feature can be tracked with a Kalman filter. The simultaneous use of the pupil constraint and the Kalman filtering greatly increases the prediction accuracy for each feature position. Feature detection is accomplished in the Gabor space with respect to the vicinity of predicted location. Local graphs consisting of identified features are extracted and used to capture the spatial relationship among detected features. Finally, a graph-based reliability propagation is proposed to tackle the occlusion problem and verify the tracking results. The experimental results show validity of our active approach to real-life facial tracking under variable lighting conditions, head orientations, and facial expressions.

  • PDF

Smartphone Addiction Detection Based Emotion Detection Result Using Random Forest (랜덤 포레스트를 이용한 감정인식 결과를 바탕으로 스마트폰 중독군 검출)

  • Lee, Jin-Kyu;Kang, Hyeon-Woo;Kang, Hang-Bong
    • Journal of IKEEE
    • /
    • v.19 no.2
    • /
    • pp.237-243
    • /
    • 2015
  • Recently, eight out of ten people have smartphone in Korea. Also, many applications of smartphone have increased. So, smartphone addiction has become a social issue. Especially, many people in smartphone addiction can't control themselves. Sometimes they don't realize that they are smartphone addiction. Many studies, mostly surveys, have been conducted to diagnose smartphone addiction, e.g. S-measure. In this paper, we suggest how to detect smartphone addiction based on ECG and Eye Gaze. We measure the signals of ECG from the Shimmer and the signals of Eye Gaze from the smart eye when the subjects see the emotional video. In addition, we extract features from the S-transform of ECG. Using Eye Gaze signals(pupil diameter, Gaze distance, Eye blinking), we extract 12 features. The classifier is trained using Random Forest. The classifiers detect the smartphone addiction using the ECG and Eye Gaze signals. We compared the detection results with S-measure results that surveyed before test. It showed 87.89% accuracy in ECG and 60.25% accuracy in Eye Gaze.