• Title/Summary/Keyword: Eye Tracking System

Search Result 172, Processing Time 0.027 seconds

Analysis of User's Eye Gaze Distribution while Interacting with a Robotic Character (로봇 캐릭터와의 상호작용에서 사용자의 시선 배분 분석)

  • Jang, Seyun;Cho, Hye-Kyung
    • The Journal of Korea Robotics Society
    • /
    • v.14 no.1
    • /
    • pp.74-79
    • /
    • 2019
  • In this paper, we develop a virtual experimental environment to investigate users' eye gaze in human-robot social interaction, and verify it's potential for further studies. The system consists of a 3D robot character capable of hosting simple interactions with a user, and a gaze processing module recording which body part of the robot character, such as eyes, mouth or arms, the user is looking at, regardless of whether the robot is stationary or moving. To verify that the results acquired on this virtual environment are aligned with those of physically existing robots, we performed robot-guided quiz sessions with 120 participants and compared the participants' gaze patterns with those in previous works. The results included the followings. First, when interacting with the robot character, the user's gaze pattern showed similar statistics as the conversations between humans. Second, an animated mouth of the robot character received longer attention compared to the stationary one. Third, nonverbal interactions such as leakage cues were also effective in the interaction with the robot character, and the correct answer ratios of the cued groups were higher. Finally, gender differences in the users' gaze were observed, especially in the frequency of the mutual gaze.

A Tracking of Head Movement for Stereophonic 3-D Sound (스테레오 입체음향을 위한 머리 움직임 추정)

  • Kim Hyun-Tae;Lee Kwang-Eui;Park Jang-Sik
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.11
    • /
    • pp.1421-1431
    • /
    • 2005
  • There are two methods in 3-D sound reproduction: a surround system, like 3.1 channel method and a binaural system using 2-channel method. The binaural system utilizes the sound localization principle of a human using two ears. Generally, a crosstalk between each channel of 2-channel loudspeaker system should be canceled to produce a natural 3-D sound. To solve this problem, it is necessary to trace a head movement. In this paper, we propose a new algorithm to correctly trace the head movement of a listener. The Proposed algorithm is based on the detection of face and eye. The face detection uses the intensity of an image and the position of eyes is detected by a mathematical morphology. When the head of the listener moves, length of borderline between face area and eyes may change. We use this information to the tracking of head movement. A computer simulation results show That head movement is effectively estimated within +10 margin of error using the proposed algorithm.

  • PDF

Mapping Studies on Visual Search, Eye Movement, and Eye track by Bibliometric Analysis

  • Rhie, Ye Lim;Lim, Ji Hyoun;Yun, Myung Hwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.34 no.5
    • /
    • pp.377-399
    • /
    • 2015
  • Objective: The aim of this study is to understand and identify the critical issues in vision research area using content analysis and network analysis. Background: Vision, the most influential factor in information processing, has been studied in a wide range of area. As studies on vision are dispersed across a broad area of research and the number of published researches is ever increasing, a bibliometric analysis towards literature would assist researchers in understanding and identifying critical issues in their research. Method: In this study, content and network analysis were applied on the meta-data of literatures collected using three search keywords: 'visual search', 'eye movement', and 'eye tracking'. Results: Content analysis focuses on extracting meaningful information from the text, deducting seven categories of research area; 'stimuli and task', 'condition', 'measures', 'participants', 'eye movement behavior', 'biological system', and 'cognitive process'. Network analysis extracts relational aspect of research areas, presenting characteristics of sub-groups identified by community detection algorithm. Conclusion: Using these methods, studies on vision were quantitatively analyzed and the results helped understand the overall relation between concepts and keywords. Application: The results of this study suggests that the use of content and network analysis helps identifying not only trends of specific research areas but also the relational aspects of each research issue while minimizing researchers' bias. Moreover, the investigated structural relationship would help identify the interrelated subjects from a macroscopic view.

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • v.16B no.3
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.

Implementation of Lane Tracking System using a Autonomous RC Toy Car (자율주행이 가능한 무선 장난감 자동차의 차선 추적 시스템 구현)

  • Ko, Eunsang;Lee, Chang Woo
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.8 no.5
    • /
    • pp.249-254
    • /
    • 2013
  • In this paper we propose nonlinear control system for automatic unmanned vehicle using a RC (Radio Controlled) car which is usually controlled by a remote controller. In the proposed system, a RC car is dissembled and reassembled with several parts enabling it to be controlled by an android mobile platform with Bluetooth communication. In our system, an android mobile smartphone is mounted on the RC car and plays an important role as an eye of the car. The proposed system automatically controls the RC car to follow a lane that we draw on the floor of our laboratory. Also, the proposed RC car system can also be controlled manually using the accelerometer sensor of a smartphone through a Bluetooth module. Our proposed system that has both manual mode and automatic mode consists of several components; a microprocessor unit, a Bluetooth serial interface module, a smartphone, a dual motor controller and a RC toy car. We are now in the development of a group driving system in which one car follows the front car that tracks a lane automatically.

Real-Time Face Tracking Algorithm Robust to illumination Variations (조명 변화에 강인한 실시간 얼굴 추적 알고리즘)

  • Lee, Yong-Beom;You, Bum-Jae;Lee, Seong-Whan;Kim, Kwang-Bae
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.3037-3040
    • /
    • 2000
  • Real-Time object tracking has emerged as an important component in several application areas including machine vision. surveillance. Human-Computer Interaction. image-based control. and so on. And there has been developed various algorithms for a long time. But in many cases. they have showed limited results under uncontrolled situation such as illumination changes or cluttered background. In this paper. we present a novel. computationally efficient algorithm for tracking human face robustly under illumination changes and cluttered backgrounds. Previous algorithms usually defines color model as a 2D membership function in a color space without consideration for illumination changes. Our new algorithm developed here. however. constructs a 3D color model by analysing plenty of images acquired under various illumination conditions. The algorithm described is applied to a mobile head-eye robot and experimented under various uncontrolled environments. It can track an human face more than 100 frames per second excluding image acquisition time.

  • PDF

Real-Time Face Detection, Tracking and Tilted Face Image Correction System Using Multi-Color Model and Face Feature (복합 칼라모델과 얼굴 특징자를 이용한 실시간 얼굴 검출 추적과 기울어진 얼굴보정 시스템)

  • Lee Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.4
    • /
    • pp.470-481
    • /
    • 2006
  • In this paper, we propose a real-time face detection, tracking and tilted face image correction system using multi-color model and face feature information. In the proposed system, we detect face candidate using YCbCr and YIQ color model. And also, we detect face using vertical and horizontal projection method and track people's face using Hausdorff matching method. And also, we correct tilted face with the correction of tilted eye features. The experiments have been performed for 110 test images and shows good performance. Experimental results show that the proposed algorithm robust to detection and tracking of face at real-time with the change of exterior condition and recognition of tilted face. Accordingly face detection and tilted face correction rate displayed 92.27% and 92.70% respectively and proposed algorithm shows 90.0% successive recognition rate.

  • PDF

Wearable Robot System Enabling Gaze Tracking and 3D Position Acquisition for Assisting a Disabled Person with Disabled Limbs (시선위치 추적기법 및 3차원 위치정보 획득이 가능한 사지장애인 보조용 웨어러블 로봇 시스템)

  • Seo, Hyoung Kyu;Kim, Jun Cheol;Jung, Jin Hyung;Kim, Dong Hwan
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.37 no.10
    • /
    • pp.1219-1227
    • /
    • 2013
  • A new type of wearable robot is developed for a disabled person with disabled limbs, that is, a person who cannot intentionally move his/her legs and arms. This robot can enable the disabled person to grip an object using eye movements. A gaze tracking algorithm is employed to detect pupil movements by which the person observes the object to be gripped. By using this gaze tracking 2D information, the object is identified and the distance to the object is measured using a Kinect device installed on the robot shoulder. By using several coordinate transformations and a matching scheme, the final 3D information about the object from the base frame can be clearly identified, and the final position data is transmitted to the DSP-controlled robot controller, which enables the target object to be gripped successfully.

Development of a computer mouse by tracking head movements and eyeblink (머리 움직임과 눈 깜박임을 이용한 컴퓨터 마우스 개발)

  • Park, Min-Je;Kang, Shin-Wook;Kim, Soo-Chan
    • Proceedings of the IEEK Conference
    • /
    • 2008.06a
    • /
    • pp.1107-1108
    • /
    • 2008
  • The purpose of this study is to develope a computer mouse using the head movements and eye blink in order to help the disability persons who can't move the hands or foot because of car accident or cerebral apoplexy. The mouse is composed of two gyro-sensors and photo sensor. The gryo-sensors detect the head horizontal and vertical angular velocities, respectively. The photo sensor detect the eye blink to perform click, double click, and to reset the head position. In the results, we could control the mouse points in real time using the proposed system.

  • PDF

A Continuous-time Equalizer adopting a Clock Loss Tracking Technique for Digital Display Interface(DDI) (클록 손실 측정 기법을 이용한 DDI용 연속 시간 이퀄라이저)

  • Kim, Kyu-Young;Kim, Gil-Su;Shon, Kwan-Su;Kim, Soo-Won
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.45 no.2
    • /
    • pp.28-33
    • /
    • 2008
  • This paper presents a continuous-time equalizer adopting a clock loss tracking technique for digital display interface. This technique uses bottom hold circuit to detect the incoming clock loss. The generated loss signal is directly fed to equalizer filters, building adaptive feed-forward loops which contribute the stability of the system. The design was done in $0.18{\mu}m$ CMOS technology. Experimental results summarize that eye-width of minimum 0.7UI is achieved until -33dB channel loss at 1.65Gbps. The average power consumption of the equalizer is a maximum 10mW, a very low value in comparison to those of previous researches, and the effective area is $0.127mm^2$.