• Title/Summary/Keyword: eye gaze tracking

Search Result 131, Processing Time 0.02 seconds

Robust Eye Region Discrimination and Eye Tracking to the Environmental Changes (환경변화에 강인한 눈 영역 분리 및 안구 추적에 관한 연구)

  • Kim, Byoung-Kyun;Lee, Wang-Heon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.5
    • /
    • pp.1171-1176
    • /
    • 2014
  • The eye-tracking [ET] is used on the human computer interaction [HCI] analysing the movement status as well as finding the gaze direction of the eye by tracking pupil's movement on a human face. Nowadays, the ET is widely used not only in market analysis by taking advantage of pupil tracking, but also in grasping intention, and there have been lots of researches on the ET. Although the vision based ET is known as convenient in application point of view, however, not robust in changing environment such as illumination, geometrical rotation, occlusion and scale changes. This paper proposes two steps in the ET, at first, face and eye regions are discriminated by Haar classifier on the face, and then the pupils from the discriminated eye regions are tracked by CAMShift as well as Template matching. We proved the usefulness of the proposed algorithm by lots of real experiments in changing environment such as illumination as well as rotation and scale changes.

Wearable Robot System Enabling Gaze Tracking and 3D Position Acquisition for Assisting a Disabled Person with Disabled Limbs (시선위치 추적기법 및 3차원 위치정보 획득이 가능한 사지장애인 보조용 웨어러블 로봇 시스템)

  • Seo, Hyoung Kyu;Kim, Jun Cheol;Jung, Jin Hyung;Kim, Dong Hwan
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.37 no.10
    • /
    • pp.1219-1227
    • /
    • 2013
  • A new type of wearable robot is developed for a disabled person with disabled limbs, that is, a person who cannot intentionally move his/her legs and arms. This robot can enable the disabled person to grip an object using eye movements. A gaze tracking algorithm is employed to detect pupil movements by which the person observes the object to be gripped. By using this gaze tracking 2D information, the object is identified and the distance to the object is measured using a Kinect device installed on the robot shoulder. By using several coordinate transformations and a matching scheme, the final 3D information about the object from the base frame can be clearly identified, and the final position data is transmitted to the DSP-controlled robot controller, which enables the target object to be gripped successfully.

A Study on the Visual Attention of Sexual Appeal Advertising Image Utilizing Eye Tracking (아이트래킹을 활용한 성적소구광고 이미지의 시각적 주의에 관한 연구)

  • Hwang, Mi-Kyung;Kwon, Mahn-Woo;Lee, Sang-Ho;Kim, Chee-Yong
    • Journal of the Korea Convergence Society
    • /
    • v.11 no.10
    • /
    • pp.207-212
    • /
    • 2020
  • This study analyzes the Soju(Korean alcohol) advertisement image, which is relatively easy to interpret subjectively, among sexual appeal advertisements that stimulate consumers' curiosity, where the image is verified through AOI (area of interest) 3 areas (face, body, product), and eye-tracking, one of the psychophysiological indicators. The result of the analysis reveals that visual attention, the interest in the advertising model, was higher in the face than in the body shape. Contrary to the prediction that men would be more interested in body shape than women, both men and women showed higher interest in the face than a body. Besides, it was derived that recognition and recollection of the product were not significant. This study is significant in terms of examining the pattern of visual attention such as the gaze point and gaze time of male and female consumers on sexual appeal advertisements. In further, the study looks forward to bringing a positive influence to the soju advertisement image by presenting the expression method that the soju advertisement image should pursue as well as the appropriate marketing direction.

A Study on Visual Attention According to Color and Form -Focusing on Eye Tracking Experiment- (색상(Color)과 형태(Form)에 따른 시각적 주의에 관한 연구 -아이트래킹 실험을 중심으로-)

  • Hwang, Mi-Kyung;Kwon, Mahn-Woo;Park, Min-Hee
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.4
    • /
    • pp.102-110
    • /
    • 2019
  • Among the visual sensibility studies, many studies of color or movement have been done, but not much have been done about whether it can evoke sensibility in static form itself. Therefore, in this study, visual attention by AOIs(Area of Interests) combined with color based on three basic forms was analyzed using eye tracking, and the results were visually expressed through Heat Map and Gaze Plot. In addition, A Paired t-test was performed on the mean difference between the two groups to verify the statistical significance of each color and form. As a result of the experiment, the chromatic color form was more visual attention than the achromatic color form and warm color form was higher mean than cool color, so the visual attention was greater. In this study, it is meaningful to suggest a quantitative method which is easy to interpret objectively the design element that is easily interpreted subjectively. Based on the results of this study, if more further studies and quantitative analysis methods are presented that can identify differences in visual attention from various colors and forms, it can be used to provide guidelines for basic design.

Relationship of Pupil's Size and Gaze Frequency for Neuro Sports Marketing: Focusing on Sigma Analysis (뉴로 스포츠 마케팅을 위한 동공 확장과 주시빈도 간의 관계: 시그마 분석법을 적용하여)

  • Ko, Eui-Suk;Song, Ki-Hyeon;Cho, Soo-Hyun;Kim, Jong-Ha
    • Science of Emotion and Sensibility
    • /
    • v.20 no.3
    • /
    • pp.39-48
    • /
    • 2017
  • In order to verify the effectiveness of marketing in the basketball stadium, this study measured and analyzed the gaze frequency and interest when the pupil was expanded by using the eye-tracking technology among various neuro marketing techniques of marketing. To analyze the section where the pupil size get expanded, interval of pupil size was higher than 2.275% (2 sigma data) and higher than 0.135% high (3 sigma data). Overall the valid data was analyzed by inflection points according to gaze frequency. We also analyzed the correlation between overall valid data and the ranges where the pupil size was significantly increased. The result showed that the correlation between overall valid data and pupil size 2 sigma data showed the highest correlation with 0.805. The pupil size 2 sigma data and pupil size 3 sigma data showed a correlation with 0.781, overall the valid data and pupil size 2 sigma data showed a correlation with 0.683. Therefore, it is concluded that, the section where the pupil size was expanded and the section at which gaze frequency is higher in the eye-tracking data were similar. However, the correlation between data of pupil size is determined to be significantly expanded and overall the valid data is decreased.

A Study on the Analysis of Gaze Characteristic of The Villa Savoye - Based on the Difference between Architecture Major Group and Non-Architecture Major Group (빌라 사보아의 건축입면 주시특성 연구 - 건축 전공자와 비전공자의 차이를 중심으로)

  • Cho, Hyeong-Kyu
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.17 no.11
    • /
    • pp.724-731
    • /
    • 2016
  • The purpose of this study was to determine what rules or patterns exist when people gaze at a villa savoye, which is probably Le Corbusier's best known building, using eye-movement tracking techniques. Villa savoye was designed, addressing his emblematic "Five Points", such as pilotis, functional roofs, free floor plans, long horizontal windows, and freely-designed facades. This study examined how the villa savoye's facade image is formed in an objective manner. This study selected a total of 56 test subjects, and showed them an image of a villa savoye. Using eye-movement tracking tools, this study recorded where they mostly gazed at when seeing the building. In addition, an experiment was carried out to test the different points of view, relating to visual attention, between the two groups with different levels of knowledge in architecture. To analyze the data, the Gaze Frequency was used as the key indicator. The results showed that the subjects have a higher degree of attention to either the door or window in general than in former studies. The architecture major group showed that they gazed the experiment image more evenly.

Development of Real-Time Vision-based Eye-tracker System for Head Mounted Display (영상정보를 이용한 HMD용 실시간 아이트랙커 시스템)

  • Roh, Eun-Jung;Hong, Jin-Sung;Bang, Hyo-Choong
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.35 no.6
    • /
    • pp.539-547
    • /
    • 2007
  • In this paper, development and tests of a real-time eye-tracker system are discussed. The tracker system tracks a user's gaze point through movement of eyes by means of vision-based pupil detection. The vision-based method has an advantage of detecting the exact positions of user's eyes. An infrared camera and a LED are used to acquire a user's pupil image and to extract pupil region, which was hard to extract with software only, from the obtained image, respectively. We develop a pupil-tracking algorithm with Kalman filter and grab the pupil images by using DSP(Digital Signal Processing) system for real-time image processing technique. The real-time eye-tracker system tracks the movements of user's pupils to project their gaze point onto a background image.

Real Time Gaze Discrimination for Human Computer Interaction (휴먼 컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Park Ho sik;Bae Cheol soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.3C
    • /
    • pp.125-132
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about 5°horizontally and 8°vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

Real Time Gaze Discrimination for Computer Interface (컴퓨터 인터페이스를 위한 실시간 시선 식별)

  • Hwang, Suen-Ki;Kim, Moon-Hwan
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.3 no.1
    • /
    • pp.38-46
    • /
    • 2010
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNs). With GRNNs, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10% improvement in classification error. The angular gaze accuracy is about $5^{\circ}$horizontally and $8^{\circ}$vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.

  • PDF

A Study on Real Time Gaze Discrimination System using GRNN (GRNN을 이용한 실시간 시선 식별 시스템에 관한 연구)

  • Lee Young-Sik;Bae Cheol-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.9 no.2
    • /
    • pp.322-329
    • /
    • 2005
  • This paper describes a computer vision system based on active IR illumination for real-time gaze discrimination system. Unlike most of the existing gaze discrimination techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze discrimination system can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using generalized regression neural networks (GRNNS). With GRNNS, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. furthermore, the mapping function can generalize to other individuals not used in the training. To further improve the gaze estimation accuracy, we employ a reclassification scheme that deals with the classes that tend to be misclassified. This leads to a 10$\%$ improvement in classification error. The angular gaze accuracy is about $5^{circ}$horizontally and $8^{circ}$vertically. The effectiveness of our gaze tracker is demonstrated by experiments that involve gaze-contingent interactive graphic display.