• Title/Summary/Keyword: gaze control

Search Result 62, Processing Time 0.024 seconds

Computer Interface Using Head-Gaze Tracking (응시 위치 추적 기술을 이용한 인터페이스 시스템 개발)

  • 이정준;박강령;김재희
    • Proceedings of the IEEK Conference
    • /
    • 1999.06a
    • /
    • pp.516-519
    • /
    • 1999
  • Gaze detection is to find out the position on a monitor screen where a user is looking at, using the image processing and computer vision technology, We developed a computer interface system using the gaze detection technology, This system enables a user to control the computer system without using their hands. So this system will help the handicapped to use a computer and is also useful for the man whose hands are busy doing another job, especially in tasks in factory. For the practical use, command signal like mouse clicking is necessary and we used eye winking to give this command signal to the system.

  • PDF

Autonomous Wheelchair System Using Gaze Recognition (시선 인식을 이용한 자율 주행 휠체어 시스템)

  • Kim, Tae-Ui;Lee, Sang-Yoon;Kwon, Kyung-Su;Park, Se-Hyun
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.14 no.4
    • /
    • pp.91-100
    • /
    • 2009
  • In this paper, we propose autonomous intelligent wheelchair system which recognize the commands using the gaze recognition and avoid the detected obstacles by sensing the distance through range sensors on the way to driving. The user's commands are recognized by the gaze recognizer which use a centroid of eye pupil and two reflection points extracted using a camera with infrared filter and two infrared LEDs. These are used to control the wheelchair through the user interface. Then wheelchair system detects the obstacles using 10 ultrasonic sensors and assists that it avoid collision with obstacles. The proposed intelligent wheelchair system consists of gaze recognizor, autonomous driving module, sensor control board and motor control board. The gaze recognizer cognize user's commands through user interface, then the wheelchair is controled by the motor control board using recognized commands. Thereafter obstacle information detected by ultrasonic sensors is transferred to the sensor control board, and this transferred to the autonomous driving module. In the autonomous driving module, the obstacles are detected. For generating commands to avoid these obstacles, there are transferred to the motor control board. The experimental results confirmed that the proposed system can improve the efficiency of obstacle avoidance and provide the convenient user interface to user.

A Study on the Hangul Input Methodology for Eye-gaze Interface (시선 입력 장치에 의한 한글 입력 시스템 설계에 관한 연구)

  • Seo Han-Sok;Kim Chee-Yong
    • Journal of Digital Contents Society
    • /
    • v.5 no.3
    • /
    • pp.239-244
    • /
    • 2004
  • New developments in IT already impact wide segments of a young and mobile population. It is evident that applications of information technology can be of equal benefit to the aged and the disabled. `Eye-Gaze'(EGI) technology was designed for people with paralysis in the upper body. There is a compeling need for a dedicated Korean Language interface for this system. The purpose of this study is to research 'Barrier Free' software using a control group of the mobility impaired to assess the Eye-Gaze Interface in the context of more conventional input methods. TheEGI of this study uses Quick Glance System of Eye Tech Digital Systems. The study will be evaluated on criteria based upon the needs of those with specific disabilities and mobility problems associated with aging. We also intend to explore applications of the Eye-Gaze Interface for English and Japanese devises, based upon our study using the Hangul phonology.

  • PDF

A Human-Robot Interface Using Eye-Gaze Tracking System for People with Motor Disabilities

  • Kim, Do-Hyoung;Kim, Jae-Hean;Yoo, Dong-Hyun;Lee, Young-Jin;Chung, Myung-Jin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.3 no.4
    • /
    • pp.229-235
    • /
    • 2001
  • Recently, service area has been emerging field f robotic applications. Even though assistant robots play an important role for the disabled and the elderly, they still suffer from operating the robots using conventional interface devices such as joysticks or keyboards. In this paper we propose an efficient computer interface using real-time eye-gaze tracking system. The inputs to the proposed system are images taken by a camera and data from a magnetic sensor. The measured data is sufficient to describe the eye and head movement because the camera and the receiver of a magnetic sensor are stationary with respect to the head. So the proposed system can obtain the eye-gaze direction in spite of head movement as long as the distance between the system and the transmitter of a magnetic position sensor is within 2m. Experimental results show the validity of the proposed system in practical aspect and also verify the feasibility of the system as a new computer interface for the disabled.

  • PDF

Robust Gaze-Fixing of an Active Vision System under Variation of System Parameters (시스템 파라미터의 변동 하에서도 강건한 능동적인 비전의 시선 고정)

  • Han, Youngmo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.1 no.3
    • /
    • pp.195-200
    • /
    • 2012
  • To steer a camera is done based on system parameters of the vision system. However, the system parameters when they are used might be different from those when they were measured. As one method to compensate for this problem, this research proposes a gaze-steering method based on LMI(Linear Matrix Inequality) that is robust to variations in the system parameters of the vision system. Simulation results show that the proposed method produces less gaze-tracking error than a contemporary linear method and more stable gaze-tracking error than a contemporary nonlinear method. Moreover, the proposed method is fast enough for realtime processing.

Smartphone Addiction Detection Based Emotion Detection Result Using Random Forest (랜덤 포레스트를 이용한 감정인식 결과를 바탕으로 스마트폰 중독군 검출)

  • Lee, Jin-Kyu;Kang, Hyeon-Woo;Kang, Hang-Bong
    • Journal of IKEEE
    • /
    • v.19 no.2
    • /
    • pp.237-243
    • /
    • 2015
  • Recently, eight out of ten people have smartphone in Korea. Also, many applications of smartphone have increased. So, smartphone addiction has become a social issue. Especially, many people in smartphone addiction can't control themselves. Sometimes they don't realize that they are smartphone addiction. Many studies, mostly surveys, have been conducted to diagnose smartphone addiction, e.g. S-measure. In this paper, we suggest how to detect smartphone addiction based on ECG and Eye Gaze. We measure the signals of ECG from the Shimmer and the signals of Eye Gaze from the smart eye when the subjects see the emotional video. In addition, we extract features from the S-transform of ECG. Using Eye Gaze signals(pupil diameter, Gaze distance, Eye blinking), we extract 12 features. The classifier is trained using Random Forest. The classifiers detect the smartphone addiction using the ECG and Eye Gaze signals. We compared the detection results with S-measure results that surveyed before test. It showed 87.89% accuracy in ECG and 60.25% accuracy in Eye Gaze.

3D View Controlling by Using Eye Gaze Tracking in First Person Shooting Game (1 인칭 슈팅 게임에서 눈동자 시선 추적에 의한 3차원 화면 조정)

  • Lee, Eui-Chul;Cho, Yong-Joo;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.10
    • /
    • pp.1293-1305
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMD. The proposed method is composed of 3 parts. In the first fart, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, the geometric relationship is determined between the monitor gazing position and the detected eye position gazing at the monitor position. In the last fart, the final gaze position on the HMB monitor is tracked and the 3D view in game is control]ed by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his (or her) hand. Also, it can increase the interest and immersion by synchronizing the gaze direction of game player and that of game character.

  • PDF

Effect of Gaze Stabilization Exercise with Balance Exercise on Static and Dynamic Balance Function of Healthy Young A dults: A Randomized Controlled Trial

  • Yi Wu;Xing- HAN Zhou;Yongbum Jung;Myoung-Kwon Kim
    • Journal of the Korean Society of Physical Medicine
    • /
    • v.19 no.2
    • /
    • pp.1-16
    • /
    • 2024
  • PURPOSE: This study examined the effects of four weeks of gaze stabilization exercises and balance training on the static and dynamic balance functions. METHODS: The study was an assessor-blinded randomized controlled trial conducted at Daegu University in South Korea. Thirty subjects who fulfilled the inclusion criteria were selected and divided randomly into three groups containing ten each. The first group received balance exercises with gaze stabilizing exercises (BGG). The second group received a balance exercise (BEG), and the third group received gaze-stabilizing exercise (GEG). Each group exercised for 40 minutes, three times a week for four weeks. The subjects were asked to complete the following static balance test: 1) one-leg standing test, 2) sharpened Romberg test, dynamic balance test, 3) Y-balance test, and 4) single-leg stand-squat-stand test. The static and dynamic balance were measured before and after four weeks to determine the effect of exercise on balance. RESULTS: The static (OLS and SRT) and dynamic (YBT and SST) balance tests showed significant differences in the surface and length of the three groups (p < .05), and the y-balance score effect size, 11.477 (p < .05), was improved significantly. On the other hand, the change in BGG value was larger than those of BEG and GEG, and the improvements in balance control were the most significant. CONCLUSION: After four weeks of exercise, BGG showed the best improvement in static and dynamic balance, suggesting that this specific type of gaze stabilization exercise with balance exercise may benefit healthy young adults.

Gaze Detection by Computing Facial and Eye Movement (얼굴 및 눈동자 움직임에 의한 시선 위치 추적)

  • 박강령
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.41 no.2
    • /
    • pp.79-88
    • /
    • 2004
  • Gaze detection is to locate the position on a monitor screen where a user is looking by computer vision. Gaze detection systems have numerous fields of application. They are applicable to the man-machine interface for helping the handicapped to use computers and the view control in three dimensional simulation programs. In our work, we implement it with a computer vision system setting a IR-LED based single camera. To detect the gaze position, we locate facial features, which is effectively performed with IR-LED based camera and SVM(Support Vector Machine). When a user gazes at a position of monitor, we can compute the 3D positions of those features based on 3D rotation and translation estimation and affine transform. Finally, the gaze position by the facial movements is computed from the normal vector of the plane determined by those computed 3D positions of features. In addition, we use a trained neural network to detect the gaze position by eye's movement. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 4.8 cm of RMS error.

Active eye system for tracking a moving object (이동물체 추적을 위한 능동시각 시스템 구축)

  • 백문홍
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.257-259
    • /
    • 1996
  • This paper presents the active eye system for tracking a moving object in 3D space. A prototype system able to track a moving object is designed and implemented. The mechanical system ables the control of platform that consists of binocular camera and also the control of the vergence angle of each camera by step motor. Each camera has two degrees of freedom. The image features of the object are extracted from complicated environment by using zero disparity filtering(ZDF). From the cnetroid of the image features the gaze point on object is calculated and the vergence angle of each camera is controlled by step motor. The Proposed method is implemented on the prototype with robust and fast calculation time.

  • PDF