• Title/Summary/Keyword: gaze control

Search Result 62, Processing Time 0.023 seconds

Robot Control Interface Using Gaze Recognition (시선 인식을 이용한 로봇 인터페이스 개발)

  • Park, Se Hyun
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.7 no.1
    • /
    • pp.33-39
    • /
    • 2012
  • In this paper, we propose robot control interface using gaze recognition which is not limited by head motion. Most of the existing gaze recognition methods are working well only if the head is fixed. Furthermore the methods require a correction process per each person. The interface in this paper uses a camera with built-in infrared filter and 2 LED light sources to see what direction the pupils turn to and can send command codes to control the system, thus it doesn't need any correction process per each person. The experimental results showed that the proposed interface can control the system exactly by recognizing user's gaze direction.

Gaze Recognition Interface Development for Smart Wheelchair (지능형 휠체어를 위한 시선 인식 인터페이스 개발)

  • Park, S.H.
    • Journal of rehabilitation welfare engineering & assistive technology
    • /
    • v.5 no.1
    • /
    • pp.103-110
    • /
    • 2011
  • In this paper, we propose a gaze recognition interface for smart wheelchair. The gaze recognition interface is a user interface which recognize the commands using the gaze recognition and avoid the detected obstacles by sensing the distance through range sensors on the way to driving. Smart wheelchair is composed of gaze recognition and tracking module, user interface module, obstacle detector, motor control module, and range sensor module. The interface in this paper uses a camera with built-in infra red filter and 2 LED light sources to see what direction the pupils turn to and can send command codes to control the system, thus it doesn't need any correction process per each person. The results of the experiment showed that the proposed interface can control the system exactly by recognizing user's gaze direction.

Steering Gaze of a Camera in an Active Vision System: Fusion Theme of Computer Vision and Control (능동적인 비전 시스템에서 카메라의 시선 조정: 컴퓨터 비전과 제어의 융합 테마)

  • 한영모
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.41 no.4
    • /
    • pp.39-43
    • /
    • 2004
  • A typical theme of active vision systems is gaze-fixing of a camera. Here gaze-fixing of a camera means by steering orientation of a camera so that a given point on the object is always at the center of the image. For this we need to combine a function to analyze image data and a function to control orientation of a camera. This paper presents an algorithm for gaze-fixing of a camera where image analysis and orientation control are designed in a frame. At this time, for avoiding difficulties in implementing and aiming for real-time applications we design the algorithm to be a simple closed-form without using my information related to calibration of the camera or structure estimation.

Development of a Non-contact Input System Based on User's Gaze-Tracking and Analysis of Input Factors

  • Jiyoung LIM;Seonjae LEE;Junbeom KIM;Yunseo KIM;Hae-Duck Joshua JEONG
    • Korean Journal of Artificial Intelligence
    • /
    • v.11 no.1
    • /
    • pp.9-15
    • /
    • 2023
  • As mobile devices such as smartphones, tablets, and kiosks become increasingly prevalent, there is growing interest in developing alternative input systems in addition to traditional tools such as keyboards and mouses. Many people use their own bodies as a pointer to enter simple information on a mobile device. However, methods using the body have limitations due to psychological factors that make the contact method unstable, especially during a pandemic, and the risk of shoulder surfing attacks. To overcome these limitations, we propose a simple information input system that utilizes gaze-tracking technology to input passwords and control web surfing using only non-contact gaze. Our proposed system is designed to recognize information input when the user stares at a specific location on the screen in real-time, using intelligent gaze-tracking technology. We present an analysis of the relationship between the gaze input box, gaze time, and average input time, and report experimental results on the effects of varying the size of the gaze input box and gaze time required to achieve 100% accuracy in inputting information. Through this paper, we demonstrate the effectiveness of our system in mitigating the challenges of contact-based input methods, and providing a non-contact alternative that is both secure and convenient.

Effects of Gaze Stabilization Exercise and Cognitive Training on Balance and Gait in Subacute Stroke Patients: Randomized Controlled Trial

  • Hye-Ryeon Jang;Ye-Ji Kim;Myoung-Kwon Kim
    • Journal of the Korean Society of Physical Medicine
    • /
    • v.19 no.1
    • /
    • pp.155-164
    • /
    • 2024
  • PURPOSE: The purpose of this study was to evaluate the effects of simultaneous application of gaze stabilization exercise and cognitive training on the balance and gait ability in subacute stroke patients. METHODS: Thirty-five patients diagnosed with stroke within 3-6 months were randomly assigned, and the experimental group (n = 18) to which both gaze stabilization exercise and cognitive training were applied and the control group (n = 17) to which only gaze stabilization exercise was applied were targeted. It was performed for 30 minutes at a time, three times a week, for a total of 4 weeks. Berg Balance Scale, Timed Up and Go test, 10Meter Walking Test, and Walking symmetry were evaluated. RESULTS: In the comparison of changes between Berg Balance Scale, Time Up and Go test, 10 Meter Walking Test, and Gait symmetry, both experimental and control groups showed significant differences before and after the intervention, and in the evaluation of Gait symmetry, significant differences between groups. CONCLUSION: As a result of this study, when gaze stabilization exercise and cognitive training were allied simultaneously, it was possible to improve the balance and gait ability of subacute stroke patients, and had a more significant effect on gait ability. In considered that training that simultaneously applies gaze stabilization exercise and cognitive training can be presented as a balance and gait rehabilitation for stroke patients on the future.

Adaptive Zoom-based Gaze Tracking for Enhanced Accuracy and Precision (정확도 및 정밀도 향상을 위한 적응형 확대 기반의 시선 추적 기법)

  • Song, Hyunjoo;Jo, Jaemin;Kim, Bohyoung;Seo, Jinwook
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.9
    • /
    • pp.610-615
    • /
    • 2015
  • The accuracy and precision of video-based remote gaze trackers is affected by numerous factors (e.g. the head movement of the participant). However, it is challenging to control all factors that have an influence, and doing so (e.g., using a chin-rest to control geometry) could lead to losing the benefit of using gaze trackers, i.e., the ecological validity of their unobtrusive nature. We propose an adaptive zoom-based gaze tracking technique, ZoomTrack that addresses this problem by improving the resolution of the gaze tracking results. Our approach magnifies a region-of-interest (ROI) and retrieves gaze points at a higher resolution under two different zooming modes: only when the gaze reaches the ROI (temporary) or whenever a participant stares at the stimuli (omnipresent). We compared these against the base case without magnification in a user study. The results are then used to summarize the advantages and limitations of our technique.

Gaze Direction Estimation Method Using Support Vector Machines (SVMs) (Support Vector Machines을 이용한 시선 방향 추정방법)

  • Liu, Jing;Woo, Kyung-Haeng;Choi, Won-Ho
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.4
    • /
    • pp.379-384
    • /
    • 2009
  • A human gaze detection and tracing method is importantly required for HMI(Human-Machine-Interface) like a Human-Serving robot. This paper proposed a novel three-dimension (3D) human gaze estimation method by using a face recognition, an orientation estimation and SVMs (Support Vector Machines). 2,400 images with the pan orientation range of $-90^{\circ}{\sim}90^{\circ}$ and tilt range of $-40^{\circ}{\sim}70^{\circ}$ with intervals unit of $10^{\circ}$ were used. A stereo camera was used to obtain the global coordinate of the center point between eyes and Gabor filter banks of horizontal and vertical orientation with 4 scales were used to extract the facial features. The experiment result shows that the error rate of proposed method is much improved than Liddell's.

A Design of the Finite State Machine to Control User's Gaze on a Screen (화면 응시 제어를 위한 유한 상태 기계 설계)

  • Moon, Bong-Hee
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.5
    • /
    • pp.127-134
    • /
    • 2011
  • A finite state machine was designed to control user's gaze on the screen when the user is monitoring the. It consists of a set of situations where pupils are gazed and a set of states which decide the gaze on a screen or sleeping. The states were especially classified into main states, pre-states and potential states. The machine uses the situation history, which decide current state using continuous previous situation and current situation, and improves the accuracy to control the gaze on a screen. We implemented the machine with the data which were get using a pupil detection method, and tested the verification of the system with monitoring operations. The experimentation using the method which get date from real images shows advantage of decision whether it is temporary gaze or long-term gaze.

Visual Modeling and Content-based Processing for Video Data Storage and Delivery

  • Hwang Jae-Jeong;Cho Sang-Gyu
    • Journal of information and communication convergence engineering
    • /
    • v.3 no.1
    • /
    • pp.56-61
    • /
    • 2005
  • In this paper, we present a video rate control scheme for storage and delivery in which the time-varying viewing interests are controlled by human gaze. To track the gaze, the pupil's movement is detected using the three-step process : detecting face region, eye region, and pupil point. To control bit rates, the quantization parameter (QP) is changed by considering the static parameters, the video object priority derived from the pupil tracking, the target PSNR, and the weighted distortion value of the coder. As results, we achieved human interfaced visual model and corresponding region-of-interest rate control system.

Eye-Gaze Interaction On Computer Screen Evaluation

  • Ponglangka, Wirot;Sutakcom, Udom
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.84-88
    • /
    • 2005
  • Eye gaze positions evaluation on computer screen uses the human eye as an input device for computer systems is that it gives low resolution. We proposes a method to determine the eye gaze positions on the screen by using two-eye displacements as the information for mapping, and the perspective projection is applied to map the displacements to a position on a computer screen. The experiments were performed on 20 persons and a 17-inch monitor is used with the screen resolution of 1024x768 pixels. Gaze detection error was 3.18 cm (RMS error), with screen is divided into 5x8 and 7x10 positions on a 17-inch monitor. The results showed 100% and 96% correction, respectively.

  • PDF