• 제목/요약/키워드: 시선추적방법

검색결과 94건 처리시간 0.025초

Gaze Recognition Interface Development for Smart Wheelchair (지능형 휠체어를 위한 시선 인식 인터페이스 개발)

  • Park, S.H.
    • Journal of rehabilitation welfare engineering & assistive technology
    • /
    • 제5권1호
    • /
    • pp.103-110
    • /
    • 2011
  • In this paper, we propose a gaze recognition interface for smart wheelchair. The gaze recognition interface is a user interface which recognize the commands using the gaze recognition and avoid the detected obstacles by sensing the distance through range sensors on the way to driving. Smart wheelchair is composed of gaze recognition and tracking module, user interface module, obstacle detector, motor control module, and range sensor module. The interface in this paper uses a camera with built-in infra red filter and 2 LED light sources to see what direction the pupils turn to and can send command codes to control the system, thus it doesn't need any correction process per each person. The results of the experiment showed that the proposed interface can control the system exactly by recognizing user's gaze direction.

Gaze Detection Based on Facial Features and Linear Interpolation on Mobile Devices (모바일 기기에서의 얼굴 특징점 및 선형 보간법 기반 시선 추적)

  • Ko, You-Jin;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • 제12권8호
    • /
    • pp.1089-1098
    • /
    • 2009
  • Recently, many researches of making more comfortable input device based on gaze detection technology have been performed in human computer interface. Previous researches were performed on the computer environment with a large sized monitor. With recent increase of using mobile device, the necessities of interfacing by gaze detection on mobile environment were also increased. In this paper, we research about the gaze detection method by using UMPC (Ultra-Mobile PC) and an embedded camera of UMPC based on face and facial feature detection by AAM (Active Appearance Model). This paper has following three originalities. First, different from previous research, we propose a method for tracking user's gaze position in mobile device which has a small sized screen. Second, in order to detect facial feature points, we use AAM. Third, gaze detection accuracy is not degraded according to Z distance based on the normalization of input features by using the features which are obtained in an initial user calibration stage. Experimental results showed that gaze detection error was 1.77 degrees and it was reduced by mouse dragging based on the additional facial movement.

  • PDF

A study on the relationship between gaze guidance and cybersickness using Eyetracking (시선 추적기법을 활용한 시선 유도와 사이버 멀미 관계 연구)

  • Lee, TaeGu;Ahn, ChanJe
    • The Journal of the Convergence on Culture Technology
    • /
    • 제8권3호
    • /
    • pp.167-173
    • /
    • 2022
  • The size of the virtual reality market is growing every year, but cybersickness that occurs in virtual reality has not been resolved yet. In this paper, results were derived through experiments on the relationship between cybersickness and gaze guidance occurring in virtual reality contents. Using eye tracking technique, the relationship of gaze movement with cybersickness was identified. The experiment was divided into two groups to find out whether visual induction affects cyber sickness. In addition, the results were analyzed by dividing the two groups to check whether cyber sickness showed different results according to gender. We also analyzed using the SSQ questionnaire to measure cybersickness. We tried to understand the relationship between gaze guidance and cybersickness through two methods. As a result of the experiment, it was found that the induction of a clear gaze caused the concentration of the gaze, and it was effective in cybersickness through the rotation of the camera. In order to alleviate cyber sickness, it has been confirmed that concentrating one's eyes through gaze-guided production is effective for cyber sickness. It is hoped that this result will be used as a way to alleviate cyber sickness for producers who want to use virtual reality to produce content.

A Study on Eye Tracking Techniques using Wearable Devices (웨어러블향(向) 시선추적 기법에 관한 연구)

  • Jaehyuck Jang;Jiu Jung;Junghoon Park
    • Smart Media Journal
    • /
    • 제12권3호
    • /
    • pp.19-29
    • /
    • 2023
  • The eye tracking technology is widespread all around the society, and is demonstrating great performances in both preciseness and convenience. Hereby we can glimpse new possibility of an interface's conduct without screen-touching. This technology can become a new way of conversation for those including but not limited to the patients suffering from Lou Gehrig's disease, who are paralyzed each part by part of the body and finally cannot help but only moving eyes. Formerly in that case, the patients were given nothing to do but waiting for the death, even being unable to communicate with there families. A new interface that harnesses eyes as a new means of communication, although it conveys great difficulty, can be helpful for them. There surely are some eye tracking systems and equipment for their exclusive uses on the market. Notwithstanding, several obstacles including the complexity of operation and their high prices of over 12 million won($9,300) are hindering universal supply to people and coverage for the patients. Therefore, this paper suggests wearable-type eye tracking device that can support minorities and vulnerable people and be occupied inexpensively and study eye tracking method in order to maximize the possibility of future development across the world, finally proposing the way of designing and developing a brought-down costed eye tracking system based on high-efficient wearable device.

Robust Gaze-Fixing of an Active Vision System under Variation of System Parameters (시스템 파라미터의 변동 하에서도 강건한 능동적인 비전의 시선 고정)

  • Han, Youngmo
    • KIPS Transactions on Software and Data Engineering
    • /
    • 제1권3호
    • /
    • pp.195-200
    • /
    • 2012
  • To steer a camera is done based on system parameters of the vision system. However, the system parameters when they are used might be different from those when they were measured. As one method to compensate for this problem, this research proposes a gaze-steering method based on LMI(Linear Matrix Inequality) that is robust to variations in the system parameters of the vision system. Simulation results show that the proposed method produces less gaze-tracking error than a contemporary linear method and more stable gaze-tracking error than a contemporary nonlinear method. Moreover, the proposed method is fast enough for realtime processing.

Gaze Detection System using Real-time Active Vision Camera (실시간 능동 비전 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • Journal of KIISE:Software and Applications
    • /
    • 제30권12호
    • /
    • pp.1228-1238
    • /
    • 2003
  • This paper presents a new and practical method based on computer vision for detecting the monitor position where the user is looking. In general, the user tends to move both his face and eyes in order to gaze at certain monitor position. Previous researches use only one wide view camera, which can capture a whole user's face. In such a case, the image resolution is too low and the fine movements of user's eye cannot be exactly detected. So, we implement the gaze detection system with dual camera systems(a wide and a narrow view camera). In order to locate the user's eye position accurately, the narrow view camera has the functionalities of auto focusing and auto panning/tilting based on the detected 3D facial feature positions from the wide view camera. In addition, we use dual R-LED illuminators in order to detect facial features and especially eye features. As experimental results, we can implement the real-time gaze detection system and the gaze position accuracy between the computed positions and the real ones is about 3.44 cm of RMS error.

A Study on Gaze Tracking Based on Pupil Movement, Corneal Specular Reflections and Kalman Filter (동공 움직임, 각막 반사광 및 Kalman Filter 기반 시선 추적에 관한 연구)

  • Park, Kang-Ryoung;Ko, You-Jin;Lee, Eui-Chul
    • The KIPS Transactions:PartB
    • /
    • 제16B권3호
    • /
    • pp.203-214
    • /
    • 2009
  • In this paper, we could simply compute the user's gaze position based on 2D relations between the pupil center and four corneal specular reflections formed by four IR-illuminators attached on each corner of a monitor, without considering the complex 3D relations among the camera, the monitor, and the pupil coordinates. Therefore, the objectives of our paper are to detect the pupil center and four corneal specular reflections exactly and to compensate for error factors which affect the gaze accuracy. In our method, we compensated for the kappa error between the calculated gaze position through the pupil center and actual gaze vector. We performed one time user calibration to compensate when the system started. Also, we robustly detected four corneal specular reflections that were important to calculate gaze position based on Kalman filter irrespective of the abrupt change of eye movement. Experimental results showed that the gaze detection error was about 1.0 degrees though there was the abrupt change of eye movement.

Digital Library Interface Research Based on EEG, Eye-Tracking, and Artificial Intelligence Technologies: Focusing on the Utilization of Implicit Relevance Feedback (뇌파, 시선추적 및 인공지능 기술에 기반한 디지털 도서관 인터페이스 연구: 암묵적 적합성 피드백 활용을 중심으로)

  • Hyun-Hee Kim;Yong-Ho Kim
    • Journal of the Korean Society for information Management
    • /
    • 제41권1호
    • /
    • pp.261-282
    • /
    • 2024
  • This study proposed and evaluated electroencephalography (EEG)-based and eye-tracking-based methods to determine relevance by utilizing users' implicit relevance feedback while navigating content in a digital library. For this, EEG/eye-tracking experiments were conducted on 32 participants using video, image, and text data. To assess the usefulness of the proposed methods, deep learning-based artificial intelligence (AI) techniques were used as a competitive benchmark. The evaluation results showed that EEG component-based methods (av_P600 and f_P3b components) demonstrated high classification accuracy in selecting relevant videos and images (faces/emotions). In contrast, AI-based methods, specifically object recognition and natural language processing, showed high classification accuracy for selecting images (objects) and texts (newspaper articles). Finally, guidelines for implementing a digital library interface based on EEG, eye-tracking, and artificial intelligence technologies have been proposed. Specifically, a system model based on implicit relevance feedback has been presented. Moreover, to enhance classification accuracy, methods suitable for each media type have been suggested, including EEG-based, eye-tracking-based, and AI-based approaches.

Gaze Detection System by IR-LED based Camera (적외선 조명 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • 제29권4C호
    • /
    • pp.494-504
    • /
    • 2004
  • The researches about gaze detection have been much developed with many applications. Most previous researches only rely on image processing algorithm, so they take much processing time and have many constraints. In our work, we implement it with a computer vision system setting a IR-LED based single camera. To detect the gaze position, we locate facial features, which is effectively performed with IR-LED based camera and SVM(Support Vector Machine). When a user gazes at a position of monitor, we can compute the 3D positions of those features based on 3D rotation and translation estimation and affine transform. Finally, the gaze position by the facial movements is computed from the normal vector of the plane determined by those computed 3D positions of features. In addition, we use a trained neural network to detect the gaze position by eye's movement. As experimental results, we can obtain the facial and eye gaze position on a monitor and the gaze position accuracy between the computed positions and the real ones is about 4.2 cm of RMS error.

A Study on Manipulating Method of 3D Game in HMD Environment by using Eye Tracking (HMD(Head Mounted Display)에서 시선 추적을 통한 3차원 게임 조작 방법 연구)

  • Park, Kang-Ryoung;Lee, Eui-Chul
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • 제45권2호
    • /
    • pp.49-64
    • /
    • 2008
  • Recently, many researches about making more comfortable input device based on gaze detection technology have been done in human computer interface. However, the system cost becomes high due to the complicated hardware and there is difficulty to use the gaze detection system due to the complicated user calibration procedure. In this paper, we propose a new gaze detection method based on the 2D analysis and a simple user calibration. Our method used a small USB (Universal Serial Bus) camera attached on a HMD (Head-Mounted Display), hot-mirror and IR (Infra-Red) light illuminator. Because the HMD is moved according to user's facial movement, we can implement the gaze detection system of which performance is not affected by facial movement. In addition, we apply our gaze detection system to 3D first person shooting game. From that, the gaze direction of game character is controlled by our gaze detection method and it can target the enemy character and shoot, which can increase the immersion and interest of game. Experimental results showed that the game and gaze detection system could be operated at real-time speed in one desktop computer and we could obtain the gaze detection accuracy of 0.88 degrees. In addition, we could know our gaze detection technology could replace the conventional mouse in the 3D first person shooting game.