• Title/Summary/Keyword: Gaze-based Input

Search Result 24, Processing Time 0.022 seconds

Development of a Non-contact Input System Based on User's Gaze-Tracking and Analysis of Input Factors

  • Jiyoung LIM;Seonjae LEE;Junbeom KIM;Yunseo KIM;Hae-Duck Joshua JEONG
    • Korean Journal of Artificial Intelligence
    • /
    • v.11 no.1
    • /
    • pp.9-15
    • /
    • 2023
  • As mobile devices such as smartphones, tablets, and kiosks become increasingly prevalent, there is growing interest in developing alternative input systems in addition to traditional tools such as keyboards and mouses. Many people use their own bodies as a pointer to enter simple information on a mobile device. However, methods using the body have limitations due to psychological factors that make the contact method unstable, especially during a pandemic, and the risk of shoulder surfing attacks. To overcome these limitations, we propose a simple information input system that utilizes gaze-tracking technology to input passwords and control web surfing using only non-contact gaze. Our proposed system is designed to recognize information input when the user stares at a specific location on the screen in real-time, using intelligent gaze-tracking technology. We present an analysis of the relationship between the gaze input box, gaze time, and average input time, and report experimental results on the effects of varying the size of the gaze input box and gaze time required to achieve 100% accuracy in inputting information. Through this paper, we demonstrate the effectiveness of our system in mitigating the challenges of contact-based input methods, and providing a non-contact alternative that is both secure and convenient.

A Study of Secure Password Input Method Based on Eye Tracking with Resistance to Shoulder-Surfing Attacks (아이트래킹을 이용한 안전한 패스워드 입력 방법에 관한 연구 - 숄더 서핑 공격 대응을 중심으로)

  • Kim, Seul-gi;Yoo, Sang-bong;Jang, Yun;Kwon, Tae-kyoung
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.4
    • /
    • pp.545-558
    • /
    • 2020
  • The gaze-based input provides feedback to confirm that the typing is correct when the user types the text. Many studies have already demonstrated that feedback can increase the usability of gaze-based inputs. However, because the information of the typed text is revealed through feedback, it can be a target for shoulder-surfing attacks. Appropriate feedback needs to be used to improve security without compromising the usability of the gaze-based input using the original feedback. In this paper, we propose a new gaze-based input method, FFI(Fake Flickering Interface), to resist shoulder-surfing attacks. Through experiments and questionnaires, we evaluated the usability and security of the FFI compared to the gaze-based input using the original feedback.

Gaze Detection Based on Facial Features and Linear Interpolation on Mobile Devices (모바일 기기에서의 얼굴 특징점 및 선형 보간법 기반 시선 추적)

  • Ko, You-Jin;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.8
    • /
    • pp.1089-1098
    • /
    • 2009
  • Recently, many researches of making more comfortable input device based on gaze detection technology have been performed in human computer interface. Previous researches were performed on the computer environment with a large sized monitor. With recent increase of using mobile device, the necessities of interfacing by gaze detection on mobile environment were also increased. In this paper, we research about the gaze detection method by using UMPC (Ultra-Mobile PC) and an embedded camera of UMPC based on face and facial feature detection by AAM (Active Appearance Model). This paper has following three originalities. First, different from previous research, we propose a method for tracking user's gaze position in mobile device which has a small sized screen. Second, in order to detect facial feature points, we use AAM. Third, gaze detection accuracy is not degraded according to Z distance based on the normalization of input features by using the features which are obtained in an initial user calibration stage. Experimental results showed that gaze detection error was 1.77 degrees and it was reduced by mouse dragging based on the additional facial movement.

  • PDF

A Study on the Hangul Input Methodology for Eye-gaze Interface (시선 입력 장치에 의한 한글 입력 시스템 설계에 관한 연구)

  • Seo Han-Sok;Kim Chee-Yong
    • Journal of Digital Contents Society
    • /
    • v.5 no.3
    • /
    • pp.239-244
    • /
    • 2004
  • New developments in IT already impact wide segments of a young and mobile population. It is evident that applications of information technology can be of equal benefit to the aged and the disabled. `Eye-Gaze'(EGI) technology was designed for people with paralysis in the upper body. There is a compeling need for a dedicated Korean Language interface for this system. The purpose of this study is to research 'Barrier Free' software using a control group of the mobility impaired to assess the Eye-Gaze Interface in the context of more conventional input methods. TheEGI of this study uses Quick Glance System of Eye Tech Digital Systems. The study will be evaluated on criteria based upon the needs of those with specific disabilities and mobility problems associated with aging. We also intend to explore applications of the Eye-Gaze Interface for English and Japanese devises, based upon our study using the Hangul phonology.

  • PDF

A Gaze Tracking based on the Head Pose in Computer Monitor (얼굴 방향에 기반을 둔 컴퓨터 화면 응시점 추적)

  • 오승환;이희영
    • Proceedings of the IEEK Conference
    • /
    • 2002.06c
    • /
    • pp.227-230
    • /
    • 2002
  • In this paper we concentrate on overall direction of the gaze based on a head pose for human computer interaction. To decide a gaze direction of user in a image, it is important to pick up facial feature exactly. For this, we binarize the input image and search two eyes and the mouth through the similarity of each block ( aspect ratio, size, and average gray value ) and geometric information of face at the binarized image. We create a imaginary plane on the line made by features of the real face and the pin hole of the camera to decide the head orientation. We call it the virtual facial plane. The position of a virtual facial plane is estimated through projected facial feature on the image plane. We find a gaze direction using the surface normal vector of the virtual facial plane. This study using popular PC camera will contribute practical usage of gaze tracking technology.

  • PDF

Webcam-Based 2D Eye Gaze Estimation System By Means of Binary Deformable Eyeball Templates

  • Kim, Jin-Woo
    • Journal of information and communication convergence engineering
    • /
    • v.8 no.5
    • /
    • pp.575-580
    • /
    • 2010
  • Eye gaze as a form of input was primarily developed for users who are unable to use usual interaction devices such as keyboard and the mouse; however, with the increasing accuracy in eye gaze detection with decreasing cost of development, it tends to be a practical interaction method for able-bodied users in soon future as well. This paper explores a low-cost, robust, rotation and illumination independent eye gaze system for gaze enhanced user interfaces. We introduce two brand-new algorithms for fast and sub-pixel precise pupil center detection and 2D Eye Gaze estimation by means of deformable template matching methodology. In this paper, we propose a new algorithm based on the deformable angular integral search algorithm based on minimum intensity value to localize eyeball (iris outer boundary) in gray scale eye region images. Basically, it finds the center of the pupil in order to use it in our second proposed algorithm which is about 2D eye gaze tracking. First, we detect the eye regions by means of Intel OpenCV AdaBoost Haar cascade classifiers and assign the approximate size of eyeball depending on the eye region size. Secondly, using DAISMI (Deformable Angular Integral Search by Minimum Intensity) algorithm, pupil center is detected. Then, by using the percentage of black pixels over eyeball circle area, we convert the image into binary (Black and white color) for being used in the next part: DTBGE (Deformable Template based 2D Gaze Estimation) algorithm. Finally, using DTBGE algorithm, initial pupil center coordinates are assigned and DTBGE creates new pupil center coordinates and estimates the final gaze directions and eyeball size. We have performed extensive experiments and achieved very encouraging results. Finally, we discuss the effectiveness of the proposed method through several experimental results.

A Study on Gamepad/Gaze based Input Processing for Mobile Platform Virtual Reality Contents (모바일 플랫폼 가상현실 콘텐츠에 적합한 게임패드/시선 기반 입력 처리 기술에 관한 연구)

  • Lee, Jiwon;Kim, Mingyu;Jeon, Changyu;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.22 no.3
    • /
    • pp.31-41
    • /
    • 2016
  • This study proposes suitable input processing technique for producing the mobile platform based virtual reality contents. First of all, we produce the mobile platform based virtual reality interactive contents to be used in experiments for improve user's immersion who experience the virtual reality contents, get interests and design the input processing that easily controllable. Then design the input processing technique in two methods, with gamepad that accessibility to mobile and with directly through user's gaze to interface. Through virtual reality based input processing technique that we proposed, we analyse effect of improve user's immersion, cause of get interests and whether provide the convenience or not for controlling contents through experiments. Moreover we verify whether bring negative psychological elements like sickness, fatigue or not.

An Implementation of Gaze Direction Recognition System using Difference Image Entropy (차영상 엔트로피를 이용한 시선 인식 시스템의 구현)

  • Lee, Kue-Bum;Chung, Dong-Keun;Hong, Kwang-Seok
    • The KIPS Transactions:PartB
    • /
    • v.16B no.2
    • /
    • pp.93-100
    • /
    • 2009
  • In this paper, we propose a Difference Image Entropy based gaze direction recognition system. The Difference Image Entropy is computed by histogram levels using the acquired difference image of current image and reference images or average images that have peak positions from $-255{\sim}+255$ to prevent information omission. There are two methods about the Difference Image Entropy based gaze direction. 1) The first method is to compute the Difference Image Entropy between an input image and average images of 45 images in each location of gaze, and to recognize the directions of user's gaze. 2) The second method is to compute the Difference Image Entropy between an input image and each 45 reference images, and to recognize the directions of user's gaze. The reference image is created by average image of 45 images in each location of gaze after receiving images of 4 directions. In order to evaluate the performance of the proposed system, we conduct comparison experiment with PCA based gaze direction system. The directions of recognition left-top, right-top, left-bottom, right-bottom, and we make an experiment on that, as changing the part of recognition about 45 reference images or average image. The experimental result shows that the recognition rate of Difference Image Entropy is 97.00% and PCA is 95.50%, so the recognition rate of Difference Image Entropy based system is 1.50% higher than PCA based system.

A Study on Interaction of Gaze-based User Interface in Mobile Virtual Reality Environment (모바일 가상현실 환경에서의 시선기반 사용자 인터페이스 상호 작용에 관한 연구)

  • Kim, Mingyu;Lee, Jiwon;Jeon, Changyu;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.23 no.3
    • /
    • pp.39-46
    • /
    • 2017
  • This study proposes a gaze-based user interface to provide user oriented interaction suitable for the virtual reality environment on mobile platforms. For this purpose, a mobile platform-based three-dimensional interactive content is produced, to test whether the proposed interface increases user satisfaction through the interactions in a mobile virtual reality environment. The gaze-based interface, the most common input method for mobile virtual reality content, is designed by considering two factors: the field of view and the feedback system. The performance of the proposed gaze-based interface is analyzed by conducting experiments on whether or not it offers motives for user interest, effects of enhanced immersion, differentiated formats from existing ones, and convenience in operating content.

3D First Person Shooting Game by Using Eye Gaze Tracking (눈동자 시선 추적에 의한 3차원 1인칭 슈팅 게임)

  • Lee, Eui-Chul;Park, Kang-Ryoung
    • The KIPS Transactions:PartB
    • /
    • v.12B no.4 s.100
    • /
    • pp.465-472
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMB. The proposed method is composed of 3 parts. At first, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, when the user gaze on the monitor plane, the geometric relationship between the gazing position of monitor and the detected position of pupil center is determined. In the last part, the final gaze position on the HMD monitor is tracked and the 3D view in game is controlled by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his(or her) hand. Also, it can Increase the interest and the immersion by synchronizing the gaze direction of game player and the view direction of game character.