• Title/Summary/Keyword: Iris tracking

Search Result 21, Processing Time 0.023 seconds

A Study on Correction and Prevention System of Real-time Forward Head Posture (실시간 거북목 증후군 자세 교정 및 예방 시스템 연구)

  • Woo-Seok Choi;Ji-Mi Choi;Hyun-Min Cho;Jeong-Min Park;Kwang-in Kwak
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.24 no.3
    • /
    • pp.147-156
    • /
    • 2024
  • This paper introduces the design of a turtle neck posture correction and prevention system for users of digital devices for a long time. The number of forward head posture patients in Korea increased by 13% from 2018 to 2021, and has not yet improved according to the latest statistics at the present time. Because of the nature of the disease, prevention is more important than treatment. Therefore, in this paper, we designed a system based on built-camera in most laptops to increase the accessiblility of the system, and utilize the features such as Pose Estimation, Face Landmarks Detection, Iris Tracking, and Depth Estimation of Google Mediapipe to prevent the need to produce artificial intelligence models and allow users to easily prevent forward head posture.

Eye detection on Rotated face using Principal Component Analysis (주성분 분석을 이용한 기울어진 얼굴에서의 눈동자 검출)

  • Choi, Yeon-Seok;Mun, Won-Ho;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2011.05a
    • /
    • pp.61-64
    • /
    • 2011
  • There are many applications that require robust and accurate eye tracking, such as human-computer interface(HCI). In this paper, a novel approach for eye tracking with a principal component analysis on rotated face. In the process of iris detection, intensity information is used. First, for select eye region using principal component analysis. Finally, for eye detection using eye region's intensity. The experimental results show good performance in detecting eye from FERET image include rotate face.

  • PDF

A Study on the Vision Sensor System for Tracking the I-Butt Weld Joints (I형 맞대기 용접선 추적용 시각센서 시스템에 관한 연구)

  • Bae, Hee-Soo;Kim, Jae-Woong
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.18 no.9
    • /
    • pp.179-185
    • /
    • 2001
  • In this study, a visual sensor system for weld seam tracking the I-butt weld joints in GMA welding was constructed. The sensor system consists of a CCD camera, a diode laser with a cylindrical lens and a band-pass-filter to overcome the degrading of image due to spatters and arc light. In order to obtain the enhanced image, quantitative relationship between laser intensity and iris number was investigated. Throughout the repeated experiments, the shutter speed was set at 1-milisecond for minimizing the effect of spatters on the image, and therefore most of the spatter trace in the image have been found to be reduced. Region of interest was defined from the entire image and gray level of searched laser line was compared to that of weld line. The differences between these gray levels lead to spot the position of weld joint using central difference method. The results showed that, as long as weld line was within $^\pm$15$^\circ$from the longitudinal straight fine, the system constructed in this study could track the weld line successful1y. Since the processing time reduced to 0.05 sec, it is expected that the developed method could be adopted to high speed welding such as laser welding.

  • PDF

A Study on a Vision Sensor System for Tracking the I-Butt Weld Joints

  • Kim Jae-Woong;Bae Hee-Soo
    • Journal of Mechanical Science and Technology
    • /
    • v.19 no.10
    • /
    • pp.1856-1863
    • /
    • 2005
  • In this study, a visual sensor system for weld seam tracking the I-butt weld joints in GMA welding was constructed. The sensor system consists of a CCD camera, a diode laser with a cylindrical lens and a band-pass-filter to overcome the degrading of image due to spatters and arc light. In order to obtain the enhanced image, quantitative relationship between laser intensity and iris opening was investigated. Throughout the repeated experiments, the shutter speed was set at 1/1000 second for minimizing the effect of spatters on the image, and therefore the image without the spatter traces could be obtained. Region of interest was defined from the entire image and gray level of the searched laser stripe was compared to that of weld line. The differences between these gray levels lead to spot the position of weld joint using central difference method. The results showed that, as long as weld line is within $\pm15^{o}$ from the longitudinal straight line, the system constructed in this study could track the weld line successfully. Since the processing time is no longer than 0.05 sec, it is expected that the developed method could be adopted to high speed welding such as laser welding.

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.10
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

Harris Corner Detection for Eyes Detection in Facial Images

  • Navastara, Dini Adni;Koo, Kyung-Mo;Park, Hyun-Jun;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2013.05a
    • /
    • pp.373-376
    • /
    • 2013
  • Nowadays, eyes detection is required and considered as the most important step in several applications, such as eye tracking, face identification and recognition, facial expression analysis and iris detection. This paper presents the eyes detection in facial images using Harris corner detection. Firstly, Haar-like features for face detection is used to detect a face region in an image. To separate the region of the eyes from a whole face region, the projection function is applied in this paper. At the last step, Harris corner detection is used to detect the eyes location. In experimental results, the eyes location on both grayscale and color facial images were detected accurately and effectively.

  • PDF

Real-time eye feature tracking using iris model (홍채 모델을 이용한 눈 특징점 실시간 추적)

  • Kim, Do-Hyoung;Chung, Myung-Jin
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.2717-2719
    • /
    • 2000
  • 사용자의 의도를 파악하는 여러 가지 행동양식 중에서 우리가 관심을 두고 있는 시스템은 사람의 눈 움직임 검출을 이용한 시스템이다. 사람의 눈 움직임에 대한 검출과 추적이 가능하다면 그 적용 분야는 매우 광범위하다. 예를 들면, 일반인들의 컴퓨터 조작을 더 편리하게 할 수도 있고 손을 사용할 수 없는 장애인들의 의사소통이나 정보교환의 한 방법으로 사용될 수 있다. 또한 사람들이 대부분의 정보를 시각적인 면으로 획득한다는 것을 감안할 때 원격 작업의 모니터링과 같은 여러 산업부분이나 군사부분과 같은 분야에 적용될 수 있다. 본 논문에서는 눈의 특징점들을 검출하고 추적하기 위해서 홍채 모델을 설정하고 그 모델이 카메라를 통해 받아들여지는 입력 영상과 일치시키는 과정으로, 카메라의 입력 영상에서 3가지의 기본 영상을 추출하고 모델의 매칭 정도를 판단할 수 있는 매칭 함수를 규정하고 그 함수들을 통하여 홍채 모델을 일치시키는 알고리즘을 제안하고 그 타당성을 보이고자 한다.

  • PDF

Investigation of sunspot substructure using chromospheric bright patches in a merging sunspot

  • Cho, Kyuhyoun
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.45 no.1
    • /
    • pp.44.3-44.3
    • /
    • 2020
  • Sunspot substructure is an important subject to explain their stability and energy transport. Previous studies suggested two substructure models, monolithic and spaghetti model, but no clear evidence has been found supporting a particular model. To obtain the clue of the sunspot substructure the IRIS Mg II 2796Å slit-jaw images (SJI) were examined. The Mg II images formed in the chromosphere show bright patches inside umbrae which are regarded as an observational signature of upward propagating slow magnetohydrodynamic (MHD) waves. The slow MHD waves are expected to be generated by convective motion below the photosphere. By tracking the motion of the bright patches it is possible to estimate the locations of oscillation centers that correspond to the occurrence position of the convections. I investigated the spatial distribution of the oscillation center in a merging sunspot and found it is randomly distributed. It implies that the occurrence rate of the convective motion inside the sunspot is not much different from that of between the two sunspots, and supports the spaghetti model as the sunspot substructure.

  • PDF

Detecting and Tracking Vehicles at Local Region by using Segmented Regions Information (분할 영역 정보를 이용한 국부 영역에서 차량 검지 및 추적)

  • Lee, Dae-Ho;Park, Young-Tae
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.10
    • /
    • pp.929-936
    • /
    • 2007
  • The novel vision-based scheme for real-time extracting traffic parameters is proposed in this paper. Detecting and tracking of vehicle is processed at local region installed by operator. Local region is divided to segmented regions by edge and frame difference, and the segmented regions are classified into vehicle, road, shadow and headlight by statistical and geometrical features. Vehicle is detected by the result of the classification. Traffic parameters such as velocity, length, occupancy and distance are estimated by tracking using template matching at local region. Because background image are not used, it is possible to utilize under various conditions such as weather, time slots and locations. It is performed well with 90.16% detection rate in various databases. If direction, angle and iris are fitted to operating conditions, we are looking forward to using as the core of traffic monitoring systems.

Webcam-Based 2D Eye Gaze Estimation System By Means of Binary Deformable Eyeball Templates

  • Kim, Jin-Woo
    • Journal of information and communication convergence engineering
    • /
    • v.8 no.5
    • /
    • pp.575-580
    • /
    • 2010
  • Eye gaze as a form of input was primarily developed for users who are unable to use usual interaction devices such as keyboard and the mouse; however, with the increasing accuracy in eye gaze detection with decreasing cost of development, it tends to be a practical interaction method for able-bodied users in soon future as well. This paper explores a low-cost, robust, rotation and illumination independent eye gaze system for gaze enhanced user interfaces. We introduce two brand-new algorithms for fast and sub-pixel precise pupil center detection and 2D Eye Gaze estimation by means of deformable template matching methodology. In this paper, we propose a new algorithm based on the deformable angular integral search algorithm based on minimum intensity value to localize eyeball (iris outer boundary) in gray scale eye region images. Basically, it finds the center of the pupil in order to use it in our second proposed algorithm which is about 2D eye gaze tracking. First, we detect the eye regions by means of Intel OpenCV AdaBoost Haar cascade classifiers and assign the approximate size of eyeball depending on the eye region size. Secondly, using DAISMI (Deformable Angular Integral Search by Minimum Intensity) algorithm, pupil center is detected. Then, by using the percentage of black pixels over eyeball circle area, we convert the image into binary (Black and white color) for being used in the next part: DTBGE (Deformable Template based 2D Gaze Estimation) algorithm. Finally, using DTBGE algorithm, initial pupil center coordinates are assigned and DTBGE creates new pupil center coordinates and estimates the final gaze directions and eyeball size. We have performed extensive experiments and achieved very encouraging results. Finally, we discuss the effectiveness of the proposed method through several experimental results.