• Title/Summary/Keyword: Head tracking

Search Result 245, Processing Time 0.029 seconds

The Comparisons of 4 Channel Auditory Brainstem Response for Tracking Auditory Neuro-Pathway

  • Woo, Jin-Wan;Lee, Sang-Min;Hong, Sung-Hwa;Sung, Young-Ju;Park, Sook-Kyoung;Lee, Yong-Hee;Kim, In-Young;Kim, Sun-I.
    • Journal of Biomedical Engineering Research
    • /
    • v.25 no.3
    • /
    • pp.195-200
    • /
    • 2004
  • The Auditory Brainstem Response (ABR) with a click stimulation in guinea pigs was used to examine the auditory neuro-pathway from the cochlear nucleus to brain. Using multi-channel active electrodes, the 3-dimensional auditory pathway was examined from the cochlea to the inferior colliculus through the brainstem. These results are similar to the well-known neuro-pathway. This study on the multi-channel ABR shows that the positions of the ABR generators move to the central brain and the contralateral pathway. It is generally agreed that the ABR is generated by some structures along the auditory pathway. This study provides some information on the neuro-pathway where the ABR peak is generated.

Development of electric vehicle maintenance education ability using digital twin technology and VR

  • Lee, Sang-Hyun;Jung, Byeong-Soo
    • International Journal of Advanced Culture Technology
    • /
    • v.8 no.2
    • /
    • pp.58-67
    • /
    • 2020
  • In this paper, the maintenance training manual of EV vehicle was produced by utilizing digital twin technology and various sensors such as IR-based light house tracking and head tracker. In addition, through digital twin technology and VR to provide high immersiveness to users, sensory content creation technology was secured through animation and effect realization suitable for EV vehicle maintenance situation. EV vehicle maintenance training manual is 3D engine programming and real-time creation of 3D objects and minimization of screen obstacles and selection of specific menus in virtual space in the form of training simulation. In addition, automatic output from the Head Mount Display (HUD), EV vehicle maintenance and inspection, etc., user can easily operate content was produced. This technology development can enhance immersion to users through implementation of detailed scenarios for maintenance / inspection of EV vehicles" and 3D parts display by procedure, realization of animations and effects for maintenance situations. Through this study, familiarity with improving the quality of education and safety accidents and correct maintenance process and the experienced person was very helpful in learning how to use equipment naturally and how to maintain EV vehicles.

Robust Extraction of Heartbeat Signals from Mobile Facial Videos (모바일 얼굴 비디오로부터 심박 신호의 강건한 추출)

  • Lomaliza, Jean-Pierre;Park, Hanhoon
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.20 no.1
    • /
    • pp.51-56
    • /
    • 2019
  • This paper proposes an improved heartbeat signal extraction method for ballistocardiography(BCG)-based heart-rate measurement on mobile environment. First, from a mobile facial video, a handshake-free head motion signal is extracted by tracking facial features and background features at the same time. Then, a novel signal periodicity computation method is proposed to accurately separate out the heartbeat signal from the head motion signal. The proposed method could robustly extract heartbeat signals from mobile facial videos, and enabled more accurate heart rate measurement (measurement errors were reduced by 3-4 bpm) compared to the existing method.

ATTEST to MUTED - Problems, Answers, and the Evolution of a Multiple Mobile Viewer Autostereoscopic Display

  • Sexton, Ian;Buckley, Edward
    • 한국정보디스플레이학회:학술대회논문집
    • /
    • 2007.08b
    • /
    • pp.1807-1810
    • /
    • 2007
  • The evolution of a multi viewer autostereoscopic display is described. Development of the display was originally a part of the EC funded 'ATTEST' project and continues as another EC project 'MUTED'. The design of the original display is presented and the limitations of the prototype are described. The current iteration of the design is presented.

  • PDF

Computer Interface Using Head-Gaze Tracking (응시 위치 추적 기술을 이용한 인터페이스 시스템 개발)

  • 이정준;박강령;김재희
    • Proceedings of the IEEK Conference
    • /
    • 1999.06a
    • /
    • pp.516-519
    • /
    • 1999
  • Gaze detection is to find out the position on a monitor screen where a user is looking at, using the image processing and computer vision technology, We developed a computer interface system using the gaze detection technology, This system enables a user to control the computer system without using their hands. So this system will help the handicapped to use a computer and is also useful for the man whose hands are busy doing another job, especially in tasks in factory. For the practical use, command signal like mouse clicking is necessary and we used eye winking to give this command signal to the system.

  • PDF

Head tracking system Implementation by using depth camera (뎁스 카메라를 이용한 머리 추적 시스템 구현)

  • Ahn, Yang-Keun;Kwon, Ji-In
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2014.04a
    • /
    • pp.1032-1033
    • /
    • 2014
  • 본 논문에서는 뎁스 카메라를 이용하여 사용자 수에 상관없이 사용자의 머리를 추적하는 방법에 대해 제안한다. 제안된 방법은 색상 정보를 제외한 뎁스 정보만을 이용하여 머리를 추적하고, 각각의 사용자에 따라 뎁스 이미지 형태가 다르게 나오는 머리를 실험적 데이터를 통하여 추적한다. 제안된 방법은 카메라의 종류에 상관없이 머리를 추적이 가능하다.

Head Tracking System Implementation Using a Depth Camera (깊이 카메라를 이용한 머리 추적 시스템 구현)

  • Ahn, Yang-Keun;Jung, Kwnag-Mo
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2015.10a
    • /
    • pp.1673-1674
    • /
    • 2015
  • 본 논문에서는 깊이 카메라를 이용하여 사용자 수에 상관없이 사용자의 머리를 추적하는 방법에 대해 제안한다. 제안된 방법은 색상 정보를 제외한 깊이 정보만을 이용하여 머리를 추적하고, 각각의 사용자에 따라 깊이 이미지 형태가 다르게 나오는 머리를 실험적 데이터를 통하여 추적한다. 또한 제안된 방법은 카메라의 종류에 상관없이 머리를 추적할 수 있다는 장점이 있다. 본 논문에서는 Microsoft사의 Kinect for Window와 SoftKinetic사의 DS311을 실험을 진행하였다.

MyWorkspace: VR Platform with an Immersive User Interface (MyWorkspace: 몰입형 사용자 인터페이스를 이용한 가상현실 플랫폼)

  • Yoon, Jong-Won;Hong, Jin-Hyuk;Cho, Sung-Bae
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.52-55
    • /
    • 2009
  • With the recent development of virtual reality, it has been actively investigated to develop user interfaces for immersive interaction. Immersive user interfaces improve the efficiency and the capability of information processing in the virtual environment providing various services, and provide effective interaction in the field of ubiquitous and mobile computing. In this paper, we propose an virtual reality platform "My Workspace" which renders an 3D virtual workspace by using an immersive user interface. We develop an interface that integrates an optical see-through head-mounted display, a Wii remote controller, and a helmet with infrared LEDs. It estimates the user's gaze direction in terms of horizontal and vertical angles based on the model of head movements. My Workspace expands the current 2D workspace based on monitors into the layered 3D workspace, and renders a part of 3D virtual workspace corresponding to the gaze direction. The user can arrange various tasks on the virtual workspace and switch each task by moving his head. In this paper, we will also verify the performance of the immersive user interface as well as its usefulness with the usability test.

  • PDF

Sound Source Localization using HRTF database

  • Hwang, Sung-Mok;Park, Young-Jin;Park, Youn-Sik
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.751-755
    • /
    • 2005
  • We propose a sound source localization method using the Head-Related-Transfer-Function (HRTF) to be implemented in a robot platform. In conventional localization methods, the location of a sound source is estimated from the time delays of wave fronts arriving in each microphone standing in an array formation in free-field. In case of a human head this corresponds to Interaural-Time-Delay (ITD) which is simply the time delay of incoming sound waves between the two ears. Although ITD is an excellent sound cue in stimulating a lateral perception on the horizontal plane, confusion is often raised when tracking the sound location from ITD alone because each sound source and its mirror image about the interaural axis share the same ITD. On the other hand, HRTFs associated with a dummy head microphone system or a robot platform with several microphones contain not only the information regarding proper time delays but also phase and magnitude distortions due to diffraction and scattering by the shading object such as the head and body of the platform. As a result, a set of HRTFs for any given platform provides a substantial amount of information as to the whereabouts of the source once proper analysis can be performed. In this study, we introduce new phase and magnitude criteria to be satisfied by a set of output signals from the microphones in order to find the sound source location in accordance with the HRTF database empirically obtained in an anechoic chamber with the given platform. The suggested method is verified through an experiment in a household environment and compared against the conventional method in performance.

  • PDF

A Study on Manipulating Method of 3D Game in HMD Environment by using Eye Tracking (HMD(Head Mounted Display)에서 시선 추적을 통한 3차원 게임 조작 방법 연구)

  • Park, Kang-Ryoung;Lee, Eui-Chul
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.2
    • /
    • pp.49-64
    • /
    • 2008
  • Recently, many researches about making more comfortable input device based on gaze detection technology have been done in human computer interface. However, the system cost becomes high due to the complicated hardware and there is difficulty to use the gaze detection system due to the complicated user calibration procedure. In this paper, we propose a new gaze detection method based on the 2D analysis and a simple user calibration. Our method used a small USB (Universal Serial Bus) camera attached on a HMD (Head-Mounted Display), hot-mirror and IR (Infra-Red) light illuminator. Because the HMD is moved according to user's facial movement, we can implement the gaze detection system of which performance is not affected by facial movement. In addition, we apply our gaze detection system to 3D first person shooting game. From that, the gaze direction of game character is controlled by our gaze detection method and it can target the enemy character and shoot, which can increase the immersion and interest of game. Experimental results showed that the game and gaze detection system could be operated at real-time speed in one desktop computer and we could obtain the gaze detection accuracy of 0.88 degrees. In addition, we could know our gaze detection technology could replace the conventional mouse in the 3D first person shooting game.