• Title/Summary/Keyword: camera interface

Search Result 407, Processing Time 0.029 seconds

An Efficient Camera Calibration Method for Head Pose Tracking (머리의 자세를 추적하기 위한 효율적인 카메라 보정 방법에 관한 연구)

  • Park, Gyeong-Su;Im, Chang-Ju;Lee, Gyeong-Tae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.19 no.1
    • /
    • pp.77-90
    • /
    • 2000
  • The aim of this study is to develop and evaluate an efficient camera calibration method for vision-based head tracking. Tracking head movements is important in the design of an eye-controlled human/computer interface. A vision-based head tracking system was proposed to allow the user's head movements in the design of the eye-controlled human/computer interface. We proposed an efficient camera calibration method to track the 3D position and orientation of the user's head accurately. We also evaluated the performance of the proposed method. The experimental error analysis results showed that the proposed method can provide more accurate and stable pose (i.e. position and orientation) of the camera than the conventional direct linear transformation method which has been used in camera calibration. The results of this study can be applied to the tracking head movements related to the eye-controlled human/computer interface and the virtual reality technology.

  • PDF

Preliminary Design of Electric Interface It Software Protocol of MSC(Multi-Spectral Camera) on KOMPSAT-II (다목적실용위성 2호 고해상도 카메라 시스템의 전기적 인터페이스 및 소프트웨어 프로토콜 예비 설계)

  • 허행팔;용상순
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2000.10a
    • /
    • pp.101-101
    • /
    • 2000
  • MSC(Multispectral Camera), which will be a unique payload on KOMPSAT-II, is designed to collect panchromatic and multi-spectral imagery with a ground sample distance of 1m and a swath width of 15km at 685km altitude in sun-synchronous orbit. The instrument is designed to have an orbit operation duty cycle of 20% over the mission life time of 3 years. MSC electronics consists of three main subsystems; PMU(Payload Management Unit), CEU(Camera Electronics Unit) and PDTS(Payload Data Transmission Subsystem). PMU performs all the interface between spacecraft and MSC, and manages all the other subsystems by sending commands to them and receiving telemetry from them with software protocol through RS-422 interface. CEU controls FPA(Focal Plane Assembly) which contains TDI(Timc Delay Integration) CCD(Charge Coupled Device) and its clock drivers. PMU provides a Master Clock to synchronize panchromatic and multispectral camera. PDTS performs compression, storage and encryption of image data and transmits them to the ground station through x-band.

  • PDF

Developing a CIS Camera Interface for Embedded Systems (임베디드 시스템에서 CIS 카메라 인터페이스의 구현)

  • Lee, Wan-Su;Oh, Sam-Kwan;Hwang, Hee-Yeung;Roh, Young-Sub
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.8 no.3
    • /
    • pp.513-521
    • /
    • 2007
  • Recently, camera function is one of the most primary functions out of the multimedia capabilities in the small mobile terminals. But, it has been difficult for implementing embedded devices with low cost because of not supporting camera interface in many SoCs. Thus, this paper presents a method of supporting camera function with ease for embedded devices which has not camera interface. For this purpose, the interface is implemented for a CMOS image sensor. The method is also provided that CIS(CMOS Image Sensor) is supported in the embedded system by programming the device driver.

  • PDF

Human-Computer Interaction Based Only on Auditory and Visual Information

  • Sha, Hui;Agah, Arvin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.2 no.4
    • /
    • pp.285-297
    • /
    • 2000
  • One of the research objectives in the area of multimedia human-computer interaction is the application of artificial intelligence and robotics technologies to the development of computer interfaces. This involves utilizing many forms of media, integrating speed input, natural language, graphics, hand pointing gestures, and other methods for interactive dialogues. Although current human-computer communication methods include computer keyboards, mice, and other traditional devices, the two basic ways by which people communicate with each other are voice and gesture. This paper reports on research focusing on the development of an intelligent multimedia interface system modeled based on the manner in which people communicate. This work explores the interaction between humans and computers based only on the processing of speech(Work uttered by the person) and processing of images(hand pointing gestures). The purpose of the interface is to control a pan/tilt camera to point it to a location specified by the user through utterance of words and pointing of the hand, The systems utilizes another stationary camera to capture images of the users hand and a microphone to capture the users words. Upon processing of the images and sounds, the systems responds by pointing the camera. Initially, the interface uses hand pointing to locate the general position which user is referring to and then the interface uses voice command provided by user to fine-the location, and change the zooming of the camera, if requested. The image of the location is captured by the pan/tilt camera and sent to a color TV monitor to be displayed. This type of system has applications in tele-conferencing and other rmote operations, where the system must respond to users command, in a manner similar to how the user would communicate with another person. The advantage of this approach is the elimination of the traditional input devices that the user must utilize in order to control a pan/tillt camera, replacing them with more "natural" means of interaction. A number of experiments were performed to evaluate the interface system with respect to its accuracy, efficiency, reliability, and limitation.

  • PDF

Head tracking system using image processing (영상처리를 이용한 머리의 움직임 추적 시스템)

  • 박경수;임창주;반영환;장필식
    • Journal of the Ergonomics Society of Korea
    • /
    • v.16 no.3
    • /
    • pp.1-10
    • /
    • 1997
  • This paper is concerned with the development and evaluation of the camera calibration method for a real-time head tracking system. Tracking of head movements is important in the design of an eye-controlled human/computer interface and the area of virtual environment. We proposed a video-based head tracking system. A camera was mounted on the subject's head and it took the front view containing eight 3-dimensional reference points(passive retr0-reflecting markers) fixed at the known position(computer monitor). The reference points were captured by image processing board. These points were used to calculate the position (3-dimensional) and orientation of the camera. A suitable camera calibration method for providing accurate extrinsic camera parameters was proposed. The method has three steps. In the first step, the image center was calibrated using the method of varying focal length. In the second step, the focal length and the scale factor were calibrated from the Direct Linear Transformation (DLT) matrix obtained from the known position and orientation of the camera. In the third step, the position and orientation of the camera was calculated from the DLT matrix, using the calibrated intrinsic camera parameters. Experimental results showed that the average error of camera positions (3- dimensional) is about $0.53^{\circ}C$, the angular errors of camera orientations are less than $0.55^{\circ}C$and the data aquisition rate is about 10Hz. The results of this study can be applied to the tracking of head movements related to the eye-controlled human/computer interface and the virtual environment.

  • PDF

Implementation of Gesture Interface for Projected Surfaces

  • Park, Yong-Suk;Park, Se-Ho;Kim, Tae-Gon;Chung, Jong-Moon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.1
    • /
    • pp.378-390
    • /
    • 2015
  • Image projectors can turn any surface into a display. Integrating a surface projection with a user interface transforms it into an interactive display with many possible applications. Hand gesture interfaces are often used with projector-camera systems. Hand detection through color image processing is affected by the surrounding environment. The lack of illumination and color details greatly influences the detection process and drops the recognition success rate. In addition, there can be interference from the projection system itself due to image projection. In order to overcome these problems, a gesture interface based on depth images is proposed for projected surfaces. In this paper, a depth camera is used for hand recognition and for effectively extracting the area of the hand from the scene. A hand detection and finger tracking method based on depth images is proposed. Based on the proposed method, a touch interface for the projected surface is implemented and evaluated.

A Posture Based Control Interface for Quadrotor Aerial Video System Using Head-Mounted Display (HMD를 이용한 사용자 자세 기반 항공 촬영용 쿼드로터 시스템 제어 인터페이스 개발)

  • Kim, Jaeseung;Jeong, Jong Min;Kim, Han Sol;Hwang, Nam Eung;Choi, Yoon Ho;Park, Jin Bae
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.64 no.7
    • /
    • pp.1056-1063
    • /
    • 2015
  • In this paper, we develop an interface for aerial photograph platform which consists of a quadrotor and a gimbal using the human body and the head posture. As quadrotors have been widely adopted in many industries such as aerial photography, remote surveillance, and maintenance of infrastructures, the demand of aerial video and photograph has been increasing remarkably. Stick type remote controllers are widely used to control a quadrotor, but this method is not an intuitive way of controlling the aerial vehicle and the camera simultaneously. Therefore, a new interface which controls the serial photograph platform is presented. The presented interface uses the human head movement measured by head-mounted display as a reference for controlling the camera angle, and the human body posture measured from Kinect for controlling the attitude of the quadrotor. As the image captured by the camera is displayed on the head-mounted display simultaneously, the user can feel flying experience and intuitively control the quadrotor and the camera. Finally, the performance of the developed system shown to verify the effectiveness and superiority of the presented interface.

A New Ergonomic Interface System for the Disabled Person (장애인을 위한 새로운 감성 인터페이스 연구)

  • Heo, Hwan;Lee, Ji-Woo;Lee, Won-Oh;Lee, Eui-Chul;Park, Kang-Ryoung
    • Journal of the Ergonomics Society of Korea
    • /
    • v.30 no.1
    • /
    • pp.229-235
    • /
    • 2011
  • Objective: Making a new ergonomic interface system based on camera vision system, which helps the handicapped in home environment. Background: Enabling the handicapped to manipulate the consumer electronics by the proposed interface system. Method: A wearable device for capturing the eye image using a near-infrared(NIR) camera and illuminators is proposed for tracking eye gaze position(Heo et al., 2011). A frontal viewing camera is attached to the wearable device, which can recognize the consumer electronics to be controlled(Heo et al., 2011). And the amount of user's eye fatigue can be measured based on eye blink rate, and in case that the user's fatigue exceeds in the predetermined level, the proposed system can automatically change the mode of gaze based interface into that of manual selection. Results: The experimental results showed that the gaze estimation error of the proposed method was 1.98 degrees with the successful recognition of the object by the frontal viewing camera(Heo et al., 2011). Conclusion: We made a new ergonomic interface system based on gaze tracking and object recognition Application: The proposed system can be used for helping the handicapped in home environment.

A Kinematics Approach to 3D Graphical Interface (3D 그래픽스 인터페이스에 대한 운동학적 접근)

  • Lee, Joo-Haeng;Jang, Tae-Ik;Kim, Myung-Soo;Kim, Mansoo;Chong, Kyung Taek;Lee, Ee Taek
    • Journal of the Korea Computer Graphics Society
    • /
    • v.2 no.2
    • /
    • pp.53-60
    • /
    • 1996
  • In 3D graphics interface, 3D objects and virtual camera have many degrees of freedom. We interpret the control of 3D objects and virtual camera as a problem of kinematics and inverse kinematics. It is well known that extra degrees of freedom introduce various singularities in inverse kinematics. In this paper, we approach 3D graphics interface problems by reducing redundant degrees of freedom so that the control degrees of freedom matches with the degrees of freedom in the motions of 3D objects and virtual camera.

  • PDF

Dynamic Slant Interface Crack Propagation Behavior under Initial Impact Loading (초기 혼합모드 동적 하중을 받는 경사계면균열의 동적 전파거동)

  • Lee, Eok-Seop;Park, Jae-Cheol;Yun, Hae-Ryong
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.18 no.2
    • /
    • pp.146-151
    • /
    • 2001
  • The effects of slant interface in the hybrid specimen on the dynamic crack propagation behavior have been investigated using dynamic photoelasticity. The dynamic photoelasticity with the aid of Cranz-Shardin type high speed camera system is utilized to record the dynamic stress field around the dynamically propagating inclined interface crack tip in the three point bending specimens. The dynamic load is applied by a hammer dropped from 0.08m high without initial velocity. The dynamic crack propagation velocities and dynamic stresses field around the interface crack tips are investigated. Theoretical dynamic isochromatic fringe loops are compared with the experimental reults. It is interesting to note that the crack propagating velocity becomes comparable to the Rayleigh wave speed of the soft material of a specimen when slant angle decreases.

  • PDF