• 제목/요약/키워드: camera interface

검색결과 407건 처리시간 0.03초

머리의 자세를 추적하기 위한 효율적인 카메라 보정 방법에 관한 연구 (An Efficient Camera Calibration Method for Head Pose Tracking)

  • 박경수;임창주;이경태
    • 대한인간공학회지
    • /
    • 제19권1호
    • /
    • pp.77-90
    • /
    • 2000
  • The aim of this study is to develop and evaluate an efficient camera calibration method for vision-based head tracking. Tracking head movements is important in the design of an eye-controlled human/computer interface. A vision-based head tracking system was proposed to allow the user's head movements in the design of the eye-controlled human/computer interface. We proposed an efficient camera calibration method to track the 3D position and orientation of the user's head accurately. We also evaluated the performance of the proposed method. The experimental error analysis results showed that the proposed method can provide more accurate and stable pose (i.e. position and orientation) of the camera than the conventional direct linear transformation method which has been used in camera calibration. The results of this study can be applied to the tracking head movements related to the eye-controlled human/computer interface and the virtual reality technology.

  • PDF

다목적실용위성 2호 고해상도 카메라 시스템의 전기적 인터페이스 및 소프트웨어 프로토콜 예비 설계 (Preliminary Design of Electric Interface It Software Protocol of MSC(Multi-Spectral Camera) on KOMPSAT-II)

  • 허행팔;용상순
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2000년도 제15차 학술회의논문집
    • /
    • pp.101-101
    • /
    • 2000
  • MSC(Multispectral Camera), which will be a unique payload on KOMPSAT-II, is designed to collect panchromatic and multi-spectral imagery with a ground sample distance of 1m and a swath width of 15km at 685km altitude in sun-synchronous orbit. The instrument is designed to have an orbit operation duty cycle of 20% over the mission life time of 3 years. MSC electronics consists of three main subsystems; PMU(Payload Management Unit), CEU(Camera Electronics Unit) and PDTS(Payload Data Transmission Subsystem). PMU performs all the interface between spacecraft and MSC, and manages all the other subsystems by sending commands to them and receiving telemetry from them with software protocol through RS-422 interface. CEU controls FPA(Focal Plane Assembly) which contains TDI(Timc Delay Integration) CCD(Charge Coupled Device) and its clock drivers. PMU provides a Master Clock to synchronize panchromatic and multispectral camera. PDTS performs compression, storage and encryption of image data and transmits them to the ground station through x-band.

  • PDF

임베디드 시스템에서 CIS 카메라 인터페이스의 구현 (Developing a CIS Camera Interface for Embedded Systems)

  • 이완수;오삼권;황희융;노영섭
    • 한국산학기술학회논문지
    • /
    • 제8권3호
    • /
    • pp.513-521
    • /
    • 2007
  • 최근 소형 이동단말기 시장에서 멀티미디어 기능 중에서 카메라 기능은 필수 항목으로 자리 잡았다. 그러나 많은 SoC들 중에서는 아직도 카메라 인터페이스를 지원하지 않는 경우가 많아 저 가격으로 임베디드 기기를 구현하고자 하는 경우 많은 애로사항이 따르게 된다. 따라서 본 논문에서는 임베디드 시스템에서 필수 기능으로 자리 잡은 카메라 인터페이스가 없는 경우 쉽게 카메라를 지원할 수 있는 방안을 제시 하였다. 이를 위하여 CMOS Image Sensor(CIS)를 사용하여 그 인터페이스를 구현하고 디바이스 드라이버를 작성함으로써 간단히 임베디드 시스템에서 CIS를 지원할 수 있는 방안을 제시 하였다.

  • PDF

Human-Computer Interaction Based Only on Auditory and Visual Information

  • Sha, Hui;Agah, Arvin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • 제2권4호
    • /
    • pp.285-297
    • /
    • 2000
  • One of the research objectives in the area of multimedia human-computer interaction is the application of artificial intelligence and robotics technologies to the development of computer interfaces. This involves utilizing many forms of media, integrating speed input, natural language, graphics, hand pointing gestures, and other methods for interactive dialogues. Although current human-computer communication methods include computer keyboards, mice, and other traditional devices, the two basic ways by which people communicate with each other are voice and gesture. This paper reports on research focusing on the development of an intelligent multimedia interface system modeled based on the manner in which people communicate. This work explores the interaction between humans and computers based only on the processing of speech(Work uttered by the person) and processing of images(hand pointing gestures). The purpose of the interface is to control a pan/tilt camera to point it to a location specified by the user through utterance of words and pointing of the hand, The systems utilizes another stationary camera to capture images of the users hand and a microphone to capture the users words. Upon processing of the images and sounds, the systems responds by pointing the camera. Initially, the interface uses hand pointing to locate the general position which user is referring to and then the interface uses voice command provided by user to fine-the location, and change the zooming of the camera, if requested. The image of the location is captured by the pan/tilt camera and sent to a color TV monitor to be displayed. This type of system has applications in tele-conferencing and other rmote operations, where the system must respond to users command, in a manner similar to how the user would communicate with another person. The advantage of this approach is the elimination of the traditional input devices that the user must utilize in order to control a pan/tillt camera, replacing them with more "natural" means of interaction. A number of experiments were performed to evaluate the interface system with respect to its accuracy, efficiency, reliability, and limitation.

  • PDF

영상처리를 이용한 머리의 움직임 추적 시스템 (Head tracking system using image processing)

  • 박경수;임창주;반영환;장필식
    • 대한인간공학회지
    • /
    • 제16권3호
    • /
    • pp.1-10
    • /
    • 1997
  • This paper is concerned with the development and evaluation of the camera calibration method for a real-time head tracking system. Tracking of head movements is important in the design of an eye-controlled human/computer interface and the area of virtual environment. We proposed a video-based head tracking system. A camera was mounted on the subject's head and it took the front view containing eight 3-dimensional reference points(passive retr0-reflecting markers) fixed at the known position(computer monitor). The reference points were captured by image processing board. These points were used to calculate the position (3-dimensional) and orientation of the camera. A suitable camera calibration method for providing accurate extrinsic camera parameters was proposed. The method has three steps. In the first step, the image center was calibrated using the method of varying focal length. In the second step, the focal length and the scale factor were calibrated from the Direct Linear Transformation (DLT) matrix obtained from the known position and orientation of the camera. In the third step, the position and orientation of the camera was calculated from the DLT matrix, using the calibrated intrinsic camera parameters. Experimental results showed that the average error of camera positions (3- dimensional) is about $0.53^{\circ}C$, the angular errors of camera orientations are less than $0.55^{\circ}C$and the data aquisition rate is about 10Hz. The results of this study can be applied to the tracking of head movements related to the eye-controlled human/computer interface and the virtual environment.

  • PDF

Implementation of Gesture Interface for Projected Surfaces

  • Park, Yong-Suk;Park, Se-Ho;Kim, Tae-Gon;Chung, Jong-Moon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제9권1호
    • /
    • pp.378-390
    • /
    • 2015
  • Image projectors can turn any surface into a display. Integrating a surface projection with a user interface transforms it into an interactive display with many possible applications. Hand gesture interfaces are often used with projector-camera systems. Hand detection through color image processing is affected by the surrounding environment. The lack of illumination and color details greatly influences the detection process and drops the recognition success rate. In addition, there can be interference from the projection system itself due to image projection. In order to overcome these problems, a gesture interface based on depth images is proposed for projected surfaces. In this paper, a depth camera is used for hand recognition and for effectively extracting the area of the hand from the scene. A hand detection and finger tracking method based on depth images is proposed. Based on the proposed method, a touch interface for the projected surface is implemented and evaluated.

HMD를 이용한 사용자 자세 기반 항공 촬영용 쿼드로터 시스템 제어 인터페이스 개발 (A Posture Based Control Interface for Quadrotor Aerial Video System Using Head-Mounted Display)

  • 김재승;정종민;김한솔;황남웅;최윤호;박진배
    • 전기학회논문지
    • /
    • 제64권7호
    • /
    • pp.1056-1063
    • /
    • 2015
  • In this paper, we develop an interface for aerial photograph platform which consists of a quadrotor and a gimbal using the human body and the head posture. As quadrotors have been widely adopted in many industries such as aerial photography, remote surveillance, and maintenance of infrastructures, the demand of aerial video and photograph has been increasing remarkably. Stick type remote controllers are widely used to control a quadrotor, but this method is not an intuitive way of controlling the aerial vehicle and the camera simultaneously. Therefore, a new interface which controls the serial photograph platform is presented. The presented interface uses the human head movement measured by head-mounted display as a reference for controlling the camera angle, and the human body posture measured from Kinect for controlling the attitude of the quadrotor. As the image captured by the camera is displayed on the head-mounted display simultaneously, the user can feel flying experience and intuitively control the quadrotor and the camera. Finally, the performance of the developed system shown to verify the effectiveness and superiority of the presented interface.

장애인을 위한 새로운 감성 인터페이스 연구 (A New Ergonomic Interface System for the Disabled Person)

  • 허환;이지우;이원오;이의철;박강령
    • 대한인간공학회지
    • /
    • 제30권1호
    • /
    • pp.229-235
    • /
    • 2011
  • Objective: Making a new ergonomic interface system based on camera vision system, which helps the handicapped in home environment. Background: Enabling the handicapped to manipulate the consumer electronics by the proposed interface system. Method: A wearable device for capturing the eye image using a near-infrared(NIR) camera and illuminators is proposed for tracking eye gaze position(Heo et al., 2011). A frontal viewing camera is attached to the wearable device, which can recognize the consumer electronics to be controlled(Heo et al., 2011). And the amount of user's eye fatigue can be measured based on eye blink rate, and in case that the user's fatigue exceeds in the predetermined level, the proposed system can automatically change the mode of gaze based interface into that of manual selection. Results: The experimental results showed that the gaze estimation error of the proposed method was 1.98 degrees with the successful recognition of the object by the frontal viewing camera(Heo et al., 2011). Conclusion: We made a new ergonomic interface system based on gaze tracking and object recognition Application: The proposed system can be used for helping the handicapped in home environment.

3D 그래픽스 인터페이스에 대한 운동학적 접근 (A Kinematics Approach to 3D Graphical Interface)

  • 이주행;장태익;김명수;김만수;정경택;이의택
    • 한국컴퓨터그래픽스학회논문지
    • /
    • 제2권2호
    • /
    • pp.53-60
    • /
    • 1996
  • 3차원 그래픽스 인터페이스에 있어서 객체 및 가상 카메라 제어의 문제는 운동학이나 역운동학의 문제로 해석할 수 있다. 역운동학에서 잉여자유도가 많으면 singularity가 빈번하게 발생한다는 사실이 잘 알려져 있다. 본 연구에서는 잉여자유도를 줄이는 관점에서 3차원 그래픽스 인터페이스의 문제를 재조명하고 이에 대한 부분적인 해결방안을 제시한다.

  • PDF

초기 혼합모드 동적 하중을 받는 경사계면균열의 동적 전파거동 (Dynamic Slant Interface Crack Propagation Behavior under Initial Impact Loading)

  • 이억섭;박재철;윤해룡
    • 한국정밀공학회지
    • /
    • 제18권2호
    • /
    • pp.146-151
    • /
    • 2001
  • The effects of slant interface in the hybrid specimen on the dynamic crack propagation behavior have been investigated using dynamic photoelasticity. The dynamic photoelasticity with the aid of Cranz-Shardin type high speed camera system is utilized to record the dynamic stress field around the dynamically propagating inclined interface crack tip in the three point bending specimens. The dynamic load is applied by a hammer dropped from 0.08m high without initial velocity. The dynamic crack propagation velocities and dynamic stresses field around the interface crack tips are investigated. Theoretical dynamic isochromatic fringe loops are compared with the experimental reults. It is interesting to note that the crack propagating velocity becomes comparable to the Rayleigh wave speed of the soft material of a specimen when slant angle decreases.

  • PDF