• Title/Summary/Keyword: Real-time camera tracking

Search Result 294, Processing Time 0.028 seconds

Mapping of Real-Time 3D object movement

  • Tengis, Tserendondog;Batmunkh, Amar
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.7 no.2
    • /
    • pp.1-8
    • /
    • 2015
  • Tracking of an object in 3D space performed in real-time is a significant task in different domains from autonomous robots to smart vehicles. In traditional methods, specific data acquisition equipments such as radars, lasers etc, are used. Contemporary computer technology development accelerates image processing, and it results in three-dimensional stereo vision to be used for localizing and object tracking in space. This paper describes a system for tracking three dimensional motion of an object using color information in real time. We create stereo images using pair of a simple web camera, raw data of an object positions are collected under realistic noisy conditions. The system has been tested using OpenCV and Matlab and the results of the experiments are presented here.

Controller Design for Object Tracking with an Active Camera (능동 카메라 기반의 물체 추적 제어기 설계)

  • Youn, Su-Jin;Choi, Goon-Ho
    • Journal of the Semiconductor & Display Technology
    • /
    • v.10 no.1
    • /
    • pp.83-89
    • /
    • 2011
  • In the case of the tracking system with an active camera, it is very difficult to guarantee real-time processing due to the attribute of vision system which handles large amounts of data at once and has time delay to process. The reliability of the processed result is also badly influenced by the slow sampling time and uncertainty caused by the image processing. In this paper, we figure out dynamic characteristics of pixels reflected on the image plane and derive the mathematical model of the vision tracking system which includes the actuating part and the image processing part. Based on this model, we find a controller that stabilizes the system and enhances the tracking performance to track a target rapidly. The centroid is used as the position index of moving object and the DC motor in the actuating part is controlled to keep the identified centroid at the center point of the image plane.

A Real-time Particle Filtering Framework for Robust Camera Tracking in An AR Environment (증강현실 환경에서의 강건한 카메라 추적을 위한 실시간 입자 필터링 기법)

  • Lee, Seok-Han
    • Journal of Digital Contents Society
    • /
    • v.11 no.4
    • /
    • pp.597-606
    • /
    • 2010
  • This paper describes a real-time camera tracking framework specifically designed to track a monocular camera in an AR workspace. Typically, the Kalman filter is often employed for the camera tracking. In general, however, tracking performances of conventional methods are seriously affected by unpredictable situations such as ambiguity in feature detection, occlusion of features and rapid camera shake. In this paper, a recursive Bayesian sampling framework which is also known as the particle filter is adopted for the camera pose estimation. In our system, the camera state is estimated on the basis of the Gaussian distribution without employing additional uncertainty model and sample weight computation. In addition, the camera state is directly computed based on new sample particles which are distributed according to the true posterior of system state. In order to verify the proposed system, we conduct several experiments for unstable situations in the desktop AR environments.

An Optimal Implementation of Object Tracking Algorithm for DaVinci Processor-based Smart Camera (다빈치 프로세서 기반 스마트 카메라에서의 객체 추적 알고리즘의 최적 구현)

  • Lee, Byung-Eun;Nguyen, Thanh Binh;Chung, Sun-Tae
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2009.05a
    • /
    • pp.17-22
    • /
    • 2009
  • DaVinci processors are popular media processors for implementing embedded multimedia applications. They support dual core architecture: ARM9 core for video I/O handling as well as system management and peripheral handling, and DSP C64+ core for effective digital signal processing. In this paper, we propose our efforts for optimal implementation of object tracking algorithm in DaVinci-based smart camera which is being designed and implemented by our laboratory. The smart camera in this paper is supposed to support object detection, object tracking, object classification and detection of intrusion into surveillance regions and sending the detection event to remote clients using IP protocol. Object tracking algorithm is computationally expensive since it needs to process several procedures such as foreground mask extraction, foreground mask correction, connected component labeling, blob region calculation, object prediction, and etc. which require large amount of computation times. Thus, if it is not implemented optimally in Davinci-based processors, one cannot expect real-time performance of the smart camera.

  • PDF

Sector Based Scanning and Adaptive Active Tracking of Multiple Objects

  • Cho, Shung-Han;Nam, Yun-Young;Hong, Sang-Jin;Cho, We-Duke
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.5 no.6
    • /
    • pp.1166-1191
    • /
    • 2011
  • This paper presents an adaptive active tracking system with sector based scanning for a single PTZ camera. Dividing sectors on an image reduces the search space to shorten selection time so that the system can cover many targets. Upon the selection of a target, the system estimates the target trajectory to predict the zooming location with a finite amount of time for camera movement. Advanced estimation techniques using probabilistic reason suffer from the unknown object dynamics and the inaccurate estimation compromises the zooming level to prevent tracking failure. The proposed system uses the simple piecewise estimation with a few frames to cope with fast moving objects and/or slow camera movements. The target is tracked in multiple steps and the zooming time for each step is determined by maximizing the zooming level within the expected variation of object velocity and detection. The number of zooming steps is adaptively determined according to target speed. In addition, the iterative estimation of a zooming location with camera movement time compensates for the target prediction error due to the difference between speeds of a target and a camera. The effectiveness of the proposed method is validated by simulations and real time experiments.

A Study on Implementation of Motion Graphics Virtual Camera with AR Core

  • Jung, Jin-Bum;Lee, Jae-Soo;Lee, Seung-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.8
    • /
    • pp.85-90
    • /
    • 2022
  • In this study, to reduce the time and cost disadvantages of the traditional motion graphic production method in order to realize the movement of a virtual camera identical to that of the real camera, motion graphics virtualization using AR Core-based mobile device real-time tracking data A method for creating a camera is proposed. The proposed method is a method that simplifies the tracking operation in the video file stored after shooting, and simultaneously proceeds with shooting on an AR Core-based mobile device to determine whether or not tracking is successful in the shooting stage. As a result of the experiment, there was no difference in the motion graphic result image compared to the conventional method, but the time of 6 minutes and 10 seconds was consumed based on the 300frame image, whereas the proposed method has very high time efficiency because this step can be omitted. At a time when interest in image production using virtual augmented reality and various studies are underway, this study will be utilized in virtual camera creation and match moving.

Study on the Real-Time Moving Object Tracking using Fuzzy Controller (퍼지 제어기를 이용한 실시간 이동 물체 추적에 관한 연구)

  • Kim Gwan-Hyung;Kang Sung-In;Lee Jae-Hyun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.1
    • /
    • pp.191-196
    • /
    • 2006
  • This paper presents the moving object tracking method using vision system. In order to track object in real time, the image of moving object have to be located the origin of the image coordinate axes. Accordingly, Fuzzy Control System is investigated for tracking the moving object, which control the camera module with Pan/Tilt mechanism. Hereafter, so the this system is applied to mobile robot, we design and implement image processing board for vision system. Also fuzzy controller is implemented to the StrongArm board. Finally, the proposed fuzzy controller is useful for the real-time moving object tracking system by experiment.

An Implementation on the Real-Time Moving Object Tracking System Using DSP (DSP를 이용한 실시간 영상추적 시스템 구현)

  • Choi, Jae-Guen;Na, Jong-In;Ahn, Do-Rang;Lee, Dong-Wook
    • Proceedings of the KIEE Conference
    • /
    • 2001.11c
    • /
    • pp.406-408
    • /
    • 2001
  • In this thesis, a video tracker with a TMS320C31 DSP is designed and implemented. It is intended to work with PC through PCI Bus and can be used in real-time applications. The DSP board is capable of grabbing image data from camera, and calculating the position of a target, and tracking its movement. The tracking situation can be displayed in a PC monitor and displacement of the movement is fed back to pan and tilt the camera. Experimental results show that the tracker implemented here works well in real applications.

  • PDF

Kinematic Method of Camera System for Tracking of a Moving Object

  • Jin, Tae-Seok
    • Journal of information and communication convergence engineering
    • /
    • v.8 no.2
    • /
    • pp.145-149
    • /
    • 2010
  • In this paper, we propose a kinematic approach to estimating the real-time moving object. A new scheme for a mobile robot to track and capture a moving object using images of a camera is proposed. The moving object is assumed to be a point-object and projected onto an image plane to form a geometrical constraint equation that provides position data of the object based on the kinematics of the active camera. Uncertainties in the position estimation caused by the point-object assumption are compensated using the Kalman filter. To generate the shortest time path to capture the moving object, the linear and angular velocities are estimated and utilized. The experimental results of tracking and capturing of the target object with the mobile robot are presented.

Head tracking system using image processing (영상처리를 이용한 머리의 움직임 추적 시스템)

  • 박경수;임창주;반영환;장필식
    • Journal of the Ergonomics Society of Korea
    • /
    • v.16 no.3
    • /
    • pp.1-10
    • /
    • 1997
  • This paper is concerned with the development and evaluation of the camera calibration method for a real-time head tracking system. Tracking of head movements is important in the design of an eye-controlled human/computer interface and the area of virtual environment. We proposed a video-based head tracking system. A camera was mounted on the subject's head and it took the front view containing eight 3-dimensional reference points(passive retr0-reflecting markers) fixed at the known position(computer monitor). The reference points were captured by image processing board. These points were used to calculate the position (3-dimensional) and orientation of the camera. A suitable camera calibration method for providing accurate extrinsic camera parameters was proposed. The method has three steps. In the first step, the image center was calibrated using the method of varying focal length. In the second step, the focal length and the scale factor were calibrated from the Direct Linear Transformation (DLT) matrix obtained from the known position and orientation of the camera. In the third step, the position and orientation of the camera was calculated from the DLT matrix, using the calibrated intrinsic camera parameters. Experimental results showed that the average error of camera positions (3- dimensional) is about $0.53^{\circ}C$, the angular errors of camera orientations are less than $0.55^{\circ}C$and the data aquisition rate is about 10Hz. The results of this study can be applied to the tracking of head movements related to the eye-controlled human/computer interface and the virtual environment.

  • PDF