• Title/Summary/Keyword: multi-camera system

Search Result 477, Processing Time 0.026 seconds

Multi-camera System Calibration with Built-in Relative Orientation Constraints (Part 1) Theoretical Principle

  • Lari, Zahra;Habib, Ayman;Mazaheri, Mehdi;Al-Durgham, Kaleel
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.32 no.3
    • /
    • pp.191-204
    • /
    • 2014
  • In recent years, multi-camera systems have been recognized as an affordable alternative for the collection of 3D spatial data from physical surfaces. The collected data can be applied for different mapping(e.g., mobile mapping and mapping inaccessible locations)or metrology applications (e.g., industrial, biomedical, and architectural). In order to fully exploit the potential accuracy of these systems and ensure successful manipulation of the involved cameras, a careful system calibration should be performed prior to the data collection procedure. The calibration of a multi-camera system is accomplished when the individual cameras are calibrated and the geometric relationships among the different system components are defined. In this paper, a new single-step approach is introduced for the calibration of a multi-camera system (i.e., individual camera calibration and estimation of the lever-arm and boresight angles among the system components). In this approach, one of the cameras is set as the reference camera and the system mounting parameters are defined relative to that reference camera. The proposed approach is easy to implement and computationally efficient. The major advantage of this method, when compared to available multi-camera system calibration approaches, is the flexibility of being applied for either directly or indirectly geo-referenced multi-camera systems. The feasibility of the proposed approach is verified through experimental results using real data collected by a newly-developed indirectly geo-referenced multi-camera system.

Combined Static and Dynamic Platform Calibration for an Aerial Multi-Camera System

  • Cui, Hong-Xia;Liu, Jia-Qi;Su, Guo-Zhong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.6
    • /
    • pp.2689-2708
    • /
    • 2016
  • Multi-camera systems which integrate two or more low-cost digital cameras are adopted to reach higher ground coverage and improve the base-height ratio in low altitude remote sensing. To guarantee accurate multi-camera integration, the geometric relationship among cameras must be determined through platform calibration techniques. This paper proposed a combined two-step platform calibration method. In the first step, the static platform calibration was conducted based on the stable relative orientation constraint and convergent conditions among cameras in static environments. In the second step, a dynamic platform self-calibration approach was proposed based on not only tie points but also straight lines in order to correct the small change of the relative relationship among cameras during dynamic flight. Experiments based on the proposed two-step platform calibration method were carried out with terrestrial and aerial images from a multi-camera system combined with four consumer-grade digital cameras onboard an unmanned aerial vehicle. The experimental results have shown that the proposed platform calibration approach is able to compensate the varied relative relationship during flight, acquiring the mosaicing accuracy of virtual images smaller than 0.5pixel. The proposed approach can be extended for calibrating other low-cost multi-camera system without rigorously mechanical structure.

Performance characteristics of a multi-directional underwater CCTV camera system to use in the artificial reef survey (인공어초 조사용 다방향 수중 CCTV 카메라 시스템의 성능 특성)

  • Lee, Dae-Jae
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.47 no.2
    • /
    • pp.146-152
    • /
    • 2011
  • Underwater CCTV camera systems are increasingly replaced the traditional net approach of assessing the species, numbers and aggregation patterns of marine animals distributing around the artificial reefs installed in the inshore fishing grounds, in particular, in relation to the biological investigation of behavior and distribution patterns of target fishes. In relation to these needs, we developed a multi-directional underwater CCTV camera system to use in detecting and tracking marine animals in the artificial reef ground. The marine targets to be investigated were independently tracked by using a camera module toward the bottom and four camera modules installed in the interval of $90^{\circ}$ in horizontal plane and inclination of $45^{\circ}$ in vertical plane of the CCTV system without the overlap of video frames by each camera module. From the results of several field tests at sea, we believe that the developed multi-directional underwater CCTV camera system will contribute to a better understanding in evaluating the effect of artificial reefs installed in the inshore fishing grounds.

Multi-camera System Calibration with Built-in Relative Orientation Constraints (Part 2) Automation, Implementation, and Experimental Results

  • Lari, Zahra;Habib, Ayman;Mazaheri, Mehdi;Al-Durgham, Kaleel
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.32 no.3
    • /
    • pp.205-216
    • /
    • 2014
  • Multi-camera systems have been widely used as cost-effective tools for the collection of geospatial data for various applications. In order to fully achieve the potential accuracy of these systems for object space reconstruction, careful system calibration should be carried out prior to data collection. Since the structural integrity of the involved cameras' components and system mounting parameters cannot be guaranteed over time, multi-camera system should be frequently calibrated to confirm the stability of the estimated parameters. Therefore, automated techniques are needed to facilitate and speed up the system calibration procedure. The automation of the multi-camera system calibration approach, which was proposed in the first part of this paper, is contingent on the automated detection, localization, and identification of the object space signalized targets in the images. In this paper, the automation of the proposed camera calibration procedure through automatic target extraction and labelling approaches will be presented. The introduced automated system calibration procedure is then implemented for a newly-developed multi-camera system while considering the optimum configuration for the data collection. Experimental results from the implemented system calibration procedure are finally presented to verify the feasibility the proposed automated procedure. Qualitative and quantitative evaluation of the estimated system calibration parameters from two-calibration sessions is also presented to confirm the stability of the cameras' interior orientation and system mounting parameters.

Development of a Multi-View Camera System Prototype (다각사진촬영시스템 프로토타입 개발)

  • Park, Seon-Dong;Seo, Sang-Il;Yoon, Dong-Jin;Shin, Jin-Soo;Lee, Chang-No
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.27 no.2
    • /
    • pp.261-271
    • /
    • 2009
  • Due to the recent rise of a need for 3 dimensional geospatial information on urban areas, general interest in aerial multi-view cameras has been on an increase. The conventional geospatial information system depends solely upon vertical images, while the multi-view camera is capable of taking both vertical and oblique images taken from multiple directions, thus making it easier for the user to interpret the object. Through our research we developed a prototype of a multi-view camera system that includes a camera system, GPS/INS, a flight management system, and a control system. We also studied and experimented with the camera viewing angles, the synchronization of image capture, the exposure delay, the data storage that must be considered for the development of the multi-view camera system.

COLOR CORRECTION METHOD USING GRAY GRADIENT BAR FOR MULTI-VIEW CAMERA SYSTEM

  • Jung, Jae-Il;Ho, Yo-Sung
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.1-6
    • /
    • 2009
  • Due to the different camera properties of the multi-view camera system, the color properties of captured images can be inconsistent. This inconsistency makes post-processing such as depth estimation, view synthesis and compression difficult. In this paper, the method to correct the different color properties of multi-view images is proposed. We utilize a gray gradient bar on a display device to extract the color sensitivity property of the camera and calculate a look-up table based on the sensitivity property. The colors in the target image are converted by mapping technique referring to the look-up table. Proposed algorithm shows the good subjective results and reduces the mean absolute error among the color values of multi-view images by 72% on average in experimental results.

  • PDF

Omni-directional Visual-LiDAR SLAM for Multi-Camera System (다중 카메라 시스템을 위한 전방위 Visual-LiDAR SLAM)

  • Javed, Zeeshan;Kim, Gon-Woo
    • The Journal of Korea Robotics Society
    • /
    • v.17 no.3
    • /
    • pp.353-358
    • /
    • 2022
  • Due to the limited field of view of the pinhole camera, there is a lack of stability and accuracy in camera pose estimation applications such as visual SLAM. Nowadays, multiple-camera setups and large field of cameras are used to solve such issues. However, a multiple-camera system increases the computation complexity of the algorithm. Therefore, in multiple camera-assisted visual simultaneous localization and mapping (vSLAM) the multi-view tracking algorithm is proposed that can be used to balance the budget of the features in tracking and local mapping. The proposed algorithm is based on PanoSLAM architecture with a panoramic camera model. To avoid the scale issue 3D LiDAR is fused with omnidirectional camera setup. The depth is directly estimated from 3D LiDAR and the remaining features are triangulated from pose information. To validate the method, we collected a dataset from the outdoor environment and performed extensive experiments. The accuracy was measured by the absolute trajectory error which shows comparable robustness in various environments.

Self-calibration of a Multi-camera System using Factorization Techniques for Realistic Contents Generation (실감 콘텐츠 생성을 위한 분해법 기반 다수 카메라 시스템 자동 보정 알고리즘)

  • Kim, Ki-Young;Woo, Woon-Tack
    • Journal of Broadcast Engineering
    • /
    • v.11 no.4 s.33
    • /
    • pp.495-506
    • /
    • 2006
  • In this paper, we propose a self-calibration of a multi-camera system using factorization techniques for realistic contents generation. The traditional self-calibration algorithms for multi-camera systems have been focused on stereo(-rig) camera systems or multiple camera systems with a fixed configuration. Thus, it is required to exploit them in 3D reconstruction with a mobile multi-camera system and another general applications. For those reasons, we suggest the robust algorithm for general structured multi-camera systems including the algorithm for a plane-structured multi-camera system. In our paper, we explain the theoretical background and practical usages based on a projective factorization and the proposed affine factorization. We show experimental results with simulated data and real images as well. The proposed algorithm can be used for a 3D reconstruction and a mobile Augmented Reality.

Research for development of small format multi -spectral aerial photographing systems (PKNU 3) (소형 다중분광 항공촬영 시스템(PKNU 3호) 개발에 관한 연구)

  • 이은경;최철웅;서영찬;조남춘
    • Proceedings of the Korean Society of Surveying, Geodesy, Photogrammetry, and Cartography Conference
    • /
    • 2004.11a
    • /
    • pp.143-152
    • /
    • 2004
  • Researchers seeking geological and environmental information, depend on remote sensing and aerial photographic datum from various commercial satellites and aircraft. However, adverse weather conditions as well as equipment expense limit the ability to collect data anywhere and anytime. To allow for better flexibility in geological and environmental data collection, we have developed a compact, multi-spectral automatic Aerial Photographic system (PKNU2). This system's Multi-spectral camera can record visible (RGB) and infrared (NIR) band (3032*2008 Pixels) images Visible and infrared band images were obtained from each camera respectively and produced color-infrared composite images to be analyzed for the purpose of the environmental monitoring. However this did not provide quality data. Furthermore, it has the disadvantage of having the stereoscopic overlap area being 60% unsatisfied due to the 12 seconds of storage time of each data The PKNU2 system in contrast, photographed photos of great capacity Thus, with such results, we have been proceeding to develop the advanced PKNU2 (PKNU3) system that consists of a color-infrared spectral camera that can photograph in the visible and near-infrared bands simultaneously using a single sensor, a thermal infrared camera, two 40G computers to store images, and an MPEG board that can compress and transfer data to the computer in real time as well as be able to be mounted onto a helicopter platform.

  • PDF

Novel Calibration Method for the Multi-Camera Measurement System

  • Wang, Xinlei
    • Journal of the Optical Society of Korea
    • /
    • v.18 no.6
    • /
    • pp.746-752
    • /
    • 2014
  • In a multi-camera measurement system, the determination of the external parameters is one of the vital tasks, referred to as the calibration of the system. In this paper, a new geometrical calibration method, which is based on the theory of the vanishing line, is proposed. Using a planar target with three equally spaced parallel lines, the normal vector of the target plane can be confirmed easily in every camera coordinate system of the measurement system. By moving the target into more than two different positions, the rotation matrix can be determined from related theory, i.e., the expression of the same vector in different coordinate systems. Moreover, the translation matrix can be derived from the known distance between the adjacent parallel lines. In this paper, the main factors effecting the calibration are analyzed. Simulations show that the proposed method achieves robustness and accuracy. Experimental results show that the calibration can reach 1.25 mm with the range about 0.5m. Furthermore, this calibration method also can be used for auto-calibration of the multi-camera mefasurement system as the feature of parallels exists widely.