• Title/Summary/Keyword: camera calibration

Search Result 693, Processing Time 0.025 seconds

Accurate Camera Self-Calibration based on Image Quality Assessment

  • Fayyaz, Rabia;Rhee, Eun Joo
    • Journal of Information Technology Applications and Management
    • /
    • v.25 no.2
    • /
    • pp.41-52
    • /
    • 2018
  • This paper presents a method for accurate camera self-calibration based on SIFT Feature Detection and image quality assessment. We performed image quality assessment to select high quality images for the camera self-calibration process. We defined high quality images as those that contain little or no blur, and have maximum contrast among images captured within a short period. The image quality assessment includes blur detection and contrast assessment. Blur detection is based on the statistical analysis of energy and standard deviation of high frequency components of the images using Discrete Cosine Transform. Contrast assessment is based on contrast measurement and selection of the high contrast images among some images captured in a short period. Experimental results show little or no distortion in the perspective view of the images. Thus, the suggested method achieves camera self-calibration accuracy of approximately 93%.

A study on the transformation of EO parameters using Boresight calibration (Boresight calibration을 이용한 외부표정요소 산출에 관한 연구)

  • 박수영;윤여상;김준철;정주권;주영은
    • Proceedings of the Korean Society of Surveying, Geodesy, Photogrammetry, and Cartography Conference
    • /
    • 2003.10a
    • /
    • pp.129-134
    • /
    • 2003
  • Mobile Mapping System needs system calibration of multi sensors. System calibration is defined as determination of spatial and rotational offsets between the sensors. Especially, EO parameters of GPS/INS require knowledge of the calibration to camera frame. The calibration parameters must be determined with the highest achievable accuracy in order to get 3D coordinate points in stereo CCD images. This study applies Boresight calibration for the calibration between GPS/INS and camera, and estimates the Performance of the calibration.

  • PDF

Parameter Calibration of Laser Scan Camera for Measuring the Impact Point of Arrow (화살 탄착점 측정을 위한 레이저 스캔 카메라 파라미터 보정)

  • Baek, Gyeong-Dong;Cheon, Seong-Pyo;Lee, In-Seong;Kim, Sung-Shin
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.21 no.1
    • /
    • pp.76-84
    • /
    • 2012
  • This paper presents the measurement system of arrow's point of impact using laser scan camera and describes the image calibration method. The calibration process of distorted image is primarily divided into explicit and implicit method. Explicit method focuses on direct optical property using physical camera and its parameter adjustment functionality, while implicit method relies on a calibration plate which assumed relations between image pixels and target positions. To find the relations of image and target position in implicit method, we proposed the performance criteria based polynomial theorem model that overcome some limitations of conventional image calibration model such as over-fitting problem. The proposed method can be verified with 2D position of arrow that were taken by SICK Ranger-D50 laser scan camera.

3D reconstruction method without projective distortion from un-calibrated images (비교정 영상으로부터 왜곡을 제거한 3 차원 재구성방법)

  • Kim, Hyung-Ryul;Kim, Ho-Cul;Oh, Jang-Suk;Ku, Ja-Min;Kim, Min-Gi
    • Proceedings of the IEEK Conference
    • /
    • 2005.11a
    • /
    • pp.391-394
    • /
    • 2005
  • In this paper, we present an approach that is able to reconstruct 3 dimensional metric models from un-calibrated images acquired by a freely moved camera system. If nothing is known of the calibration of either camera, nor the arrangement of one camera which respect to the other, then the projective reconstruction will have projective distortion which expressed by an arbitrary projective transformation. The distortion on the reconstruction is removed from projection to metric through self-calibration. The self-calibration requires no information about the camera matrices, or information about the scene geometry. Self-calibration is the process of determining internal camera parameters directly from multiply un-calibrated images. Self-calibration avoids the onerous task of calibrating cameras which needs to use special calibration objects. The root of the method is setting a uniquely fixed conic(absolute quadric) in 3D space. And it can make possible to figure out some way from the images. Once absolute quadric is identified, the metric geometry can be computed. We compared reconstruction image from calibrated images with the result by self-calibration method.

  • PDF

A Study on the Estimation of Camera Calibration Parameters using Cooresponding Points Method (점 대응 기법을 이용한 카메라의 교정 파라미터 추정에 관한 연구)

  • Choi, Seong-Gu;Go, Hyun-Min;Rho, Do-Hwan
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.50 no.4
    • /
    • pp.161-167
    • /
    • 2001
  • Camera calibration is very important problem in 3D measurement using vision system. In this paper is proposed the simple method for camera calibration. It is designed that uses the principle of vanishing points and the concept of corresponding points extracted from the parallel line pairs. Conventional methods are necessary for 4 reference points in one frame. But we proposed has need for only 2 reference points to estimate vanishing points. It has to calculate camera parameters, focal length, camera attitude and position. Our experiment shows the validity and the usability from the result that absolute error of attitude and position is in $10^{-2}$.

  • PDF

The Camera Calibration Parameters Estimation using The Projection Variations of Line Widths (선폭들의 투영변화율을 이용한 카메라 교정 파라메터 추정)

  • Jeong, Jun-Ik;Moon, Sung-Young;Rho, Do-Hwan
    • Proceedings of the KIEE Conference
    • /
    • 2003.07d
    • /
    • pp.2372-2374
    • /
    • 2003
  • With 3-D vision measuring, camera calibration is necessary to calculate parameters accurately. Camera calibration was developed widely in two categories. The first establishes reference points in space, and the second uses a grid type frame and statistical method. But, the former has difficulty to setup reference points and the latter has low accuracy. In this paper we present an algorithm for camera calibration using perspective ratio of the grid type frame with different line widths. It can easily estimate camera calibration parameters such as focal length, scale factor, pose, orientations, and distance. But, radial lens distortion is not modeled. The advantage of this algorithm is that it can estimate the distance of the object. Also, the proposed camera calibration method is possible estimate distance in dynamic environment such as autonomous navigation. To validate proposed method, we set up the experiments with a frame on rotator at a distance of 1,2,3,4[m] from camera and rotate the frame from -60 to 60 degrees. Both computer simulation and real data have been used to test the proposed method and very good results have been obtained. We have investigated the distance error affected by scale factor or different line widths and experimentally found an average scale factor that includes the least distance error with each image. It advances camera calibration one more step from static environments to real world such as autonomous land vehicle use.

  • PDF

An Easy Camera-Projector Calibration Technique for Structured Light 3-D Reconstruction (구조광 방식 3차원 복원을 위한 간편한 프로젝터-카메라 보정 기술)

  • Park, Soon-Yong;Park, Go-Gwang;Zhang, Lei
    • The KIPS Transactions:PartB
    • /
    • v.17B no.3
    • /
    • pp.215-226
    • /
    • 2010
  • The structured-light 3D reconstruction technique uses a coded-pattern to find correspondences between the camera image and the projector image. To calculate the 3D coordinates of the correspondences, it is necessary to calibrate the camera and the projector. In addition, the calibration results affect the accuracy of the 3D reconstruction. Conventional camera-projector calibration techniques commonly require either expensive hardware rigs or complex algorithm. In this paper, we propose an easy camera-projector calibration technique. The proposed technique does not need any hardware rig or complex algorithm. Thus it will enhance the efficiency of structured-light 3D reconstruction. We present two camera-projector systems to show the calibration results. Error analysis on the two systems are done based on the projection error of the camera and the projector, and 3D reconstruction of world reference points.

A Study on Vision-based Calibration Method for Bin Picking Robots for Semiconductor Automation (반도체 자동화를 위한 빈피킹 로봇의 비전 기반 캘리브레이션 방법에 관한 연구)

  • Kyo Mun Ku;Ki Hyun Kim;Hyo Yung Kim;Jae Hong Shim
    • Journal of the Semiconductor & Display Technology
    • /
    • v.22 no.1
    • /
    • pp.72-77
    • /
    • 2023
  • In many manufacturing settings, including the semiconductor industry, products are completed by producing and assembling various components. Sorting out from randomly mixed parts and classification operations takes a lot of time and labor. Recently, many efforts have been made to select and assemble correct parts from mixed parts using robots. Automating the sorting and classification of randomly mixed components is difficult since various objects and the positions and attitudes of robots and cameras in 3D space need to be known. Previously, only objects in specific positions were grasped by robots or people sorting items directly. To enable robots to pick up random objects in 3D space, bin picking technology is required. To realize bin picking technology, it is essential to understand the coordinate system information between the robot, the grasping target object, and the camera. Calibration work to understand the coordinate system information between them is necessary to grasp the object recognized by the camera. It is difficult to restore the depth value of 2D images when 3D restoration is performed, which is necessary for bin picking technology. In this paper, we propose to use depth information of RGB-D camera for Z value in rotation and movement conversion used in calibration. Proceed with camera calibration for accurate coordinate system conversion of objects in 2D images, and proceed with calibration of robot and camera. We proved the effectiveness of the proposed method through accuracy evaluations for camera calibration and calibration between robots and cameras.

  • PDF

A Camera Calibration Algorithm for an Ill-Conditioned Case (악조건하의 카메라 교정을 위한 알고리즘)

  • Lee, Jung-Hwa;Lee, Moon-Kyu
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.16 no.2 s.95
    • /
    • pp.164-175
    • /
    • 1999
  • If the camera plane is nearly parallel to the calibration board on which objects are defined, most of existing calibration approaches such as Tsai's radial-alignment-constraint method cannot be applied. Recently, for such an ill-conditioned case, Zhuang & Wu suggested the linear two-stage calibration algorithm assuming that the exact values of focal length and scale factor are known a priori. In this paper, we developed an iterative two-stage algorithm starts with initial guess fo the two parameters to determine the value of the others using Zhuang & Wu's method. In the second stage, the two parameters are locally optimized. This process is repeated until any improvement cannot be expected any more. The performance comparison between Zhuang & Wu's method and our algorithm shows the superiority of ours. Also included are the computational results for the effects of the distribution and the number of calibration points on the calibration performance.

  • PDF

Calibration and accuracy evaluation of airborne digital camera images (항공기용 디지털 영상에 대한 검정(Calibration) 및 정확도 평가)

  • 이승헌;위광재;이강원;이홍술
    • Proceedings of the Korean Society of Surveying, Geodesy, Photogrammetry, and Cartography Conference
    • /
    • 2004.04a
    • /
    • pp.183-195
    • /
    • 2004
  • Photogrammetry is one of the most important sources of GIS application. Nowadays, color photos are used and camera is integrated with GPS/INS sensors. However the photos are still taken from analogue camera and scanned for digital image. For the convenient and accurate image application especially for 3D, airborne digital camera images is essential. In this paper, digital image calibration process with GPS/INS and its accuracy evaluation was presented.

  • PDF