지역적 매칭쌍 특성에 기반한 고해상도영상의 자동기하보정

Automatic Registration of High Resolution Satellite Images using Local Properties of Tie Points

  • 한유경 (서울대학교 건설환경시스템 공학부) ;
  • 번영기 (서울대학교 건설환경시스템 공학부) ;
  • 최재완 (서울대학교 건설환경시스템 공학부) ;
  • 한동엽 (전남대학교 공과대학 건설환경공학) ;
  • 김용일 (서울대학교 건설환경시스템 공학부)
  • 투고 : 2010.05.18
  • 심사 : 2010.06.03
  • 발행 : 2010.06.30

초록

본 논문은 Scale Invariant Feature Transform(SIFT) 기술자를 이용한 매칭 방법을 개선하여 고해상도영상에서 보다 많은 매칭쌍(tie points)을 추출함으로써 고해상도영상 자동기하보정의 결과향상을 목적으로 한다. 이를 위해 기준(reference)영상과 대상(sensed)영상의 특징점(interest points)간의 위치관계를 추가적으로 이용하여 매칭쌍을 추출하였다. SIFT 기술자를 이용하여 어핀(affine)변환계수를 추정한 후, 이를 통해 대상영상의 특징점 좌표를 기준영상 좌표체계로 변환하였다. 변환된 대상영상의 특징점과 기준영상의 특징점간의 공간거리(spatial distance)정보를 이용하여 최종적으로 매칭쌍을 추출하였다. 추출된 매칭쌍으로 piecewise linear function을 구성하여 고해상도 영상간 자동기하보정을 수행하였다. 제안한 기법을 통하여, 기존 SIFT 기법에 의해 추출한 결과에 비해 영상 전역에 걸쳐 고르게 분포된 다수의 매칭쌍을 추출할 수 있었다.

In this paper, we propose the automatic image-to-image registration of high resolution satellite images using local properties of tie points to improve the registration accuracy. A spatial distance between interest points of reference and sensed images extracted by Scale Invariant Feature Transform(SIFT) is additionally used to extract tie points. Coefficients of affine transform between images are extracted by invariant descriptor based matching, and interest points of sensed image are transformed to the reference coordinate system using these coefficients. The spatial distance between interest points of sensed image which have been transformed to the reference coordinates and interest points of reference image is calculated for secondary matching. The piecewise linear function is applied to the matched tie points for automatic registration of high resolution images. The proposed method can extract spatially well-distributed tie points compared with SIFT based method.

키워드

참고문헌

  1. 한동엽, 김대성, 이재빈, 김용일 (2006), SIFT 기법을 이용한 중.저해상도 위성영상의 자동 기하보정, 한국공간정보시스템학회 학술회의 논문집, 한국공간정보 시스템학회, Vol. 11, pp. 311-316.
  2. Arevalo, V. and González, J. (2008), An experimental evaluation of non-rigid registration techniques on Quickbird satellite imagery, International Journal of Remote Sensing, Vol. 29, No. 2, pp. 513-527. https://doi.org/10.1080/01431160701241910
  3. Chen, H. M., Arora, M. K., and Varshney, P. K. (2003), Mutual information-based image registration for remote sensing data, International Journal of Remote Sensing, Vol. 24, No. 18, pp. 3701-3706. https://doi.org/10.1080/0143116031000117047
  4. Goshtasby, A. (1986), Piecewise linear mapping functions for image registration, Pattern Recognition, Vol. 19, No. 6, pp. 459-466. https://doi.org/10.1016/0031-3203(86)90044-0
  5. Habib, A., Mwafag, G., Michel, M., and Al-Ruzouq, R. (2005), Photogrammetric and LIDAR data registration using linear features, Photogrammetric Engineering & Remote Sensing, Vol. 71, No. 6, pp. 699-707. https://doi.org/10.14358/PERS.71.6.699
  6. Hong, G. and Zhang, Y. (2008), Wavelet-based image registration technique for high-resolution remote sensing images, Computers & Geosciences, Vol. 34, pp. 1708- 1720. https://doi.org/10.1016/j.cageo.2008.03.005
  7. Kim, T. and Im, Y. (2003), Automatic satellite image registration by combination of matching and random sample consensus, IEEE Transactions on Geoscience and Remote Sensing, Vol. 41, No. 5, pp. 1111-1117. https://doi.org/10.1109/TGRS.2003.811994
  8. Lowe, D. G. (2004), Distinctive image features from scaleinvariant keypoints, International Journal of Computer Vision, Vol. 60, no. 2 pp. 91-110.
  9. Rignot, E., Kowk, R., Curlander, J. C., and Pang, S. S. (1991), Automated multisensor registration: requirements and techniques, Photogrammetric Engineering and Remote Sensing, Vol. 57, No. 8, pp. 1029-1038.
  10. Xiong, Z. and Zhang, Y. (2009), A novel interest-pointmatching algorithm for high-resolution satellite images, IEEE Transactions on Geoscience and Remote Sensing, Vol. 47, No. 12, pp. 4189-4200. https://doi.org/10.1109/TGRS.2009.2023794
  11. Yu, L., Zhang, D., and Holden, E. (2008), A fast and fully automatic registration approach based on point features for multi-source remote-sensing images, Computers & Geosciences, Vol. 34, pp. 838-848. https://doi.org/10.1016/j.cageo.2007.10.005
  12. Zagorchev, L. and Goshtasby, A. (2006), A comparative study of transformation functions for nonrigid image registration, IEEE Transactions on Image Processing, Vol. 15, No. 3, pp. 529-538. https://doi.org/10.1109/TIP.2005.863114
  13. Zitova, B. and Flusser, J. (2003), Image registration methods: a survey, Image and Vision Computing, Vol. 21, No. 11, pp. 977-1000. https://doi.org/10.1016/S0262-8856(03)00137-9