Co-registration Between PAN and MS Bands Using Sensor Modeling and Image Matching

센서모델링과 영상매칭을 통한 PAN과 MS 밴드간 상호좌표등록

  • Lee, Chang No (Dept. of Civil Engineering, Seoul National University of Science and Technology) ;
  • Oh, Jae Hong (Dept. of Civil Engineering, Korea Maritime and Ocean University)
  • Received : 2021.01.22
  • Accepted : 2021.02.16
  • Published : 2021.02.28


High-resolution satellites such as Kompsat-3 and CAS-500 include optical cameras of MS (Multispectral) and PAN (Panchromatic) CCD (Charge Coupled Device) sensors installed with certain offsets. The offsets between the CCD sensors produce geometric discrepancy between MS and PAN images because a ground target is imaged at slightly different times for MS and PAN sensors. For precise pan-sharpening process, we propose a co-registration process consisting the physical sensor modeling and image matching. The physical sensor model enables the initial co-registration and the image matching is carried out for further refinement. An experiment with Kompsat-3 images produced RMSE (Root Mean Square Error) 0.2pixels level of geometric discrepancy between MS and PAN images.

아리랑3호, 국토위성 등 고해상도 국토관측 위성은 일반적으로 가시광 및 근적외선 영역의 영상을 획득하기 위한 MS (Multispectral) CCD (Charge Coupled Device) 센서와 MS보다 4배의 공간해상도를 갖는 고해상도 PAN (Panchromatic) 영상을 획득하기 위한 CCD 센서의 조합으로 된 카메라를 탑재한다. 카메라 내에서 PAN과 MS CCD라인이 일정한 간격을 갖게 설치되기 때문에 위성이 궤도를 지나가며 대상물을 약간의 시간차를 갖고 촬영하게 되며 따라서 영상 내의 대상물 위치도 달라진다. PAN과 MS 영상융합을 위해서는 PAN과 MS영상간의 정밀한 상호좌표등록이 필요한데, 본 연구에서는 센서모델링을 통한 기법과 영상 매칭의 융합을 통한 상호좌표등록을 수행하였다. PAN과 MS 상호 센서모델링을 통해 초기 상호좌표등록을 수행하고, 영상 매칭을 통해 그 정밀도를 향상시켜 약 RMSE (Root Mean Square Error) 0.2 화소의 정밀도를 확보할 수 있었다.


  1. Boukerch, I., Farhi, N., Karoui, M.S., Djerriri, K., and Mahmoudi, R. (2018), Multispectral and panchromatic registration of alsat-2 images using dense vector matching for pan-sharpening process, ISPRS Technical Commission II Symposium 2018, 4-7 June 2018, Riva del Garda, Italy.
  2. Harris, C. and Stephens, M. (1988), A combined corner and edge detector, Alvey Vision Conference 1988, doi: 10.5244/C.2.23.
  3. Kpalma, K., Mezouar, M.C.E., and Taleb, N. (2014), Recent trends in satellite image pansharpening techniques, 1st International Conference on Electrical, Electronic and Computing Engineering, Jun 2014, Vrniacka Banja, Serbia.
  4. Langheinrich, M. (2014), On the Influence of Coregistration Errors on Satellite Image Pansharpening Methods, Becheolor's thesis, Hochschule fur Angewandte Wissenschaften Munchen, Oberpfaffenhofen, Germany, 53p.
  5. Lee, Changno and Oh, Jaehong. (2020). Rigorous coregistration of Kompsat-3 multispectral and panchromatic images for pan-sharpening image fusion, Sensors, 20, doi: 10.3390/s20072100.
  6. Pushparaj, J. and Hegde, A.V. (2016), Evaluation of pansharpening methods for spatial and spectral quality, Applied Geomatics, Vol. 9, pp.1-12.
  7. Tu, T.M., Su, S.C., Shyu, H.C., and Huang, P.S. (2001), A new look at IHS-like image fusion methods, Information Fusion, Vol. 2, No. 3, pp.177-186, doi: 10.1016/S1566-2535(01)00036-7.
  8. Zhu, X.X. and Bamler, R., 2013: A sparse image fusion algorithm with application to pan-sharpening, IEEE Transactions on Geoscience and Remote Sensing, Vol. 51, No. 5, pp.2827-2836. doi:10.1109/TGRS.2012.2213604.
  9. Zitova, B. and Flusser, J. (2003), Image registration methods: A survey, Image and Vision Computing, Vol. 21, No. 11, pp.977-1000. doi: 10.1016/S0262-8856(03)00137-9.