DOI QR코드

DOI QR Code

Vision-based Sensor Fusion of a Remotely Operated Vehicle for Underwater Structure Diagnostication

수중 구조물 진단용 원격 조종 로봇의 자세 제어를 위한 비전 기반 센서 융합

  • Lee, Jae-Min (Control and Robot Engineering, Chungbuk National University) ;
  • Kim, Gon-Woo (School of Electronics Engineering, Chungbuk National University)
  • Received : 2014.11.15
  • Accepted : 2015.02.25
  • Published : 2015.04.01

Abstract

Underwater robots generally show better performances for tasks than humans under certain underwater constraints such as. high pressure, limited light, etc. To properly diagnose in an underwater environment using remotely operated underwater vehicles, it is important to keep autonomously its own position and orientation in order to avoid additional control efforts. In this paper, we propose an efficient method to assist in the operation for the various disturbances of a remotely operated vehicle for the diagnosis of underwater structures. The conventional AHRS-based bearing estimation system did not work well due to incorrect measurements caused by the hard-iron effect when the robot is approaching a ferromagnetic structure. To overcome this drawback, we propose a sensor fusion algorithm with the camera and AHRS for estimating the pose of the ROV. However, the image information in the underwater environment is often unreliable and blurred by turbidity or suspended solids. Thus, we suggest an efficient method for fusing the vision sensor and the AHRS with a criterion which is the amount of blur in the image. To evaluate the amount of blur, we adopt two methods: one is the quantification of high frequency components using the power spectrum density analysis of 2D discrete Fourier transformed image, and the other is identifying the blur parameter based on cepstrum analysis. We evaluate the performance of the robustness of the visual odometry and blur estimation methods according to the change of light and distance. We verify that the blur estimation method based on cepstrum analysis shows a better performance through the experiments.

Keywords

References

  1. J. N. Seo and J. R. Choi, "Autonomy levels for underwater vehicles and development trends," Journal of Electroincs Engineering (in Korean), vol. 38, no. 7, pp. 20-29, 2011.
  2. P. Lee, et al., "System design and development of a deep-sea unmanned underwater vehicle 'HEMIRE' for oceanographic research," The 16th International Offshore and Polar Engineering Conference, vol. 2, 2006.
  3. J. H. Li, et al., "Development of P-SURO II hybrid AUV and its experimental study," MTS/IEEE OCEANS-Bergen, pp. 1-6, 2013.
  4. J. C. Kinsey, R. M. Eustice, and L. L. Whitcomb, "A survey of underwater vehicle navigation: Recent advances and new challenges," IFAC Conference of Manoeuvering and Control of Marine Craft, 2006.
  5. K. Vickery, "Acoustic positioning systems. A practical overview of current systems," The Workshop on Autonomous Underwater Vehicles, pp. 5-17, 1998.
  6. N. Gracias and S. V. Jose, "Underwater video mosaics as visual navigation maps," Computer Vision and Image Understanding, vol. 79, no. 1, pp. 66-91, 2000. https://doi.org/10.1006/cviu.2000.0848
  7. A. Huster, E. W. Frew, and S. M. Rock, "Relative position estimation for AUVs by fusing bearing and inertial rate sensor measurements," MTS/IEEE OCEANS, vol. 3, pp. 1863-1870, 2002.
  8. D, Nister, O. Naroditsky, J. Bergen, and D. Nist, "Visual odometry," The Computer Vision and Pattern Recognition, vol. 1, pp. 652-659, 2004.
  9. A, Geiger, J. Ziegler, and C. Stiller, "Stereoscan: Dense 3d reconstruction in real-time," Intelligent Vehicles Symposium, pp. 963-968, 2011.
  10. D. Scaramuzza, F. Fraundorfer, and R. Siegwart, "Realtime monocular visual odometry for on-road vehicles with 1-point ransac," IEEE International Conference on Robotics and Automation, pp. 4293-4299, 2009.
  11. R. C. Gonzalez and R. E. Woods, Digital Image Processing, 2002.
  12. R. Rom, "On the cepstrum of two-dimensional functions (Corresp.)," IEEE Transactions on Information Theory, vol. 21, no. 2, pp. 214-217, 1975. https://doi.org/10.1109/TIT.1975.1055353
  13. M. Cannon, "Blind deconvolution of spatially invariant image blurs with phase," IEEE Transactions on Acoustics, Speech and Signal Processing, vol. 24, no. 1, pp. 58-63, 1976. https://doi.org/10.1109/TASSP.1976.1162770
  14. S. Wu, Z. Lu, E. P. Ong, and W. Lin, "Blind image blur identification in cepstrum domain," 16th International Conference on Computer Communications and Networks, pp. 1166-1171, 2007.
  15. J. M. Lee and G. W. Kim, "Vision based sensor fusion algorithms for underwater ROV," Proc. of ICROS (Institute of Control, Robotics and Systems) 2014 Daejeon-Chungchung Branch Conference (in Korean), pp. 51-54, 2014.
  16. D. H. Kim, D. H. Lee, H. Myeong, and H. T. Choi, "Vision-based localization for AUVs using weighted template matching in a structured environment," Journal of Institute of Control, Robotics and Systems (in Korean), vol. 19, no. 8, pp. 667-675, 2013. https://doi.org/10.5302/J.ICROS.2013.13.9012
  17. J. H. Li, et al., "Development of P-SURO II hybrid autonomous underwater vehicle and its experimental studies," Journal of Institute of Control, Robotics and Systems (in Korean), vol. 19, no. 9, pp. 813-821, 2013. https://doi.org/10.5302/J.ICROS.2013.13.9027