DOI QR코드

DOI QR Code

Real-Time Camera Tracking for Markerless Augmented Reality

마커 없는 증강현실을 위한 실시간 카메라 추적

  • Received : 2011.03.17
  • Accepted : 2011.06.15
  • Published : 2011.07.31

Abstract

We propose a real-time tracking algorithm for an augmented reality (AR) system for TV broadcasting. The tracking is initialized by detecting the object with the SURF algorithm. A multi-scale approach is used for the stable real-time camera tracking. Normalized cross correlation (NCC) is used to find the patch correspondences, to cope with the unknown and changing lighting condition. Since a zooming camera is used, the focal length should be estimated online. Experimental results show that the focal length of the camera is properly estimated with the proposed online calibration procedure.

본 논문에서는 방송용 증강현실 시스템을 위한 실시간 카메라 추적 알고리듬을 제안한다. SURF(speeded up robust features) 알고리듬을 이용하여 추적을 초기화하며, 안정적인 실시간 카메라 추적을 위해 다층(multi-scale) 구조를 사용한다. 미리 알려져 있지 않고 시간에 따라 변하는 조명 환경에서의 특징 추적을 위해 정규상호상관도(normalized cross correlation, NCC)를 사용한다. 방송제작에는 줌 렌즈를 장착한 카메라가 사용되기 때문에 카메라의 초점거리를 온라인으로 추정할 필요가 있다. 카메라의 회전과 이동으로 이루어진 외부 포즈(pose) 변수와 함께 내부 변수인 초점거리를 목적함수에 포함시켜 함께 최적화한다. 실험결과는 제안한 온라인 카메라 보정 기법에 의해 카메라의 초점거리가 정확히 구해지는 것을 보여준다.

Keywords

References

  1. V. Lepetit and P. Fua, Monocular Model-based 3D Tracking of Rigid Objects, Now Publishers Inc, 2005.
  2. 박성 춘, 남승진, 오주현, 박창섭, "색상패턴 추적을 이용한 실시간 증강영상 시스템," 방송공학회논문지 제7권 제1호, pp.2-9, 2002년 3월.
  3. T. Drummond and R. Cipolla, "Real-time visual tracking of complex structures," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.24, pp.932-946, Jul. 2002. https://doi.org/10.1109/TPAMI.2002.1017620
  4. G. Simon, A. Fitzgibbon, and A. Zisserman, "Markerless tracking using planar structures in the scene," Int. Symposium on Augmented Reality (ISAR), pp.120-128, Oct. 2000.
  5. L. Vacchetti, V. Lepetit, and P. Fua, "Stable Real-Time 3D Tracking Using Online and Offline Information," IEEE Transactions on Pattern Analysis and Machine Intelligence, 26(10), pp.1385-1391, 2004. https://doi.org/10.1109/TPAMI.2004.92
  6. Q. Wang, W. W. Zhang, X. O. Tang, and H. Y. Shum, "Real-time Bayesian 3-D pose tracking," IEEE Transactions on Circuits and Systems for Video Technology, vol.16, pp.1533-1541, Dec. 2006. https://doi.org/10.1109/TCSVT.2006.885727
  7. A. I. Comport, E. Marchand, M. Pressigout, and F. Chaumette, "Real-time markerless tracking for augmented reality: The virtual visual servoing framework," IEEE Transactions on Visualization and Computer Graphics, vol.12, pp.615-628, 2006. https://doi.org/10.1109/TVCG.2006.78
  8. M. Pressigout and E. Marchand, "Real-time hybrid tracking using edge and texture information," International Journal of Robotics Research, vol.26, pp.689-713, Jul. 2007. https://doi.org/10.1177/0278364907080477
  9. "Total Immersion: Augmented reality software solutions with D'Fusion," http://www.t-immersion.com/, 2010.
  10. V. Lepetit and P. Fua, "Keypoint recognition using randomized trees," IEEE Transactions on Patteren Analysis and Machine Intelligence, 28(9), pp.1465-1479, 2006. https://doi.org/10.1109/TPAMI.2006.188
  11. M. Ozuysal, P. Fua, and V. Lepetit, "Fast keypoint recognition in ten lines of code," in IEEE Conference on Computer Vision and Pattern Recognition, pp. 1-8, 2007. https://doi.org/10.1109/CVPR.2007.383123
  12. Z. Chen and X. Li, "Markerless tracking based on natural feature for autmented reality," Int. Conf. on Educational and Information Technology (ICEIT), Sep. 2010.
  13. D. Wagner, G. Reitmayr, A. Mulloni, T. Drummond, and D. Schmalstieg, "Pose tracking from natural features on mobile phones," Int. Symposium on Mixed and Augmented Reality, pp.125-134, 2008.
  14. D. Wagner, G. Reitmayr, A. Mulloni, T. Drummond, and D. Schmalstieg, "Real-Time Detection and Tracking for Augmented Reality on Mobile Phones," IEEE Transactions on Visualization and Computer Graphics, vol.16, pp.355-368, May-Jun. 2010. https://doi.org/10.1109/TVCG.2009.99
  15. D. Lowe, "Distinctive image features from scale-invariant keypoints," International Journal of Computer Vision, vol.60, pp.91-110, 2004. https://doi.org/10.1023/B:VISI.0000029664.99615.94
  16. J. Yu, J. Kim, H. Kim, I. Choi, and I. Jeong, "Real-time camera tracking for augmented reality," Int. Conf. on Advanced Communication Technology (ICACT), Feb. 2009.
  17. H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool, "Speeded-Up Robust Features (SURF)," Computer Vision and Image Understanding, vol.110, pp.346-359, Jun. 2008. https://doi.org/10.1016/j.cviu.2007.09.014
  18. J. Oh and K. Sohn, "Semiautomatic zoom lens calibration based on the camera's rotation," SPIE Journal of Electronic Imaging, 20(2), Apr-Jun. 2011.
  19. W. Liu, Y. Wang, J. Chen, and J. Guo, "An efficient zoom tracking method for pan-tilit-zoom camera," IEEE Int. Conf. on Computer Science and Information Technology (ICCSIT), pp.536-540, Jul. 2010. https://doi.org/10.1109/ICCSIT.2010.5564441
  20. Z. Y. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.22, pp.1330-1334, Nov. 2000. https://doi.org/10.1109/34.888718
  21. R. Hartley and A. Zisserman, Multiple view geometry in computer vision, Cambridge University Press, 2003.
  22. M. Lourakis, "A brief description of the Levenberg-Marquardt algorithm implemented by levmar," matrix, 3(2), 2005.
  23. M. Fischler and R. Bolles, "Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography," Communications of the ACM, vol.24, pp.381-395, 1981. https://doi.org/10.1145/358669.358692