DOI QR코드

DOI QR Code

A Real-time Augmented Reality System using Hand Geometric Characteristics based on Computer Vision

손의 기하학적인 특성을 적용한 실시간 비전 기반 증강현실 시스템

  • 최희선 (중앙대학교 첨단영상대학원) ;
  • 정다운 (중앙대학교 첨단영상대학원) ;
  • 최종수 (중앙대학교 첨단영상대학원)
  • Received : 2011.06.29
  • Accepted : 2012.01.06
  • Published : 2012.03.31

Abstract

In this paper, we propose an AR(augmented reality) system using user's bare hand based on computer vision. It is important for registering a virtual object on the real input image to detect and track correct feature points. The AR systems with markers are stable but they can not register the virtual object on an acquired image when the marker goes out of a range of the camera. There is a tendency to give users inconvenient environment which is limited to control a virtual object. On the other hand, our system detects fingertips as fiducial features using adaptive ellipse fitting method considering the geometric characteristics of hand. It registers the virtual object stably by getting movement of fingertips with determining the shortest distance from a palm center. We verified that the accuracy of fingertip detection over 82.0% and fingertip ordering and tracking have just 1.8% and 2.0% errors for each step. We proved that this system can replace the marker system by tacking a camera projection matrix effectively in the view of stable augmentation of virtual object.

본 논문에서는 손을 이용한 컴퓨터 비전 기술 기반의 증강 현실 시스템을 제안한다. 입력 영상에 가상의 물체를 정합하기 위해서는 정확한 특징점 추출과 추적 기술이 중요하다. 기존의 마커를 이용한 증강현실 시스템은 매우 안정성이 있지만 마커가 없이는 증강이 불가능하고 증강된 물체를 조작하는데 제한적인 상황을 제공하는 경향이 있다. 제안한 시스템은 손의 기하학적인 특성을 고려하여 적응적 최적 타원 검출 방법을 통해 손끝점을 특징점으로 추출한다. 그리고 손바닥 중심점을 기준으로 한 최단 거리 검출 방법을 이용하여 손끝점의 움직임을 추출해 가상의 객체를 안정적으로 정합한다. 실험을 통하여 특징점 추출이 약 82.0%의 정확도를 보였고 특징점 판별과 추적 성능 평가에서는 단지 약 1.8%와 2.0%의 오류를 보였다. 또한 제안한 시스템이 가상 객체의 안정적인 증강 측면에서, 효과적으로 카메라 사영 행렬을 획득하여 마커를 이용한 시스템을 대체할 수 있음을 확인하였다.

Keywords

References

  1. R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, and B. Maclntyre, "Recent Advances in Augmented Reality," IEEE Computer Graphics and Applications, Vol.21, No.6, pp. 34-47, 2001.
  2. L.T. De Paolis, M. Pulimeno, and G. Aloisio, "An Augmented Reality Application for Minimally Invasive Surgery," IFMBE Proceedings, Vol. 20, Part 8, pp. 489-492, 2008. https://doi.org/10.1007/978-3-540-69367-3_131
  3. C. Bichlmeier, S.M. Heining, M. Feuerstein, and N. Navab, "The Virtual Mirror: A New Interaction Paradigm for Augmented Reality Environments," IEEE Transactions on Medical Imaging, Vol.28, No.9, pp. 1498-1510, 2009. https://doi.org/10.1109/TMI.2009.2018622
  4. D. Wagner and I. Barakonyi, "Augmented Reality Kanji Learning," ISMAR, pp. 335, 2003.
  5. T. Miyashita, P. Meier, T. Tachikawa, S. Orlic, T. Eble, V. Scholz, A. Gapel, O. Gerl, S. Arnaudov, and S. Lieberknecht, "An Augmented Reality Museum Guide," ISMAR, pp. 103-106, 2008.
  6. H.M. Park, S.H. Lee, and J.S. Choi, "Wearable Augmented Reality System using Gaze Interaction," ISMAR, pp. 175-176, 2008.
  7. J. Shi and C. Tomasi, "Good Features to Track," Proc. Computer Vision and Pattern Recognition, pp. 539-600, 1994.
  8. U. Neumann and S. You, "Natural Feature Tracking for Augmented Reality," IEEE Trans. on Multimedia, Vol.1, No.1, pp. 53-64, 1999. https://doi.org/10.1109/6046.748171
  9. M.L. Yuan, S.K. Ong, and A.Y.C. Nee, "Registration using Natural Features for Augmented Reality Systems," IEEE Trans. on Visualization and Computer Graphics, Vol. 12, No.4, pp. 569-580, 2006. https://doi.org/10.1109/TVCG.2006.79
  10. S. Kim, Y. Park, K. Lim, H. Lee, S. Kim, and S. Lee, "Fingertips Detection and Tracking based on Active Shape Models and an Ellipse," Proc. of the IEEE Region 10 Conference, pp. 1-6, 2009.
  11. K. Oka, Y. Sato, and H. Koike, "Real-Time Fingertip Tracking and Gesture Recognition," IEEE Computer graphics and Applications, Vol.22, No.6, pp. 64-71, 2002. https://doi.org/10.1109/MCG.2002.1046630
  12. G. Ye, J.J. Corso, G.D. Hager, and A.M. Okamura, "VisHap: Augmented Reality Combining Haptics and Vision," IEEE International Conference on Systems, Man and Cybernetics, Vol.4, pp. 3425-3431, 2003.
  13. S. Irawati, S. Green, M. Billinghurst, A. Duenser, and H. Ko, "Move the Couch Where?: Developing an Augmented Reality Multimodal Interface," Proc. International Symposium on Mixed and Augmented Reality, pp. 183-186, 2006.
  14. S. Irawati, S. Green, M. Billinghurst, A. Duenser, and H. Ko, "An Evaluation of an Augmented Reality Multimodal Interface Using Speech and Paddle Gestures," ICART, Vol.4282, pp. 272-283, 2006.
  15. R.Y. Wang and J. Popovic, "Real-Time Hand-Tracking with a Color Glove," Proc. of ACM SIGGRAPH, Vol.28, No.3, pp. 1-8, 2009.
  16. M. Kolsch, M. Turk, and T. Hollerer, "Vision- Based Interfaces for Mobility," The International Conference on Mobile and Ubiquitous Systems: Networking and Services, pp. 86-94, 2004.
  17. T. Lee and T. Höllerer, "Handy AR: Markerless Inspection of Augmented Reality Object Using Fingertip Tracking," ISWC, pp. 83-90, 2007.
  18. C. Hardenberg and F. Berard, "Bare-Hand Human-Computer Interaction," Proc. of the workshop on Perceptive user interfaces, pp. 1-8, 2001.
  19. F. Gasparini and R. Schettini, "Skin Segmentation using Multiple Thresholding," Proc. SPIE, Vol.6061, pp. 60610F-1-60610F-8, 2006.
  20. G. Borgefors, "Distance Transformations in Digital Images," ICVGIP , Vol.34, pp. 344-371, 1986.
  21. M. Hu, "Visual Pattern Recognition by Moment Invariants," IRE Transactions on Information Theory, Vol.8, No.2, pp. 179-187, 1962.
  22. 이정진, 김종호, 김태영, "증강현실 응용을 위한 손 끝점 추출과 손 동작 인식 기법," 한국멀티미디어학회 논문지, 제13권, 제2호, pp. 316-323, 2010.