Fingertip Extraction and Hand Motion Recognition Method for Augmented Reality Applications

증강현실 응용을 위한 손 끝점 추출과 손 동작 인식 기법

  • 이정진 (가톨릭대학교 디지털미디어학부) ;
  • 김종호 (서경대학교 컴퓨터공학과) ;
  • 김태영 (서경대학교 컴퓨터공학과)
  • Received : 2009.07.07
  • Accepted : 2009.11.18
  • Published : 2010.02.28

Abstract

In this paper, we propose fingertip extraction and hand motion recognition method for augmented reality applications. First, an input image is transformed into HSV color space from RGB color space. A hand area is segmented using double thresholding of H, S value, region growing, and connected component analysis. Next, the end points of the index finger and thumb are extracted using morphology operation and subtraction for a virtual keyboard and mouse interface. Finally, the angle between the end points of the index finger and thumb with respect to the center of mass point of the palm is calculated to detect the touch between the index finger and thumb for implementing the click of a mouse button. Experimental results on various input images showed that our method segments the hand, fingertips, and recognizes the movements of the hand fast and accurately. Proposed methods can be used the input interface for augmented reality applications.

References

  1. D. Wagner, T. Pintaric, and D. Schmalstieg, "The invisible train: a collaborative handheld augmented reality demonstrator," Proceedings of ACM SIGGRAPH 2004 Emerging Technologies, pp. 12, 2004.
  2. D. Bandyopadhyay, R. Raskar, and H. Fuchs, "Dynamic shader lamps : painting on movable objects," Proceedings of IEEE and ACM International Symposium on Augmented Reality, pp. 207-216, 2001.
  3. 한갑종, 황재인, 최승문, 김정현, "증강현실 기반의 3차원 도자기 모델링 시스템," 한국 HCI학회논문지, 2권, 2호, pp. 19-26, 2007.
  4. Y. S. Kim, B. S. Soh, and S. G. Lee, "A new wearable input device: SCURRY," IEEE Transactions on Industrial Electronics, Vol.52, No.6, pp. 1490-1499, 2005 https://doi.org/10.1109/TIE.2005.858736
  5. S. Malik, C. McDonald, and G. Roth, "Hand tracking for interactive pattern-based augmented reality," Proceedings of International Symposium on Mixed and Augmented Reality, pp. 117-126, 2002.
  6. W. Liang, Y. Jia, F. Sun, B. Ning, T. Liu, and X. Wu, "Visual Hand Tracking Using MDSA Method," IMACS Multiconference on Computational Engineering in Systems Applications, pp. 255-259, 2006.
  7. T. Sato, K. Fukuchi, and H. Koike, "OHAJIKI Interface: Flicking Gesture Recognition with a High-Speed Camera," Lecture Notes in Computer Science, Vol.4161, pp. 205-210, 2006.
  8. T. Sato, K. Fukuchi, and H. Koike, "Camera-based Flicking Gesture Recognition and Game Applications," Adjunct Proceedings of the 19th annual ACM Symposium on User Interface Software and Technology, 2006.
  9. 김성진, 김태영, 임철수, "발달장애인을 위한 혼합현실 기반 상황훈련 시스템," 한국컴퓨터그래픽스학회논문지, Vol.14, No.2, pp. 1-8, 2008.
  10. R. C. GonzaIez, R. E. Woods, and S. L. Eddins, Digital image processing using matlab, Prentice-Hall, 2004.
  11. F. Chang, C. J. Chen, and C. J. Lu, "A linear-time component-labeling algorithm using contour tracing technique," Computer Vision and Image Understanding, Vol.93, No.2, pp. 206-220, 2004. https://doi.org/10.1016/j.cviu.2003.09.002
  12. 하태진, 우운택, "Video see-through HMD 기반 증강현실을 위한 손 인터페이스," 한국 HCI학회, pp. 169-174, 2006.