Analysis of Face Direction and Hand Gestures for Recognition of Human Motion

인간의 행동 인식을 위한 얼굴 방향과 손 동작 해석

  • 김성은 (한국전자통신연구원 가상현실연구센터) ;
  • 조강현 (울산대학교 전기전자정보시스템) ;
  • 전희성 (울산대학교 전기전자정보시스템) ;
  • 최원호 (울산대학교 전기전자정보시스템) ;
  • 박경섭 (울산대학교 전기전자정보시스템)
  • Published : 2001.04.01

Abstract

In this paper, we describe methods that analyze a human gesture. A human interface(HI) system for analyzing gesture extracts the head and hand regions after taking image sequence of and operators continuous behavior using CCD cameras. As gestures are accomplished with operators head and hands motion, we extract the head and hand regions to analyze gestures and calculate geometrical information of extracted skin regions. The analysis of head motion is possible by obtaining the face direction. We assume that head is ellipsoid with 3D coordinates to locate the face features likes eyes, nose and mouth on its surface. If was know the center of feature points, the angle of the center in the ellipsoid is the direction of the face. The hand region obtained from preprocessing is able to include hands as well as arms. For extracting only the hand region from preprocessing, we should find the wrist line to divide the hand and arm regions. After distinguishing the hand region by the wrist line, we model the hand region as an ellipse for the analysis of hand data. Also, the finger part is represented as a long and narrow shape. We extract hand information such as size, position, and shape.

Keywords

References

  1. K. H. Jo, Y. Kuno, and Y. Shirai, 'Manipulative hand gesture recognition using task knowledge for human computer interaction,' Proc. 3rd IEEE International Conference on Automatic Face and Gesture Recognition, pp. 468-473, 1998 https://doi.org/10.1109/AFGR.1998.670992
  2. Pattie Maes, Bruce Blumberg, Trevor Darrell, and Alex Pentland, 'The alive system: Full-body interaction with animated autonomous agents,' ACM Multimedia Systems, vol. 5, pp. 105-112, 1997 https://doi.org/10.1007/s005300050046
  3. Francis K. H. Quek, 'Unencumbered gestural interaction,' IEEE Multimedia, vol. 3, no. 4, pp. 36-47, Winter, 1996 https://doi.org/10.1109/93.556459
  4. Christopher Wren, Ali Azarbayejani, Trevor Darrell, and Alex Pentland, 'Pfinder: Real-Time tracking of the human body,' IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 7, pp. 780-785, July, 1997 https://doi.org/10.1109/34.598236
  5. Ming-Hsuan Yang and Narendra Ahuja, 'Extraction and classification of visual motion patterns for hand gesture recognition,' In Proceedings of the IEEE CVPR, pp. 892-897, Santa Barbara, 1998 https://doi.org/10.1109/CVPR.1998.698710
  6. Maylor K. Leung and Yee-Hong Yang, 'First sight: A human body outline labeling system,' IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, no. 4, April, 1995 https://doi.org/10.1109/34.385981