DOI QR코드

DOI QR Code

실시간 손동작 인식을 위한 동작 평면 추정

Motion Plane Estimation for Real-Time Hand Motion Recognition

  • 정승대 (경북대학교 전자전기컴퓨터학부) ;
  • 장경호 (경북대학교 전자전기컴퓨터학부) ;
  • 정순기 (경북대학교 컴퓨터공학과)
  • 발행 : 2009.10.31

초록

손동작을 인식하는 연구가 오랫동안 이뤄져 왔지만 대개의 시스템들이 값비싼 깊이 카메라를 사용하거나 여러 개의 카메라를 사용해 분석하는 등 그 비용이 크며 작동이 가능한 작업 공간이 지극히 제한적이었다. 본 논문에서는 가전제품을 원격 제어하기 위한 목적으로 두 개의 회전 모터를 사용해 작업 공간을 확대하고 저렴한 일반 카메라를 사용해서 효율적으로 손동작을 인식하기 위한 시스템을 제안한다. 이 시스템은 입력된 카메라의 자세 정보와 이미지상의 2차원적 손가락 위치 정보를 이용하여 3차원 궤적을 추정하고 이를 동작 평면으로 투영시켜 의미 있는 선형 동작 패턴으로 복원한다. 또한 본 논문에서는 개발된 시스템을 테스트하여 주어진 목적에 맞는 정확도를 가지는 작업 영역을 정의한다.

In this thesis, we develop a vision based hand motion recognition system using a camera with two rotational motors. Existing systems were implemented using a range camera or multiple cameras and have a limited working area. In contrast, we use an uncalibrated camera and get more wide working area by pan-tilt motion. Given an image sequence provided by the pan-tilt camera, color and pattern information are integrated into a tracking system in order to find the 2D position and direction of the hand. With these pose information, we estimate 3D motion plane on which the gesture motion trajectory from approximately forms. The 3D trajectory of the moving finger tip is projected into the motion plane, so that the resolving power of the linear gesture patterns is enhanced. We have tested the proposed approach in terms of the accuracy of trace angle and the dimension of the working volume.

키워드

참고문헌

  1. Paul P. Maglio, Teenie Matlock, Christopher S. Campbell, Shumin Zhai and Barton A. Smith, "Gaze and Speech in Attentive User Interfaces," Proceedings of the International Conference on Multimodal Interfaces, pp.1-7, 2000.
  2. Barry Brumitt and JJ Cadiz, "Let There Be Light: Comparing Interfaces for Homes of the Future," Proceedings of INTERACT'01, pp.375-382, 2001.
  3. Kai Nickel, Edgar Seemann and Rainer Stiefelhagen, "3D-Tracking of Head and Hands for Pointing Gesture Recognition in a Human-Robot Interaction Scenario," IEEE International Conference on Automatic Face and Gesture Recognition, pp.565-570, 2004. https://doi.org/10.1109/AFGR.2004.1301593
  4. Qi Jie Zhao, Da Wei Tu, Da Ming Gao and Ren San Wang, "Human-Computer Interaction Models and Their Application in an Eye-Gaze Input System," Proceedings of 2004 International Conference on Machine Learning and Cybernetics, pp.2274-2278, 2004.
  5. Xiying Wang, Xiwen Zhang and Guozhong Dai, "Tracking of deformable human hand in real time as continuous input for gesture-based interaction," Proceedings of the 12th ACM International Conference on Intelligent user interfaces, pp.235-242, 2007. https://doi.org/10.1145/1216295.1216338
  6. Odest Chadwicke Jenkins, German Gonzalez and Matthew Maverick Loper, "Tracking Human Motion and Actions for Interactive Robots," Proceeding of the ACM/IEEE international conference on Human-robot interaction, pp.365-372, 2007. https://doi.org/10.1145/1228716.1228765
  7. Andy Cassidy, Dan Hook and Avinash Baliga, "Hand Tracking using Spatial Gesture Modeling and Visual Feedback for a Virtual DJ System," Proceedings of the 4th IEEE International Conference on Multimodal Interfaces, pp.197-202, 2002. https://doi.org/10.1109/ICMI.2002.1166992
  8. N. Soontranon, Supavadee Aramvith and Thanarat H. Chalidabhongse, "Improved Face and Hand Tracking for Sign Language Recognition," Proceedings of the International Conference on Information Technology: Coding and Computing, Vol.2, pp.141-146, 2005. https://doi.org/10.1109/ITCC.2005.172
  9. Kazuhiko Sumi, Akihiro Sugimoto, Takashi Matsuyama, Masato Toda and Sotaro Tsukizawa, "Active Wearable Vision Sensor: Recognition of Human Activities and Environments," Proceedings of the International Conference on Informatics Research for Development of Knowledge Society Infrastructure, pp.15-22, 2004. https://doi.org/10.1109/ICKS.2004.1313404
  10. Yang Liu and Yunde Jia, "A Robust Hand Tracking and Gesture Recognition Method for Wearable Visual Interfaces and Its Applications," Proceedings of the Third International Conference on Image and Graphics, pp.472-475, 2004. https://doi.org/10.1109/ICIG.2004.24
  11. Satoshi Yonemoto, Hiroshi Nakano and Rin-ichiro Taniguchi, "Avatar Motion Control by User Body Postures," Proceedings of the eleventh ACM international conference on Multimedia, pp. 347-350, 2003. https://doi.org/10.1145/957013.957088
  12. Jakub Segen and Senthil Kumar, "Gesture VR: vision-based 3D hand interace for spatial interaction," Proceedings of the sixth ACM international conference on Multimedia, pp.455-464, 1998. https://doi.org/10.1145/290747.290822
  13. Le Song and Masahiro Takatsuka, "Real-time 3D Finger Pointing for an Augmented Desk," Proceedings of the Sixth Australasian conference on User interface, Vol.40, pp.99-108, 2005.
  14. Alvaro Cassinelli, Stephane Perrin and Masatoshi Ishikawa, "Smart Laser-Scanner for 3D Human-Machine Interface," CHI'05 extended abstracts on Human factors in computing systems, pp.1138-1139, 2005. https://doi.org/10.1145/1056808.1056851
  15. Andrew Wu, Mubarak Shah and Niels da Vitoria Lobo, "A Virtual 3D Blackboard: 3D Finger Tracking Using a Single Camera," IEEE International Conference on Automatic Face and Gesture Recognition, pp.536-543, 2000. https://doi.org/10.1109/AFGR.2000.840686
  16. Douglas Chai and King Ngan, "Face Segmentation Using Skin-Color Map in Videophone Applications," IEEE Transactions on Circuits and Systems for Video Technology, pp.551-564, 1999. https://doi.org/10.1109/76.767122
  17. Qiang Zhu, Ching-Tung Wu, Kwang-Ting Cheng and Yi-Leh Wu, "An adaptive skin model and its application to objectionable image filtering," Proceedings of the 12th annual ACM international conference on Multimedia, pp.56-63, 2004. https://doi.org/10.1145/1027527.1027538