DOI QR코드

DOI QR Code

A Directional Perception System based on Human Detection for Public Guide Robots

공공 안내 로봇을 위한 인체 검출 기반의 방향성 감지 시스템

  • 도태용 (한밭대학교 제어계측공학과) ;
  • 백정현 ((주)케이엠씨로보틱스)
  • Received : 2010.02.08
  • Accepted : 2010.03.13
  • Published : 2010.05.01

Abstract

Most public guide robots installed in public spots such as exhibition halls and lobbies of department store etc., have poor capability to distinguish the users who require services. As to provide suitable services, public guide robots should have a human detection system that makes it possible to evaluate intention of customers from their movement direction. In this paper, a DPS (Directional Perception System) is realized based on face detection technology. In particular, to catch human movement efficiently and reduce computational time, human detection technology using face rectangle, which is obtained from the human face, is developed. DPS determines which customer needs services of public guide robots by investigating the size and direction of face rectangle. If DPS is adapted, guide service will be provided with more satisfaction and reliability, and power efficiency also can be added up because public guide robots provide services only for the users who expresses their intentions of wanting services explicitly. Finally, through several experiments, the feasibility of the proposed DPS is verified.

Keywords

References

  1. N. Hirai and H. Mizoguchi, “Visual tracking of human back and shoulder for person-following robot,” Proc. of 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, vol. 1, pp. 527-532, 2003.
  2. M. Kleinehagenbrock, S. Lang, J. Fritsch, F. Lomker, G. A. Fink, and G. Sagerer, “Person tracking with a mobile robot based on multi=modal anchoring,” Proc. of 11th IEEE International Workshop on Robot and Human Interactive Communication, pp. 423-429, 2002.
  3. C. Wren, A. Azarbayejani, T. Darrell, and A. Pentland, “Pfinder: real-time tracking of the human body,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 7, pp. 780-785, 1997. https://doi.org/10.1109/34.598236
  4. M. Tarokh, “Robotic person following using fuzzy control and image segmentation,” Journal of Robotic Systems, vol. 20, no. 9, pp. 557-568, 2003. https://doi.org/10.1002/rob.10106
  5. S. Smith and J. Brady, “Asset-2: real-time motion segmentation and shape tracking,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, no. 8, pp. 814-820, 1995. https://doi.org/10.1109/34.400573
  6. S. Feyrer and A. Zell, “Detection, tracking, and pursuit of humans with an autonomous mobile robot,” Proc. of 1999 IEEE International Conference on Intelligent Robots and Systems, pp. 864-869, 1999.
  7. 정성환, 이문호, “OpenCV를 이용한 컴퓨터 비전 실무 프로그래밍,” 홍릉과학출판사, 2007.
  8. 김도형, 정성욱, 김도윤, 정명진 “로봇과 인간의 상호 작용을 위한 얼굴 표정 인식 및 얼굴 표정 생성 기법,” 제어 ․ 자동화 ․ 시스템공학회 논문지, 제12권 제3호, pp. 255-263, 2006. https://doi.org/10.5302/J.ICROS.2006.12.3.255
  9. 정성욱, 김도형, 안광호, 정명진 “실시간 얼굴 표정 인식을 위한 새로운 사각 특징 형태 선택기법,” 제어 ․ 자동화 ․ 시스템공학회 논문지, 제12권 제2호, pp. 130-137, 2006. https://doi.org/10.5302/J.ICROS.2006.12.2.130
  10. 김성은, 조강현, 전희성, 최원호, 박경섭 “인간의 행동 인식을 위한 얼굴 방향과 손 동작 해석,” 제어 ․ 자동화 ․ 시스템공학회 논문지, 제7권 제4호, pp. 309-318, 2001.