Vision-based Human-Robot Motion Transfer in Tangible Meeting Space

실감만남 공간에서의 비전 센서 기반의 사람-로봇간 운동 정보 전달에 관한 연구

  • 최유경 (KIST/연세대학교 학연협동) ;
  • 나성권 (한국과학기술연구원 인지로봇연구단) ;
  • 김수환 (한국과학기술연구원 인지로봇연구단) ;
  • 김창환 (한국과학기술연구원 인지로봇연구단) ;
  • 박성기 (한국과학기술연구원 인지로봇연구단)
  • Published : 2007.06.30

Abstract

This paper deals with a tangible interface system that introduces robot as remote avatar. It is focused on a new method which makes a robot imitate human arm motions captured from a remote space. Our method is functionally divided into two parts: capturing human motion and adapting it to robot. In the capturing part, we especially propose a modified potential function of metaballs for the real-time performance and high accuracy. In the adapting part, we suggest a geometric scaling method for solving the structural difference between a human and a robot. With our method, we have implemented a tangible interface and showed its speed and accuracy test.

Keywords