DOI QR코드

DOI QR Code

Associative Motion Generation for Humanoid Robot Reflecting Human Body Movement

  • Wakabayashi, Akinori (Dept. of Computer Science and Engineering, Graduate School of Engineering, Nagoya Institute of Technology) ;
  • Motomura, Satona (Dept. of Computer Science and Engineering, Graduate School of Engineering, Nagoya Institute of Technology) ;
  • Kato, Shohei (Dept. of Computer Science and Engineering, Graduate School of Engineering, Nagoya Institute of Technology)
  • Received : 2012.02.28
  • Accepted : 2012.06.20
  • Published : 2012.06.25

Abstract

This paper proposes an intuitive real-time robot control system using human body movement. Recently, it has been developed that motion generation for humanoid robots with reflecting human body movement, which is measured by a motion capture. However, in the existing studies about robot control system by human body movement, the detailed structure information of a robot, for example, degrees of freedom, the range of motion and forms, must be examined in order to calculate inverse kinematics. In this study, we have proposed Associative Motion Generation as humanoid robot motion generation method which does not need the detailed structure information. The associative motion generation system is composed of two neural networks: nonlinear principal component analysis and Jordan recurrent neural network, and the associative motion is generated with the following three steps. First, the system learns the correspondence relationship between an indication and a motion using training data. Second, associative values are extracted for associating a new motion from an unfamiliar indication using nonlinear principal component analysis. Last, the robot generates a new motion through calculation by Jordan recurrent neural network using the associative values. In this paper, we propose a real-time humanoid robot control system based on Associative Motion Generation, that enables user to control motion intuitively by human body movement. Through the task processing and subjective evaluation experiments, we confirmed the effective usability and affective evaluations of the proposed system.

Keywords

References

  1. T. Jin and H. Hashimoto, "Human centered robot for mutual interaction in intelligent space," International Jounal of Fuzzy Logic and Intelligent Systems, vol. 5, no. 3, pp. 246-252, 2005. https://doi.org/10.5391/IJFIS.2005.5.3.246
  2. T. Kanda, T. Miyashita, T. Osada, Y. Haikawa, and Hiroshiguro, "Analysis of humanoid appearances in human-robot interaction," IEEE Transactions on Robotics, vol. 24, no. 3, pp. 725-735, 2008. https://doi.org/10.1109/TRO.2008.921566
  3. M. Riley, A. Ude, and C. G. Atkeson, "Methods for motion generation and interaction with a humanoid robot: Case studies of dancing and catching," Proceedings of AAAI and CMU Workshop on Interactive Robotics and Entertainment, pp. 35-42, 2000.
  4. S. Nakaoka, A. Nakazawa, and K. Yokoi, "Generating whole body motions for a biped humanoid robot from captured human dances," Proceedings of the IEEE International Conference on Robotics and Automation, vol. 3, pp. 3905-3910, 2003.
  5. S. Motomura, S. Kato, and H. Itoh, "Generating association-based motion through human-robot interaction," in Lecture Notes in Artificial Intelligence (The 12th International Conference on Principles of Practice in Multi-Agent Systems (PRIMA 2009)), vol. 5925, pp. 389-402, 2009.
  6. Microsoft, "Kinect." http://www.xbox.com/en-US/kinect.
  7. D. Matsui, T. Minato, K. F. MacDorman, and H. Ishiguro, Generating Natural Motion in an Android by Mapping Human Motion, I-Tech Education and Publishing, 2007.
  8. K. Kurihara, I. Suzuki, M. Tange, K. Yamane, and Y. Nakamura, "Optical cockpit for humanoid teleoperation with realtime motion capture system," in The Robotics and Mechatronics Conference 2002 (ROBOMEC 2002), p. 2A1.L02, 2002. (in Japanese).
  9. M. A. Kramer, "Nonlinear Principal Component Analysis Using Autoassociative Neural Networks," AIChE Journal, vol. 37, pp. 233-243, 1991. https://doi.org/10.1002/aic.690370209
  10. D. DeMers and G. Cottrell, "Non-Linear Dimensionality Reduction," in Neural Information Processing Systems, vol. 5, pp. 580-587, 1993.
  11. M. I. Jordan, "Attractor Dynamics and Parallelism in A Connectionist Sequential Machine," in The 8th Annual Conference of the Cognitive Science Society, pp. 531-546, 1986.
  12. D. E. Rumelhart, G. Hinton, and R.Williams, "Learning Internal Representation by Error Propagation," in Parallel Distributed Processing (D. E. Rumelhart and J. L. McClelland, eds.), vol. 1, pp. 318-362, The MIT Press, 1986.
  13. K. Aoyama, K. Minamino, and H. Shimomura., "learning of cognitive action based on self-organizing maps with hmms," Transactions of the Japanese Society for Artificial Intelligence, vol. 22, no. 4, pp. 375- 388, 2007. (in Japanese). https://doi.org/10.1527/tjsai.22.375
  14. D. Lee, H. Kunori, and Y. Nakamura, "Association of Whole Body Motion from Tool Knowledge for Humanoid Robots," in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'08), pp. 2867-2874, 2008.
  15. C. Osgood, G. Suci, and P. Tannenbaum, The measurement of meaning, Ur-bana: University of Illinois Press, 1967.
  16. J. Tukey, The problem of multiple comparisons, Mimeographed monograph, 1953.

Cited by

  1. Bi-Directional Optical-EtherCAT Communication for Motion Network Control of Humanoid Robot vol.11, pp.04, 2014, https://doi.org/10.1142/S0219843614420067
  2. Implementation of Vector Control of PMSM by Applying Optical-EtherCAT Communication for Robot vol.11, pp.04, 2014, https://doi.org/10.1142/S0219843614420092
  3. Marathon Game and Strategy of Humanoid Robot vol.26, pp.1, 2016, https://doi.org/10.5391/JKIIS.2016.26.1.064
  4. Omni-Directional Walking Pattern Generator for Child-Sized Humanoid Robot, CHARLES2 vol.14, pp.02, 2017, https://doi.org/10.1142/S0219843617500049