Real Time Motion Processing for Autonomous Navigation

  • Kolodko, J. (Intelligent Control Systems Laboratory, Griffith University) ;
  • Vlacic, L. (Intelligent Control Systems Laboratory, Griffith University)
  • Published : 2003.03.01

Abstract

An overview of our approach to autonomous navigation is presented showing how motion information can be integrated into existing navigation schemes. Particular attention is given to our short range motion estimation scheme which utilises a number of unique assumptions regarding the nature of the visual environment allowing a direct fusion of visual and range information. Graduated non-convexity is used to solve the resulting non-convex minimisation problem. Experimental results show the advantages of our fusion technique.

Keywords

References

  1. Working Notes of the AAAI Spring Symposium on Robot Navigation Towards the unification of navigational planning and reactive control R. C. Arkin
  2. PH. D. Thesis, Yale Robust Incremental Optical Flow M. J. Black
  3. Visual Reconstruction A. Blake;A. Zisserman
  4. AI Memo 572 Determining Optical Flow Horn;Schunk
  5. Intelligent Vehicle Technologies: Theory and Applications Decisional Architectures for Motion Autonomy C. Laugier;T. Fraichard;L. Vlacic(ed.);M. Parent(ed.);F. Harashima(ed.)
  6. Computational Analysis of Visual Motion A. Mitiche
  7. International Conference on Pattern Recognition Dense range flow from depth and intensity data H. Spies;B. Jahne;J. L. Barron
  8. LYRTECH;LYRTECH signal processing
  9. SHARC Processor
  10. Videotron