Improvement of odometry for omnidirectional vehicle using optical flow information

Keiji Nagatani, S. Tachibana, M. Sofue, Y. Tanaka

Research output: Contribution to conferencePaperpeer-review

49 Citations (Scopus)

Abstract

Our research goal is to realize a robust navigation in indoor and outdoor environment for autonomous vehicle. An omnidirectional vehicle driven by four Mecanum wheels was chosen for our research platform. Mecanum wheel has 16 tilted rollers (45 degrees against the direction of wheel rotation) around the wheel, so the vehicle moves omnidirectionally by controlling these wheels independently. However, it has a disadvantage of odometry because of wheels' slip-page Particularly, when the robot moves laterally, same wheels' rotations generate different traveling distance according to a friction of a ground surface. To cope with the problem, we estimate robot's position by detecting optical flow of ground image using vision sensor (visual dead-reckoning). The estimation method is inaccurate comparing with odometry, but it is independent from friction of ground surface. Therefore, the estimated vehicle position can be improved by fusing odometry and visual dead-reckoning based on maximum likelihood technique. This paper describes an odometry method and a visual dead-reckoning method for omnidirectional vehicle, and fusion technique to improve the estimated position of the vehicle. Finally, experimental results support above technique.

Original languageEnglish
Pages468-473
Number of pages6
Publication statusPublished - 2000 Dec 1
Event2000 IEEE/RSJ International Conference on Intelligent Robots and Systems - Takamatsu, Japan
Duration: 2000 Oct 312000 Nov 5

Other

Other2000 IEEE/RSJ International Conference on Intelligent Robots and Systems
CountryJapan
CityTakamatsu
Period00/10/3100/11/5

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Fingerprint Dive into the research topics of 'Improvement of odometry for omnidirectional vehicle using optical flow information'. Together they form a unique fingerprint.

Cite this