Collision detection VLSI processor for intelligent vehicles based on a hierarchical obstacle representation

Masanori Hariyama, Michitaka Kameyama

Research output: Contribution to conferencePaperpeer-review

Abstract

To avoid a traffic accident, it is needed to detect a possible collision between a vehicle and obstacles at high speed. If we increase the accuracy of an obstacle representation, it results in an increase in the number of discrete points to represent obstacles and the number of collision checks for the distance points. We propose a hierarchical collision detection algorithm to reduce the time complexity. First, collision checks with a coarse representation of obstacles are performed. If collision exists with such a representation, next, checks with a fine one are performed to detect collision. Otherwise, it is not required to perform collision checks with a fine representation. Since vehicle pixel information is predetermined and not changed, a high-performance ROM-type CAM is employed to perform a matching operation in parallel. Parallel and pipelined architecture for the high-speed coordinate transformations is also proposed based on matrix multiplications. The performance of the proposed VLSI processor is several ten times higher than that of the equivalent special-purpose processor without using the hierarchical representation.

Original languageEnglish
Pages830-834
Number of pages5
Publication statusPublished - 1997 Dec 1
EventProceedings of the 1997 IEEE Conference on Intelligent Transportation Systems, ITSC - Boston, MA, USA
Duration: 1997 Nov 91997 Nov 12

Other

OtherProceedings of the 1997 IEEE Conference on Intelligent Transportation Systems, ITSC
CityBoston, MA, USA
Period97/11/997/11/12

ASJC Scopus subject areas

  • Automotive Engineering
  • Mechanical Engineering
  • Computer Science Applications

Fingerprint Dive into the research topics of 'Collision detection VLSI processor for intelligent vehicles based on a hierarchical obstacle representation'. Together they form a unique fingerprint.

Cite this