A sequential online 3D reconstruction system using dense stereo matching

Sosuke Yamao, Mamoru Miura, Shuji Sakai, Koichi Ito, Takafumi Aoki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

This paper proposes a sequential online 3D reconstruction system using dense stereo matching for a non-expert user, which can sequentially reconstruct accurate and dense 3D point clouds when the new image is captured. The proposed system is based on a novel processing pipeline of sequential online 3D reconstruction with two key techniques: (i) camera parameter estimation of Structure from Motion (SfM) and (ii) dense stereo correspondence matching using Phase-Only Correlation (POC). The user can confirm the reconstruction result and add supplementary images to the system in order to reconstruct a complete 3D model as needed. Through a set of experiments, the proposed system exhibits efficient performance in terms of reconstruction accuracy and computation time compared with the conventional system.

Original languageEnglish
Title of host publicationProceedings - 2015 IEEE Winter Conference on Applications of Computer Vision, WACV 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages341-348
Number of pages8
ISBN (Electronic)9781479966820
DOIs
Publication statusPublished - 2015 Feb 19
Event2015 15th IEEE Winter Conference on Applications of Computer Vision, WACV 2015 - Waikoloa, United States
Duration: 2015 Jan 52015 Jan 9

Publication series

NameProceedings - 2015 IEEE Winter Conference on Applications of Computer Vision, WACV 2015

Other

Other2015 15th IEEE Winter Conference on Applications of Computer Vision, WACV 2015
CountryUnited States
CityWaikoloa
Period15/1/515/1/9

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'A sequential online 3D reconstruction system using dense stereo matching'. Together they form a unique fingerprint.

Cite this