Recognizing surface qualities from natural images based on learning to rank

Takashi Abe, Takayuki Okatani, Koichiro Deguchi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

14 Citations (Scopus)

Abstract

This paper proposes a method for estimating the quantitative values of some attributes associated with surface qualities of an object, such as glossiness and transparency, from its image. Our approach is to learn functions that compute such attribute values from the input image by using training data given in the form of relative information. To be specific, each sample of the training data represents that, for a pair of images, which is greater in terms of the target attribute. The functions are learned based on leaning to rank. This approach enables us to deal with natural images, which cannot be dealt with in previous works, which are based on CG synthesized images for both training and testing. We created data sets using the Flickr Material Database for four attributes of glossiness, transparency, smoothness, and coldness, and learn the functions representing the values of these attributes. We present experimental results that the learned functions show very promising performances in the estimation of the attribute values.

Original languageEnglish
Title of host publicationICPR 2012 - 21st International Conference on Pattern Recognition
Pages3712-3715
Number of pages4
Publication statusPublished - 2012 Dec 1
Event21st International Conference on Pattern Recognition, ICPR 2012 - Tsukuba, Japan
Duration: 2012 Nov 112012 Nov 15

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Other

Other21st International Conference on Pattern Recognition, ICPR 2012
CountryJapan
CityTsukuba
Period12/11/1112/11/15

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Recognizing surface qualities from natural images based on learning to rank'. Together they form a unique fingerprint.

Cite this