TY - GEN

T1 - Fast encoding method for vector quantization based on subvector technique with a modified data structure

AU - Pan, Zhibin

AU - Kotani, Koji

AU - Ohmi, Tadahiro

PY - 2004/12/1

Y1 - 2004/12/1

N2 - The encoding process of vector quantization (VQ) is a time bottleneck to its practical applications. In order to speed up the process of VQ encoding, it is possible to estimate Euclidean distance first with just a lighter computation to try to reject a candidate codeword. In order to make an estimation for Euclidean distance, appropriate features of a vector become necessary. By using the famous statistical features of the sum and the variance for a k-dimensional vector and furthermore for its two corresponding (k/2)-dimensional subvectors, it is easy to estimate Euclidean distance so as to reject most of unlikely codewords for a certain input vector as proposed in [2]-[5]. Because it is very heavy to online compute the variance of a k-dimensional vector, a new feature, which is based on the variances of two subvectors, is constructed in this paper for estimating Euclidean distance. Meanwhile, a modified more memory-efficient data structure is proposed for storing all features of a vector to reduce extra memory requirement compared to the latest previous work [5]. Experimental results confirmed that the proposed method in this paper is more search efficient.

AB - The encoding process of vector quantization (VQ) is a time bottleneck to its practical applications. In order to speed up the process of VQ encoding, it is possible to estimate Euclidean distance first with just a lighter computation to try to reject a candidate codeword. In order to make an estimation for Euclidean distance, appropriate features of a vector become necessary. By using the famous statistical features of the sum and the variance for a k-dimensional vector and furthermore for its two corresponding (k/2)-dimensional subvectors, it is easy to estimate Euclidean distance so as to reject most of unlikely codewords for a certain input vector as proposed in [2]-[5]. Because it is very heavy to online compute the variance of a k-dimensional vector, a new feature, which is based on the variances of two subvectors, is constructed in this paper for estimating Euclidean distance. Meanwhile, a modified more memory-efficient data structure is proposed for storing all features of a vector to reduce extra memory requirement compared to the latest previous work [5]. Experimental results confirmed that the proposed method in this paper is more search efficient.

UR - http://www.scopus.com/inward/record.url?scp=21444455736&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=21444455736&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:21444455736

SN - 0780386396

T3 - Proceedings of 2004 International Symposium on Intelligent Signal Processing and Communication Systems, ISPACS 2004

SP - 570

EP - 573

BT - Proceedings of 2004 International Symposium on Intelligent Signal Processing and Communication Systems, ISPACS 2004

A2 - Ko, S.J.

T2 - Proceedings of 2004 International Symposium on Intelligent Signal Processing and Communication Systems, ISPACS 2004

Y2 - 18 November 2004 through 19 November 2004

ER -