Font Size: a A A

Stacked Product Quantization For Nearest Neighbor Search

Posted on:2018-07-09Degree:MasterType:Thesis
Country:ChinaCandidate:J WangFull Text:PDF
GTID:2348330512477226Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
In recent years,with the development of big data,Internet of things,Mobile service,High-dimensional data like image and video increase exponentially.In these large high-dimensional datasets,how to retrieval the objective data becomes time-consuming and low-efficiency.In order to solve this problem,the algorithm of nearest neighbor(NN)search in high dimensional space has become a fundamental paradigm in many applications like image retrieval,machine learning,data mining.Product quantization(PQ)has been proved to be one of the most effective method for scalable high dimensional nearest neighbor search.In terms of nearest neighbor search,a lot of quantitative algorithms have been proposed to optimize the product quantization algorithm.Optimized product quantization(OPQ)proposed a rotation matrix to solve the problem of subspace's unbalance in PQ.Additive quantization(AQ)was proposed to solve the independence of subspace in PQ.In order to decrease the quantization error,Stacked quantization was proposed to solve this problem.In this paper,we propose a new quantization method called Stacked product quantization(SPQ)to make nearest neighbor search.SPQ algorithm combine the advantage of SQ and PQ.Compared with PQ,we can achieve a low quantization error,compared with SQ,our quantization method consume less time and memory.The core idea of our method as follows:First of all,we subdivide the high-dimension vector into multiple subspace,then we quantize each subspace independently.Secondly,we make subtraction between sub-vector and it's quantization code to get the residual vector.Thirdly,we continue quantizing the residual vector to get it's quantization code,make a subtraction between residual vector and its quantization code.Then we repeat step this process in three step,and make Cartesian product between sub-codebook.In the experiments,we use the datasets SIFT1M and ConvNet-128 to measure our quantization method SPQ.From the experiments,we can see that our method SPQ can achieve low quantization error and high recall.
Keywords/Search Tags:Nearest Neighbor Search, High Dimensions, PQ, K-mean
PDF Full Text Request
Related items