Font Size: a A A

Discrete Hashing Learning

Posted on:2021-12-21Degree:DoctorType:Dissertation
Country:ChinaCandidate:Q Y JiangFull Text:PDF
GTID:1488306500466644Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Hashing has been widely used in many large-scale retrieval applications.The goal of hashing is to learn a hash function to map the data from original features as binary hash codes which can preserve the similarity as much as possible.The optimization of hashing model is difficult because the binary hash codes are defined over discrete space.Some hashing methods adopt the strategy which discards the binary constraint directly during training procedure.However,directly discarding the binary constraint will make the hashing model deviate the original goal of hashing.Thus,the retrieval accuracy of these methods is deteriorated.Discrete hashing learning can learn binary hash code during the training procedure.Hence,compared with the methods which discard the binary constraint directly,discrete hashing learning methods can achieve higher retrieval accuracy.This paper studies the discrete hashing learning from the following four application scenarios: non-deep single modal hashing,deep single modal hashing,non-deep multi-modal hashing and deep multi-modal hashing.The contributions of this thesis are outlined as follows:· In non-deep single modal scenarios,graph hashing is one of the most important non-deep single modal hashing methods.However,existing discrete graph hashing cannot use the whole graph similarity for training.Hence,the retrieval accuracy of these methods is deteriorated.Furthermore,the training of existing discrete graph hashing is inefficient.To fully use the graph similarity,this paper proposes a graph hashing method,called scalable graph hashing with feature transformation(SGH).SGH designs an approach to compute the whole graph similarity implicitly.Hence,SGH can use the whole graph similarity during training procedure.By computing graph similarity implicitly,SGH can achieve linear complexity.Moreover,SGH proposes a bitwise discrete optimization algorithm to learn binary hash codes.Experiments demonstrate that SGH can outperform existing discrete graph hashing methods.And the training of SGH is more efficient compared with existing discrete graph hashing methods.· In deep single modal scenarios,existing deep single modal hashing methods have two issues.On one hand,existing deep single modal hashing methods cannot use supervised information to guide the binary hash codes learning and deep feature learning simultaneously and directly.On the other hand,existing deep single modal hashing methods are symmetric hashing methods and the training of these methods is inefficient.This paper proposes a deep hashing method,called deep discrete supervised hashing(DDSH).DDSH is the first deep single modal hashing method which can use the pairwise supervised information to directly guide the binary hash codes learning and deep feature learning.Thus these two learning procedures can give feedback to each other during training.To solve the inefficiency problem for training deep single modal hashing,this paper also proposes another deep hashing method,called asymmetric deep supervised hashing(ADSH).ADSH uses asymmetric hashing to model hashing problem and designs an efficient learning algorithm.Experiments demonstrate that DDSH can achieve higher retrieval accuracy compared with existing deep single modal hashing methods.Furthermore,ADSH can achieve higher retrieval accuracy within the shortest time compared with symmetric deep single modal hashing methods except for DDSH.Compared with DDSH,the training of ADSH is more efficient.· In non-deep multi-modal scenarios,the computation complexity of existing pairwise non-deep cross-modal discrete hashing is the square of the training set size.These methods can only use sampled set for training when the computational resource is limited.Moreover,the training of these methods is inefficient due to the high complexity.This paper proposes a cross-modal hashing method,called discrete latent factor model based cross-modal hashing(DLFH).DLFH designs a discrete learning algorithm which can be proved to be convergent to learn binary hash codes.And DLFH designs a stochastic sampling strategy to improve the training efficiency.Experiments demonstrate that DLFH can achieve higher retrieval accuracy compared with existing non-deep cross-modal hashing methods.Moreover,the training of DLFH is more efficient compared with discrete cross-modal hashing methods.· In deep multi-modal scenarios,this paper introduces the deep feature learning technique into cross-modal hashing for the first time and proposes a cross-modal hashing method,called deep cross-modal hashing(DCMH).DCMH is the first cross-modal hashing method which can integrate the binary hash codes learning and deep feature learning into an end-to-end learning framework.This paper also proposes a crossmodal hashing method,called deep discrete latent factor model for cross-modal hashing(DDLFH).DDLFH seamlessly integrates the discrete learning ability of DLFH and the feature learning ability of deep learning into the same learning framework.Experiments demonstrate that DCMH can achieve higher accuracy compared with non-deep cross-modal hashing methods,and DDLFH can achieve higher retrieval accuracy compared with existing non-deep cross-modal hashing methods and deep cross-modal hashing methods.
Keywords/Search Tags:Hashing Learning, Large-scale Data Retrieval, Discrete Optimization, Discrete Hashing
PDF Full Text Request
Related items