Font Size: a A A

Extensions Of Depths To The Reproducing Kernel Hilbert Space And The Tensor Space

Posted on:2010-02-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y G HuFull Text:PDF
GTID:1100360305973620Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Depth of a point in the multidimensional space measures the centrality of that pointwith respect to a multivariate distribution or a given data cloud. It provides a new wayfor sorting data and has been successfully used in robust estimating, data discriminant andmany other fields. According the two limitations of data depth, the convexity and thevector limitation which can not effectively deal with the non-convex data sets and the datasets with tensor form, the main contributions of this document are as follows:First, aiming at the problem of convexity, we extend the data depths into theRKHS to make the contours of the depths automatically fit with the shape of thedata sets.The extension of the classical depth functions into the RKHS. Taking the spatial rankdepth and the L2 depth as examples, we discuss the main properties of the kernelmapped depths. In addition, through the extension of the Mahalanobis depth, wegive a general method for extending the depths which can not be directly extended.The construction of a new generalized depth function: minimal sphere depth function.Compared with the classical depth functions, it not only has a relative betterdepth interpretation for the non-convexity data sets compared, but also has a mucheasy computation.Second, we extend the data depths into the tensor space in order to efficientlyprocess those data with tensor form.Aiming at the difficult computation of the traditional projection depth, we redefinethe common projection depth as the Rayleigh projection depth which simplify thecomputation to a Rayleigh quotient.Tensor-based projection depth (TPD) are proposed. We present the algorithm ofTPD with 2nd-order and prove its convergence. Furthermore, TPD with higherorder tensor is also discussed. The experimental results of data classification withthe depth show that TPD performs much better than PD for the data in natural tensorform; even for the data with natural vector form, TPD performs no apparent inferiorto PD. Third, three applications are presented based on the kernel mapped depths andSVM.In order to adaptively reduce the outliers' influence on the classification, a robustSVM is proposed by weighting the kernel mapped depth to the regularization termin the object function.We present a method to estimate the potential support vectors (SVs) in advance byconstructing the ratio depth. It can not only reduce the computation by removingthe redundant data while keeping the precision, but also improve the performanceof SVM by the revised kernel method using the interface information.
Keywords/Search Tags:data depth, kernel method, statistical depth, tensor depth, kernel mapped depth
PDF Full Text Request
Related items