Font Size: a A A

The Analysis Of The DRT's Computation Complexity And The Research On Compound DRT Based On DCT

Posted on:2007-05-13Degree:MasterType:Thesis
Country:ChinaCandidate:Y G HuFull Text:PDF
GTID:2120360215469897Subject:Mathematics
Abstract/Summary:PDF Full Text Request
This paper is mainly dedicated to the research of the compound dimensionality reduction method. Dimensionality reduction theory, as the basis of high-dimensional-data field, is now the most important area of all the ones ever touched. Although the existing methods have shown their great advantages in dealing with the high-dimensionality data with lower dimensionality, they can do nothing for the data with super high dimensionalities for the computation complexity. According to such situation, our research emphasizes on the following four aspects:1. Summary of the existing methodsWe analyzes not only the merits and demerits of all the typical methods in detail, but also the essential relationship of all the linear dimensionality reduction methods, such as PCA, PP, ICA, and LDA. At the same time, the essential meanings of the nonlinear methods, such as MDS, Isomap, LLE and Laplacian Eigenmap, are also discussed. Aiming at the data with super high dimensionalities, we studied the relationship between the dimensionality and each algorithm's computation complexity in detail.2. The compound methodDCT to the high-dimensionality data can congregate most of the each data's energy to the anterior dimensions without changing the data's topological structure. This paper takes the DCT as the preprocessing method of data dimensionality reduction based on such properties, that is, its strong "energy compaction" and distance preserving. Its usefulness exhibits in two aspects. First, it can cut down the most of the dimensions of the high-dimensionality data, which reduces the amount of computation. Second, it can also repair the structure of the data influenced by the measurement errors or noise, and improve the ability of recognition of the dimensionality reduction methods.3. The judgment for the intercepted dimensionality based on the distortional measurementBased on the deeply research on the theory of Stress function in MDS, we proposed a method for judging the optimal intercepted dimensionality, that is, to estimate the threshold value of distortional measurement by the inflexion in the curve line of the distortional measurement.4. The Proof of the compound method's advantages in little distortional measurement compared with the down sampling methodThe traditional method dealing with data with ultra long dimensionalities is to do down sampling to the data with the low dimensionalities. However, we proved that the distortional measurement got by the compound method is smaller than the one done by the down sampling.The result of experiments in this paper shows that the compound method can reduce the computational complexity of the dimensionality reduction process with little changing of the original data structure. Furthermore, the higher dimensionalities are, the more efficient the method will be.
Keywords/Search Tags:compound dimensionality reduction, high dimensional data, singular dimensionality reduction method, DCT
PDF Full Text Request
Related items