Font Size: a A A

The Convergence Of Ojam And A Modified Oja-Xu Learning Algorithm

Posted on:2013-10-04Degree:MasterType:Thesis
Country:ChinaCandidate:B WangFull Text:PDF
GTID:2248330371497172Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Minor component analysis(MCA) is a very important statistical method, which has a wide application in the fields of curve fitting、Computer Graphics Processing and total least squares. Data space have the smallest covariance in the direction of the minor com-ponent. We can get the associated matrix of the given input signal data. MCA extract the minor component by computing the corresponding feature vector of the smallest eigen-value of the associated matrix. For the complex of the considerable problem, which always has a bigger data space, the extraction of the minor component of the high dimensional data turned into a challenge. The artificial neural network has high error tolerance, self-adaptability and the parallel processing capabilities, and is ideal for high dimensional signal minor component extraction.Oja proposed MCA learning algorithm firstly, in the consideration of which impor-tance in vary field, many MCA neural networks learning algorithms has been proposed then for the better extraction of the minor component. MCA neural networks adopt linear Neuron, which make input signals as the network’s input, the weighted average result with the weight vector as the output, over adjusting the weight, the weight vector converge to the direction of the eigenvector of smallest eigenvalue of the incidence matrix with the given precision. With the research result, deterministic discrete time (DDT) method can maintain the; discrete behaviors, and the learning rate can be a positive constant. The Oja and Oja-Xu MCA algorithm just adopt this kind of methods, gain and prove the condition of convergence.This thesis studies the Ojam MCA learning algorithm and Oja-Xu MCA learning algorithm. For the former one, The literature [29] gave the sufficient condition for the convergence of the Ojam MCA learning algorithm:η<0.5/λ1. The author proposed another Sufficient Condition to the convergence:η≤2/λ1+λ(?). To the later, we make improved algorithm of the fixed interval normalizing method with additional momentum item based on the normalizing Oja-Xu MCA algorithm, which improve the convergence speed and the accuracy.The structure of this thesis is organized as follows. Chapter1gives a brief introduc-tion of ANN and the knowledge of MCA learning algorithm. Chapter2is concerned with the further study of the convergence of the Ojam MCA learning. Chapter3makes some improvements of the Oja-Xu MCA learning algorithm. Finally, a brief conclusion is given.
Keywords/Search Tags:Minor Component Analysis(MCA), Ojam, Oja-Xu, Neural network, Convergence, Learning rate, Monotony
PDF Full Text Request
Related items