Font Size: a A A

The Convergence Of EXIN And Feng’s MCA Learning Algorithm

Posted on:2014-02-20Degree:MasterType:Thesis
Country:ChinaCandidate:L J YouFull Text:PDF
GTID:2230330395999842Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Minor Component Analysis(MCA), a method of multivariate statistical, is applied in many files, such as data analysis, graphics and image processing, curves/surfaces fitting, and other fields. The purpose of MCA is to find a direction, in which the variance of the projected data is minimized. The traditional methods which dealt with incidence matrix through eigenvalue decomposition or singular value decomposition are complex, especially causing many difficulties when it’s applied in engineering field. While the Artificial Neu-ral Network has very strong self-organization, robustness, and parallel processing ability, therefore it is very suitable to online extracting for component of high-dimensional signals.In history, Oja proposed a neural network method for extracting the first principal component for the first time. On the basis of the work of Oja, many various MCA were proposed according to the rules of anti-Hebbian. As MCA algorithm is a random dis-crete time algorithm, an important question is that whether it is able to converge to the minor component. There are usually two research methods:Deterministic Continuous Time(DCT) and Deterministic Discrete Time(DDT). According to stochastic approxima-tion theory, the learning rate must be approximate to zero when the DCT is used to analysis the convergence of MCA. However this is difficult to be satisfied. Meanwhile the DDT methods can guarantees the discrete features, and the learning rate needn’t to be approximate to zero.We research EXIN MCA and Feng’s MCA neural network learning algorithm based on DDT system in this paper. Reference[20] has discussed the convergence of EXIN MCA algorithm which is based on variable learning rate. In this paper we generalize the work to the condition that the learning rate is constant. The reference[21] has been proposed that the convergence condition of Feng’s MCA is η<min{0.5, λn/λ1} under the invariant set S0. We have not only given a wider range η<min{0.5,2λn/(λ1-λn)} for the selection of learning rate, but also done some further analysis for the invariant set.The arrangement in this paper is as follows:The first charter introduces the basic knowledge of neural network and some learning algorithm associated with MCA neural network; the second part gives the convergence condition of the EXIN MCA learning algorithm, relevant theoretical proof and numerical experiments; the third part we analysis the Feng’s MCA learning algorithm convergence condition and the invariant set.
Keywords/Search Tags:Minor Component Analysis(MCA), Artificial Neural Network, Determin-istic Discrete Time(DDT), Eigenvalue, Eigenvector
PDF Full Text Request
Related items