Font Size: a A A

Major And Minor Element Analysis And Stability Analysis

Posted on:2005-01-04Degree:MasterType:Thesis
Country:ChinaCandidate:C F ZhangFull Text:PDF
GTID:2208360125964297Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Our goal is to find out the conditions which guarantee the networks stability. Th-ese conditions are weaker than ever and make the design of the networks easier. In general, we can verify whether the networks system is stable through these conditions.In other words, we can establish stable system base on the stability criteria.We mainly use mathematical tools to research networks model. For example cons-tance variation, inequality analyze technique, Gronwall-Bellman inequality and clas-sical Liapunov function , Dini derivation ect. The differential equation of the networks model is solved by solving a simple one dimensional equation .The solution of the neural networks that is obtained is represen-ted by the eigenvalue and eigenvector of the weight matrix. Then the asymptotic sta-ble behavior in analyzed. Sometimes differential equation form is complex , so we can't obtain the analytic expression of the solution. We may use simple methods from the equation itself to de-termine the asymptotic and the stability of the trivial solution. We gain some stabilityconditions base on the continuous dependence of the solution for initial-value and theunique existence of a solution. Under these conditions, we needn't solute the differe-ntial equation and only make use of the sign of eigenvalue for weight matrix to verify the asymptotic and the stability of trivial solution. For another transformation of the network model ,we make use of local linear appr-ximation .Because nonlinear equation is stable equivalence with it's linear equation inlocal range, we only need analyze the linear equation. After discuss the existence of the Liapunov function ,we can make up apposite V function to derivate along the mo-del. We obtain the result that the output will convergence to the eigenvector correspo-ndion to the largest eigenvalue of weight matrix. Since then realize the feature extrac-tion . Through changing the sign of weight matrix ,we obtain the result that the final output vector is the eigenvector corresponding to the minimum eigenvalue of weight matix. In the final section ,we analyze the stability of cellular neural networks ,then proof the existence of equilibrium point . By meanse of Liapunov function we obtain sever-eval sufficient conditions which guarantee the equilibrium point uniformly asymptoti-cal stability.
Keywords/Search Tags:neural networks, stability, PCA and MCA, Liapunov function
PDF Full Text Request
Related items