Font Size: a A A

Relationship Between Persistent Excitation Condition And Rbf Network Structures With Application To Performance Analysis Of Deterministic Learning

Posted on:2018-01-07Degree:MasterType:Thesis
Country:ChinaCandidate:T J ZhengFull Text:PDF
GTID:2348330536978565Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
Based on the notion of persistent excitation(PE),a deterministic learning theory is recently proposed for Radial Basis Function(RBF)network-based identification of nonlinear systems.In this thesis,we study the relationship between the PE levels(including the level of excitation and the upper bound of excitation),the structures of RBF networks and the performance of de-terministic learning.Specifically,given a state trajectory generated from a nonlinear dynamical system,we investigate the problem of how to construct the RBF networks in order to guarantee sufficient PE levels(especially the level of excitation)for deterministic learning.First,we deduce explicit formulas for the relationship between PE levels and RBF network structures,and prove that PE levels both increase with the separation distance of neural centers.However,they increase in different ratios.Specifically,the upper bound of excitation always increase with the separation distance in a direct ratio,but the level of excitation increase in different ratios when choosing different excitation distance.This conclusion suggests that when increasing the density of neural centers to improve the approximation capability,the level of excitation decreases.Besides,the level of excitation decreases with the excitation distance.This conclusion suggests that when activating more neural centers near the system trajectory to improve the approximation capability,the level of excitation decreases again.Second,as a practical illustration,the results on PE levels are applied to analyze the con-vergence properties of deterministic learning.By combining the obtained formulas about the relationship between PE levels and RBF network structures with the expressions about the de-terministic learning performance,it is proven that the convergence rate of deterministic learning decreases with the density of neural centers,and there always exists an optimal convergence rate corresponding to the separation distance of neural centers,which can be obtained by choosing appropriate design parameters according to the separation distance.As for the issue of con-vergence accuracy,we present exact theoretical conclusions that a finite and definite number of centers along the system trajectory can achieve the same performance as global centers.In addition,when adjusting the separation distance(or the density of neural centers),a trade-off exists between a relatively high level of excitation and the good approximation capabilities of RBF networks,which indicates that we cannot always obtain better convergence accuracy by reducing the separation distance of neural centers(or increasing the density of neural centers).Finally,simulation studies on Rossler systems and on the Mansoux model that is based on the Beihang low-speed axial compressor platform are included to illustrate the conclusions.
Keywords/Search Tags:neural network identification, persistent excitation(PE)condition, neural network structure, level of excitation, deterministic learning
PDF Full Text Request
Related items