Font Size: a A A

On The Precision And Sparsification Of Nonlinear Kernel Filtering Algorithms

Posted on:2019-05-10Degree:MasterType:Thesis
Country:ChinaCandidate:L J DangFull Text:PDF
GTID:2428330566980096Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
Kernel adaptive filters(KAFs)implement nonlinear filtering with function approximation via the kernel methods.In kernel methods,the inner products in input space can be effectively estimated using kernel methods in the reproducing kernel Hilbert spaces(RKHS).Recently,KAFs are developed from the three aspects including the optimization criterion,filter structure and control of network size.The optimization criterion in KAF mainly includes the minimum mean square error(MMSE)and maximum Correntropy criterion(MCC).The MMSE-based method is optimal in the presence of Gaussian noise,and the MCC-based method is used to combat larger outliers in the presence of non-Gaussian noise.The filter structure of KAF mainly includes the feedforword and feedback structures.The feedback structure introduces the history information into traditional KAF to improve filtering accuracy of KAFs.However,KAFs have a linearly increasing network with the length of training data,which results in large computational cost and storage size.Sample sparsification and structure sparsification are used to curb the growth of network size in KAF.In sample sparsification,at the cost of reduction of filtering accuracy,some threshold criterion is used to reduce the network size with a dynamic network size,resulting in reduction of space and time complexities.Structure sparsification uses a network with fixed size to implement the approximation of kernel function,relieving the growth of space and time complexities,similarly.Therefore,based on the aforementioned three aspects,this paper proposes two kinds of methods for improving filtering accuracy and two kinds of methods for controlling the network size.(1)Type I for improving accuracy.A novel feedback structure based on multiple singledelay outputs is introduced into kernel recursive maximum correntropy(KRMC)to obtain higher accuracy and robustness,generating KRMC with multiple feedback(KRMC-MF).To further reduce the time complexity,a simplified feedback structure based on one singledelay output is also presented to construct a linear recurrent kernel online learning based on maximum correntropy criterion(LRKOL-MCC).KRMC-MF and LRKOL-MCC use the history information and MCC to improve filtering accuracy and robustness.(2)Type II for improving accuracy of KAFs.Since the design of feedback is nontrivial and time-consuming,kernel online learning algorithm with scale adaptation(KOL-SA)is proposed to reduce the computational complexity and improve filtering accuracy simultaneously.In KOL-SA,the direction vector and the scale in the coefficient vector are updated using the gradient descent method.In comparison with KAFs with feedback,the proposed KOL-SA achieves desirable filtering performance in the filtering accuracy with less computational complexity.(3)Type I for curbing the growth of network size in KAFs.For the traditional KAFs with an increasing network size,random Fourier filters(RFFs)can effectively control the network structure by transforming the input into a Fourier feature space.In RFFs,the network outputs can be estimated by the transformed input data in RFFs with a set of fixed-size weight.MCC is applied to RFFs,generating a novel robust random Fourier filter under maximum correntropy(RFFMC).To further improve the convergence rate and robustness for larger outliers,a class of batch MCC-based algorithms is proposed by choosing a fixed size input vectors,generating random batch random Fourier filter under maximum correntropy(RB-RFFMC).RFFMC and RB-RFFMC improve the robustness of nonlinear filters under a network structure with fixed network size.(4)Type II for curbing the growth of network size in KAFs.Although RFFs have excellent filtering performance,the chosen data in random Fourier features are data independent.When there is a large correlation coefficient in sample data,RFFs yield performance degradation.The Nystr?m method is an approximation method with good approximation and generalization ability.In the Nystr?m method,the column vectors in kernel matrix are chosen randomly to approximate this matrix.To combat the large outliers efficiently,the kernel recursive maximum correntropy with Nystr?m approximation(KRMC-NA)is proposed to achieve desirable filtering performance under a fixed and efficient filter structure.
Keywords/Search Tags:Maximum correntropy criterion, feedback, scale adaptation, random Fourier feature, Nystr?m approximation
PDF Full Text Request
Related items