Font Size: a A A

Research On Incremental Twin Support Vector Regression

Posted on:2022-10-25Degree:MasterType:Thesis
Country:ChinaCandidate:J CaoFull Text:PDF
GTID:2518306527984329Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Twin support vector regression(TSVR)is a machine learning algorithm for solving regression problems.Because TSVR only needs to solve a pair of small quadratic programming problems,the training efficiency of TSVR is higher than that of support vector regression.Therefore,TSVR has gradually become a research hotspot in machine learning field.However,most existing training algorithms are offline ones,and they are unable to efficiently solve incremental learning problems.This thesis aims to improve the training efficiency of TSVR variants under incremental scenarios,and develop their corresponding incremental learning algorithms.The results are summarized as follows:To solve the problem that the constituted kernel matrix cannot approximate the original kernel matrix well,an incremental reduced least squares twin support vector regression is proposed.First,in order to reduce the correlation of column vectors in the kernel matrix,the proposed algorithm utilizes a reduced method to screen support vectors from samples to constitute the column vectors of the kernel matrix.Therefore,the constituted kernel matrix can better approximate the original counterpart,which ensures the sparsity of the solution.Then,the inverse matrix is incrementally updated by the block matrix inverse lemma,which further shortens the training time of the proposed algorithm.The results show that the proposed algorithm can obtain sparse solution and its generalization performance is closer to offline algorithm compared with state-of-the-art algorithms.To solve the problem that the Lagrangian ?-twin support vector regression(LETSVR)cannot efficiently update the model under incremental scenarios,an incremental Lagrangian?-twin support vector regression(ILETSVR)based on the semismooth Newton method is proposed.By utilizing the matrix inverse lemmas to incrementally update the second-order gradient matrix,ILETSVR lowers the time complexity of matrix inversion,and expedites the training speed of model.However,when solving the problem of nonlinear regression,the training speed of ILETSVR dramatically drops with the increasing size of the kernel matrix.Therefore,an incremental reduced Lagrangian ?-twin support vector regression(IRLETSVR)is proposed.Based on ILETSVR,the proposed IRLETSVR introduces a reduced method to restrict the size of the inverse matrix at the cost of slightly lower prediction accuracy.The results show that ILETSVR can effectively address the linear regression problem under incremental scenarios,and obtain the same generalization performance as offline algorithm.IRLETSVR can greatly accelerate the training speed of nonlinear regression model under incremental scenarios,while obtain sparse solution,and its generalization performance is close to that of offline algorithm.To solve the problem that the existing training algorithms for ?-twin support vector regression cannot efficiently address the incremental learning in linear case,an accurate incremental ?-twin support vector regression(AIETSVR)is proposed.First,by calculating the Lagrangian multiplier of the new sample and adjusting the Lagrangian multipliers of the boundary samples,the influence generated by the quadratic loss of the new sample on the existing samples is minimized.Therefore,most of the existing samples still meet the Karush-Kuhn-Tucker conditions,and a valid initial state is obtained.Then,the exceptional Lagrangian multiplier is gradually adjusted.Finally,the feasibility and finite convergence of AIETSVR are theoretically analyzed.The results show that AIETSVR can obtain accurate solutions and has a great advantage in shortening training time.
Keywords/Search Tags:twin support vector regression, incremental learning, online learning, reduced method, semismooth Newton method
PDF Full Text Request
Related items