Font Size: a A A

Research On Model And Algorithm Of Robust Support Tensor Machine

Posted on:2014-01-04Degree:MasterType:Thesis
Country:ChinaCandidate:L J TanFull Text:PDF
GTID:2268330401458879Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
With the development of computer and internet technology, all areas of society haveaccumulated a large amount of data. Useful information from the data analysis and datamining can sever the production and life of the society better. Most of the traditional dataanalysis and data mining methods were based on vector space. And lots of complex datastructures tend to be directly converted into vector before researching, such as images, videos,etc., which are tensor data. This direct conversion approach leads to potential shortcomings:Firstly, the original data structure and correlation information may be broken, and secondly itmay suffer from the curse of dimensionality and small sample size problem.Machine learning is an important area of data mining. As one of the most importantclassification method in machine learning, support vector machine performs well in solvingsmall sample size, nonlinear, high-dimensional and local minima problems. So it has beenwidely used. If support vector machine theory and methods can be extended to tensor learning,it will greatly improve people’s ability to handle multi-modal tensor data in patternrecognition and image classification.Based on the idea of truncating least squares loss function, the paper proposed a robustsupport the tensor model (RSTM), which can process tensor data with noise and outliers. Themodel is the expansion of our robust least squares support vector machine (RLS-SVM).Because vector is first-order tensor, the RLS-SVM can be unified to RSTM. The papercarried out a detailed description of the RLS-SVM model’s proposing and the expansion totensor space, which ultimately got a unified RSTM model.Due to RSTM model introducing a truncating parameter, the optimization problemsbecome not smooth and non-convex. Therefore it can’t use conventional method to solve theoptimization problem. We put forward a feasible algorithm: Firstly, we smooth loss functionwith a smoothing technology. Secondly, by the using of the CCCP procedure, we transfer solving the concave-convex optimization problem into iterative solving a series of convexoptimization problem. And finally we solve the convex optimization problems with Newtoniterative method.We verified the robustness of the RLS-SVM by vector regression and classificationexperiments. Finally, we did a second-order tensor experiment, and the result showed RSTMis better than RLS-SVM both on effectiveness and efficiency. Before running the RLS-SVMmodel, we converted tensor data into vector space. And we also verified the robustness of theRSTM by the result comparison to LS-STM.
Keywords/Search Tags:Robust Support Tensor Machine, Robust Least Squares Support Vector Machine, Truncation of Loss function, Concave-Convex Procedure
PDF Full Text Request
Related items