Support Vector Machine SVM is a general learning approach based on statistical learning theory, which has obtained its practical applications in many areas such as pattern recognition, regression and prediction, and density evaluation due to its excellent generalized capability. When applied to regression and prediction, we often call SVM as support vector regression machine SVR. In general, sample data in regression analysis often contain noise. Therefore, how to determine the optimal parameters such that SVR becomes as robust as possible is an important subject worth of studying. The main aim of this dissertation is to study the theoretical relationships between the SVR's parameters and Laplacian and Uniform noisy inputs respectively.Firstly, in this dissertation, the issue of Huber- SVR robustness is addressed. Focused on the parameter choice issues of Huber-SVR with Laplacian and Uniform noisy inputs respectively, and based on the Bayesian framework, we derived the first relationship: with the best robustness, the approximately linear relationship between the parameterμin Huber-SVR and the standard deviationσof Laplacian and Uniform noisy inputs is kept.Secondly, the issue of r- SVR robustness is then studied. Focused on the parameter choice issues of r-SVR with Laplacian and Uniform noisy inputs respectively, and based on the Bayesian framework, we derived the second relationship: with the best robustness, the approximately inversely linear relationship between the parameter r in norm r-SVR and the standard deviationσof Laplacian and Uniform noisy inputs is kept. Meanwhile, our experimental results confirmed the above claims.Finally, Huber- SVR and r-SVR were used to regress practical stock market data respectively, and the results confirmed the above two conclusions. |