Font Size: a A A

Eugenic Evolutionary Optimization And Statistical Learning Modeling

Posted on:2004-11-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:X F SongFull Text:PDF
GTID:1118360122971415Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
In many fields such as chemical engineering, biomedical engineering, and so on, many internal mechanisms of objects subject to research can hardly be recognized. However, observation data, which reflect the values of dependent variables changing with respect to the corresponding values of independent variables, can be obtained through experiments, and then be used to model the involved object, which is qualitatively describe the relationships between the dependent and independent variables. This will serve as another form of representing their internal mechanism. This is a kind of modeling by learning from observation data, which is one of the basic important tasks for researchers.Modeling is to abstract qualitative relationships of variables involved in objects, and is an important means of recognizing and describing the behavior of objects under studying. Many techniques are involved in modeling methods such as optimization, statistics, artificial intelligence, pattern recognition, machine learning, neural network, and so on. The main contributions of this dissertation are as follows:(1) A eugenic evolution strategy was proposed to improve the efficiency of the conventional simple genetic algorithm (SGA) searching. The Eugenic evolution genetic algorithm (EGA) collects the population information along the evolution of children generations and constructs a deterministic optimization algorithm, which will be embedded in the evolution process at appropriate stage to speed up the local searching. For the possible deterministic searching methods, the Powell method was found to be feasible in integrating with the genetic algorithm. Besides, proposing an adaptive variation factor to keep the diversity of population, and a novel crossover rule to widen the distribution space of children generations also effectively modified the SGA. Two typical examples indicated the good performance of the proposed method. Finally, the EGA was successfully applied to the nonlinear parameter estimation of a mathematical model for Heavy Oil Thermal Cracking.(2) The performance of Support Vector Machine (SVM) for classification was analyzed. It was found that SVM for classification is more sensitive to noise than other methods, In particular, the kernel function, its parameter and penalty factor C are the main factor affecting the classification performance of SVM. CorrelativeComponent Analysis (CCA) was used to eliminate multicollinearity and noise of original sample data before classifying by SVM. To improve the classification performance of SVM and obtain the optimal discriminative function, the EGA proposed in this work was used to optimize the parameters of SVM including correlative components (CCs), penalty factor C, and kernel width factor. Finally, a typical two classes example of two classes natural Spearmint Essence was employed to verify the effectiveness of the proposed approach CCA-SVM. The classification accuracy is much better than that obtained by SVM alone or Correlative Component Analysis - Self-Organizing Map (CCA-SOM) networks.(3) The performance of support vector machine (SVM) for regression estimation was studied. It was found that the insensitive factor , penalty factor C , and the kernel function along with its parameter are the main factors affecting the performance of SVM regression estimation. The method for determining the parameters of SVM remains a critical unsolved problem. Cross-validation methods were commonly used in practice to determine the parameters of SVM, but they are usually expensive in computing time. A novel adaptive support vector machine (A-SVM) was proposed to determine the optimal parameters adaptively. The algorithms for adaptively tuning parameters of SVM were work out. The A-SVM was successfully applied to model the Delayed Coking process. Comparing with Radial Basis Function Neural Network-Partial Least Square Regression (RBFN-PLSR) methods, A-SVM was superior in both fitting accuracy and prediction performance. The proposed al...
Keywords/Search Tags:modeling, eugenic evolution optimization, genetic algorithms, statistical learning, support vector machine, Regression, Pattern classification, adjustable structure, and adaptively tuned parameters
PDF Full Text Request
Related items