Font Size: a A A

Research On The Optimization Methods Based On Particle Swarm Optimization

Posted on:2010-05-24Degree:DoctorType:Dissertation
Country:ChinaCandidate:F Q YangFull Text:PDF
GTID:1118360272997310Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Optimization is an important subdiscipline of mathematics. Optimization problem is the problem of finding the best solution from all feasible solutions. In mathematics, the term optimization refers to the study of problems in which one seeks to minimize or maximize a real function by systematically choosing the values of real or integer variables from within an allowed set, and the allowed set often is specified by a set of constraints, equalities or inequalities. Classical optimization methods include the analytical methods and the iterative methods. The analytical methods get systems of equations and inequations according to necessary conditions by some mathematical techniques such as differential calculus, variation calculus and Lagrange's method of multipliers, and solve systems of equations and inequations to obtain the solution. The iterative methods make use of some characteristics of the objective functions to solve the problems by successive procedures starting from an initial guess.The classical methods require optimization problem strictly satisfy some mathematical properties. The optimization problem encountered in many application fields is usually very complicated, which possibly have more than one local optimal solutions and the objective functions of which probably is nonconvex or nondifferentiable and even incapable of being expressed. The classical optimization methods are not adequate in solving such problems.Evolutionary computation, as a new intelligent optimization technique, is widely applied to many fields of science and engineering. Evolutionary computation, which simulates some natural phenomenon, is superior to other classical optimization methods on global search ability, the ability of solving complicated problems and applicability. PSO is a new swarm intelligence technique, which simulates the behaviors of bird flocking, and is a new branch of evolutionary computation. Since PSO was proposed it has progressively attracted considerable attention from researchers in optimization field due to its characteristics of quick convergent speed, simple principle, easy realization,and just a small amount of parameters to be adjusted. So this thesis researches on Hidden Markov Model (HMM) optimization problems, fuzzy clustering problems, K-Harmonic Means (KHM) clustering clustering problems and Flexible Job-shop Scheduling Problem (FJSP) based on PSO algorithm, and proposes several corresponding algorithms to solve these optimization problems. The main contributions and research achievements of this thesis include: 1. Summarizing the research on the PSO algorithm. Firstly, the author describes the PSO procedure and analyzes the effect of parameter on its performance. Secondly, the author introduces population topology, several PSO variants and the theoretical achievements. Finally, the author analyses systematically open questions in PSO research.2. Studying the HMM optimization based on the PSO algorithm. As a statistics model, HMM have been widely used in speech signal processing and pattern recognition. The problem of optimizing model parameters is one of important problems in this area. The Baum-Welch (BW) algorithm is a popular estimation method due to its reliability and efficiency. However, it is easily trapped in local optimum because of using a gradient descent approach. To overcome the shortcoming of the BW algorithm, we propose a new optimization method (PSOBW) based on the PSO algorithm and the BW algorithm. The Sentence Correct rate, the Word Correct rate and the Word Accuracy rate are used to measure the performance of the algorithms and these criteria are used to evaluate the speech recognition results in HTK Toolkit3.3. Experiments on data from the Census (AN4) database show that the PSOBW algorithm not only overcomes the shortcoming of the slow convergence speed of the PSO algorithm but also helps the BW algorithm escape from local optimum. In addition, we compare the PSOBW algorithm with the optimization method (GABW) based on the genetic algorithm and the BW algorithm. The experimental results indicate that the PSOBW algorithm is superior to the GABW algorithm in the recognition performance.3. Studying the fuzzy clustering technique based on the PSO algorithm. One of the most widely used fuzzy clustering models is fuzzy c-means (FCM). Fuzzy c-means is in essence a local search technique that searches for the optimum by using a hill-climbing technique and is prone to converge to a local optimal solution. The hybrid data clustering algorithm based the PSO algorithm and the FCM algorithm called HPSOFCM is proposed in this research. The hybrid clustering algorithm makes full use of the merits of the PSO algorithm and the FCM algorithm. Moreover, we compare the HPSOFCM algorithm with the fuzzy clustering method (DEFCM) based on the differential evolution algorithm and the FCM algorithm. The performances of the HPSOFCM algorithm and the HDEFCM algorithm are compared with those of the FCM algorithm on two artificial data sets and four real data sets. The FCM algorithm, the HPSOFCM algorithm and the HDEFCM algorithm are compared according to the objective function values of the FCM algorithm, Xie-Beni indices and runtimes. Although the FCM algorithm is fastest, it obtains neither the best objective function values nor Xie-Beni indices. The HDEFCM algorithm always gets the best Xie-Beni indices and the better objective function values. The HPSOFCM algorithm always gets the best objective function values and is faster than the HDEFCM algorithm. The HPSOFCM algorithm and the HDEFCM algorithm help the FCM algorithm escape from local optima.4. Studying the K-harmonic means clustering technique based on the PSO algorithm. K-harmonic means (KHM) clustering is a new center-based iterative clustering algorithm and solves the problem of initialization using a built-in boosting function, but it also easily runs into local optima. Aiming at the shortcoming of the KHM algorithm, a hybrid data clustering algorithm (PSOKHM) based on the PSO algorithm and the KHM algorithm is proposed in this research, which makes full use of the advantage of both algorithms. The performances of the PSOKHM algorithm and the PSO algorithm are compared with those of the KHM algorithm on two artificial data sets and five real data sets. The algorithms are compared according to the objective function value, the F-Measure and the runtime. Experimental results indicate the superiority of the PSOKHM algorithm and the PSOKHM algorithm not only helps the KHM clustering escape from local optima but also overcomes the shortcoming of the slow convergence speed of the PSO algorithm.5. Studying the method for solving the Flexible Job-shop Scheduling Problem (FJSP) based on the PSO algorithm. FJSP extends the Job-shop Scheduling Problem (JSP), in which every operation of a job is allocated a unique machine for processing, and FJSP is harder than JSP. A novel Discrete PSO algorithm for solving the FJSP is proposed in this research, which use two vectors to represent the particle. We make use of the crossover operator and mutation operator from genetic algorithm to redefine the motion equations of a particle. We design a new crossover operator and a new mutation operator for this special problem. The experiments on two benchmark problems show the proposed algorithm is comparable with the AL+CGA method and is superior to the Temporal Decomposition method, the classic genetic method and the AL method with respect to Makespan.In the final part of the thesis, the author summarizes the whole research work and prospects the future task.
Keywords/Search Tags:Optimization, Particle Swarm Optimization, Hidden Markov Model, Fuzzy c-means clustering, K-harmonic clustering, Differential evolution algorithm, Flexible Job-shop Scheduling Problem
PDF Full Text Request
Related items