Optimization problems are everywhere in real life.With the development of society,more and more problems that need to be optimized are more and more complicated,and the traditional optimization methods are far from meeting the needs of modern society.The swarm intelligent optimization algorithm came into being.Grey Wolf Optimizer(GWO)is a novel swarm intelligent optimization algorithm,which mainly simulates the social hierarchy and hunting behavior of wolves in nature.GWO has a simple structure and is easy to implement,so it has been widely used and successfully applied in many fields.However,GWO still has insufficient global search ability when solving complex optimization problems.Clustering is an unsupervised machine learning technique that can be applied to many fields such as pattern recognition and image processing.K-Means clustering and fuzzy C-Means(FCM)clustering are two classical clustering algorithms.They have the advantages of simple principle and easy implementation,but they are still sensitive to the initial point.It is easy to fall into local optimum and so on.Clustering optimization is a complex optimization problem.Many scholars have used swarm intelligence optimization algorithms to solve clustering optimization problems.GWO is novel and has obvious advantages,and has the potential to solve the clustering optimization problem.However,there are few related studies at present,so GWO has a large research space in cluster optimization applications.Aiming at the shortcomings of GWO in solving the complex optimization problems,this paper proposes three improved GWO algorithms and applies them to solve the clustering optimization problems.The main research contents are as follows:(1)Aiming at the problem of insufficient global search ability of GWO in solving complex optimization problems,the differential mutation operator of differential evolution algorithm is embedded,a gray wolf optimization algorithm based on opposition-learning and differential mutation(ODGWO).The algorithm is mainly improved in three ways.Firstly,a max-min opposition learning strategy is proposed.Secondly,a dynamic random differential mutation operator is proposed.Thirdly,the algorithm adopts one-dimensional operation and full-dimensional operation.Experimental results of function optimization and FCM clustering optimization show that the algorithm has better optimization performance.(2)Aiming at the problem of poor optimization performance of GWO in solving complex optimization problems,the empty_nest operator of the cuckoo search algorithm is embedded,a Grey Wolf Optimizer based on Global-best Opposition-learning and Empty_nest operator(GOEGWO)is proposed.Two major improvements are made to the algorithm.Firstly,a grey wolf searcher based on global-best and opposition-learning is proposed.Secondly,an empty_nest operator with random opposition-learning is proposed.Experimental results of function optimization and FCM clustering optimization show that GOEGWO balances exploration and exploitation and effectively solves the cluster optimization problem.(3)In order to take advantage of GWO and overcome its shortcomings,the artificial bee colony algorithm is mixed with GWO.A Hybrid GWO with Artificial bee colony(HGWOA)is proposed.The algorithm can better realize the complementary advantages between the algorithms and maximize their respective advantages.Experimental results of function optimization and K-Means clustering optimization show that HGWOA has better optimization ability and clustering performance.(4)The theoretical analysis of the improved GWO is carried out.The stability analysis is carried out by taking HGWOA as an example.Firstly,the first-order stability is proved by strict mathematical derivation,and then the second-order stability is proved.This proves that HGWOA is stable and reachable.This paper follows the principle of operator embedding into algorithm fusion,basic improvement to complex improvement,namely the embedding of the new operator(basic improvement)--the mixing of the operator and the change of the structure(depth improvement)--two optimization algorithms Mix(complex improvement),step by step advance research.Three improved algorithms are successfully proposed.These three improved algorithms mainly solve one defect of GWO,that is,the global search ability of GWO is insufficient in solving the complex optimization problem.Finally,HGWOA is taken as an example to prove that it is stable and convergent,and the theoretical research results are obtained.Three improved GWOs are used for clustering optimization.A large number of simulation experiments show that compared with GWO and other stat-of-the-art algorithms,the three improved algorithms proposed in this paper have better optimization performance,and they can better solve the Cluster optimization problem. |