| In engineering and scientific research,there are many optimization problems involving a large number of optimization variables,which makes it difficult to find the global optimal solution.Using evolutionary optimization algorithm to solve such large-scale problems in decision space often requires a lot of fitness evaluation.Therefore,when the fitness evaluation is expensive,that is,one fitness evaluation takes a long time,it is difficult to use evolutionary optimization algorithm to solve it.The research shows that surrogate assisted evolutionary algorithms(SAEAs)can obtain better solutions to expensive problems in limited computing resources.However,due to the "dimensional disaster",it is difficult to obtain enough data to train the appropriate surrogate model for expensive large-scale problems.Inspired by the idea of "divide and conquer",we propose to divide a large-scale optimization problem into several low-dimensional sub-problems using the random grouping technique at first.Then a surrogate is trained for each sub-problem to assist the sub-problem optimization,and the individuals calculated using the real fitness are selected based on the sub-problem estimation,in order to obtain good optimization performance under limited computing resources.The two algorithms proposed in this paper are described as follows:(1)An expensive large-scale optimization algorithm based on random grouping strategy and surrogate ensemble assisted optimization is proposed.The problem is decomposed into several sub-problems by random grouping strategy,an ensemble surrogate model is established for each sub-problem to assist the optimization.The next parent population for large-scale optimization will be generated by the composition of the populations for sub-problems after their optimization.Subsequently,a specific solution is selected through the filling criterion to replace the corresponding dimension value of the current optimal solution to obtain a new solution.The fitness of the solution is calculated through the actual expensive fitness function and stored in the data set for ensemble surrogate training.In addition,a diversity protection mechanism is proposed to prevent the population from falling into local areas.Finally,the algorithm is tested using CEC’2013 benchmark problem.Experimental results verify the effectiveness of the algorithm in solving expensive large-scale optimization problems.(2)A large-scale optimization algorithm based on model switching strategy is proposed.The algorithm adopts the switching strategy to establish the corresponding surrogate model for each sub-problem.After the optimization,the candidate solution is selected through the dynamic filling criterion,and the real expensive fitness function is calculated,which is stored in the data set for agent model training.In addition,this algorithm proposes a forced escape mechanism to maintain the diversity of the population.Finally,it is tested on the benchmark of CEC’2013.Experimental results show that the algorithm is effective in solving expensive large-scale optimization problems with limited computational resources. |