Font Size: a A A

Training WGAN Based On Genetic Algorithm And Truncated Aggregate SGD

Posted on:2021-01-28Degree:MasterType:Thesis
Country:ChinaCandidate:Y J LeiFull Text:PDF
GTID:2428330626960402Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
The Generative Adversarial Networks(GAN)is a kind of unsupervised generative model,which includes the generative network and the discriminant network.GAN often face some problems such as mode collapse,gradient disappearance and gradient explosion,and the training is difficult.WGAN,a variant of GAN,greatly reduces the training difficulty of GAN by improving the distance between the real date distribution and the model distribution.Generally,Stochastic Gradient Descent(SGD)or its variants is used to alternately train the discriminator and the generator of WGAN.However,training WGAN alternately may lead to non-convergence and the mode may collapse when the discriminator falls into a local optimum.In order to improve the training stability of WGAN,we regard the training process of WGAN as the process of solving a semi-infinite minimax problem in this paper,and propose an algorithm for training WGAN based on the genetic algorithm and the truncated aggregate SGD.The semi-infinite minimax problem is a typical non-smooth optimization problem,which is often difficult to solve directly due to the infinity of the component functions.We use the genetic algorithm evolution to obtain the last generation population of the max problem,and combine different discretization strategies to transform the semi-infinite minimax problem into a series of similar finite minimax problems.The aggregate function can smoothly approximates the maximum value function of the finite minimax problem uniformly and smooth the non-smooth problem.In this paper,the truncated aggregate SGD algorithm is used to solve the finite minimax problem to obtain the solution of the original problem.In this paper,we validate the effectiveness of the algorithm on the MNIST handwritten font data sets.The experimental results show the image generation effect of the GA-TASGD algorithm with novelty search and the GA-TASGD algorithm based on the Boltzmann selection are good.This indicates that applying the idea of the algorithm for solving the semi-infinite minimax problems to training WGAN is feasible,which provides a new research idea for the training method of GAN.
Keywords/Search Tags:WGAN, Stochastic Gradient Descent, Genetic Algorithm, Semi-Infinite minimax Problem, Aggregate Function
PDF Full Text Request
Related items