Font Size: a A A

Fractal Neural Net Based On Rational Fractal Theorem And Its Application

Posted on:2021-04-26Degree:MasterType:Thesis
Country:ChinaCandidate:S L HuFull Text:PDF
GTID:2428330602981385Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
At present,convolutional neural nets(CNNs)are commonly used in computer vision territory because of their ability to extract features from given images.The strategy to enhance the recognition capability of a CNN is to increase its depth.However,a net is hard to train of it's too deep.On the one hand,gradient explosion and gradient vanishment phenomena are more likely to happen in a deep net,which may lead to a training failure or train the net to a local optimum;on the other hand,with the training iteration increasing,the mean values of the neurons in a deep net incline to deviate from zero,which makes some of the activations of these neurons saturated,decreasing the efficiency of training.Aim at the problems above,present solutions still have limitations to some degree.Thus,how to enhance the recognition capability of a CNN without further increasing its depth is a valuable problem to solve.For this purpose,a new network structure is proposed based on rational fractal theorem,which is named as fractal neural nets.It is further modularized as fractal blocks in order to be embed into other network for use.The experiments show that a CNN's recognition capability is enhanced through embedding a fractal block into it without increasing its depth.Thereby,fractal blocks provide a new way to enhance optimize the performance of CNNs.The reason that fractal blocks could strengthen the recognition capability of CNNs is because of the powerful data-fitting ability of fractal neural nets.Due to the fact that the essence of training a neural net is training a data-fitting model,while in computer-aided design,rational fractal interpolation functions possess good ability to fit data,thus,fractal neural nets,by introducing the fractal function generating theorem into neural net design,can efficiently fit different type of data with a concise structure.In the paper,we proof that fractal neural nets not only can fit fractal dataset,but also can fit smooth dataset.This property makes fractal nets more powerful than those with similar network structure using common activations.But fractal nets strictly define the form of input data to work,which limits their range of.To overcome such limitation,fractal blocks are further proposed,which modularize fractal nets and can be directly embedded into other network structures to enhance overall performance.In the experiment part of the paper,the performance of fractal blocks is verified through several practical cases.First,fractal blocks are used to fit 2D data,including fractal data,practical stock data and the data generated by smooth functions,and their results are compared to those given by the other nets with similar structures.The comparisons show the high efficiency and stability of fractal blocks;Second,fractal blocks are used to dehaze the images with bad quality because of fog.The result shows its ability to handle high dimensional datasets;Finally,we embed the fractal block into a convolutional neural net.Through the comparison,we find that fractal blocks have enhanced the recognition capability of CNNs,which shows the practical value of fractal blocks.
Keywords/Search Tags:Fractal function, neural net, data fitting, network acceleration
PDF Full Text Request
Related items