Font Size: a A A

Exploring Generalization Of Neural Network Via Diversity

Posted on:2019-11-28Degree:MasterType:Thesis
Country:ChinaCandidate:P Q ZhangFull Text:PDF
GTID:2428330626452409Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Neural networks model are commonly used as models for classification for a wide variety of tasks.Studying generalization performance of neural network is one of the most important highlights in deep learning.Although Neural Networks(NNs)models have achieved excellent performance in many tasks,understanding the generalization capacity of NN models remains a challenge.In this paper,inspired by the diversity of individual learners in ensemble learning,a new metric diversity in NN model is proposed to evaluate the generalization capacity.The definition of NN depends on two factors: uncorrelation and equality.Where the uncorrelation aims to promote the model to learn different features,and the equality ensures that different units play a significant role in the model.We derive a diversity-based generalization bound for NNs and prove that diversity of model is crucial for reducing generalization error.Furthermore,we prove the diversity is associated with two statistics of parameters: the orthogonality and uniformity of weight matrix,which indicates that we can improve the generalization of NNs by increasing the orthogonality and uniformity.In order to verify our theoretical results,an orthogonal-uniformity network model(OUNN)is proposed,which can guarantee the orthogonality and uniformity of the network.We performed experiments on the MNIST,SVHN,and CIFAR data sets.The experimental results on the MNIST,CIFAR dataset validate the relationship between diversity and model generalization performance.The performance of the OUNN model on SVHN,CIFAR dataset illustrates the effectiveness of diversity and orthogonal-uniformity.
Keywords/Search Tags:Neural Network, Generalization Performance, Diversity, Orthogonality, Uniformity
PDF Full Text Request
Related items