| Radmacher complexity is one of the methods to describe the complexity of hypothesis space.It can be used as a tool to verify and improve the generalization ability of neural network,and plays an important role in the theoretical research of neural network,while label data can directly affect the performance of our neural network.In real life,unlabeled data is easy to obtain,while labeled data is often difficult to collect and time consuming to annotate.In this case,semi-supervised learning is more suitable for real-world applications and has become a new direction in the field of deep learning in recent years,which utilizes a large amount of unlabeled data and limited labeled data for learning.In transfer learning,a large number of experiments show that when the image attributes and task objectives in the target domain are different from those in the source domain,The dropout method is often used in fully connected neural networks because fully connected layers in pre-trained source domain models achieve higher accuracy after being transferred to the target domain.In this paper,we mainly study the application of fully connected neural networks based on Rademacher complexity regularization and dropout algorithm in semi-supervised classification.Aiming at the research direction of this paper,This paper discusses the radmacher complexity of a specific neural network Unet and the application of fully connected neural networks based on the Rademacher complexity regularization and dropout algorithm in semi-supervised classification.The following work is accomplished:(1)For the specific structure of Unet,the rademacher complexity of neural network cannot be calculated directly,So we use the Covering Number of neural network to calculate the Rademacher complexity of neural network hypothesis space.The special layer hopping structure of Unet is calculated according to the residual connection structure of Resnet.We directly treat the hop connection as a nonlinear function and then calculate it with the function on the stem.This simplifies Unet model,and also makes the calculation of Rademacher complexity of Unet model easier.When we calculate the covering number of neural network,we use the parameters of the weight matrix,which also proves that the performance of neural network is closely related to the weight matrix.The calculation of rademacher complexity of Unet model is beneficial to the interpretability of Unet model,and also provides some theories to help improve the model.(2)In the background of transfer learning,margin GAN is used to generate adversarial networks,and the classifier of generative adversarial networks is replaced by a fully connected network to achieve semi-supervised classification task.At the same time,ladmacher regular term is added to the loss function of the classifier.We find that our model has more obvious effect when there is less label data.Meanwhile,the experiment proves the effectiveness of applying Rademacher regular term to semi-supervised classification task,which improves the universality of Rademacher regular term and enhances the generalization ability of fully connected network. |