Font Size: a A A

Image Semantic Segmentation Based On Adversarial Learning And Knowledge Distillation

Posted on:2023-01-14Degree:MasterType:Thesis
Country:ChinaCandidate:Z T YuanFull Text:PDF
GTID:2568307103485634Subject:Control Engineering
Abstract/Summary:PDF Full Text Request
Semantic segmentation is an important part of computer vision task.The fundamental purpose of semantic segmentation is to distinguish the scene into different image regions and classify the pixels of different image regions into a certain class.This technology is currently widely used in many fields such as Geoscience Information System(GIS),autonomous driving,and medicine.At the same time,the deep learning boom has also promoted the development of semantic segmentation.How to enhance the ability of network feature representation,reduce the loss of image details,and reduce model calculation costs has become the focus of semantic segmentation tasks.However,most advanced semantic segmentation methods need a lot of computing resources.As an excellent model compression method,knowledge distillation can reduce this computational burden.Due to the inherent structural gap between teacher network and student network in knowledge distillation,dimensionality reduction should be carried out to adapt to the dimensions of students.In this process,useful information in teacher network will be lost.In addition,because the teacher network is usually parameterized,there must be useless and redundant information.In view of these situations,a knowledge refinement module is proposed to reduce the information loss in the dimensionality reduction process,and the spatial structure knowledge,channel distribution knowledge and remote dependent knowledge in the refined teacher network can be extracted and transmitted to the student network.Then,we use the conditional generation adversarial network framework to transfer the high-dimensional semantic feature knowledge from the teacher network to the student network.We improve the traditional self attention discriminator network to capture more correct features and transfer the correct knowledge to the student network,so that the feature distribution result of the student network is closer to the feature distribution result of the teacher network,and improve the segmentation performance of the student network.In addition,semantic segmentation models usually need enough labeled data for training to achieve high-performance segmentation performance.Due to the original performance gap between teacher network and student network and the low efficiency of knowledge distillation method,student network needs a long time and a large number of labeled samples to approach the performance of teacher network.When there are few labeled training samples,it is difficult to obtain a compact and high-performance model by using knowledge distillation method.The main reason for this situation is the limited efficiency of knowledge transfer.When the sample is insufficient,the teacher network can only transfer a little knowledge to the student network.In order to reduce the consumption of computing resources and labeled samples when training semantic segmentation model.A knowledge distillation method based on the structure of generative adversarial network,namely progressive growth knowledge distillation(PGKD),is proposed.This method uses teachers’knowledge to train each part of the neural network of students’ network step by step,which greatly reduces the difficulty of students’ network learning knowledge and realizes the purpose of small sample knowledge distillation.The characteristic map of student network and teacher network is aligned in multiple stages.In addition,we also propose a relevant feature knowledge distillation module to transfer the class prototype knowledge in the teacher network to the student network.Our proposed method can obtain a high-performance student network model only by relying on a small number of samples.
Keywords/Search Tags:Semantic segmentation, Deep learning, Knowledge distillation, Adversarial learning
PDF Full Text Request
Related items