Font Size: a A A

Research On Few-shot Image Classification Algorithm Based On Optimization Meta-learning

Posted on:2022-06-01Degree:MasterType:Thesis
Country:ChinaCandidate:H D TianFull Text:PDF
GTID:2518306533494674Subject:Electronic information
Abstract/Summary:PDF Full Text Request
Optimization-based Meta-Learning is a machine learning algorithm that aims to learn from given data/tasks a set of sensitive model initializations.The merit of the algorithm is when given a set of new data/a new task,the learned model initialization can quickly achieve good generalization performance on the query set of new task with merely several gradient descent steps.However,from the perspective of generalization,there are two main deficiencies in traditional optimization-based meta-learning:(1).Due to the scarce data,the overfitting phenomenon resulted from the over-parameterized network will hurt the generalization performance of the algorithm;(2).Since the essence of optimization-based meta-learning is a bi-level optimization problem,the update of meta-parameters depends significantly on the learned task-specific parameters.However,due to the rugged loss surface,searching the optimal task-specific parameters are difficult.Meanwhile,the task-specific parameters with strong inductive bias also have negative effect on the update of meta-parameters.Therefore,this dissertation mainly pay attention to the study on optimization-based meta-learning algorithm from the angle of few-shot image classification task.By performing theoretical analysis on the classical algorithm,we would like to improve the learning framework and the generalization performance of the algorithm.This dissertation proposes two kinds of optimization-based meta-learning algorithm,which leverage different strategies to solve the deficiencies in the current meta-learning framework.Specific research results include:We propose an optimization-based meta-learning algorithm based on network pruning techniques.The proposed algorithm incorporates unstructured network pruning methods with first-order optimization-based meta-learning algorithm to prune the unimportant weights during sparse phase so that the overfitting phenomenon caused by the overparameterized network can be eased.As demonstrated by theoretical analysis and empirical results,our work effectively improve the generalization performance of optimization-based meta-learning framework.We propose an algorithm called Meta LAB(Meta-Learning with Adaptive Biased Regularization).Inspired by the merits of biased regularization,Meta LAB aims to learn a more adaptive biased regularization constrained by a modulus hyperparameter.To this end,a hyperparameter optimization problem is embedded into the original first-order meta-learning algorithm.Thus,the meta-parameters and the modulus can be updated at the same time,which modifies the modulus to converge to the optimal point.The proposed algorithm can search the optimal task-specific parameters in a constrained neighbourhood of meta-parameters by controlling the regularization modulus while obtain the more optimal solution of the subproblem by taking advantage of the property that biased regularization is able to make the loss function more convex.In turn,the efficiency and the generalization performance are improved.
Keywords/Search Tags:Deep Learning, Few-Shot Image Classification, Optimization-based Meta-Learning, Network Pruning, Generalization Theory
PDF Full Text Request
Related items