Font Size: a A A

Accelerating Gradient Algorithm Synthesis Optimization Objective Function

Posted on:2014-09-18Degree:MasterType:Thesis
Country:ChinaCandidate:S J YuanFull Text:PDF
GTID:2260330428959329Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
In the last years, several advances in Non-smooth Convex Optimization are based on development of different models for optimization problems, According to the objective function’s particular structure, we can give the optimization method which performs very well. In this paper, the composite objective func-tion refers to f(x)+||x||1, where f(x) is smooth and convex, and has Lipschitz continuous gradient.According to the objective function’s particular structure, we introduce an accelerated gradient method for minimizing objective function f(x)+||x||1. This method is based on the primal gradient method. Compar-ing with the primal gradient method, it improves the convergence rate from (?)(I/k) to (?)(1/k2), where k is the iteration counter, keeping basically the complexity of each iteration unchanged. Comparing with the accelerated scheme given by Yu.Nesterov, it performs better on the spending time, though performs a little worse on the convergence rate, because each iteration of the accelerated scheme is more complex and needs more computational efforts. We give computational experiments to show it.
Keywords/Search Tags:Nonsmooth optimization, Composite objective function, Gradientmethod, L1-norm, Structural optimization
PDF Full Text Request
Related items