Font Size: a A A

Smoothing Technique And Theory For Solving Nondifferentiable Optimization Problem

Posted on:2004-12-30Degree:MasterType:Thesis
Country:ChinaCandidate:C F WangFull Text:PDF
GTID:2120360122475629Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Firstly, we want to solve the problem that based on the background of some numerical problem. It is the special nondifferentiable optimization problem. We give the original problem some transform, then we try to find the parametric smoothing approximations of this nonsmooth function. We give the analysis of the fundamental maximum function, by twice integrating a core function and find four type smoothing approximations for it through convolution. We continue to find the smoothing approximations of general maximum function. At last we propose the parametric smoothing approximations of the original problem. We also give the analysis of the error and the result of convergence. The new approach to the original problem is efficiency and correct. So we can freely choose some good smooth optimization technique to deal with this kind nondifferentiable optimization problem.Secondly, the new generalized gradient is introduced to take advantage of the given directional derivative. We give the definition of this new generalized gradient, discuss the property and prove the mean-value theorem. At the same time, we can obtain optimality condition.
Keywords/Search Tags:Nondifferentiable Optimization, Parametric Smoothing Approximations, Convolution, Maximum Function, Convergence, Generalized Gradient, Directional Derivative, Optimality Condition
PDF Full Text Request
Related items