Firstly, we want to solve the problem that based on the background of some numerical problem. It is the special nondifferentiable optimization problem. We give the original problem some transform, then we try to find the parametric smoothing approximations of this nonsmooth function. We give the analysis of the fundamental maximum function, by twice integrating a core function and find four type smoothing approximations for it through convolution. We continue to find the smoothing approximations of general maximum function. At last we propose the parametric smoothing approximations of the original problem. We also give the analysis of the error and the result of convergence. The new approach to the original problem is efficiency and correct. So we can freely choose some good smooth optimization technique to deal with this kind nondifferentiable optimization problem.Secondly, the new generalized gradient is introduced to take advantage of the given directional derivative. We give the definition of this new generalized gradient, discuss the property and prove the mean-value theorem. At the same time, we can obtain optimality condition. |