| Recently,as sparse optimization is widely used in fields such as machine learning,statistical regression,and computational vision,the study of sparse optimization problems is becoming increasingly important.This article aims at the estimation of regression coefficients in multiple linear regression based on the robustness of Huber loss and the unbiased penalty of folding concavity,and a sparse optimization model based on Huber loss and linear inequality constraints is considered.Firstly,this paper presents three models of sparse optimization: the original problem,the relaxation problem based on Capped-L1 regularization,and the unconstrained problem based on constraint penalty.In addition,this paper introduces the optimality conditions for non-smooth optimization problems and the definitions of various stable points.With the help of the lower bound property of the directional stable points of the penalty model,the equivalence of the global optimal solutions of the three models is analyzed under certain conditions.Secondly,this paper proposes a smoothing penalty algorithm,gives the properties of the smoothing penalty function,calculates the closed solution of the proximal operator under certain conditions,and proves the convergence of the algorithm.Finally,linear regression fitting experiment and sparse signal recovery experiments were carried out.Regression experiments have demonstrated the robustness of Huber loss compared to least squares,and compared the descent speed of the loss function when using the steepest descent method to solve Huber loss and least squares loss.Sparse signal recovery experiments have verified the effectiveness of the algorithm in different dimensions,and verified its stability in high-dimensional signal restoration. |