Structured and large-scale convex optimization problems have wide applications in sig-nal processing, image processing, compressed sensing, multi-task learning and so on. Manyproblems in signal processing, image recovery, matrix complexity and machine learning can beformulated as convex optimization problems.In this paper, the objective function of the convex optimization is the sum of a smoothfunction and a non-smooth function. On one hand, proximal gradient method can overcome thedifficulties, which cannot be solved by traditional methods, resulting from the non-smoothnessof the objective function. On the other hand, we obtain a nice result by treating with the s-mooth part and non-smooth part of the objective function separately. The convex optimizationproblems, which we study in this paper, can be applied very well in practical problems such asvariable selection and sparse group Lasso.First, we study the proximal splitting method. Although every step is an optimizationproblem, analytical solution is derived. And the convergence analysis of the algorithm, includ-ing global convergence and linear convergence, is also done. |