Smoothing gradient descent algorithm for the composite sparse optimization
Composite sparsity generalizes the standard sparsity that considers the sparsity on a linear transformation of the variables. In this paper, we study the composite sparse optimization problem consisting of minimizing the sum of a nondifferentiable loss function and the $ {\mathcal{\ell}_0} $ penalty...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
AIMS Press
2024-11-01
|
Series: | AIMS Mathematics |
Subjects: | |
Online Access: | https://www.aimspress.com/article/doi/10.3934/math.20241594 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832590726240665600 |
---|---|
author | Wei Yang Lili Pan Jinhui Wan |
author_facet | Wei Yang Lili Pan Jinhui Wan |
author_sort | Wei Yang |
collection | DOAJ |
description | Composite sparsity generalizes the standard sparsity that considers the sparsity on a linear transformation of the variables. In this paper, we study the composite sparse optimization problem consisting of minimizing the sum of a nondifferentiable loss function and the $ {\mathcal{\ell}_0} $ penalty term of a matrix times the coefficient vector. First, we consider an exact continuous relaxation problem with a capped-$ {\mathcal{\ell}_1} $ penalty that has the same optimal solution as the primal problem. Specifically, we propose the lifted stationary point of the relaxation problem and then establish the equivalence of the original and relaxation problems. Second, we propose a smoothing gradient descent (SGD) algorithm for the continuous relaxation problem, which solves the subproblem inexactly since the objective function is inseparable. We show that if the sequence generated by the SGD algorithm has an accumulation point, then it is a lifted stationary point. At last, we present several computational examples to illustrate the efficiency of the algorithm. |
format | Article |
id | doaj-art-1a423b6417c842b0a03b578d5b33e60e |
institution | Kabale University |
issn | 2473-6988 |
language | English |
publishDate | 2024-11-01 |
publisher | AIMS Press |
record_format | Article |
series | AIMS Mathematics |
spelling | doaj-art-1a423b6417c842b0a03b578d5b33e60e2025-01-23T07:53:24ZengAIMS PressAIMS Mathematics2473-69882024-11-01912334013342210.3934/math.20241594Smoothing gradient descent algorithm for the composite sparse optimizationWei Yang0Lili Pan1Jinhui Wan2Department of Mathematics, Shandong University of Technology, Zibo 255049, ChinaDepartment of Mathematics, Shandong University of Technology, Zibo 255049, ChinaDepartment of Mathematics, Shandong University of Technology, Zibo 255049, ChinaComposite sparsity generalizes the standard sparsity that considers the sparsity on a linear transformation of the variables. In this paper, we study the composite sparse optimization problem consisting of minimizing the sum of a nondifferentiable loss function and the $ {\mathcal{\ell}_0} $ penalty term of a matrix times the coefficient vector. First, we consider an exact continuous relaxation problem with a capped-$ {\mathcal{\ell}_1} $ penalty that has the same optimal solution as the primal problem. Specifically, we propose the lifted stationary point of the relaxation problem and then establish the equivalence of the original and relaxation problems. Second, we propose a smoothing gradient descent (SGD) algorithm for the continuous relaxation problem, which solves the subproblem inexactly since the objective function is inseparable. We show that if the sequence generated by the SGD algorithm has an accumulation point, then it is a lifted stationary point. At last, we present several computational examples to illustrate the efficiency of the algorithm.https://www.aimspress.com/article/doi/10.3934/math.20241594nonsmooth convex regressioncardinality penaltygradient descentsmoothing method |
spellingShingle | Wei Yang Lili Pan Jinhui Wan Smoothing gradient descent algorithm for the composite sparse optimization AIMS Mathematics nonsmooth convex regression cardinality penalty gradient descent smoothing method |
title | Smoothing gradient descent algorithm for the composite sparse optimization |
title_full | Smoothing gradient descent algorithm for the composite sparse optimization |
title_fullStr | Smoothing gradient descent algorithm for the composite sparse optimization |
title_full_unstemmed | Smoothing gradient descent algorithm for the composite sparse optimization |
title_short | Smoothing gradient descent algorithm for the composite sparse optimization |
title_sort | smoothing gradient descent algorithm for the composite sparse optimization |
topic | nonsmooth convex regression cardinality penalty gradient descent smoothing method |
url | https://www.aimspress.com/article/doi/10.3934/math.20241594 |
work_keys_str_mv | AT weiyang smoothinggradientdescentalgorithmforthecompositesparseoptimization AT lilipan smoothinggradientdescentalgorithmforthecompositesparseoptimization AT jinhuiwan smoothinggradientdescentalgorithmforthecompositesparseoptimization |