A multiobjective continuation method to compute the regularization path of deep neural networks
Sparsity is a highly desired feature in deep neural networks (DNNs) since it ensures numerical efficiency, improves the interpretability (due to the smaller number of relevant features), and robustness. For linear models, it is well known that there exists a regularization path connecting the sparse...
Saved in:
Main Authors: | Augustina Chidinma Amakor, Konstantin Sonntag, Sebastian Peitz |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2025-03-01
|
Series: | Machine Learning with Applications |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2666827025000088 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Re-Calibrating Network by Refining Initial Features Through Generative Gradient Regularization
by: Naim Reza, et al.
Published: (2025-01-01) -
Remarks on μ″-measurbale sets: regularity, σ-smootheness, and measurability
by: Carman Vlad
Published: (1999-01-01) -
A strain based Lipschitz regularization for materials undergoing damage
by: Kamasamudram, Vasudevan, et al.
Published: (2023-03-01) -
On optimality conditions and duality for multiobjective fractional optimization problem with vanishing constraints
by: Haijun Wang, et al.
Published: (2024-08-01) -
Large-scale multiobjective competitive swarm optimizer algorithm based on regional multidirectional search
by: Xuenan Zhang, et al.
Published: (2024-11-01)