-
1
-
2
An Accelerated Successive Convex Approximation Scheme With Exact Step Sizes for L1-Regression
Published 2025-01-01“…We demonstrate this scheme by devising three related accelerated algorithms with provable convergence. The first introduces an additional descent step along the past optimization trajectory in the variable update that is inspired by Nesterov's accelerated gradient method and uses a closed-form step size. …”
Get full text
Article -
3
Machine Learning and Deep Learning Optimization Algorithms for Unconstrained Convex Optimization Problem
Published 2025-01-01“…It contrasts traditional methods like Gradient Descent (GD) and Nesterov Accelerated Gradient (NAG) with modern techniques such as Adaptive Moment Estimation (Adam), Long Short-Term Memory (LSTM) and Multilayer Perceptron (MLP). …”
Get full text
Article