Leveraging Simplex Gradient Variance and Bias Reduction for Black-Box Optimization of Noisy and Costly Functions

Gradient variance errors in gradient-based search methods are largely mitigated using momentum, however the bias gradient errors may fail the numerical search methods in reaching the true optimum. We investigate the reduction in both bias and variance errors attributed to the simplex gradient estima...

Full description

Saved in:
Bibliographic Details
Main Authors: Mircea-Bogdan Radac, Titus Nicolae
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10843234/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Gradient variance errors in gradient-based search methods are largely mitigated using momentum, however the bias gradient errors may fail the numerical search methods in reaching the true optimum. We investigate the reduction in both bias and variance errors attributed to the simplex gradient estimated from noisy function measurements, in favor of the finite-differences gradient, when both are used for black-box optimization methods. Regardless of the simplex orientation, while reducing the gradient bias error owned to several factors such as truncation, numerical or measurement noise, we claim and verify that, under relaxed assumptions about the underlying function’s differentiability, the estimated gradient by the simplex method has at most half the variance of the finite-difference gradient. The findings are validated with two comprehensive and representative case studies, one related to the minimization of a nonlinear feedback control system cost function and the second related to a deep machine learning classification problem whose hyperparameters are tuned. We conclude that in up to medium-size practical black-box optimization problems with unknown variable domains and where the noisy function measurements are expensive, a simplex gradient-based search is an attractive option.
ISSN:2169-3536