A Jackson-type estimate in terms of the \(\tau\)-modulus for neural network operators in \(L^{p}\)-spaces

In this paper, we study the order of approximation with respect to the \(L^{p}\)-norm for the (shallow) neural network (NN) operators. We establish a Jackson-type estimate for the considered family of discrete approximation operators using the averaged modulus of smoothness introduced by Sendov and...

Full description

Saved in:
Bibliographic Details
Main Authors: Lorenzo Boccali, Danilo Costarelli, Gianluca Vinti
Format: Article
Language:English
Published: Tuncer Acar 2024-08-01
Series:Modern Mathematical Methods
Subjects:
Online Access:https://modernmathmeth.com/index.php/pub/article/view/42
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we study the order of approximation with respect to the \(L^{p}\)-norm for the (shallow) neural network (NN) operators. We establish a Jackson-type estimate for the considered family of discrete approximation operators using the averaged modulus of smoothness introduced by Sendov and Popov, also known by the name of \(\tau\)-modulus, in the case of bounded and measurable functions on the interval \([-1,1]\). The results here proved, improve those given by Costarelli (J. Approx. Theory 294:105944, 2023), obtaining a sharper approximation. In order to provide quantitative estimates in this context, we first establish an estimate in the case of functions belonging to Sobolev spaces. In the case \(1 < p <+\infty\), a crucial role is played by the so-called Hardy-Littlewood maximal function. The case of \(p=1\) is covered in case of density functions with compact support.
ISSN:3023-5294