Exploration on Robustness of Exponentially Global Stability of Recurrent Neural Networks with Neutral Terms and Generalized Piecewise Constant Arguments

With a view to the interference of piecewise constant arguments (PCAs) and neutral terms (NTs) to the original system and the significant applications in the signal transmission process, we explore the robustness of the exponentially global stability (EGS) of recurrent neural network (RNN) with PCAs...

Full description

Saved in:
Bibliographic Details
Main Authors: Wenxiao Si, Tao Xie, Biwen Li
Format: Article
Language:English
Published: Wiley 2021-01-01
Series:Discrete Dynamics in Nature and Society
Online Access:http://dx.doi.org/10.1155/2021/9941881
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:With a view to the interference of piecewise constant arguments (PCAs) and neutral terms (NTs) to the original system and the significant applications in the signal transmission process, we explore the robustness of the exponentially global stability (EGS) of recurrent neural network (RNN) with PCAs and NTs (NPRNN). The following challenges arise: what the range of PCAs and the scope of NTs can NPRNN tolerate to be exponentially stable. So we derive two important indicators: maximum interval length of PCAs and the scope of neutral term (NT) compression coefficient here for NPRNN to be exponentially stable. Additionally, we theoretically proved that if the interval length of PCAs and the bound of NT compression coefficient are all lower than the given results herein, the disturbed NPRNN will still remain global exponential stability. Finally, there are two numerical examples to verify the deduced results’ effectiveness here.
ISSN:1026-0226
1607-887X