Global Robust Exponential Dissipativity for Interval Recurrent Neural Networks with Infinity Distributed Delays

This paper is concerned with the robust dissipativity problem for interval recurrent neural networks (IRNNs) with general activation functions, and continuous time-varying delay, and infinity distributed time delay. By employing a new differential inequality, constructing two different kinds of Lyap...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiaohong Wang, Huan Qi
Format: Article
Language:English
Published: Wiley 2013-01-01
Series:Abstract and Applied Analysis
Online Access:http://dx.doi.org/10.1155/2013/585709
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper is concerned with the robust dissipativity problem for interval recurrent neural networks (IRNNs) with general activation functions, and continuous time-varying delay, and infinity distributed time delay. By employing a new differential inequality, constructing two different kinds of Lyapunov functions, and abandoning the limitation on activation functions being bounded, monotonous and differentiable, several sufficient conditions are established to guarantee the global robust exponential dissipativity for the addressed IRNNs in terms of linear matrix inequalities (LMIs) which can be easily checked by LMI Control Toolbox in MATLAB. Furthermore, the specific estimation of positive invariant and global exponential attractive sets of the addressed system is also derived. Compared with the previous literatures, the results obtained in this paper are shown to improve and extend the earlier global dissipativity conclusions. Finally, two numerical examples are provided to demonstrate the potential effectiveness of the proposed results.
ISSN:1085-3375
1687-0409