Improved Stability Criteria of Static Recurrent Neural Networks with a Time-Varying Delay

This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to con...

Full description

Saved in:
Bibliographic Details
Main Authors: Lei Ding, Hong-Bing Zeng, Wei Wang, Fei Yu
Format: Article
Language:English
Published: Wiley 2014-01-01
Series:The Scientific World Journal
Online Access:http://dx.doi.org/10.1155/2014/391282
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods.
ISSN:2356-6140
1537-744X