Improved Stability Criteria of Static Recurrent Neural Networks with a Time-Varying Delay
This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to con...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2014-01-01
|
Series: | The Scientific World Journal |
Online Access: | http://dx.doi.org/10.1155/2014/391282 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods. |
---|---|
ISSN: | 2356-6140 1537-744X |