Neural Networks 102 (2018) 1–9
Contents lists available at ScienceDirect
Neural Networks
journal homepage: www.elsevier.com/locate/neunet
Stability analysis for discrete-time stochastic memristive neural
networks with both leakage and probabilistic delays
✩
Hongjian Liu
a,b
, Zidong Wang
c
, Bo Shen
a,
*, Tingwen Huang
d
, Fuad E. Alsaadi
e
a
School of Information Science and Technology, Donghua University, Shanghai 200051, China
b
School of Mathematics and Physics, Anhui Polytechnic University, Wuhu 241000, China
c
Department of Computer Science, Brunel University London, Uxbridge, Middlesex, UB8 3PH, United Kingdom
d
Texas A&M University at Qatar, Doha 23874, Qatar
e
Department of Electrical and Computer Engineering, Faculty of Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia
a r t i c l e i n f o
Article history:
Received 20 September 2017
Received in revised form 5 December 2017
Accepted 2 February 2018
Available online 15 February 2018
Keywords:
Discrete-time networks
Memristive neural networks
Stochastic neural networks
Exponential stability
Leakage delays
Probabilistic time-varying delays
a b s t r a c t
This paper is concerned with the globally exponential stability problem for a class of discrete-time
stochastic memristive neural networks (DSMNNs) with both leakage delays as well as probabilistic
time-varying delays. For the probabilistic delays, a sequence of Bernoulli distributed random variables
is utilized to determine within which intervals the time-varying delays fall at certain time instant.
The sector-bounded activation function is considered in the addressed DSMNN. By taking into account
the state-dependent characteristics of the network parameters and choosing an appropriate Lyapunov–
Krasovskii functional, some sufficient conditions are established under which the underlying DSMNN is
globally exponentially stable in the mean square. The derived conditions are made dependent on both
the leakage and the probabilistic delays, and are therefore less conservative than the traditional delay-
independent criteria. A simulation example is given to show the effectiveness of the proposed stability
criterion.
© 2018 Elsevier Ltd. All rights reserved.
1. Introduction
For decades, it has been generally recognized that the recur-
rent neural networks (RNNs) are capable of self-organizing, self-
learning, nonlinear function approximation and fault tolerance. In
fact, RNNs have been successfully applied in a variety of practical
domains which include, but are not limited to, signal processing,
control engineering, pattern recognition, image processing and
combinatorial optimization. These applications are heavily depen-
dent on the dynamic behaviors of the RNNs. In fact, the dynamics
analysis problem for RNNs has been a hot topic of research receiv-
ing an ever-increasing research interest and a great many excellent
results have been reported in the literature, see e.g. Liang, Gong,
and Huang (2016), Liu, Wang, and Liu (2006), Liu, Wang, and Liu
(2008), Shen, Wang, and Qiao (2017), Wang, Liu, and Liu (2005),
✩
This work was supported in part by the National Natural Science Foundation of
China under Grants 61503001, 61525305 and 61473076, the Program for Professor
of Special Appointment (Eastern Scholar) at Shanghai Institutions of Higher Learn-
ing, the Natural Science Foundation of Universities in Anhui Province under Grants
KJ2015A088 and TSKJ2015B17, the Teacher Scholarship Fund by Anhui Polytechnic
University 2017, the Royal Society of the UK, and the Alexander von Humboldt
Foundation of Germany.
*
Corresponding author.
E-mail addresses: Zidong.Wang@brunel.ac.uk (Z. Wang), Bo.Shen@dhu.edu.cn
(B. Shen).
Zhang, He, Jiang, Lin, and Wu (2017), Zhang, Tang, Wong, and
Miao (2015) and the references therein. In particular, the global
stability of RNNs is arguably the most desirable dynamic property
that attracts a great deal of research attention and plays a vitally
important role in practice such as optimization problems (Zhang,
Wang, & Liu, 2014).
Since the first announcement from the HP Lab on the experi-
mental prototyping of memristor, memristive devices have been
widely investigated for their potential applications in non-volatile
memories, logic devices, neuromorphic devices, and neuromor-
phic self-organized computation and learning, see Adamatzky and
Chua (2013) and Strukov, Snider, Stewart, and Williams (2008)
for more details. On the other hand, it is well known that NNs
can be implemented by very large-scale integration and, in the
implementation of NNs, it is natural to replace the resistors by the
memristors in order to exploit the aforementioned advantages of
memristors, and this gives rise to a new kind of neural networks,
namely, memristive neural networks (MNNs). Actually, in the past
few years, such MNNs have already been used in some application
areas such as brain emulation, combinatorial optimization and
knowledge acquisition, see e.g. Pedretti et al. (2017) and Pershin
and Di Ventra (2010), where the dynamical behaviors (especially
the global stability) of the MNNs form a critically important in the
successes of the MNN applications. In this regard, along with the
https://doi.org/10.1016/j.neunet.2018.02.003
0893-6080/© 2018 Elsevier Ltd. All rights reserved.