Physics Letters A 376 (2012) 412–417
Contents lists available at SciVerse ScienceDirect
Physics Letters A
www.elsevier.com/locate/pla
Learning-induced pattern classification in a chaotic neural network
Yang Li
a,b
,PingZhu
a
, Xiaoping Xie
a
, Guoguang He
a,∗
, Kazuyuki Aihara
c
a
Department of Physics, Faculty of Science, Zhejiang University, Hangzhou 310027, PR China
b
Key Laboratory of Infrared Imaging Materials and Detectors, Shanghai Institute of Technical Physics, Chinese Academy of Sciences, Shanghai 200083,PRChina
c
Institute of Industrial Science, The University of Tokyo, Tokyo 153-8505, Japan
article info abstract
Article history:
Received 1 April 2011
Received in revised form 21 September
2011
Accepted 25 October 2011
Availableonline23November2011
Communicated by A.R. Bishop
Keywords:
Pattern classification
Learning rule
Passive forgetting
Chaotic neural networks
In this Letter, we propose a Hebbian learning rule with passive forgetting (HLRPF) for use in a chaotic
neural network (CNN). We then define the indices based on the Euclidean distance to investigate the
evolution of the weights in a simplified way. Numerical simulations demonstrate that, under suitable
external stimulations, the CNN with the proposed HLRPF acts as a fuzzy-like pattern classifier that
performs much better than an ordinary CNN. The results imply relationship between learning and
recognition.
© 2011 Elsevier B.V. All rights reserved.
1. Introduction
In the past three decades, different artificial neural network
models have been proposed and investigated because of their po-
tential application to flexible intelligent information processing
[1–3]. Along with the discovery of chaotic behavior in biological
systems, artificial neural network models that exhibit chaotic dy-
namics have been drawing increasing attention [4–7].Onesuch
chaotic neural network (CNN) model was proposed [4] on the ba-
sis of electrophysiological experiments on squid giant axons [8,9].
Both the network and its single constituent neuron take on chaotic
dynamics when suitable values of the parameters are chosen [4,
10]. Further, intensive studies demonstrated that the CNN model is
a promising approach to tasks such as memory retrieval, pattern
recognition, combinational optimization, and multistable percep-
tion, and so on [10–17].
In application to associative memory, the synaptic weights be-
tween individual neurons in an ordinary CNN are usually deter-
mined by a symmetric auto-associative matrix [10].Severalbasal
patterns are embedded in the constant synaptic weights, and the
output of the CNN wanders chaotically among all these stored pat-
terns. Furthermore, the widely known phenomena of synaptic plas-
ticity provide powerful models to studies on CNNs with changeable
synapses [18–20]. Generally speaking, these studies fall into two
categories. One analyzes successive learning strategies that enable
*
Corresponding author.
E-mail address: gghe@zju.edu.cn (G. He).
a CNN to gain new memories, whereas the other considers the
mechanism of dynamic synapses with no memory update but im-
provement in the network’s information processing ability. For ex-
ample, Watanabe et al. [18] proposed a fully local algorithm by
which the CNN can detect and learn an unknown pattern auto-
matically; Wang et al. [20] incorporated synaptic depression into
the original CNN and found that the refined model becomes a pe-
culiar pattern recognition system. However, the learning-induced
function of pattern recognition or classification in a CNN has not
been widely reported.
The main aim of this Letter is to reveal the fact that learning in
a CNN can enhance its pattern classification ability. We proposed
a variant Hebbian learning rule with passive forgetting for weights
updating, and defined indices for the weight projection to inves-
tigate the weights evolution. Thereafter, we carried out numerical
simulations to investigate the retrieval properties of the proposed
model. In Section 2, the CNN model and its associative memory
are briefly discussed. The proposed learning rule and observation
method are presented in Section 3, followed by the simulation
results in Section 4. The conclusion and discussion are given in
Section 5.
2. The chaotic neural network model
A chaotic neuron model was derived with a continuous out-
put function and relative refractoriness to model responses of a
biological neuron. Using this kind of chaotic neuron, a CNN is con-
structed by considering the spatiotemporal summation of both the
external inputs and the feedback inputs [4]. The structure of the
0375-9601/$ – see front matter © 2011 Elsevier B.V. All rights reserved.
doi:10.1016/j.physleta.2011.10.062