Adaptive Quick Learning for Associative Memory
Tomoshige Yoshihara and Masafumi Hagiwara
Keio University, Yokohama, Japan 223-8522
SUMMARY
Bidirectional associative memory (BAM) is a a form
of heteroassociative memory that can recall and restore
patterns. Being a Hebbian learning-based memory, it has
the problem of very low capacity. A training paradigm
called the Pseudo-Relaxation Learning Algorithm
(PRLAB) greatly increases the memory capacity of BAM.
The combination of Hebbian learning with BAM in the
Quick Learning training algorithm has increased its storage
capacity and robustness to noisy inputs while greatly reduc-
ing the number of iterations. But in these learning algo-
rithms, if a solution domain does not exist for the set,
learning of the connection weights will not converge and
recall of the training pattern is not guaranteed. This paper
proposes a new method of solving this problem, in which
training patterns are multimodalized by attaching random
numbers to them if it is estimated that learning is not
converging. Thus even if there is a contradiction in the
simultaneous inequalities used in the training patterns, con-
vergence is artificially forced and correct recall becomes
possible. Simulations indicate the effectiveness of the new
method in both the presence and absence of untrainable
patterns. © 2000 Scripta Technica, Syst Comp Jpn, 32(1):
5361, 2001
Key words:
Bidirectional associative memory
(BAM); pseudo-relaxation learning algorithm (PRLAB);
Hebbian learning; unrainable patterns; multimodalization;
forced convergence.
1. Introduction
Memory is the most important function of the brain,
and therefore understanding its mechanism is very im-
portant. In order to study it, many researches based not
only on physiological and psychological approaches, but
also engineering approaches have been made [15].
Among them, researches based on neural models have
been reported [113]. Bidirectional Associative Memory
(BAM) is a simple structure that is built taking notice of
similarities with the brain. BAM consists of two layers
with a symmetric synaptic interconnection between lay-
ers. It behaves as a heteroassociative memory that can
recall and store patterns.
However, as most associative recalling memories,
BAM is also a Hebbian learning-based memory and has the
problem of very low memory capacity. To improve the
storage capacity of BAM, several improvements have been
proposed [213]. The multiple training technique [7] can
improve the storage capacity of BAM but does not guaran-
tee recall of all training patterns. Wang and colleagues
method [8], on the other hand, guarantees the recall of
all training patterns, but the storage capacity is still low.
Kawabata and colleagues [13] proposed an algorithm to
improve the storage capacity and processing time
through multimodalization of the training vectors. How-
ever, since correlation-based learning is used, the storage
capacity is still low.
A new training paradigm called Pseudo-Relaxation
Learning Algorithm for BAM (PRLAB) was proposed by
© 2000 Scripta Technica
Systems and Computers in Japan, Vol. 32, No. 1, 2001
Translated from Denshi Joho Tsushin Gakkai Ronbunshi, Vol. J82-D-II, No. 1, January 1999, pp. 109116
Contract grant sponsor: Japan Society for the Promotion of Science;
contract grant #JSPS-RFTF 96 I 00102.
53