G.I. Parisi, R. Kemker, J.L. Part et al. / Neural Networks 113 (2019) 54–71 57
deprivation of visual input began prior to ten weeks of age while no
changes were observed in adult animals. Additional experiments
showed that neural patterns of cortical organization can be driven
by external environmental factors at least for a period early in
development (Hubel & Wiesel, 1962, 1970; Hubel, Wiesel, & LeVay,
1977).
The most well-known theory describing the mechanisms of
synaptic plasticity for the adaptation of neurons to external stimuli
was first proposed by Hebb (1949), postulating that when one neu-
ron drives the activity of another neuron, the connection between
them is strengthened. More specifically, the Hebb’s rule states that
the repeated and persistent stimulation of the postsynaptic cell
from the presynaptic cell leads to an increased synaptic efficacy.
Throughout the process of development, neural systems stabilize
to shape optimal functional patterns of neural connectivity. The
simplest form of Hebbian plasticity considers a synaptic strength
w which is updated by the product of a pre-synaptic activity x and
the post-synaptic activity y:
∆w = x · y · η, (1)
where η is a given learning rate. However, Hebbian plasticity alone
is unstable and leads to runaway neural activity, thus requiring
compensatory mechanisms to stabilize the learning process (Ab-
bott & Nelson, 2000; Bienenstock, Cooper, & Munro, 1982). Stability
in Hebbian systems is typically achieved by augmenting Hebbian
plasticity with additional constraints such as upper limits on the
individual synaptic weights or average neural activity (Miller &
MacKay, 1994; Song, Miller, & Abbott, 2000). Homeostatic mech-
anisms of plasticity include synaptic scaling and meta-plasticity
which directly affect synaptic strengths (Davis, 2006; Turrigiano,
2011). Without loss of generality, homeostatic plasticity can be
viewed as a modulatory effect or feedback control signal that
regulates the unstable dynamics of Hebbian plasticity (see Fig. 1a).
The feedback controller directly affects synaptic strength on the
basis of the observed neural activity and must be fast in relation to
the timescale of the unstable system (Aström & Murray, 2010). In
its simplest form, modulated Hebbian plasticity can be modelled
by introducing an additional modulatory signal m to Eq. (1) such
that the synaptic update is given by
∆w = m · x · y · η. (2)
Modulatory feedback in Hebbian neural networks has received
increasing attention, with different approaches proposing biologi-
cally plausible learning through modulatory loops (Grant, Tanner,
& Itti, 2017; Soltoggio et al., 2017). For a critical review of the
temporal aspects of Hebbian and homeostatic plasticity, we refer
the reader to Zenke, Gerstner et al. (2017).
Evidence on cortical function has shown that neural activity in
multiple brain areas results from the combination of bottom-up
sensory drive, top-down feedback, and prior knowledge and ex-
pectations (Heeger, 2017). In this setting, complex neurodynamic
behaviour can emerge from the dense interaction of hierarchically
arranged neural circuits in a self-organized manner (Tani, 2016).
Input-driven self-organization plays a crucial role in the brain
(Nelson, 2000), with topographic maps being a common feature
of the cortex for processing sensory input (Willshaw & von der
Malsburg, 1976). Different models of neural self-organization have
been proposed that resemble the dynamics of basic biological
findings on Hebbian-like learning and plasticity (Fritzke, 1992;
Kohonen, 1982; Marsland, Shapiro, & Nehmzow, 2002; Martinetz,
Berkovich, & Schulten, 1993), demonstrating that neural map or-
ganization results from unsupervised, statistical learning with
nonlinear approximations of the input distribution. To stabilize the
unsupervised learning process, neural network self-organization
can be complemented with top-down feedback such as task-
relevant signals that modulate the intrinsic map plasticity (Parisi
et al., 2018; Soltoggio et al., 2017). In a hierarchical processing
regime, neural detectors have increasingly large spatio-temporal
receptive fields to encode information over larger spatial and tem-
poral scales (Hasson, Yang, Vallines, Heeger, & Rubin, 2008; Taylor,
Hobbs, Burroni, & Siegelmann, 2015). Thus, higher-level layers
can provide the top-down context for modulating the bottom-
up sensory drive in lower-level layers. For instance, bottom-up
processing is responsible for encoding the co-occurrence statis-
tics of the environment while error-driven signals modulate this
feedforward process according to top-down, task-specific factors
(Murray et al., 2016). Together, these models contribute to a better
understanding of the underlying neural mechanisms for the devel-
opment of hierarchical cortical organization.
2.3. The complementary learning systems
The brain learns and memorizes. The former task is charac-
terized by the extraction of the statistical structure of the per-
ceived events with the aim to generalize to novel situations. The
latter, conversely, requires the collection of separated episodic-
like events. Consequently, the brain must comprise a mecha-
nism to concurrently generalize across experiences while retaining
episodic memories.
Sophisticated cognitive functions rely on canonical neural cir-
cuits replicated across multiple areas (Douglas, Koch, Mahowald,
Martin, & Suarez, 1995). However, although there are shared struc-
tural properties, different brain areas operate at multiple timescales
and learning rates, thus differing significantly from each other
in a functional way (Benna & Fusi, 2016; Fusi, Drew, & Abbott,
2005). A prominent example is the complementary contribution
of the neocortex and the hippocampus in learning and memory
consolidation (McClelland et al., 1995; O’Reilly, 2004; O’Reilly &
Norman, 2002). The complementary learning systems (CLS) the-
ory (McClelland et al., 1995) holds that the hippocampal system
exhibits short-term adaptation and allows for the rapid learning
of novel information which will, in turn, be played back over
time to the neocortical system for its long-term retention (see
Fig. 1b). More specifically, the hippocampus employs a rapid learn-
ing rate and encodes sparse representations of events to mini-
mize interference. Conversely, the neocortex is characterized by
a slow learning rate and builds overlapping representations of
the learned knowledge. Therefore, the interplay of hippocampal
and neocortical functionality is crucial to concurrently learn reg-
ularities (statistics of the environment) and specifics (episodic
memories). Both brain areas are known to learn via Hebbian and
error-driven mechanisms (O’Reilly & Rudy, 2000). In the neocortex,
feedback signals will yield task-relevant representations while, in
the case of the hippocampus, error-driven modulation can switch
its functionally between pattern discrimination and completion for
recalling information (O’Reilly, 2004).
Studies show that adult neurogenesis contributes to the forma-
tion of new memories (Altman, 1963; Cameron, Woolley, McEwen,
& Gould, 1993; Eriksson et al., 1998; Gage, 2000). It has been
debated whether human adults grow significant amounts of new
neurons. Recent research has suggested that hippocampal neuro-
genesis drops sharply in children to undetectable levels in adult-
hood (Sorrells et al., 2018). On the other hand, other studies suggest
that hippocampal neurogenesis sustains human-specific cogni-
tive function throughout life (Boldrini, Fulmore, Tartt, Simeon, &
Pavlova, 2018). During neurogenesis, the hippocampus’ dentate
gyrus uses new neural units to quickly assimilate and immediately
recall new information (Altman, 1963; Eriksson et al., 1998). Dur-
ing initial memory formation, the new neural progenitor cells ex-
hibit high levels of plasticity; and as time progresses, the plasticity
decreases to make the new memory more stable (Deng, Aimone,
& Gage, 2010). In addition to neurogenesis, neurophysiological