Synapse efficiency diverges due to synaptic pruning

PHYSICAL REVIEW E 68, 031910 共2003兲
Synapse efficiency diverges due to synaptic pruning following overgrowth
Kazushi Mimura,1,* Tomoyuki Kimoto,2 and Masato Okada3
1
Department of Electrical Engineering, Kobe City College of Technology, Gakuenhigashi-machi 8-3, Nishi-ku, Kobe,
Hyogo 651-2194, Japan
2
Oita National College of Technology, Maki 1666, Oita-shi, Oita 870-0152, Japan
3
Laboratory for Mathematical Neuroscience, RIKEN Brain Science Institute, Saitama 351-0198, Japan
‘‘Intelligent Cooperation and Control,’’ PRESTO, JST, c/o RIKEN BSI, Saitama 351-0198, Japan
and ERATO Kawato Dynamic Brain Project, 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan
共Received 23 July 2002; revised manuscript received 26 March 2003; published 22 September 2003兲
In the development of the brain, it is known that synapses are pruned following overgrowth. This pruning
following overgrowth seems to be a universal phenomenon that occurs in almost all areas—visual cortex,
motor area, association area, and so on. It has been shown numerically that the synapse efficiency is increased
by systematic deletion. We discuss the synapse efficiency to evaluate the effect of pruning following overgrowth, and analytically show that the synapse efficiency diverges as O( 兩 ln c兩) at the limit where connecting
rate c is extremely small. Under a fixed synapse number criterion, the optimal connecting rate, which maximizes memory performance, exists.
DOI: 10.1103/PhysRevE.68.031910
PACS number共s兲: 87.10.⫹e, 89.70.⫹c, 05.90.⫹m
I. INTRODUCTION
In this paper, we analytically discuss synapse efficiency to
evaluate effects of pruning following overgrowth during
brain development, within the framework of autocorrelationtype associative memory.
Because this pruning following overgrowth seems to be a
universal phenomenon that occurs in almost all areas—visual
cortex, motor area, association area, and so on 关1–10兴—we
discuss the meaning of its function from a universal viewpoint rather than in terms of particular properties in each
area. Of course, to discuss this phenomenon as a universal
property of a neural network model, we need to choose an
appropriate model.
Artificial neural network models are roughly classified
into two types: feed forward models and recurrent models.
Various learning rules are applied to the architectures of
these models, and correlation learning corresponding to the
Hebb rule can be considered a prototype of any other learning rules. For instance, correlation learning can be regarded
as a first-order approximation of the orthogonal projection
matrix, because the orthogonal projection matrix can be expanded by correlation matrices 关11兴. In this respect, we can
naturally regard a correlation-type associative memory
model as one prototype of the neural network models of the
brain. For example, Amit et al. discussed the function of the
column of anterior ventral temporal cortex by means of a
model based on correlation-type associative memory model
关12,13兴. Also, Sompolinsky discussed the effect of dilution.
He assumed that the capacity is proportional to the number
of remaining bonds, and pointed out that a synapse efficiency
of diluted network, which is storage capacity per a connecting rate, is higher than a full connected network’s one 关14兴.
Chechik et al. discussed the significance of the function of
the pruning following overgrowth on the basis of a
*Electronic address: [email protected]
1063-651X/2003/68共3兲/031910共15兲/$20.00
correlation-type associative memory model 关15兴. They
pointed out that a memory performance, which is stored pattern number per synapse number, is maximized by systematic deletion that cuts synapses that are lightly weighted.
However, while it is qualitatively obvious that synapse efficiency and memory performance are increased by a systematic deletion, we also need to consider the increase of synapse efficiency quantitatively.
In this paper, we quantitatively compare the effectiveness
of systematic deletion to that with random deletion on the
basis of an autocorrelation-type associative memory model.
In this model, one neuron is connected to other neurons with
a proportion of c, where c is called the connecting rate. Systematic deletion is considered as a kind of nonlinear correlation learning 关16兴. At the limit where the number of neurons
N is extremely large, it is known that random deletion and
nonlinear correlation learning can be transformed into correlation learning with synaptic noise 关14,16兴. These two types
of deletion, systematic and random, are strongly related to
multiplicative synaptic noise. First, we investigated the dependence of storage capacity on multiplicative synaptic
noise. At the limit where multiplicative synaptic noise is extremely large, we show that storage capacity is inversely
proportional to the variance of the multiplicative synaptic
noise. From this result, we analytically derive that the synapse efficiency in the case of systematic deletion diverges as
O( 兩 ln c兩) at the limit where the connecting rate c is extremely
small. We also show that the synapse efficiency in the case of
systematic deletion becomes 2 兩 ln c兩 times as large as that of
random deletion.
In addition to such fixed neuron number criterion as the
synapse efficiency, a fixed synapse number criterion could be
discussed. At the synaptic growth stage, it is natural to assume that metabolic energy resources are restricted. When
metabolic energy resources are limited, it is also important
that the effect of synaptic pruning is discussed under limited
synapse number. Under this criterion, the optimal connecting
rate, which maximizes memory performance, exists. These
optimal connecting rates are in agreement with the computer
simulation results given by Chechik et al. 关15兴.
68 031910-1
©2003 The American Physical Society
PHYSICAL REVIEW E 68, 031910 共2003兲
MIMURA, KIMOTO, AND OKADA
II. MODEL
Sompolinsky discussed the effects of synaptic noise and
nonlinear synapse by means of the replica method 关14兴.
However, symmetry of the synaptic connections J i j ⫽J ji is
required in the replica method since the existence of the
Lyapunov function is necessary. Therefore, there was a problem that the symmetry regarding synaptic noise had to be
assumed in the Sompolinsky theory. To avoid this problem,
Okada et al. discussed additive synaptic noise, multiplicative
synaptic noise, random synaptic deletion, and nonlinear synapse by means of the self-consistent signal-to-noise analysis
共SCSNA兲 关16兴. They showed that additive synaptic noise,
random synaptic deletion, nonlinear synapse can be transformed into multiplicative synaptic noise.
Here, we discuss the synchronous dynamics as
冉兺
N
x i ⫽F
j⫽i
冊
共1兲
where F is the response function, x i is the output activity of
neuron i, and ⫺h is the threshold of each neuron. Every
component ␰ i␮ in a memorized pattern ␰␮ is an independent
random variable,
1⫾a
Prob关 ␰ ␮i ⫽⫾1 兴 ⫽
,
2
共2兲
and the generated patterns are called sparse pattern with bias
a (⫺1⬍a⬍1). We have determined that the firing rate of
states in the retrieval phase is the same for each memorized
pattern 关17,18兴. In this case, threshold ⫺h can be determined
as
a⫽
1
N
冉
N
J i j x i ⫹h
兺 sgn 兺
j⫽i
i⫽1
冊
.
共3兲
The firing rate becomes f ⫽(1⫹a)/2 at the bias a.
Additive synaptic noise, multiplicative synaptic noise,
random synaptic deletion, and nonlinear synapse can be introduced by synaptic connections in the following manner.
In the case of additive synaptic noise, synaptic connections are constituted as
Ji j⫽
J
N 共 1⫺a 2 兲
冉
兺
N 共 1⫺a 2 兲 2
共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲 ⫹ ␦ i j ,
冉 冊
␦ A2
␦ i j ⬃N 0,
,
␦ i j ⫽ ␦ ji ,
冊
⌬ A2 ,
␦ i j ⫽ ␦ ji .
共7兲
␣N
1⫹␧ i j
N 共 1⫺a 2 兲
兺
共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲 ,
␮ ⫽1
共8兲
where ␧ i j is multiplicative synaptic noise. The symmetric
multiplicative noise ␧ i j and ␧ ji are generated according to
the probability,
␧ i j ⬃N 共 0,⌬ 2M 兲 ,
␧ i j ⫽␧ ji ,
共9兲
where ⌬ 2M is the variance of the multiplicative synaptic
noise.
In the model of random synaptic deletion, synaptic connections are constituted as
Ji j⫽
␣N
cij
Nc 共 1⫺a 2 兲
兺
␮ ⫽1
共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲 ,
共10兲
where c i j is a cut coefficient. The synapse that is cut is represented by the cut coefficient c i j ⫽0. In the case of symmetric random deletion, the cut coefficients c i j and c ji are generated according to the probability
Prob关 c i j ⫽1 兴 ⫽1⫺Prob关 c i j ⫽0 兴 ⫽c,
c i j ⫽c ji ,
共11兲
where c is the connecting rate.
In the model of nonlinear synapse 关27兴, synaptic connections are constituted as
共4兲
共5兲
共6兲
,
In the case of multiplicative synaptic noise, synaptic connections are constituted as
Ji j⫽
where ␦ i j is the additive synaptic noise. The symmetric additive synaptic noise ␦ i j and ␦ ji are generated according to
the probability,
N
J2
␦ i j ⬃N 0,
␣N
␮ ⫽1
J/ 共 1⫺a 2 兲
which measures the relative strength of the noise and we call
the parameter ⌬ A2 the variance of the additive synaptic noise.
Therefore, we define the probability to generate the additive
synaptic noise ␦ i j as
Ji j⫽
J i j x j ⫹h ,
␦A
⌬ A⬅
Ti j⫽
1
冑p 共 1⫺a 2 兲
冑p
N
f 共 Ti j 兲,
共12兲
p
兺
␮ ⫽1
共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲 ⬃N 共 0,1兲 , 共13兲
where p⫽ ␣ N. The nonlinear synapse is introduced by applying the nonlinear function f (x) to the conventional Hebbian connection matrix T i j .
Systematic deletion by nonlinear synapse
where ␦ A2 is the absolute strength of the additive synaptic
noise. The parameter ␦ A is assumed to be O(1). This means
that the synaptic connection J i j is O(1/冑N). It is useful to
define the parameter ⌬ A as
Chechik et al. pointed out that a memory performance,
which is a storage pattern number per a synapse number, is
maximized by systematic deletion that cuts synapses that are
lightly weighted 关12,13,15兴. Such a systematic deletion can
031910-2
PHYSICAL REVIEW E 68, 031910 共2003兲
SYNAPSE EFFICIENCY DIVERGES DUE TO SYNAPTIC . . .
FIG. 1. Clipped synapse.
FIG. 3. Compressed deletion.
be represented by the nonlinear function f (x) for a nonlinear
synapse. In accordance with Chechik et al., we discuss three
types of nonlinear functions 共Figs. 1–3兲. Figure 1 shows
clipped modification that is discussed generally as
f 1 共 z,t 兲 ⫽
再
sgn共 z 兲 ,
兩 z 兩 ⬎t
0
otherwise.
共14兲
Chechik et al. also obtained the nonlinear functions
shown in Figs. 2 and 3 by applying the following optimization principles 关15,19兴. In order to evaluate the effect of synaptic pruning on the network’s retrieval performance,
Chechik et al. study its effect on the signal-to-noise ratio
(S/N) of the internal field h i ⬅ 兺 j⫽i J i j x j ⫹h 关15兴. The S/N is
calculated by analyzing the moments of the internal field and
is given as
Dz⫽
⬀
E 关 h i 兩 ␰ ␮i ⫽1 兴 ⫺E 关 h i 兩 ␰ ␮i ⫽⫺1 兴
冑
E 关 z f 共 z 兲兴
冑E 关 f 共 z 兲 2 兴
⫽
⬁
⫺⬁
⬁
⫺⬁
Dz z f 共 z 兲
冕
⬁
Dz z f 共 z 兲 ⫺ ␥
冉冕
f 2 共 z,t 兲 ⫽
Dz f 共 z 兲 2
⬅ ␳ „f 共 z 兲 ,z…,
冉 冊
z2
,
2
共16兲
⬁
⫺⬁
冊
Dz f 共 z 兲 2 ⫺c 0 →max,
共17兲
for some constant c 0 . Since the synaptic connection before
the action of the nonlinear function T i j obeys a Gaussian
distribution N(0,1), Eq. 共13兲 is averaged over all of the synaptic connections.
Thus, the nonlinear function shown in Fig. 2,
V 关 h i 兩 ␰ i␮ 兴
冕
冕
冑2 ␲
exp ⫺
and E,V denote the operators to calculate the expectation
and the variance for the random variables ␰ i␮ (i
⫽1, . . . ,N, ␮ ⫽1, . . . , ␣ N), respectively. The function
␳ „f (z),z… denotes the correlation coefficient. Chechik et al.
considered the piecewise linear function such as following
nonlinear functions. In order to find the best nonlinear function f (z), we should maximize ␳ „f (z),z…, which is invariant
to scaling. Namely, the best nonlinear function f (x) is obtained by maximizing E 关 z f (z) 兴 under the condition that
E 关 f (z) 2 兴 is constant. Let ␥ be the Lagrange multiplier, then
it is sufficient to solve
⫺⬁
R S/N ⫽
dz
共15兲
where z has standard normal distribution, i.e., E 关 z 2 兴 ⫽V 关 z 兴
⫽1 and Dz is Gaussian measure defined as
再
z,
兩 z 兩 ⬎t
0
otherwise,
共18兲
is also obtained. The deletion by this nonlinear function is
called minimal value deletion. Similarly, by adding the condition that the total strength of synaptic connection
兰 Dz 兩 f (z) 兩 is constant, the nonlinear function
f 3 共 z,t 兲 ⫽
再
z⫺sgn共 z 兲 t,
兩 z 兩 ⬎t
0
otherwise
⫽ f 2 共 z,t 兲 ⫺t f 1 共 z,t 兲
共19兲
is obtained. The deletion by this nonlinear function is called
compressed deletion. We discuss systematic deletion by using these three types of nonlinear functions f 1 (z,t), f 2 (z,t),
and f 3 (z,t) given by Chechik et al. 关15兴.
III. RESULTS
FIG. 2. Minimal value deletion.
In this section, the results concerning the multiplicative
synaptic noise, the random deletion, and the nonlinear syn031910-3
PHYSICAL REVIEW E 68, 031910 共2003兲
MIMURA, KIMOTO, AND OKADA
apse are shown, at the limit where the effect of these deletions is extremely large.
The SCSNA starts from the fixed-point equations for the
dynamics of an N-neuron network as Eq. 共1兲. The results of
the SCSNA for the symmetric additive synaptic noise are
summarized by the following order-parameter equations 共see
Appendix A兲:
m⫽
1
1⫺a 2
冕
␴ 2⫽
共20兲
Dz 具 Y 共 z; ␰ 兲 2 典 ␰ ,
共21兲
冕
冕
␴
q⫽
U⫽
Dz 具 共 ␰ ⫺a 兲 Y 共 z; ␰ 兲 典 ␰ ,
1
Dz z 具 Y 共 z; ␰ 兲 典 ␰ ,
␣ J 2q
共 1⫺JU 兲
冉
冋
⫹
2
J2
共 1⫺a 2 兲 2
⌬ A2 q,
共22兲
共23兲
FIG. 4. Overlaps in the random deletion network. The curves
represent the theoretical results. The dots represent simulation results with N⫽3000 and f ⫽0.1 for the connecting rates c⫽0.1, c
⫽0.3, and c⫽1.0.
⌬ 2M ⫽
Y 共 z; ␰ 兲 ⫽F J 共 ␰ ⫺a 兲 m⫹ ␴ z⫹h
⫹
册 冊
␣ J 2U
J2
⌬ 2 Y 共 z; ␰ 兲 ,
⫹
1⫺JU 共 1⫺a 2 兲 2 A
共24兲
N
1
N 共 1⫺a 2 兲
兺
i⫽1
共 ␰ 1i ⫺a 兲 x i ;
共25兲
note that generality is kept even if the overlap was defined by
only the first memory pattern, q is Edwards-Anderson order
parameter, U is a kind of the susceptibility, which measures
the sensitivity of neuron output with respect to the external
input, Y (z; ␰ ) is an effective response function, and ␴ 2 is the
variance of the noise. We set the output function F(x)
⫽sgn(x)⫺a in the following sections, where domain of the
x variable is F(x)⫽1⫺a when x⭐0, otherwise F(x)⫽⫺1
⫺a.
According to Okada et al. 关16兴, the symmetric additive
synaptic noise, the symmetric random deletion, and the nonlinear synapse can be transformed into the symmetric multiplicative synaptic noise as follows 共see Appendix B兲: the
additive synaptic noise is
⌬ 2M ⫽
⌬ A2
␣ 共 1⫺a 2 兲 2
,
and the nonlinear synapse is
1⫺c
,
c
⫺1,
共28兲
J⫽
冕
冕
J̃ 2 ⫽
Dz z f 共 z 兲 ,
共29兲
Dz f 共 z 兲 2 .
共30兲
In the following sections, symmetries of the additive synaptic noise, the multiplicative synaptic noise, and random deletion are assumed. Storage capacity can be obtained by
solving the order-parameter equations.
Figure 4 shows m( ␣ ) curves in the random deletion network with the number of neurons N⫽3000 and the firing rate
f ⫽0.1 for the connecting rates c⫽0.1, c⫽0.3, and c⫽1.0.
It can be confirmed that the theoretical results of the SCSNA
are in good agreement with the computer simulation results
from Fig. 4. Since it is known that theoretical results obtained by means of the SCSNA are generally in good agreement with the results obtained through the computer simulations using various models that include synaptic noise, we
treat the results by means of the SCSNA only 关16,20–26兴.
Through the relationships of Eqs. 共26兲–共30兲, the symmetric additive synaptic noise, the symmetric random deletion,
and the nonlinear synapse can be discussed in terms of the
symmetric multiplicative synaptic noise. Therefore, first of
all, we deal with the multiplicative synaptic noise.
共26兲
A. Multiplicative synaptic noise
the random deletion is
⌬ 2M ⫽
J2
where J,J̃ 2 are
where 具 ••• 典 ␰ implies averaging over the target pattern and m
is the overlap between the first memory pattern ␰1 and the
equilibrium state x defined as
m⫽
J̃ 2
共27兲
Figure 5 shows the dependence of storage capacity on the
multiplicative synaptic noise. As it is clear from Fig. 5, storage capacity ␣ c is inversely proportional to the variance of
the multiplicative synaptic noise ⌬ 2M , when the multiplicative synaptic noise is extremely large. Storage capacity ␣ c
asymptotically approaches
031910-4
PHYSICAL REVIEW E 68, 031910 共2003兲
SYNAPSE EFFICIENCY DIVERGES DUE TO SYNAPTIC . . .
FIG. 5. Dependence of storage capacity ␣ c on the multiplicative
synaptic noise ⌬ 2M at the firing rate f ⫽0.5. Comparison of asymptote and the results from the SCSNA.
␣ c⫽
2
␲ ⌬ 2M
共31兲
共see Appendix C兲. In the sparse limit where the firing rate is
extremely small, it is known that storage capacity becomes
␣ c ⯝1/( f 兩 ln f 兩) 关18,20,28 –30兴.
Figure 5 shows the results from the SCSNA and the asymptote at the firing rate f ⫽0.5. Figure 6 shows the results
from the SCSNA at various firing rates. It can be confirmed
that the order of the asymptote O(1/⌬ 2M ) does not depend on
the firing rate from Fig. 6.
FIG. 7. Dependence of the synapse efficiency S e f f on the random deletion with the connecting rate c at the firing rate f ⫽0.5.
Comparison of asymptote and the results from the SCSNA.
according to the asymptote of the multiplicative synaptic
noise in Eq. 共31兲. In the random deletion, the synapse efficiency S e f f , which is storage capacity per the connecting
rate 关14,16兴, i.e., storage capacity per the input of one neuron, and defined as
共33兲
␣c 2
⫽ ,
c
␲
共34兲
approaches a constant value as
Sef f⫽
B. Random deletion
Next, we discuss the asymptote of the random deletion.
The random deletion with the connecting rate c can be transformed into the multiplicative synaptic noise by Eq. 共27兲.
Hence, at the limit where the connecting rate c is extremely
small, storage capacity becomes
␣c
,
c
Sef f⬅
共32兲
according to Eq. 共32兲 at the limit where the connecting rate c
is extremely small.
Figure 7 shows the result from the SCSNA and the asymptote at the firing rate f ⫽0.5. Figure 8 shows the results
from the SCSNA at various firing rates. It can be confirmed
that the order of the asymptote, O(1) with respect to c, does
not depend on the firing rate from Fig. 8.
FIG. 6. Dependence of storage capacity ␣ c on the multiplicative
synaptic noise ⌬ 2M . It can be confirmed that the order of the asymptote does not depend on the firing rate.
FIG. 8. Dependence of the synapse efficiency S e f f on the random deletion with the connecting rate c at various firing rates. It can
be confirmed that the order of the asymptote does not depend on the
firing rate.
␣ c⫽
2
␲ ⌬ 2M
⫽
2c
2
→ c,
␲ 共 1⫺c 兲 ␲
031910-5
PHYSICAL REVIEW E 68, 031910 共2003兲
MIMURA, KIMOTO, AND OKADA
C. Systematic deletion
1. Clipped synapse
Synapses within the range ⫺t⬍z⬍t are pruned by the
nonlinear function of Eq. 共14兲.
The connection rate c of the synaptic deletion in Eq. 共13兲
is given by
c⫽
冕
兵 z 兩 f 1 (z,t)⫽0 其
⫽1⫺erf
Dz
冉 冊
t
冑2
→
冑
冉 冊
2 ⫺1
t2
t exp ⫺
,
␲
2
t→⬁, 共35兲
since the synaptic connection T i j before the action of the
nonlinear function of Eq. 共13兲 obeys the Gaussian distribution N(0,1). Next, J,J̃ of Eqs. 共29兲 and 共30兲 become
J⫽2
冕
t
⬁
Dz zsgn共 z 兲 ⫽
J̃ ⫽2
2
冕
2. Minimal value deletion
冑 冉 冊
2
t2
exp ⫺
→tc,
␲
2
⬁
Dz⫽1⫺erf
t
冉 冊
t
冑2
FIG. 9. Dependence of the synapse efficiency S e f f with the
clipped synapse on the connecting rate c at f ⫽0.5. Comparison of
the results from the SCSNA and asymptote.
⫽c.
t→⬁,
共36兲
In a similar way, the equivalent multiplicative synaptic
noise ⌬ 2M of the systematic deletion of Eq. 共18兲 is obtained as
follows:
⌬ 2M ⫽
共37兲
Hence, the equivalent multiplicative synaptic noise ⌬ 2M is
obtained as
J
t c
,
t→⬁.
共38兲
J⫽
共39兲
→
is obtained by taking the logarithm of Eq. 共35兲 at t→⬁ limit.
Therefore, at the limit where the equivalent connecting rate c
is extremely small, storage capacity ␣ c can be obtained,
␣ c ⫽⫺
4
c ln c,
␲
⫺1→
1
t 2c
,
共42兲
t→⬁,
冕
冑␲
兵 z 兩 f 2 (z,t)⫽0 其
Dz⫽1⫺erf
冉 冊
t
冑2
冉 冊
冑 冉 冊 冉冑 冊
冑 冉 冊
→
The relationship of the pruning range t and the connecting
rate c,
t 2 ⫽⫺2 ln c,
J
2
where the connecting rate c and J,J̃ of Eqs. 共29兲 and 共30兲 are
c⫽
J̃ 2
1
⌬ 2M ⫽ 2 ⫺1→ 2
J̃ 2
2
t ⫺1 exp ⫺
t2
,
2
t→⬁,
2
t2
t exp ⫺
⫹1⫺erf
␲
2
2
t2
t exp ⫺
,
␲
2
t→⬁,
共43兲
t
2
共44兲
共40兲
through Eqs. 共31兲, 共38兲, and 共39兲. The synapse efficiency
becomes
Sef f⫽
␣c
4
⫽⫺ ln c.
c
␲
共41兲
Figure 9 shows the results from the SCSNA and the asymptote at the firing rate f ⫽0.5. Figure 10 shows the results
from the SCSNA at various firing rates. It can be confirmed
that the order of the asymptote O(ln c) does not depend on
the firing rate from Fig. 10.
FIG. 10. Dependence of the synapse efficiency S e f f with the
clipped synapse on the firing rate f. It can be confirmed that the
order of the asymptote does not depend on the firing rate.
031910-6
PHYSICAL REVIEW E 68, 031910 共2003兲
SYNAPSE EFFICIENCY DIVERGES DUE TO SYNAPTIC . . .
FIG. 11. Dependence of the synapse efficiency S e f f with the
minimal value deletion on the connecting rate c at time f ⫽0.5.
Comparison of the results from the SCSNA and the asymptote.
J̃ 2 ⫽J,
共45兲
respectively. Hence, at the limit where the equivalent connecting rate c is extremely small, storage capacity ␣ c and the
synapse efficiency S e f f can be obtained through Eqs. 共31兲,
共42兲, and 共39兲 as follows:
␣ c ⫽⫺
4
c ln c,
␲
共46兲
FIG. 13. Dependence of the synapse efficiency S e f f with the
compressed deletion on the connecting rate c at f ⫽0.5. Comparison
of the results from the SCSNA and the asymptote.
3. Compressed deletion
Again, in the similar way, the equivalent multiplicative
synaptic noise ⌬ 2M of the systematic deletion of Eq. 共19兲 is
given by
⌬ 2M ⫽
共47兲
respectively. Figure 11 shows the results from the SCSNA
and the asymptote at the firing rate f ⫽0.5. Figure 12 shows
the results from the SCSNA at various firing rates. It can be
confirmed that the order of the asymptote does not depend on
the firing rate from Fig. 12.
2
J
t 2c
⫺1→
2
,
共48兲
t→⬁,
where the connecting rate c and J,J̃ of Eqs. 共29兲 and 共30兲 are
c⫽
4
S e f f ⫽⫺ ln c,
␲
J̃ 2
→
冕
冑␲
兵 z 兩 f 3 (z,t)⫽0 其
2
Dz⫽1⫺erf
冉 冊
t ⫺1 exp ⫺
t2
,
2
冉 冊
t
冑2
共49兲
t→⬁,
共50兲
J⫽c,
冕
J̃ 2 ⫽
⬁
⫺⬁
⫺2t
⫽
Dz f 2 共 z,t 兲 2 ⫹t 2
冕
⬁
⫺⬁
冕
⬁
⫺⬁
Dz f 1 共 z,t 兲 2
Dz f 1 共 z,t 兲 f 2 共 z,t 兲
冋冑 冉 冊
2
t2
t exp ⫺
⫹1⫺erf
␲
2
⫺2t
→
2c
t2
,
冋冑 冉 冊 册
冉 冊册 冋 冉 冊册
t
冑2
⫹t 2 1⫺erf
t
冑2
2
t2
exp ⫺
␲
2
共51兲
t→⬁,
respectively. Here, we use an asymptotic expansion equation
of the error function
FIG. 12. Dependence of the synapse efficiency S e f f with the
minimal value deletion on the firing rate f. It can be confirmed that
the order of the asymptote does not depend on the firing rate.
031910-7
erf共 x 兲 ⫽1⫺
e ⫺x
2
冑␲
冉
冊
3
1
1
⫺ 3 ⫹ 5 ⫺O 共 x ⫺7 兲 ,
x 2x
4x
共52兲
PHYSICAL REVIEW E 68, 031910 共2003兲
MIMURA, KIMOTO, AND OKADA
TABLE I. Asymptote of storage capacity by the random deletion and by the systematic deletion at the firing rate f ⫽0.5.
Types of deletion
Storage capacity
共Asymptote兲
(2/␲ )c
(4/␲ )c 兩 ln c兩
(4/␲ )c 兩 ln c兩
(2/␲ )c 兩 ln c兩
Random deletion
Systematic deletion clipped synapse
Minimal value deletion
Compressed deletion
FIG. 14. Dependence of the synapse efficiency S e f f with the
compressed deletion on the firing rate f. It can be confirmed that the
order of the asymptote does not depend on the firing rate.
for xⰇ1. In order for the first term and the second term of
Eq. 共51兲 to be of the same order, the asymptotic expansion
equation has taken the approximation of O(t ⫺3 ) and
O(t ⫺5 ), respectively. The equivalent multiplicative synaptic
noise in the case of systematic deletion becomes double that
of the clipped synapse of Eq. 共38兲 and the minimal value
deletion of Eq. 共48兲. Therefore, at the limit where the equivalent connecting rate c is extremely small, storage capacity ␣ c
and the synapse efficiency S e f f can be obtained through Eqs.
共31兲, 共48兲, and 共39兲 as follows:
␣ c ⫽⫺
2
c ln c,
␲
共53兲
2
ln c,
␲
共54兲
S e f f ⫽⫺
respectively. Figure 13 shows the results from the SCSNA
and the asymptote at the firing rate f ⫽0.5. Figure 14 shows
the results by the SCSNA at various firing rates. It can be
confirmed that the order of the asymptote O(ln c) does not
depend on the firing rate from Fig. 14.
Figure 15 shows the dependence of the synapse efficiency
S e f f on the connecting rate c obtained by means of the
SCSNA. Table I shows the asymptote of storage capacity
with the random deletion 共rd兲 and the systematic deletion
共sd兲. Hence, when using minimal value deletion as the simplest from of systematic deletion we found that the synapse
efficiency 共Table II兲 in the case of systematic deletion becomes
S e f f (sd) 共 4/ ␲ 兲 兩 ln c 兩
⫽
⫽2 兩 ln c 兩 ;
S e f f (rd)
2/ ␲
共55兲
thus we have shown analytically that the synapse efficiency
in the case of systematic deletion diverges as O( 兩 ln c兩) at the
limit where the connecting rate c is extremely small, and
have also shown that the synapse efficiency in the case of the
systematic deletion becomes 2 兩 ln c兩 times as large as that of
the random deletion.
IV. THE MEMORY PERFORMANCE UNDER LIMITED
METABOLIC ENERGY RESOURCES
Until the preceding section, we have discussed the effect
of synaptic pruning by evaluating the synapse efficiency
which is the memory capacity normalized by connecting rate
c. When the connecting rate is c, the synapse number per one
neuron decreases to cN. Therefore, the synapse efficiency
means the capacity per the input of one neuron. In the discussion on the synapse efficiency, the synapse number decreases when the connecting rate is small.
In addition to such fixed neuron number criterion, a fixed
synapse number criterion could be discussed. At the synaptic
TABLE II. Asymptote of the synapse efficiency by the random
deletion and the systematic deletion at the firing rate f ⫽0.5.
Types of deletion
FIG. 15. Comparison of the synapse efficiency with the random
deletion and that with the systematic deletion at the firing rate f
⫽0.5.
Random deletion
Systematic deletion clipped synapse
Minimal value deletion
Compressed deletion
031910-8
Synapse efficiency
共Asymptote兲
(2/␲ )
(4/␲ ) 兩 ln c兩
(4/␲ ) 兩 ln c兩
(2/␲ ) 兩 ln c兩
PHYSICAL REVIEW E 68, 031910 共2003兲
SYNAPSE EFFICIENCY DIVERGES DUE TO SYNAPTIC . . .
memory performance on the connecting rate in three types of
pruning. It is confirmed that there are optimal values which
maximized the memory performance by each deletion.
The optimal connecting rates of clipped synapse, minimal
value deletion, and compressed deletion are c
⫽0.036,0.038,0.084, respectively. This interesting fact may
imply that the memory performance is improved without
heavy pruning. These optimal values agree with the computer simulation results given by Chechik et al. 关15兴.
V. CONCLUSION
FIG. 16. Memory performance ␣ c / 冑c of networks with different number of neurons but the same total number of synapses as a
function of the connecting rate c in the case of clipped synapse 共the
solid line兲, minimal value deletion 共the dotted line兲, and compressed
deletion 共the dashed line兲.
growth stage, it is natural to assume that metabolic energy
resources are restricted. When metabolic energy resources
are limited, it is also important that the effect of synaptic
pruning is discussed under limited synapse number. Chechik
et al. discussed the memorized pattern number per one synapse under a fixed synapse number criterion 关15兴. They
pointed out the existence of an optimal connecting rate under
the fixed synapse number criterion and suggested an explanation of synaptic pruning as follows: synaptic pruning following overgrowth can improve the performance of a network with limited synaptic resources.
The synapse number is N 2 in the full connected case c
⫽1. We consider the larger network with M (⬎N) neurons.
The synapse number in the larger networks with the connecting rate c becomes cM 2 . We can introduce the fixed synapse
number criterion by considering a larger network which has
M neurons, i.e.,
N 2 ⫽cM 2
共56兲
We have analytically discussed the synapse efficiency,
which we regarded as the autocorrelation-type associative
memory, to evaluate the effect of the pruning following overgrowth. Although Chechik et al. pointed out that the synapse
efficiency is increased by the systematic deletion, this is
qualitatively obvious and the increase in the synapse efficiency should also be discussed quantitatively. At the limit
where the multiplicative synaptic noise is extremely large,
storage capacity ␣ c varies inversely as the variance of the
multiplicative synaptic noise ⌬ 2M . From this result, we analytically obtained that the synapse efficiency in the case of
the systematic deletion diverges as O( 兩 ln c兩) at the limit
where the connecting rate c is extremely small.
On the other hand, it is natural to assume that metabolic
energy resources are restricted at the synaptic growth stage.
When metabolic energy resources are limited, i.e., synapse
number is limited, the optimal connecting rate that maximizes memory performance exists. These optimal values are
in agreement with the results given by Chechik et al. 关15兴.
In the correlation learning, which can be considered a
prototype of any other learning rules, various properties can
be analyzed quantitatively. The asymptote of synapse efficiency in the model with another learning rule can be discussed in a similar way. As our future work, we plan to
further discuss these properties while taking into account
various considerations regarding related physiological
knowledge.
synapses at the connecting rate c. The memorized pattern
number per one synapse becomes
␣c
␣c
,
⫽
⫽
⫽
cM 2 N 2 N 冑cM
pc
pc
共57兲
where the critical memorized pattern number p c is
p c ⫽ ␣ c N.
共58兲
We define the coefficient ␣ c / 冑c as memory performance.
We discuss the effect of synaptic pruning by the memory
performance. Under limited metabolic energy resources, the
optimal strategy is the maximization of the memory performance. Chechik et al. showed the existence of the optimal
connecting rate which maximizes memory performance 关15兴.
The memory performance can be calculated by normalizing
the capacity, which is given by solving the order-parameter
equations, with 冑c. Figure 16 shows the dependence of the
ACKNOWLEDGMENTS
This work was partially supported by a Grant-in-Aid for
Scientific Research on Priority Areas No. 14084212, for Scientific Research 共C兲 No. 14580438, for Encouragement of
Young Scientists 共B兲 No. 14780309, and for Encouragement
of Young Scientists 共B兲 No. 15700141 from the Ministry of
Education, Culture, Sports, Science and Technology of Japan.
APPENDIX A: SCSNA FOR ADDITIVE SYNAPTIC NOISE
Derivations of the order-parameter equations 共20兲–共24兲
are given here. The SCSNA starts from the fixed-point equations for the dynamics of the N-neuron network shown as Eq.
共1兲. The random memory patterns are generated according to
the probability distribution of Eq. 共2兲. The synaptic connections are given by Eq. 共4兲. The asymmetric additive synaptic
noise ␦ i j and ␦ ji are independently generated according to
031910-9
PHYSICAL REVIEW E 68, 031910 共2003兲
MIMURA, KIMOTO, AND OKADA
the probability distribution of Eq. 共7兲. Moreover, we can analyze a more general case, where ␦ i j and ␦ ji have an arbitrary
correlation such that
J2
Cov关 ␦ i j , ␦ ji 兴 ⫽k ␦
N 共 1⫺a 兲
2
⌬2 ,
2 A
Let ␰1 be the target pattern to be retrieved. Therefore, we can
assume that m 1 ⫽O(1) and m ␮ ⫽O(1/冑N), ␮ ⬎1. Then we
can use the Taylor expansion to obtain
m ␮⫽
⫺1⭐k ␦ ⭐1. 共A1兲
In this general case, the symmetric and the asymmetric additive synaptic noise correspond to k ␦ ⫽1 and k ␦ ⫽0, respectively. Here, we assume that the probability distribution of
the additive synaptic noise is normal distribution
冉
J
␦ i j ⬃N 0,
冊
2
N 共 1⫺a 兲
2 2
⌬ A2 .
⫽
However, any probability distributions, which have same average and variance, can be discussed by the central limit
theorem in the similar way as in the following discussion.
Defining the loading rate as ␣ ⫽p/N, we can write the local
field h i for neuron i as
␣N
N
h i⬅
N 共 1⫺a
⫹
⫽
N 共 1⫺a
N 共 1⫺a
( ␦ ji )
( ␦ ji )
⫹ ␦ ji x i x ⬘ j
共A3兲
N 共 1⫺a 2 兲共 1⫺JU 兲
共A6兲
Substituting Eq. 共A4兲 into Eq. 共A2兲, the local field h i can be
expressed as
h i ⫽J
兺
␮ ⫽1
N
共 ␰ ␮i ⫺a 兲 m ␮ ⫹
兺
j⫽i
(␦
␦ i j ␦ ji x ⬘ j
兺
j⫽i
ji )
冉兺
⫹
⫹
(␦
␦i jx j
兺
j⫽i
(␦
␦i jx j
兺
j⫽i
ji )
冊
N
共 ␰ i␯ ⫺a 兲 m ␯ ⫹
(␦
␦i jx j
兺
j⫽i
ji )
共A10兲
,
冊
, 共A11兲
N
兺 x ⬘ (i ␮ ) .
共A12兲
i⫽1
冋
册
J2
J2
⌬ A2 Ux i
⫹k ␦
1⫺JU
共 1⫺a 2 兲 2
N 共 1⫺a
2
␣N
N
J
共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲 x (j ␮ )
兺
兺
j⫽i
␮
⫽2
兲共 1⫺JU 兲
ji )
冊
.
(␦
␦i jx j
兺
j⫽i
ji )
N
N 2 共 1⫺a 2 兲 2 共 1⫺JU 兲 2
共A8兲
共A13兲
.
Note that the second term in Eq. 共A13兲 denotes the effective
self-coupling term. The third and the last terms are summations of uncorrelated random variables, with mean 0 and
variance,
共A7兲
N
共 ␰ i␮ ⫺a 兲 m ␮ ⫹
N
␯⫽␮
J2
(␦ )
␦ i j x j ji ⫺J ␣ x i
.
p
␮ ⫽1
␯⫽␮
h i ⫽J 共 ␰ 1i ⫺a 兲 m 1 ⫹ ␣
⫽
We assume that Eq. 共A7兲 and x i ⫽F(h i ⫹h) can be solved by
using the effective response function F̃(u) as
x i ⫽F̃ J
共A9兲
Equations 共A7兲 and 共A9兲 give the following expression for
the local field:
N
⫹x i
兺 共 ␰ ␮i ⫺a 兲 x (i ␮ ) ,
i⫽1
共 ␰ ␯i ⫺a 兲 m ␯ ⫹
1
U⫽
N
共A5兲
⫽F ⬘ 共 h j ⫺ ␦ ji x i 兲 .
␣N
N
N
(␦ )
x j ji ⫽F 共 h j ⫺ ␦ ji x i 兲 ,
( ␦ ji )
i⫽1
1
x ⬘ (i ␮ ) ⫽F̃ ⬘ J
where
x⬘j
共 ␰ ␮i ⫺a 兲 x (i ␮ ) ⫹JUm ␮
兺
兲
冉兺
冉兺
共A4兲
,
2
␣N
The second term including x j ⫽F(h j ⫹h) in Eq. 共A2兲 depends on ␦ ji . The ␦ i j dependences of x j are extracted from
xj ,
x j ⫽x j
i⫽1
N
1
N
共 ␰ ␮i ⫺a 兲 x i .
兺
兲
兺 共 ␰ ␮i ⫺a 兲 2 m ␮ x ⬘ (i ␮ )
N 共 1⫺a 2 兲
where m ␮ is the overlap between the stored pattern ␰␮ and
the equilibrium state x defined by
i⫽1
N
␣N
共A2兲
2
共 ␰ ␮i ⫺a 兲 x (i ␮ )
兺
兲
i⫽1
J
x (i ␮ ) ⫽F̃ J
N
1
2
by substituting Eq. 共A8兲 into the overlap defined by Eq.
共A3兲, where
J i j x j ⫽J 兺 共 ␰ ␮i ⫺a 兲 m ␮ ⫹ 兺 ␦ i j x j ⫺J ␣ x i ,
兺
j⫽i
␮ ⫽1
j⫽i
m ␮⫽
N
1
␣J2
共 1⫺JU 兲 2
N
兺
j⫽i
␣N
兺
兺 共 ␰ ␮i ⫺a 兲 2共 ␰ ␮j ⫺a 兲 2共 x (j ␮ ) 兲 2
j⫽i ␮ ⫽2
共A14兲
q,
(␦ )
␦ 2i j 共 x j ji 兲 2 ⫽
J2
共 1⫺a 兲
2 2
⌬ A2 q,
共A15兲
respectively. The cross term of these terms has vanished.
Thus, we finally obtain
031910-10
PHYSICAL REVIEW E 68, 031910 共2003兲
SYNAPSE EFFICIENCY DIVERGES DUE TO SYNAPTIC . . .
h i ⫽J 共 ␰ 1i ⫺a 兲 m 1 ⫹ ␴ z i ⫹
␴ 2⫽
冋
册
␣ J 2q
共 1⫺JU 兲 2
⫹
J2
共 1⫺a 2 兲 2
1
1⫺a 2
⌬ A2 q,
共A17兲
N 共 1⫺a 2 兲
( ␮ )(␧ ji )
兵 ␮ ,␧ ji 其
⫽ 共 ␰ ␮j ⫺a 兲 m ␮ ⫹
⫹
⫹
␣J2
Y 共 z; ␰ 兲 ⫽F J 共 ␰ ⫺a 兲 m⫹ ␴ z⫹h⫹
1⫺JU
J2
共 1⫺a 2 兲
册
冊
⌬ 2 UY 共 z; ␰ 兲 .
2 A
( ␮ )(␧ ji )
x⬘j
,
共B3兲
N
1
兺 ␧ jk共 ␰ ␮j ⫺a 兲共 ␰ ␮k ⫺a 兲 x k
N 共 1⫺a 2 兲
k⫽ j
N 共 1⫺a 2 兲
␧ ji
N 共 1⫺a 2 兲
兺
␯⫽␮
共 ␰ ␯j ⫺a 兲共 ␰ ␯i ⫺a 兲 x i
共 ␰ ␮j ⫺a 兲共 ␰ ␮i ⫺a 兲 x i ,
共B4兲
where
共A18兲
APPENDIX B: EQUIVALENCE AMONG THREE TYPES
OF NOISE
The multiplicative synaptic noise, the random synaptic
deletion, and the nonlinear synapse can be discussed in the
similar manner to Appendix A.
Derivations of the equivalent noise, Eq. 共26兲, is given
here. We can also analyze by a similar manner to the analysis
of the additive synaptic noise. The synaptic connections are
given by Eq. 共8兲. The asymmetric multiplicative synaptic
noise ␧ i j and ␧ ji are independently generated according to
the probability distribution of Eq. 共9兲. We analyze a more
general case, where ␦ i j and ␦ ji have an arbitrary correlation
such that
共B1兲
In this general case, the symmetric and the asymmetric multiplicative synaptic noise correspond to k ␧ ⫽1 and k ␧ ⫽0,
respectively. Here, we assume that the probability distribution of the multiplicative synaptic noise is normal distribution ␧ i j ⬃N(0,⌬ 2M ). The local field h i for neuron i becomes
( ␮ )(␧ ji )
⫽F 共 h j ⫺h j
( ␮ )(␧ ji )
⫽F ⬘ 共 h j ⫺h j
xj
x⬘j
兵 ␮ ,␧ ji 其
共B5兲
兲,
兵 ␮ ,␧ ji 其
共B6兲
兲.
We assume that Eq. 共B2兲 and x i ⫽F(h i ⫹h) can be solved by
using the effective response function F̃(u) as
冉兺
p
x i ⫽F̃
1. Multiplicative synaptic noise
⫺1⭐k ␧ ⭐1.
兵 ␮ ,␧ ji 其
⫹h j
␣N
␧ ji
The effective response function of Eq. 共24兲 can be obtained
by substituting k ␦ ⫽1 into Eq. 共A18兲.
Cov关 ␧ i j ,␧ ji 兴 ⫽k ␧ ⌬ 2M ,
兺 兺 ␧ i j 共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲 x j ⫺ ␣ x i ,
␮ ⫽1 j⫽i
where m ␮ is the overlap defined by Eq. 共A3兲. The second
term including x j ⫽F(h j ⫹h) in Eq. 共B2兲 depends on ␧ ji .
The ␧ i j dependences and ␰ ␮j dependences of x j are extracted
from x j ,
hj
冋
N
共B2兲
Dz z 具 Y 共 z; ␰ 兲 典 ␰ ,
冉
␣N
1
where the effective response function Y (z; ␰ ) becomes
⫹k ␦
共 ␰ ␮i ⫺a 兲 m ␮
x j ⫽x j
Dz 具 Y 共 z; ␰ 兲 2 典 ␰ ,
1
兺
␮ ⫽1
Dz 具 共 ␰ ⫺a 兲 Y 共 z; ␰ 兲 典 ␰ ,
冕
冕
␴
q⫽
U⫽
冕
h i⫽
⫹
from Eqs. 共A14兲 and 共A15兲, where z i ⬃N(0,1). Equation
共23兲 is given by Eq. 共A17兲. Finally, after rewriting ␰ 1i → ␰ ,
m 1 →m, z i →z, and x i →Y (z; ␰ ), the results of the SCSNA
for the additive synaptic noise are summarized by the orderparameter equations of Eqs. 共20兲–共22兲 as
m⫽
␣N
␣J2
J2
⫹k ␦
⌬ A2 Ux i ,
1⫺JU
共 1⫺a 2 兲 2
共A16兲
␮ ⫽1
共 ␰ i␮ ⫺a 兲 m ␮ ⫹
( ␮ )(␧ ji )
⫻ 共 ␰ ␮j ⫺a 兲 x j
冊
␣N
1
N 共 1⫺a 兲
2
N
兺 兺 ␧ i j 共 ␰ i␮ ⫺a 兲
␮ ⫽1 j⫽i
共B7兲
.
Let ␰1 be the target pattern. We substitute Eq. 共B7兲 into the
overlap defined by Eq. 共A3兲 and expand the resultant expression by ( ␰ ␮i ⫺a)m ␮ ( ␮ ⬎1), which has the order of
O(1/冑N). This leads to
m ␮⫽
where
031910-11
1
N 共 1⫺a 2 兲共 1⫺U 兲
N
兺 共 ␰ ␮i ⫺a 兲 x (i ␮ ) ,
i⫽1
共B8兲
PHYSICAL REVIEW E 68, 031910 共2003兲
MIMURA, KIMOTO, AND OKADA
冉兺
␣N
x (i ␮ ) ⫽F̃
␯⫽␮
共 ␰ ␯i ⫺a 兲 m ␯ ⫹
( ␯ )(␧ ji )
⫻共 ␰ ␯j ⫺a 兲 x j
冊
␣N
1
Cov关 c i j ,c ji 兴 ⫽k c Var关 c i j 兴 ,
兺 兺 ␧ i j 共 ␰ ␯i ⫺a 兲
N 共 1⫺a 2 兲 ␯ ⫽ ␮ j⫽i
共B9兲
,
and U is defined by the similar way to Eq. 共A12兲 in the case
of the additive synaptic noise. Equations 共B2兲, 共B3兲, and
共B8兲 give
冋
⫹
N 共 1⫺a 2 兲共 1⫺U 兲
␣N
1
N 共 1⫺a
2
共B17兲
⫹
␣N
N
Var关 c i j 兴 ⫽E 关共 c i j 兲 2 兴 ⫺ 共 E 关 c i j 兴 兲 2 ⫽c 共 1⫺c 兲 .
h i ⫽ 共 ␰ 1i ⫺a 兲 m 1 ⫹ ␣
册
1
共B16兲
In this general case, the symmetric and the asymmetric random deletion correspond to k c ⫽1 and k c ⫽0, respectively.
According to a similar analysis of the multiplicative synaptic
noise, the local field becomes
1
⫹k ␧ ⌬ 2M Ux i
h i ⫽ 共 ␰ 1i ⫺a 兲 m 1 ⫹ ␣
1⫺U
⫹
⫺1⭐k c ⭐1,
N
兺
兺 共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲 x (j ␮ )
j⫽i ␮ ⫽2
⫹
N
( ␮ )(␧
␧ i j 共 ␰ i␮ ⫺a 兲共 ␰ ␮j ⫺a 兲 x j
兺
兺
兲 ␮ ⫽1 j⫽i
ji )
.
␴ ⫽
共 1⫺U 兲
冋
␣N
Nc 共 1⫺a 2 兲
N
兺 兺 共 c i j ⫺c 兲
␮ ⫽1 j⫽i
( ␮ )(c ji )
xj
共B11兲
x (i ␮ ) ⫽F̃
冉兺
兵 ␮ ,c ji 其
册
冊
共B12兲
⌬ A2 ⫽ ␣ 共 1⫺a 2 兲 2 ⌬ 2M ,
共B14兲
k ␦ ⫽k ␧ ,
共B15兲
⫹
Derivations of the equivalent noise, Eq. 共27兲, is given
here. The random deletion has similar effects to the multiplicative synaptic noise. Therefore, we analyze by a similar
way to the analysis of the multiplicative synaptic noise. The
synaptic connections are given by Eq. 共10兲. The asymmetric
cut coefficients are independently generated according to the
probability distribution of Eq. 共11兲. We analyze a more general case, where c i j and c ji have an arbitrary correlation such
that
c ji ⫺c
Nc 共 1⫺a 2 兲
冊
N
兺 兺 共 c i j ⫺c 兲
␯ ⫽ ␮ j⫽i
共B20兲
,
N
1
Nc 共 1⫺a 2 兲
兺 共 c jk ⫺c 兲共 ␰ ␮j ⫺a 兲
k⫽ j
␣N
c ji ⫺c
Nc 共 1⫺a 2 兲
兺
␯⫽␮
共 ␰ ␯j ⫺a 兲共 ␰ ␯i ⫺a 兲 x i
共 ␰ ␮j ⫺a 兲共 ␰ ␮i ⫺a 兲 x i ,
共B21兲
and U is defined by Eq. 共A12兲 similarly. The variance of the
noise term is given by
␴ 2⫽
by comparing Eqs. 共B11兲 and 共B12兲 to Eqs. 共A17兲 and 共A18兲.
2. Random deletion
⫽ 共 ␰ ␮j ⫺a 兲 m ␮ ⫹
⫻ 共 ␰ ␮k ⫺a 兲 x k ⫹
Finally, the equivalence between the multiplicative synaptic
noise and the additive synaptic noise is obtained as follows:
共B13兲
Nc 共 1⫺a 2 兲
共B19兲
兲,
␣N
1
( ␯ )(c ji )
hj
J⫽1,
␯⫽␮
共 ␰ ␯i ⫺a 兲 m ␯ ⫹
⫻ 共 ␰ ␯i ⫺a 兲共 ␰ ␯j ⫺a 兲 x j
兵 ␮ ,c ji 其
1
⫹k ␧ ⌬ 2M UY 共 z; ␰ 兲 .
1⫺U
共B18兲
,
⫽F 共 h j ⫺h j
␣N
⫹ ␣ ⌬ 2M q.
2
Y 共 z; ␰ 兲 ⫽F 共 ␰ ⫺a 兲 m⫹ ␴ z⫹h
⫹␣
兺
兺 共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲 x (j ␮ )
j⫽i ␮ ⫽2
N 共 1⫺a 2 兲共 1⫺U 兲
1
␣N
where
Thus, after rewriting ␰ 1i → ␰ and m 1 →m, we obtain the effective response function:
冉
N
1
( ␮ )(c ji )
The third and last terms can be regarded as the noise terms.
The variance of the noise terms becomes
␣q
册
1
k c 共 1⫺c 兲
⫹
Ux i
1⫺U
c
⫻ 共 ␰ i␮ ⫺a 兲共 ␰ ␮j ⫺a 兲 x j
共B10兲
2
冋
␣q
共 1⫺U 兲
2
⫹␣
1⫺c
q.
c
共B22兲
Thus, after rewriting ␰ 1i → ␰ and m 1 →m, the effective response function becomes
冉
Y 共 z; ␰ 兲 ⫽F 共 ␰ ⫺a 兲 m⫹ ␴ z⫹h
⫹␣
冋
册
冊
1
k c 共 1⫺c 兲
UY 共 z; ␰ 兲 .
⫹
1⫺U
c
共B23兲
Finally, the equivalence between random deletion and the
additive synaptic noise is obtained as follows:
031910-12
PHYSICAL REVIEW E 68, 031910 共2003兲
SYNAPSE EFFICIENCY DIVERGES DUE TO SYNAPTIC . . .
共B24兲
J⫽1,
1⫺c
⌬ A2 ⫽ ␣ 共 1⫺a 2 兲 2
,
c
(T ji )
xj
共B25兲
k ␦ ⫽k c ,
T (i ␮j ) ⫽
共B26兲
by comparing Eqs. 共B22兲 and 共B23兲 to Eqs. 共A17兲 and
共A18兲. Substituting Eq. 共B25兲 into Eq. 共B14兲, we obtain the
equivalence of Eq. 共27兲.
Ji j⫽
N
⫺
⫽
冋
冑p
N
f 共 Ti j 兲⫽
冑p
N
N 共 1⫺a 2 兲
f 共 Ti j 兲⫺
兺
␮ ⫽1
共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲
p
J
N 共 1⫺a
2
兺 共 ␰ ␮i ⫺a 兲共 ␰ ␮j ⫺a 兲
兲 ␮ ⫽1
兵 JT i j ⫺ 关 f 共 T i j 兲 ⫺JT i j 兴 其 .
⫹
⫹
冑p
N
冋
p
N 共 1⫺a
2
␣ J 2q
共 1⫺JU 兲
冉
⫹␣
冋
冉兺
⫹
J
␯⫽␮
冑p
N
冕
冕
J̃ 2 ⫽
共B32兲
冊
冉 冊
J̃ 2
J2
⫺1 ,
共B34兲
Dx x f 共 x 兲 ,
共B35兲
Dx f 共 x 兲 2 ,
共B36兲
by comparing Eqs. 共B32兲 and 共B33兲 to Eqs. 共A17兲 and
共A18兲. Substituting Eq. 共B34兲 into Eq. 共B14兲, we obtain the
equivalence of Eq. 共28兲.
APPENDIX C: ASYMPTOTE FOR LARGE
MULTIPLICATIVE SYNAPTIC NOISE
N
m⫽erf
共 ␰ i␮ ⫺a 兲共 ␰ ␮j ⫺a 兲 x (j ␮ ) ,
N
ji )
冊
冉 冊
m
冑2 ␴
,
q⫽1,
1
U⫽
␴
共 ␰ ␯i ⫺a 兲 m ␯
(T
关 f 共 T (i ␮j ) 兲 ⫺JT (i ␮j ) 兴 x j
兺
j⫽i
q⫹ ␣ 共 J̃ 2 ⫺J 2 兲 .
册
J⫽
p
x (i ␮ ) ⫽F̃
2
J2
⫹ 共 J̃ 2 ⫺J 2 兲 UY 共 z; ␰ 兲 . 共B33兲
1⫺JU
共B28兲
where
共B31兲
Derivations of the asymptote of storage capacity in a large
multiplicative synaptic noise ⌬ M are given here.
In Eqs. 共20兲–共22兲, let a⫽0, J⫽1, and F(x)⫽sgn(x),
then the order-parameter equations become
ji )
兺兺
兲共 1⫺JU 兲 ␮ ⫽2 j⫽i
共 ␰ ␯i ⫺a 兲共 ␰ ␯j ⫺a 兲 ,
␯⫽␮
Y 共 z; ␰ 兲 ⫽F J 共 ␰ ⫺a 兲 m⫹ ␴ z⫹h
共B27兲
册
J
兺
冑p 共 1⫺a 2 兲
Thus, after rewriting ␰ 1i → ␰ and m 1 →m, the effective response function becomes
册
J2
⫹ 共 J̃ 2 ⫺J 2 兲 Ux i
1⫺JU
(T
关 f 共 T i j 兲 ⫺JT i j 兴 x j
兺
j⫽i
共B30兲
p
⌬ A2 ⫽ ␣ 共 1⫺a 2 兲 2
The following derivation suggests that the residual overlap
m ␮ for the first term in Eq. 共B27兲 is enhanced by a factor of
1/(1⫺JU), while any enhancement to the last part is canceled because of the subtraction. It also implies that the last
part corresponds to the synaptic noise. For the SCSNA of the
nonlinear synapse, we can analyze by a similar manner to the
analysis of the additive synaptic noise. We obtain the local
field:
h i ⫽J 共 ␰ 1i ⫺a 兲 m 1 ⫹ ␣
冊
关 f 共 T ji 兲 ⫺JT ji 兴 x i ,
Finally, the equivalence between the nonlinear synapse and
the additive synaptic noise is obtained as follows:
p
J
N
1
␴ 2⫽
Derivations of the equivalent noise, Eq. 共28兲, is given
here. The effect of the nonlinear synapse can be separated
into a signal part and a noise part. The noise part can be
regarded as the additive synaptic noise.
The systematic deletion of synaptic connections can be
achieved by introducing synaptic noise with an appropriate
nonlinear function f (x) 关14兴. Note that T i j obeys the normal
distribution N(0,1) for p⫽ ␣ N→⬁. According to this naive
S/N analysis 关16兴, we can write the connections as
冑p
and U is defined by Eq. 共A12兲 similarly. The variance of the
noise term is given by
3. Nonlinear synapse
冑p
冉
⫽F h j ⫺
冑
冉 冊
2
m2
exp ⫺
,
␲
2␴2
共C1兲
共C2兲
共C3兲
the threshold becomes h⫽0, the effective response function
of Eq. 共24兲 and the variance of the noise become
,
共B29兲
031910-13
Y 共 z; ␰ 兲 ⫽sgn共 ␰ m⫹ ␴ z 兲 ,
共C4兲
PHYSICAL REVIEW E 68, 031910 共2003兲
MIMURA, KIMOTO, AND OKADA
␴ 2⫽
␣
共 1⫺U 兲
⫹⌬ A2 ,
2
共C5兲
respectively, where the error function erf(x) is defined as
erf共 x 兲 ⫽
冕
冑␲
2
x
2
e ⫺u du.
0
共C6兲
冉 冊
m
冑2 ␴
⫽
1
␴
冑
冉 冊
m2
2
exp ⫺
.
␲
2␴2
␴ 2c ⫽2/␲ .
␴
␴c
共C9兲
to solve for m as a function of ␴ in the vicinity of this critical
value ␴ c . The critical value of the additive synaptic noise is
discussed in the case of ␶ ⯝1. The overlap m shows the
first-order phase transition when ⌬ A is small, but it is regarded as the second-order phase transition at large ⌬ A region. The overlap becomes mⰆ1 when ␶ ⯝1 and ⌬ A is sufficiently large, therefore the nontrivial solution of m is given
as
m3
m
⫺
⫹O 共 m 4 兲 ⫽ ␴ c ␶ 冑6 共 1⫺ ␶ 兲 ,
␶ 6 ␴ 2␶
共C10兲
共C12兲
when bias a⫽0. Therefore, substituting Eqs. 共C11兲 and
共C12兲 into Eq. 共C5兲, the variance of the noise ␴ is given as
␴ 2⫽
␣␶ 2
4 共 1⫺ ␶ 兲 2
⫹ ␣ ⌬ 2M .
共C13兲
The loading rate ␣ becomes
␣⫽
␶ 2 共 1⫺ ␶ 兲 2
8
.
␲ ␶ 2 ⫹4⌬ 2M 共 1⫺ ␶ 2 兲
共C14兲
When the overlap is small enough, i.e., mⰆ1, the orderparameter equations of Eqs. 共C1兲–共C5兲 reduce to Eq. 共C14兲.
Solving Eq. 共C14兲 for the fixed value of ␣ and ⌬ M , we
obtain the parameter ␶ . Substituting ␶ into Eq. 共C10兲, we can
obtain the overlap m for given ␣ and ⌬ M . It is easily confirmed that the ␶ increases with ␣ for the fixed value of ⌬ M .
This means that the maximal value of ␶ which holds, Eq.
共C14兲 corresponds to the maximum value of ␣ , that is, storage capacity ␣ c . The critical value ␶ c is equal to the value
which maximizes the loading rate of Eq. 共C14兲 and becomes
␶ c⫽
共 2⌬ M 兲 2/3
1⫹ 共 2⌬ M 兲 2/3
⯝1⫺ 共 2⌬ M 兲 ⫺2/3,
共C15兲
in a large ⌬ M limit. Therefore, substituting Eq. 共C15兲 into
Eq. 共C14兲, we obtain Eq. 共31兲 as follows:
␣ c⫽
by Taylor expansion including terms up to the third order.
Substituting Eq. 共C10兲 into Eq. 共C3兲, U becomes
关1兴 P.R. Huttenlocker, Brain Res. 163, 195 共1979兲.
关2兴 P.R. Huttenlocker, C. De Courten, L.J. Garey, and H. Van der
Loos, Neurosci. Lett. 33, 247 共1982兲.
关3兴 J.P. Bourgeois and P. Rakic, J. Neurosci. 13, 2801 共1993兲.
关4兴 J. Takacs and J. Hamori, J. Neurosci. Res. 38, 515 共1994兲.
关5兴 G.M. Innocenti, Trends Neurosci. 18, 397 共1995兲.
关6兴 M.F. Eckenhoff and P. Rakic, Dev. Brain Res. 64, 129 共1991兲.
关7兴 P. Rakic, J.P. Bourgeois, and P.S. Goldman-Rakic, Prog. Brain
Res. 102, 227 共1994兲.
关8兴 M.P. Stryker, J. Neurosci. 6, 2117 共1986兲.
关9兴 A.W. Roe, S.L. Pallas, J.O. Hahm, and Sur, Science 250, 818
共1990兲.
关10兴 J.R. Wolff, R. Laskawi, W.B. Spatz, and M. Missler, Behav.
Brain Res. 66, 13 共1995兲.
关11兴 H. Yanai and S. Amari, IEEE Trans. Neural Netw. 7, 803
共1996兲.
共C11兲
From Eq. 共26兲, the variance of the multiplicative synaptic
noise ⌬ 2M is related to the variance of the additive synaptic
noise ⌬ A2 as
共C8兲
This shows that a retrieval phase exists only for ␴ ⬍ ␴ c . We
define the parameter ␶ (⬍1) as
m⯝
冊
共C7兲
Equation 共C1兲 has nontrivial solutions m⫽0 within the range
where the slope of the rhs of Eq. 共C7兲 at m⫽0 is greater than
1. Therefore, the critical value of the noise ␴ 2c is given by
␶⫽
冉
1
m2
1⫺
⫹O 共 m 4 兲 ⫽3⫺2 ␶ ⫺1 .
␶
2␴2
⌬ A2 ⫽ ␣ ⌬ 2M ,
The slope of the right-hand side 共rhs兲 of Eq. 共C1兲 is given by
d
erf
dm
U⯝
2
␲ ⌬ 2M
.
共C16兲
关12兴 D.J. Amit, N. Brunel, and M.V. Tsodyks, J. Neurosci. 14, 6435
共1994兲.
关13兴 M. Griniasty, M.V. Tsodyks, and D.J. Amit, Neural Comput. 5,
1 共1993兲.
关14兴 H. Sompolinsky, Phys. Rev. A 34, 2571 共1986兲.
关15兴 G. Chechik, I. Meilijson, and E. Ruppin, Neural Comput. 10,
1759 共1998兲.
关16兴 M. Okada, T. Fukai, and M. Shiino, Phys. Rev. E 57, 2095
共1998兲.
关17兴 D.J. Amit, H. Gutfreund, and H. Sompolinsky, Phys. Rev. A
35, 2293 共1987兲.
关18兴 S. Amari, Neural Networks 2, 451 共1989兲.
关19兴 I. Meilijson and E. Ruppin, Biol. Cybern. 74, 479 共1996兲.
关20兴 M. Okada, Neural Networks 9, 1429 共1996兲.
关21兴 M. Shiino and T. Fukai, J. Phys. A 25, L375 共1992兲.
关22兴 M. Shiino and T. Fukai, Phys. Rev. E 48, 867 共1993兲.
031910-14
PHYSICAL REVIEW E 68, 031910 共2003兲
SYNAPSE EFFICIENCY DIVERGES DUE TO SYNAPTIC . . .
关23兴 M. Okada, Neural Networks 8, 833 共1995兲.
关24兴 K. Mimura, M. Okada, and K. Kurata, IEICE Trans. Inf. Syst.
8, 928 共1998兲.
关25兴 K. Mimura, M. Okada, and K. Kurata, IEICE Trans. Inf. Syst.
11, 1298 共1998兲.
关26兴 T. Kimoto and M. Okada, Biol. Cybern. 85, 319 共2001兲.
关27兴 H. Sompolinsky, in Heidelberg Colloquium on Glassy Dynam-
ics, edited by J.L. van Hemmen and I. Morgenstern 共SpringerVerlag, Berlin, 1987兲, p. 485.
关28兴 M.V. Tsodyks and M.V. Feigelman, Europhys. Lett. 6, 101
共1988兲.
关29兴 J. Buhmann, R. Divko, and K. Schulten, Phys. Rev. A 39, 2689
共1989兲.
关30兴 C.J. Perez-Vincente and D.J. Amit, J. Phys. A 22, 559 共1989兲.
031910-15