download

Matakuliah
Tahun
Versi
: H0434/Jaringan Syaraf Tiruan
: 2005
:1
Pertemuan 17
HOPFIELD NETWORK
1
Learning Outcomes
Pada akhir pertemuan ini, diharapkan mahasiswa
akan mampu :
• Menjelaskan konsep dari Jaringan
Hopfield
2
Outline Materi
• Hopfield Model.
• Lyapnov Function.
3
Hopfield Model
4
Equations of Operation
dn i(t)
C ------------- =
dt
ni
ai
C
Ii
T i
j
1
= --------R i j
S

j =1
n (t )
T i j a j(t ) – ----i------ + Ii
Ri
- input voltage to the ith amplifier
- output voltage of the ith amplifier
- amplifier input capacitance
- fixed input current to the ith amplifier
1
1
----- = --- +
Ri

S
1
 --R-------
j=1
i j
ni = f
–1
a i 
a i = f ni 
5
Network Format
dn i(t)
Ri C ------------- =
dt
S
 Ri T i ja j(t) – ni(t ) + RiI i
j =1
Define:
 = RiC
w i
j
= RiT i j
d ni(t )
------------- = – n i(t) +
dt
b i = R iI i
S
 wi j aj(t ) + bi
j=1
Vector Form:
d n(t)
 ------------ = – n(t) + Wa(t ) + b
dt
a(t ) = f  n(t) 
6
Hopfield Network
7
Lyapunov Function
 ai



T
1 T
V  a  = – --- a Wa +    f –1 u  du  – b a
2

i = 1 0


S
8
Individual Derivatives
First Term:
T da

d 1 T
1
T
T da
T
da
–
-a
Wa
=
–


a
Wa

------ = –  Wa  ------ = – a W -----

dt 2
2
dt
dt
dt

Second Term:
 ai
d
f
d t  0


 ai

–1
d 
 u  du  =
 f
d
a
i 0




da i
da i
da i
–1
–1
= f  ai 
= ni
 u  du 
d
t
d
t
dt


 ai



T da
d
-----    f –1 u  du  = n -----dt
dt

i = 1 0


S
Third Term:
d
T
T T da
T da
 – b a  = –  b a ------ = – b -----dt
dt
dt
9
Complete Lyapunov Derivative
d
T
da
T da
T da
T
T
T da
V  a  = – a W ------ + n ------ – b ------ =  – a W + n – b  -----dt
dt
dt
dt
dt
From the system equations we know:
d n(t )
 – a W + n – b  = –  -----------dt
T
T
T
T
So the derivative can be written:
d n(t )
d
V  a = – -----------dt
dt
S
= – 
i=1
If
S
T da
------ = – 
dt

i=1
dn da
 i   i = – 
d t  d t 
S

i=1
dn da
 i  i 
d t  d t 
2
da
–
1
d
  f  a    i
i  d t 
d a i
–1
d
 f  ai   > 0
d ai
then
d
V a   0
dt
10
Invariant Sets
Z =  a : dV a   dt = 0 a in the closure ofG
S
2
da
d
d
–
1
i
V a  = –     f  a i   
dai
dt 
dt
i =1
This will be zero only if the neuron outputs are not changing:
da
------ = 0
dt
Therefore, the system energy is not changing only at the
equilibrium points of the circuit. Thus, all points in Z are
potential attractors:
L = Z
11
Example
2

n = ----- tan --- a 
  2 
2 –1  n
a = f n  = ---tan  --------- 
 2 

R1  2 = R2  1 = 1
W = 0 1
1 0
T 1 2 = T 2 1 = 1
 = RiC = 1
 = 1.4
I 1 = I2 = 0
b = 0
0
12
Example Lyapunov Function
 ai



T
1 T
–1
-V  a  = – a Wa +    f  u  du  – b a
2

i = 1 0


S
a1
1 T
1
– ---a Wa = – --- a 1 a 2 0 1
= –a1a2
2
2
1 0 a2
ai
f
0
ai
ai
0
0
2
2
 2

 u  du = ------  tan  --- u d u = ------ – log cos  --- u --
2 

2

–1
4

= – --------2- log cos --- a i
2




4
 

V  a = – a 1 a 2 – ------------2- log cos  -- a 1   + log cos  --- a2  
2 
2  
1.4


13
Example Network Equations
dn
------- = – n + Wf  n = – n + Wa
dt
dn1  dt = a2 – n1
dn2  dt = a 1 – n2
2 –1 1.4
a 1 = ---tan  ----------- n1 

2
2 –1 1.4
a 2 = ---tan  ----------- n2 
 2


14
Lyapunov Function and
Trajectory
1
V(a)
2
0.5
1
a2
0
0
-0.5
1
0.5
1
0.5
0
-1
-1
-0.5
0
0.5
1
a2
0
-0.5
-0.5
-1
-1
a1
a1
15
Time Response
1
2
1.5
0.5
a2
1
0
V(a)
0.5
-0.5
a1
-1
0
0
2
4
6
t
8
10
0
2
4
6
8
10
t
16
Convergence to a Saddle
Point
1
0.5
a2
0
-0.5
-1
-1
-0.5
0
a1
0.5
1
17
Hopfield Attractors
The potential attractors of the Hopfield network satisfy:
da
------ = 0
dt
How are these points related to the minima of V(a)? The
minima must satisfy:
V = V V ... V
 a1  a2
 aS
T
= 0
Where the Lyapunov function is given by:
 ai



T
1
V  a  = – --- a Wa +    f –1 u  du  – b a
2

i = 1 0


S
T
18
Hopfield Attractors
Using previous results, we can show that:
d n(t )
V a  =  – Wa + n – b = –  -----------dt
The ith element of the gradient is therefore:
dn i
da i
–1
–1
d
d

V a = – 
= –
( f  a i  ) = – 
 f  ai  
dt
dt
d ai
dt
 ai
Since the transfer function and its inverse are monotonic
d
–1
increasing:
 f a  > 0
dai
All points for which
d a(t)
------------ = 0
dt
i
will also satisfy
V a = 0
Therefore all attractors will be stationary points of V(a).
19
Effect of Gain
2 – 1 n
a = f  n  = --- tan  ---------
 2 

1
 = 14
 = 1.4
0.5
 = 0.14
a
0
-0.5
-1
-5
-2.5
0
n
2.5
5
20
Lyapunov Function
 ai



T
1 T
V  a  = – --- a Wa +    f –1 u  du  – b a
2

i = 1 0


S
ai
f
0
–1
2
 u  du = -----
f
–1
2
u
 u  = ------ tan  ------
  2 
a i
a i
2 
4




--- log  cos  --------  = – --------2- log cos  -------- 
2
2


1.5
 = 0.14
1
a
4
– --------2- log cos  -------- 
2

 = 1.4
0.5
0
 = 14
-1
-0.5
0
a
0.5
1
21
High Gain Lyapunov Function
As  the Lyapunov function reduces to:
1 T
T
V  a = – --- a Wa – b a
2
The high gain Lyapunov function is quadratic:
T
T
1 T
1 T
V a  = – -- a Wa – b a = -- a Aa + d a + c
2
2
where
2 V a  = A = –W
d = –b
c = 0
22
Example
2 V  a = – W =
0 –1
–1 0
 1 = –1
2V  a –  I =
z1 = 1
–  –1
–1 –
2
=  – 1 =  + 1   – 1 
2 = 1
1
z2 =
1
–1
1
V(a)
1
0.5
0.5
a2
0
0
-0.5
-0.5
-1
1
0.5
1
0.5
0
-1
-1
-0.5
0
a1
0.5
1
a2
0
-0.5
-0.5
-1
-1
a1
23