download

Matakuliah
Tahun
Versi
: H0434/Jaringan Syaraf Tiruan
: 2005
:1
Pertemuan 7
JARINGAN INSTAR DAN OUTSTAR
1
Learning Outcomes
Pada akhir pertemuan ini, diharapkan mahasiswa
akan mampu :
• Menghubungkan Instar dan Outstar Rule
pada Jaringan Associative
2
Outline Materi
• Arsitektur Jaringan Instar
• Arsitektur Jaringan Outstar
• Learning Rule
3
INSTAR
• RECOGNITION NETWORK
4
INSTAR OPERATION
a = hardlim Wp + b = hardlim 1w T p + b
The instar will be active when
T
1w p
 –b
or
1w
Tp
=
1w
p cos q  – b
For normalized vectors, the largest inner product occurs when the
angle between the weight vector and the input vector is zero -- the
input vector is equal to the weight vector.
The rows of a weight matrix represent patterns
to be recognized.
5
INSTAR RULE
Hebb with Decay
w ij q  = wij  q – 1 + ai qp j q
Modify so that learning and forgetting will only occur
when the neuron is active - Instar Rule:
w ij  q  = w ij  q – 1  + a i  q  p j  q  – g a i  q w ij  q – 1 
or
w ij q = wij  q – 1 + ai q  pj  q – wi j q – 1 
Vector Form:
iw q
= iw  q – 1 + ai q p q – iw q – 1 
6
GRAPHICAL REPRESENTATION
For the case where the instar is active (ai = 1):
iw q
= iw  q – 1 +  p q – iw q – 1
or
iw q
=  1 –  iw  q – 1 + p q
For the case where the instar is inactive (ai = 0):
iw  q 
= iw q – 1
7
CONTOH
Kamera
Sight
Fruit
Measure
NETWORK
 1  orange detected visually
0
p = 
 0  orange not detected
shape
p = tex ture
w eight
Orange ?
8
TRAINING
T
W 0  = 1w  0  = 0 0 0



1
1
 0
 0

p

1

=
0

p

1

=

p

2

=
1

p

2

=

–1  
– 1  



–
1
–
1



First Iteration (=1):
0 0
a 1  = hardlim  w p  1  + Wp  1  – 2 


1


a 1  = h ardlim 3  0 + 0 0 0 – 1 – 2 = 0 (no response)


–
1


 1

0
0
0


1 w 1  = 1 w 0  + a  1  p  1  – 1w  0  = 0 + 0  – 1 – 0  = 0


0
0
0
 –1
9
Further Training


1


0 0
a  2  = hardlim  w p  2  + Wp  2  – 2  = h ardlim  3  1 + 0 0 0 – 1 – 2 = 1


–
1

 (orange)
 1

0
0
1


1w  2  = 1 w 1  + a 2   p 2  – 1 w 1   = 0 + 1  – 1 – 0  = – 1


0
–
1
0
–1




1


0 0
a  3  = hardlim  w p  3  + Wp  3  – 2  = hardlim 3  0 + 1 – 1 – 1 – 1 – 2 = 1


–1

 (orange)
 1

1
1
1


1 w 3  = 1 w 2  + a 3  p 3  – 1w 2  = – 1 + 1  – 1 – –1  = – 1


–1
–1 
–1
 –1
Orange will now be detected if either set of sensors works.
10
Kohonen Rule
1 w q
= 1w q – 1 +  p q – 1w  q – 1 
for i  X q
Learning occurs when the neuron’s index i is a member of
the set X(q). We will see in Chapter 14 that this can be used
to train all neurons in a given neighborhood.
11
OUTSTAR
• RECALL NETWORK
12
Outstar Operation
Suppose we want the outstar to recall a certain pattern a*
whenever the input p = 1 is presented to the network. Let
W = a
Then, when p = 1
a = satlins Wp = satli ns a  1 = a
and the pattern is correctly recalled.
The columns of a weight matrix represent patterns
to be recalled.
13
Outstar Rule
For the instar rule we made the weight decay term of the Hebb
rule proportional to the output of the network. For the outstar
rule we make the weight decay term proportional to the input of
the network.
wij  q = wi j q – 1 + ai qp j q – g p j qw ij q – 1
If we make the decay rate g equal to the learning rate ,
wi j q = wi j q – 1 +  ai q – w ij q – 1 pj q 
Vector Form:
w j  q = w j  q – 1 +  a q – w j  q – 1 p j q
14
Example - Pineapple Recall
15
Definitions
0 0
a = satl ins W p + W p
100
W = 010
001
0
shape
p = tex ture
w eight
0
p
pi neap ple
–1
= –1
1
 1  if a pineapple can be seen
p = 
 0  otherwise
16
Iteration 1



0
–
1
 0
 0

p

1

=

p

1

=
1

p

2

=

p

2

=
1

0

–1
 



0
1



=1


0
0
0


a 1  = s atli ns 0 + 0 1  = 0


0 
0
 0
(no response)


0 0
0
0
w 1 1  = w1  0  + a 1  – w 1  0  p 1  = 0 +  0 – 0  1 = 0


0 0
0
0
17
Convergence


–
1
0
–1


a 2  = satlins  – 1 + 0 1  = – 1


0 
1
 1
(measurements given)
0  – 1
0 
–1
w 1  2  = w1 1  + a  2  – w1  1   p  2  = 0 +  – 1 – 0  1 = –1


0  1
0
1
 0

–
1
–1


a 3  = s atli ns 0 + – 1 1  = – 1


0
1
1


(measurements recalled)
– 1  –1
– 1 
–1
w 1  3  = w1  2  + a  2  – w1  2   p  2  = – 1 +  –1 – – 1  1 = – 1


1  1
1
1
18
19