Document

Nanjing University of Science & Technology
Pattern Recognition:
Statistical and Neural
Lonnie C. Ludeman
Lecture 18
Oct 21, 2005
Lecture 18 Topics
1. Example – Generalized Linear Discriminant
Function
2. Weight Space
3. Potential Function Approach- 2 class case
4. Potential Function Example- 2 class case
5. Potential Function Algorithm – M class case
Classes not Linearly separable
2
from C1
from C2
1
1
2
3
4
x1
-1
-2
Q. How can we find decision boundaries??
Answers:
(1) Use Generalized Linear Discriminant functions
(2) Use Nonlinear Discriminant Functions
Example: Generalized Linear Discriminant
Functions x
from C
2
1
from C2
3
2
1
1
2
3
4
x1
-1
-2
Given Samples from 2 Classes
Find a generalized linear discriminant
function that separates the classes
Solution:
d(x) = w1f1(x)+ w2f2(x)+ w3f3(x)
+ w4f4(x) +w5f5(x) + w6f6(x)
T
= w f (x)
in the f space (linear)
where
in the original pattern space: (nonlinear)
Use the Perceptron Algorithm in the f space
(the iterations follow)
Iteration #
Samples
Weights
Action
Iterations
Continue
d(x)
Iterations
Stop
The discriminant function is as follows
Decision boundary set d(x) = 0
Putting in standard form we get the decision
boundary as the following ellipse
Decision Boundary in original pattern space
x2
from C1
2
from C2
1
1
-1
-2
2
3
4
x1
Boundary
d(x) = 0
Weight Space
To separate two pattern classes C1 and
C2 by a hyperplane we must satisfy the
following conditions
T
Where w x = 0 specifies the
boundary between the classes
But we know that
T
T
w x=x w
Thus we could now write the equations in
the w space with coefficients representing
the samples as follows
Each inequality gives a hyperplane boundary
in the weight space such that weights on the
positive side would satisfy the inequality
In the Weight Space
View of the Pereptron algorithm in the weight
space
Potential Function Approach – Motivated
by electromagnetic theory
+ from C1
- from C2
Sample space
Given Samples x from two classes C1 and C2
S1
S2
C1
C2
Define Total Potential Function
K(x) = ∑ K(x, xk) - ∑ K(x, xk)
x k C S1
xk C S2
Decision Boundary
K(x) = 0
Potential
Function
Choices for Potential functions K(x, xk)
Graphs of Potential functions
Example – Using Potential functions
Given the following Patterns from two classes
Find a nonlinear Discriminant function using
potential functions that separate the classes
Plot of Samples from the two classes
Trace of Iterations
Algorithm converged in 1.75 passes through
the data to give final discriminant function as
KFINAL(x)
x1
Potential Function Algorithm for K Classes
Reference (3) Tou And Gonzales
Flow Chart for Potential Function Method: M-Class
Flow Chart Continued
Flow Chart Continued
Summary
1. Example – Generalized Linear Discriminant
Function
2. Weight Space
3. Potential Function Approach- 2 class case
4. Potential Function Example- 2 class case
5. Potential Function Algorithm – M class case
End of Lecture 18