Classification - Hiram College

Classification as Non-Symbolic
Learning
CPSC 386 Artificial Intelligence
Ellen Walker
Hiram College
Symbolic vs. Non-Symbolic learning
• If you “open the system up” after it has
learned, can the knowledge be easily
expressed?
• Symbolic uses accessible internal
representations
• Non-symbolic uses inaccessible internal
representations
Classification
• Given a set of examples (x, y) where x is
input (a vector of values), y is classification
• Learn a function y=f(x) that
– Returns correct results for all (x,y) pairs in the
training set of examples
– Generalizes well -- returns correct results for x
values not in the training set
Discrete vs. Continuous Features
• Color
– A set of named values?
– Numeric Red, Green, Blue codes?
• Size
– Large vs. small?
– Numeric volume?
Graphing Two Features
Height
Weight
Linear Separation
• The equation of a line is a ‘rule’ that
separates classes.
• If Ax + By > C, then “above the line” (if equal,
then “on the line” and if < then “below the
line”)
• We can implement these “lines” as rules to
create a decision tree as before
What Remains?
•
Where do I draw the lines?
–
•
Can I always find appropriate lines?
–
•
Yes, and no – again we will see
What if there are more variables?
–
•
We’ll see one way when we look at perceptrons
Draw planes in 3D, hyperplanes in 4D etc.
Where do the features come from?
–
A very good question – we’ll look at one way to
get them automatically, soon