CS 8520: Artificial Intelligence

CS 8520: Artificial Intelligence
Resolution and Bayes’
Examples
Paula Matuszek
Spring, 2010
CSC 8520 Spring 2010. Paula Matuszek
1
To resolve a pair of clauses which are entirely disjuncts
(ORs) of terms, you find a term which exists as a positive
term in one and a negative term in the other, and produce a
new term which contains all of the terms of both except the
pair, which cancel each other. That new term is the resolvent,
and showing all possible resolutions involves finding each
resolvent. .
The idea is that if we assume that both of the following
statements are true:
dog ∨ cat
¬ dog ∨ horse
then
(dog ∨ cat) ∧ (¬ dog ∨ horse) ⇒ (cat ∨ horse).
CSC 8520 Spring 2010. Paula Matuszek
2
dog
¬ dog
cat
horse
dog ∨
cat
¬ dog ∨ (dog ∨
cat ∨
cat ) ∧ (¬ horse
horse
((dog ∨
cat ) ∧ (¬
dog ∨
horse)) ⇒
(cat ∨
dog ∨
horse)
horse)
T
F
T
T
T
T
T
T
T
T
F
T
F
T
F
F
T
T
T
F
F
T
T
T
T
T
T
T
F
F
F
T
F
F
F
T
F
T
T
T
T
T
T
T
T
F
T
T
F
T
T
T
T
T
F
T
F
T
F
T
F
T
T
F
T
F
F
F
T
F
F
T
CSC 8520 Spring 2010. Paula Matuszek
3
Midterm question: Show all the possible resolutions for the following pairs of
clauses:
delicious
¬ delicious ∨ anchovies
The only term we can resolve on is delicious, so the resulting clause is
anchovles
delicious ∨ anchovies
¬ delicious ∨ ¬ anchovies
We can resolve on either delicious or anchovies (but not both at once) so we can get
either
delicious ∨ ¬ delicious (which is trivially true)
anchovies ∨ ¬ anchovies (ditto)
¬X∨Y
X∨¬Y∨Z
We can resolve on x or on y. Giving us either
Y ∨ ¬Y v Z or
X ∨ ¬X v Z. (also both trivially true)
CSC 8520 Spring 2010. Paula Matuszek
4
Bayes Theorem
• Bayes’ Theorem, as Eric discussed, is a way of
determining the conditional probability of A given
B (written A|B). In other words, if we know B
has happened, how likely is A?
• In order to compute the conditional probability of
A|B we need to know three other probabilities,
called the priors:
– The probability of A (without any other information)
– The probability of B (without any other information)
– The conditional probability of B given A
CSC 8520 Spring 2010. Paula Matuszek
5
Bayes’ Theorem
Figure from http://phaedrusdeinus.org/your-own-bayes/slides/bayes.png
CSC 8520 Spring 2010. Paula Matuszek
6
Bayes Example
• Your child has a rash. How likely is it that he has
chicken pox? In other words, what is the
probability of chicken pox | rash.
– Chicken pox is going around your child’s class:
p(chicken pox) = .3.
– Overall in your child’s class about half of the children
have a rash from something: p(rash) = .5
– Almost all kids with chicken pox get a rash:
– p(rash | chicken pox) = .99.
• So p(chicken pox | rash) = (.99 x .3)/.5 = .594
• In other words, there’s about a 60% chance that if
your child has a rash, it’s chicken pox.
CSC 8520 Spring 2010. Paula Matuszek
7
Thinking about Bayes’
• What this is basically saying is:
– Two things make the probability of A|B higher:
• A high overall (or prior) probability of A (there is a lot of chicken pox around)
• A high prior probability of B|A (usually children with chicken pox get a rash)
– One thing makes the probability of A|B lower:
• A high prior probability of B (there are a lot of rashes around)
• In other words, p(B) is giving us a way to normalize the other
probabilities, so that we don’t decide that chicken pox is likely
based on a rash if everyone has a rash.
• Consider what the values would be if instead of “rash” we
looked at “Chicken pox | drinks milk”. Such a high proportion
of children drink milk that drinking milk really has no predictive
value for determining whether a child has chicken pox.
CSC 8520 Spring 2010. Paula Matuszek
8