Logical Agents
CSC 411: AI
Fall 2013
NC State University
Logical Agents
1 / 111
Outline
Knowledge-based agents
Wumpus world
Logic in general – models and entailment
Propositional (Boolean) logic
Equivalence, validity, satisfiability
Inference rules and theorem proving
• Forward chaining
• Backward chaining
• Resolution
NC State University
Logical Agents
2 / 111
Knowledge bases
Inference engine ← Domain-independent algorithms
Knowledge base ← Domain-specific content
A knowledge base is a set of sentences in a formal
language.
The declarative approach to building an agent (or other
system):
• T ELL it what it needs to know.
• Then it can A SK itself what to do.
Knowledge level versus implementation level.
NC State University
Logical Agents
3 / 111
A simple knowledge-based agent
KB-AGENT(percept)
static t = 0
T ELL(KB, M AKE -P ERCEPT-S ENTENCE(percept, t))
action = A SK(KB, M AKE -ACTION -Q UERY(t))
T ELL(KB, M AKE -ACTION -S ENTENCE(action, t))
t = t+1
return action
NC State University
Logical Agents
4 / 111
A simple knowledge-based agent
The agent must be able to
• Represent states, actions, etc.
• Incorporate new percepts
• Update internal representations of the world
• Deduce hidden properties of the world
• Deduce appropriate actions
NC State University
Logical Agents
5 / 111
Wumpus world PEAS description
4
Performance measure:
Stench
Bree z e
Gold:
Death:
Taking a step:
Using an arrow:
3
+1000
−1000
−1
−10
Stench
Bree z e
PIT
PIT
Bree z e
Gold
2
Bree z e
Stench
Bree z e
1
PIT
Bree z e
3
4
START
1
NC State University
2
Logical Agents
6 / 111
Wumpus world PEAS description
Environment:
Smelly when adjacent to Wumpus.
Breezy when adjacent to pit.
Glitter iff gold is in same square.
Shooting kills Wumpus if you are
facing it.
Shooting uses up only arrow.
Grabbing picks up gold if in same
square.
Releasing drops the gold in same
square.
NC State University
4
Stench
Bree z e
3
Stench
Bree z e
PIT
PIT
Bree z e
Gold
2
Bree z e
Stench
Bree z e
1
PIT
Bree z e
3
4
START
1
2
Logical Agents
7 / 111
Wumpus world PEAS description
4
Sensors:
Stench, Breeze, Glitter,
Bump, Scream
Actuators:
Left turn, Right turn, Forward, Grab, Release, Shoot
Stench
Bree z e
3
Stench
Bree z e
PIT
PIT
Bree z e
Gold
2
Bree z e
Stench
Bree z e
1
PIT
Bree z e
3
4
START
1
NC State University
2
Logical Agents
8 / 111
Wumpus world characterization
Fully Observable? No, only local perception.
Deterministic? Yes, outcomes are exactly specified.
Episodic? No, sequential at the level of actions.
Static? Yes, the Wumpus and pits do not move.
Discrete? Yes.
Single-agent? Yes, the Wumpus is essentially a natural
feature.
NC State University
Logical Agents
9 / 111
Exploring a Wumpus world
NC State University
Logical Agents
10 / 111
Exploring a Wumpus world
NC State University
Logical Agents
11 / 111
Exploring a Wumpus world
NC State University
Logical Agents
12 / 111
Exploring a Wumpus world
NC State University
Logical Agents
13 / 111
Exploring a Wumpus world
NC State University
Logical Agents
14 / 111
Exploring a Wumpus world
NC State University
Logical Agents
15 / 111
Exploring a Wumpus world
NC State University
Logical Agents
16 / 111
Exploring a Wumpus world
NC State University
Logical Agents
17 / 111
Logic in general
Logics are formal languages for representing information
such that conclusions can be drawn.
Syntax defines the sentences in the language; semantics
define the “meaning” (i.e., truth values) of sentences.
The language of arithmetic:
x + 2 ≥ y is a sentence
x2 + y > () is not a sentence
x + 2 ≥ y is true in a world where x = 7, y = 1
x + 2 ≥ y is false in a world where x = 0, y = 6
NC State University
Logical Agents
18 / 111
Entailment
The knowledge base KB entails sentence α if and only if
α is true in all worlds where KB is true: KB |= α.
Said differently, a set of premises KB logically entails a
conclusion α if and only if every interpretation that
satisfies the premises also satisfies the conclusion.
{p} |= (p ∨ q)
{p, q} |= (p ∧ q)
Entailment is a relationship between sentences (i.e.
syntax), based on semantics.
NC State University
Logical Agents
19 / 111
Models
Logicians typically think in terms of models, formally
structured worlds with respect to which truth can be
evaluated.
We can think of a model as a possible world.
In the semantics of arithmetic, x + y = 4 is true in a world
where x is 2 and y is 2, but false in a world where x is 1
and y is 10.
NC State University
Logical Agents
20 / 111
Models
We say m is a model of a sentence α if α is true in m.
M(α) is the set of all models of α.
KB |= α iff M(KB) ⊆ M(α).
{p, q} |= p because the worlds in which p is true is a
superset of the worlds in which p is true and q is true.
NC State University
Logical Agents
21 / 111
Entailment in the Wumpus world
You detect nothing in [1,1].
You move right.
You detect a breeze in [2,1].
Consider possible models for
KB assuming only pits:
3 (Boolean) possibilities →
8 possible models.
NC State University
Logical Agents
22 / 111
Wumpus models
NC State University
Logical Agents
23 / 111
Wumpus models
KB = Wumpus-world rules + observations
NC State University
Logical Agents
24 / 111
Wumpus models
KB = Wumpus-world rules + observations
α1 = “[1,2] is safe.” Does KB |= α1 ?
NC State University
Logical Agents
25 / 111
Wumpus models
KB = Wumpus-world rules + observations
NC State University
Logical Agents
26 / 111
Wumpus models
KB = Wumpus-world rules + observations
α2 = “[2,2] is safe.” Does KB |= α2 ?
NC State University
Logical Agents
27 / 111
Inference
KB `i α means that sentence α can be derived from KB
by procedure i.
Soundness: i is sound if whenever KB `i α, it is also true
that KB |= α.
Completeness: i is complete if whenever KB |= α, it is
also true that KB `i α.
NC State University
Logical Agents
28 / 111
Propositional logic: Syntax
The proposition symbols P1 , P2 , etc. are sentences.
If S is a sentence, ¬S is a sentence (negation).
If S1 and S2 are sentences. . .
S1 ∧ S2 is a sentence (conjunction).
S1 ∨ S2 is a sentence (disjunction).
S1 ⇒ S2 is a sentence (implication).
S1 ⇔ S2 is a sentence (biconditional).
NC State University
Logical Agents
29 / 111
Propositional logic: Semantics
Each model specifies true or false for each proposition
symbol.
P1,2 P2,2 P3,1
False True False
With these symbols, 8 possible models can be enumerated
automatically.
NC State University
Logical Agents
30 / 111
Propositional logic: Semantics
Rules for evaluating truth with respect to a model m:
¬S is true iff S is false.
S1 ∧ S2 is true iff S1 is true and S2 is true.
S1 ∨ S2 is true iff S1 is true or S2 is true.
S1 ⇒ S2 is true iff S1 is false or S2 is true.
S1 ⇔ S2 is true iff S1 ⇒ S2 is true and S2 ⇒ S1 is true.
NC State University
Logical Agents
31 / 111
Propositional logic: Semantics
A simple recursive process can evaluate an arbitrary
sentence:
¬P1,2 ∧ (P2,2 ∨ P3,1 ) = not false ∧ ( true ∨ false )
= true ∧ ( true ∨ false )
= true ∧ true
= true
NC State University
Logical Agents
32 / 111
Truth tables for connectives
P
False
False
True
True
Q
False
True
False
True
¬P P ∧ Q P ∨ Q P ⇒ Q P ⇔ Q
True False False
True
True
True False True
True
False
False False True
False
False
False True True
True
True
NC State University
Logical Agents
33 / 111
Wumpus world sentences
Let Pi,j be true if there is a pit in [i, j].
Let Bi,j be true if there is a breeze in [i, j].
Suppose we know
P1,1 and B1,2 and B2,1 .
We also know that pits cause breezes in adjacent squares.
Then
B1,2 ⇔ (P1,1 ∨ P2,2 ∨ P1,3 ).
B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1 ).
NC State University
Logical Agents
34 / 111
Truth tables for inference
B1,1
F
F
B2,1
F
F
P1,1
F
F
P1,2
F
F
F
T
F
F
F
F
T
T
F
F
F
F
F
T
F
F
T
T
T
T
NC State University
P2,1
F
F
...
F
...
F
F
...
T
...
T
P2,2
F
F
P3,1
F
T
KB
F
F
α1
T
T
F
F
F
T
F
T
F
F
T
T
T
T
F
F
F
T
T
T
F
F
Logical Agents
35 / 111
Inference by enumeration
TT-E NTAILS ?(KB, α)
symbols = a list of symbols in KB and α
return TT-C HECK -A LL(KB, α, symbols, {})
TT-C HECK -A LL(KB, α, symbols, model)
if E MPTY ?(symbols)
if PL-T RUE ?(KB, model)
return PL-T RUE ?(α, model)
else return True
else p = First(symbols)
rest = Rest(symbols)
return TT-C HECK -A LL(KB, α, rest, E XTEND(p, True, model))
and TT-C HECK -A LL(KB, α, rest, E XTEND(p, False, model))
NC State University
Logical Agents
36 / 111
Inference by enumeration
Depth-first enumeration of all models is sound and
complete.
For n symbols, time complexity is O(2n ); space
complexity is O(n).
NC State University
Logical Agents
37 / 111
Logical equivalence
Two sentences are logically equivalent iff they are true in
same models:
α ≡ β iff α |= β and β |= α.
NC State University
Logical Agents
38 / 111
Logical equivalence
(α ∧ β)
(α ∨ β)
((α ∧ β) ∧ γ)
((α ∨ β) ∨ γ)
¬(¬α)
(α ⇒ β)
¬(α ∧ β)
¬(α ∨ β)
(α ∧ (β ∨ γ))
(α ∧ (β ∨ γ))
≡
≡
≡
≡
≡
≡
≡
≡
≡
≡
(β ∧ α)
(β ∨ α)
(α ∧ (β ∧ γ))
(α ∨ (β ∨ γ))
α
(¬β ⇒ ¬α)
(¬α ∨ ¬β)
(¬α ∧ ¬β)
((α ∧ β) ∨ (α ∧ γ))
((α ∨ β) ∧ (α ∨ γ))
NC State University
Commutativity of ∧
Commutativity of ∨
Associativity of ∧
Associativity of ∨
Double negation
Contraposition
de Morgan
de Morgan
Distributivity, ∧/∨
Distributivity, ∨/∧
Logical Agents
39 / 111
Validity
A sentence is valid if it is true in all models, e.g.,
True
A ∨ ¬A
A⇒A
(A ∧ (A ⇒ B)) ⇒ B
Validity is connected to inference via the Deduction
Theorem:
KB |= α iff (KB ⇒ α) is valid.
NC State University
Logical Agents
40 / 111
Satisfiability
A sentence is satisfiable if it is true in some model.
A sentence is unsatisfiable if it is true in no model.
Satisfiability is also connected to inference:
KB |= α iff (KB ∧ ¬α) is unsatisfiable.
NC State University
Logical Agents
41 / 111
Proof methods: Inference rules
One kind of proof method involves the application of
inference rules.
• Rules give sound generation of new sentences from
old.
• A proof is a sequence of inference rule applications.
(We can use rules as operators in search.)
• This typically requires transformation of sentences
into some normal form.
NC State University
Logical Agents
42 / 111
Proof methods: Model checking
Another approach is model checking, e.g.,
• Truth table enumeration (as we’ve seen).
• Smarter backtracking (potentially more efficient).
• Heuristic search or optimization in model space
(sound but incomplete).
NC State University
Logical Agents
43 / 111
Exercise: Bike riding
“You often ride your bike, but only on days when the
weather is nice. Whenever you ride, you wear a helmet.
Wearing your helmet always messes up your hair beyond
easy repair, but you like to be presentable: you never
arrive at work with messy hair, and you never go out on
dinner dates with messy hair.”
What propositions should we define for this story?
NC State University
Logical Agents
44 / 111
Exercise: Bike riding
“You often ride your bike, but only on days when the
weather is nice. Whenever you ride, you wear a helmet.
Wearing your helmet always messes up your hair beyond
easy repair, but you like to be presentable: you never
arrive at work with messy hair, and you never go out on
dinner dates with messy hair.”
One possible set: NiceOut, BikeRide, Helmet, MessyHair,
Work, DinnerDate.
Develop sentences to represent this story.
NC State University
Logical Agents
45 / 111
Exercise: Bike riding
“You call me on your cell phone and tell me you’re out on
a dinner date. If I wonder whether you rode your bike,
should I ask you or figure it out on my own?”
How can I express this as a sentence in logic?
NC State University
Logical Agents
46 / 111
Resolution
We start with Conjunctive Normal Form (CNF).
Call a logical variable or its complement a literal.
A, B, ¬A, ¬B
Call a disjunction of literals (including a literal standing
alone) a clause.
A, (B ∨ ¬A)
A sentence consisting of a conjunction of clauses is in
CNF.
NC State University
Logical Agents
47 / 111
Resolution
The resolution rule for CNF:
a1 ∨ . . . ∨ ak
b1 ∨ . . . ∨ bn
a1 ∨ . . . ∨ aj−1 ∨ aj+1 ∨ . . . ∨ ak ∨ b1 ∨ . . . ∨ bm−1 ∨ bm+1 ∨ . . . ∨ bn
where aj and bm are complementary literals.
Resolution is sound and complete for propositional logic.
NC State University
Logical Agents
48 / 111
Resolution in Wumpus world
P1,3 ∨ P2,2
P1,3
NC State University
¬P2,2
Logical Agents
49 / 111
Resolution in Wumpus world
P1,3 ∨ P2,2
P1,3
NC State University
¬P2,2
Logical Agents
50 / 111
Conversion to CNF
B1,1 ⇔ (P1,2 ∨ P2,1 )
Eliminate ⇔, replacing α ⇔ β with (α ⇒ β) ∧ (β ⇒ α).
(B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 )
Eliminate ⇒, replacing α ⇒ β with ¬α ∨ β.
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬(P1,2 ∨ P2,1 ) ∨ B1,1 )
...
NC State University
Logical Agents
51 / 111
Conversion to CNF
...
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬(P1,2 ∨ P2,1 ) ∨ B1,1 )
Move ¬ inwards, using de Morgan’s rule:
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ ((¬P1,2 ∧ ¬P2,1 ) ∨ B1,1 )
Apply distributivity law (∧ over ∨) and flatten:
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬P1,2 ∨ B1,1 ) ∧ (¬P2,1 ∨ B1,1 )
NC State University
Logical Agents
52 / 111
Exercise: Bike riding
In CNF:
¬NiceOut ⇒ ¬BikeRide
BikeRide ⇒ Helmet
Helmet ⇒ MessyHair
¬(MessyHair ∧ DinnerDate)
¬(MessyHair ∧ Work)
NC State University
Logical Agents
53 / 111
Exercise: Bike riding
In CNF:
¬NiceOut ⇒ ¬BikeRide
BikeRide ⇒ Helmet
Helmet ⇒ MessyHair
¬(MessyHair ∧ DinnerDate)
¬(MessyHair ∧ Work)
NC State University
NiceOut ∨ ¬BikeRide
Helmet ∨ ¬BikeRide
¬Helmet ∨ MessyHair
¬DinnerDate ∨ ¬MessyHair
¬Work ∨ ¬MessyHair
Logical Agents
54 / 111
Exercise: Bike riding
In CNF:
¬NiceOut ⇒ ¬BikeRide
BikeRide ⇒ Helmet
Helmet ⇒ MessyHair
¬(MessyHair ∧ DinnerDate)
¬(MessyHair ∧ Work)
NC State University
NiceOut ∨ ¬BikeRide
Helmet ∨ ¬BikeRide
¬Helmet ∨ MessyHair
¬DinnerDate ∨ ¬MessyHair
¬Work ∨ ¬MessyHair
Logical Agents
55 / 111
Exercise: Bike riding
In CNF:
¬NiceOut ⇒ ¬BikeRide
BikeRide ⇒ Helmet
Helmet ⇒ MessyHair
¬(MessyHair ∧ DinnerDate)
¬(MessyHair ∧ Work)
NC State University
NiceOut ∨ ¬BikeRide
Helmet ∨ ¬BikeRide
¬Helmet ∨ MessyHair
¬DinnerDate ∨ ¬MessyHair
¬Work ∨ ¬MessyHair
Logical Agents
56 / 111
Exercise: Bike riding
In CNF:
¬NiceOut ⇒ ¬BikeRide
BikeRide ⇒ Helmet
Helmet ⇒ MessyHair
¬(MessyHair ∧ DinnerDate)
¬(MessyHair ∧ Work)
NC State University
NiceOut ∨ ¬BikeRide
Helmet ∨ ¬BikeRide
¬Helmet ∨ MessyHair
¬DinnerDate ∨ ¬MessyHair
¬Work ∨ ¬MessyHair
Logical Agents
57 / 111
Exercise: Bike riding
In CNF:
¬NiceOut ⇒ ¬BikeRide
BikeRide ⇒ Helmet
Helmet ⇒ MessyHair
¬(MessyHair ∧ DinnerDate)
¬(MessyHair ∧ Work)
NC State University
NiceOut ∨ ¬BikeRide
Helmet ∨ ¬BikeRide
¬Helmet ∨ MessyHair
¬DinnerDate ∨ ¬MessyHair
¬Work ∨ ¬MessyHair
Logical Agents
58 / 111
Exercise: Bike riding
A truth table, for enumeration
NiceOut
T
T
F
F
BikeRide ¬NiceOut ⇒ ¬BikeRide
T
T
T
T
F
F
T
T
NC State University
NiceOut ∨ ¬BikeRide
T
T
F
T
Logical Agents
59 / 111
Resolution algorithm
The strategy is proof by contradiction.
To prove α, we show that there exist no models in which
KB ∧ ¬α is true.
That is, we prove KB ∧ ¬α is unsatisfiable.
NC State University
Logical Agents
60 / 111
Resolution algorithm
PL-R ESOLUTION(KB, α)
clauses = clauses in CNF representation of KB ∧ ¬α
empty = the empty clause
new = the empty set
while True
for Ci , Cj in clauses
resolvent = PL-R ESOLVE(Ci , Cj )
if resolvent is the empty set
return True
new = new ∪ resolvent
if new ⊆ clauses
return False
clauses = clauses ∪ new
NC State University
Logical Agents
61 / 111
Exercise: Bike riding
“You call me on your cell phone and tell me you’re out on
a dinner date. . . ”
NiceOut ∨ ¬BikeRide
Helmet ∨ ¬BikeRide
¬Helmet ∨ MessyHair
¬DinnerDate ∨ ¬MessyHair
¬Work ∨ ¬MessyHair
DinnerDate
I’ll guess you didn’t ride your bike (¬BikeRide). Can I
prove it?
NC State University
Logical Agents
62 / 111
Exercise: Bike riding
Recall. . .
a1 ∨ . . . ∨ ak
b1 ∨ . . . ∨ bn
a1 ∨ . . . ∨ aj−1 ∨ aj+1 ∨ . . . ∨ ak ∨ b1 ∨ . . . ∨ bm−1 ∨ bm+1 ∨ . . . ∨ bn
Prove ¬BikeRide by resolution.
To start: Negate ¬BikeRide. . .
NC State University
Logical Agents
63 / 111
Exercise: Bike riding
Helmet ∨ ¬BikeRide
BikeRide
Helmet
¬Helmet ∨ MessyHair
Helmet
MessyHair
MessyHair
¬DinnerDate ∨ ¬MessyHair
¬DinnerDate
¬DinnerDate
NC State University
DinnerDate
Logical Agents
64 / 111
Forward and backward chaining
Horn Form: KB is a conjunction of Horn clauses.
A Horn clause is an implication of a conjunction to a
positive literal:
C, (B ⇒ A), (C ∧ D ⇒ B)
Modus Ponens (for Horn Form) is complete for Horn
KBs.
α1 . . . αn
NC State University
α1 ∧ . . . ∧ αn ⇒ β
β
Logical Agents
65 / 111
Forward chaining
Q
A
B
P
L∧M
B∧L
A∧P
A∧B
P
⇒
⇒
⇒
⇒
⇒
Q
P
M
L
L
M
L
A
NC State University
B
Logical Agents
66 / 111
Forward chaining
Fire any rule whose premises are satisfied in the KB.
Add its conclusion to the KB.
Continue until the query is found.
NC State University
Logical Agents
67 / 111
Forward chaining algorithm
PL-FC-E NTAILS ?(KB, q)
count = a table, where count[c] is # of symbols in c’s premise
inferred = a table, where inferred[s] is initially false
agenda = a list of symbols known to be true in KB
while not Empty?(agenda)
p = Pop(agenda)
if p is q return True
if inferred[p] is False
inferred[p] = True
for c such that p ∈ c.premise
count[c] = count[c] − 1
if count[c] is 0
Push(c.conclusion, agenda)
return False
NC State University
Logical Agents
68 / 111
Forward chaining example
Agenda: A, B.
P⇒Q:1
L∧M ⇒P: 2
B∧L⇒M : 2
A∧P⇒L: 2
A∧B⇒L: 2
Pop A.
NC State University
Logical Agents
69 / 111
Forward chaining example
After processing A.
Agenda: B.
P⇒Q:1
L∧M ⇒P: 2
B∧L⇒M : 2
A∧P⇒L: 1
A∧B⇒L: 1
Pop B.
NC State University
Logical Agents
70 / 111
Forward chaining example
After processing B.
P⇒Q:1
L∧M ⇒P: 2
B∧L⇒M : 1
A∧P⇒L: 1
A∧B⇒L: 0
Agenda: L.
Pop L.
NC State University
Logical Agents
71 / 111
Forward chaining example
After processing L.
P⇒Q:1
L∧M ⇒P: 1
B∧L⇒M : 0
A∧P⇒L: 1
A∧B⇒L: 0
Agenda: M.
Pop M.
NC State University
Logical Agents
72 / 111
Forward chaining example
After processing M.
P⇒Q:1
L∧M ⇒P: 0
B∧L⇒M : 0
A∧P⇒L: 1
A∧B⇒L: 0
Agenda: P.
Pop P.
NC State University
Logical Agents
73 / 111
Forward chaining example
After processing P.
P⇒Q:0
L∧M ⇒P: 0
B∧L⇒M : 0
A∧P⇒L: 0
A∧B⇒L: 0
Agenda: Q, L.
Pop Q.
NC State University
Logical Agents
74 / 111
Forward chaining example
NC State University
Logical Agents
75 / 111
Forward chaining example
NC State University
Logical Agents
76 / 111
Forward chaining example
NC State University
Logical Agents
77 / 111
Forward chaining example
NC State University
Logical Agents
78 / 111
Forward chaining example
NC State University
Logical Agents
79 / 111
Forward chaining example
NC State University
Logical Agents
80 / 111
Forward chaining example
NC State University
Logical Agents
81 / 111
Forward chaining example
NC State University
Logical Agents
82 / 111
Proof of completeness
FC derives every atomic sentence that is entailed by KB.
FC reaches a fixed point where no new atomic sentences
are derived.
Consider the final state to be a model m, assigning
true/false to symbols.
Every clause in the original KB is true in m
a1 ∧ . . . ∧ ak ⇒ b
Then m is a model of KB.
If KB |= q, q is true in every model of KB, including m.
NC State University
Logical Agents
83 / 111
Backward chaining
Work backward from query q.
To prove q by BC,
• Check if q is known already, or
• Prove by BC all premises of some rule concluding q.
Avoid loops by checking if a new subgoal is already on
the goal stack.
Avoid repeated work by checking if a new subgoal has
already been proved true or has already failed.
NC State University
Logical Agents
84 / 111
Backward chaining example
NC State University
Logical Agents
85 / 111
Backward chaining example
NC State University
Logical Agents
86 / 111
Backward chaining example
NC State University
Logical Agents
87 / 111
Backward chaining example
NC State University
Logical Agents
88 / 111
Backward chaining example
NC State University
Logical Agents
89 / 111
Backward chaining example
NC State University
Logical Agents
90 / 111
Backward chaining example
NC State University
Logical Agents
91 / 111
Backward chaining example
NC State University
Logical Agents
92 / 111
Backward chaining example
NC State University
Logical Agents
93 / 111
Backward chaining example
NC State University
Logical Agents
94 / 111
Forward vs backward chaining
FC is data-driven, automatic processing, e.g., object
recognition, routine decisions.
It may do lots of work that is irrelevant to the goal.
BC is goal-driven, appropriate for problem solving, e.g.,
Where are my keys? How do I get CS degree?
The complexity of BC can be much less than linear in size
of KB.
NC State University
Logical Agents
95 / 111
Efficient propositional inference
Two families of efficient algorithms for propositional
inference are
• Complete backtracking search algorithms, such as
DPLL, and
• Incomplete local search algorithms, such as
WalkSAT.
NC State University
Logical Agents
96 / 111
The DPLL algorithm: Concepts
A clause is true if any literal is true.
A sentence is false if any clause is false.
A unit clause contains only one literal.
A pure symbol always appears in the same positive or
negative form in all clauses.
(A ∨ ¬B), (¬B ∨ ¬C), (C ∨ A)
A and B are pure; C is not.
NC State University
Logical Agents
97 / 111
The DPLL algorithm: Heuristics
Notice when a clause can be true if one of its literals is
true.
Notice that a sentence is false if one of its clauses is false.
Make the single literal in unit clauses true.
Make pure symbol literals true.
NC State University
Logical Agents
98 / 111
The DPLL algorithm
DPLL-S ATISFIABLE ?(s)
clauses = the set of clauses in the CNF representation of s
symbols = propositional symbols in s
return DPLL(clauses, symbols, {})
NC State University
Logical Agents
99 / 111
The DPLL algorithm
DPLL(clauses, symbols, model)
if A LL -T RUE(clauses, model) return True
if S OME -FALSE(clauses, model) return False
p, value = F IND -P URE -S YMBOL(clauses, symbols, model)
if p found
return DPLL(clauses, symbols \ p, [p = value | model])
p, value = F IND -U NIT-C LAUSE(clauses, model)
if p found
return DPLL(clauses, symbols \ p, [p = value | model])
p = First(symbols)
return DPLL(clauses, symbols \ p, [p = True | model]) or
DPLL(clauses, symbols \ p, [p = False | model])
NC State University
Logical Agents
100 / 111
The WalkSAT algorithm
WalkSat is an incomplete local search algorithm.
Its evaluation function is the min-conflict heuristic of
minimizing the number of unsatisfied clauses.
It balances between greediness and randomness.
NC State University
Logical Agents
101 / 111
The WalkSAT algorithm
WALK SAT(clauses, p, max-flips)
model = a random assignment of True/False to symbols in clauses
for j = 1 to max-flips
if S ATISFIES ?(model, clauses)
return model
clause = C HOOSE -R ANDOM -FALSE -C LAUSE(model)
if Sample(0, 1) < p
model = F LIP(Random-Element(clause), clause, model)
else symbol = A RG -M AX(clause, M AXIMIZE -S ATISFIED)
model = F LIP(symbol, clause, model)
return Failure
NC State University
Logical Agents
102 / 111
Hard satisfiability problems
Consider random 3-CNF sentences.
(¬D ∨ ¬B ∨ C) ∧ (B ∨ ¬A ∨ ¬C) ∧ (¬C ∨ ¬B ∨ E)∨
(E ∨ ¬D ∨ B) ∨ (B ∨ E ∨ ¬C)
Let
m = number of clauses and
n = number of symbols.
Hard problems seem to cluster near a critical point,
m/n = 4.3.
NC State University
Logical Agents
103 / 111
Hard satisfiability problems
NC State University
Logical Agents
104 / 111
Hard satisfiability problems
NC State University
Logical Agents
105 / 111
Inference in the Wumpus world
¬P1,1
¬W1,1
Bx,y ⇔ (Px,y+1 ∨ Px,y−1 ∨ Px+1,y ∨ Px−1,y )
Sx,y ⇔ (Wx,y+1 ∨ Wx,y−1 ∨ Wx+1,y ∨ Wx−1,y )
W1,1 ∨ W1,2 ∨ . . . ∨ W4,4
¬W1,1 ∨ W1,2
¬W1,1 ∨ W1,3
...
NC State University
Logical Agents
106 / 111
A propositional Wumpus world agent
PL-W UMPUS -AGENT(percept)
static: KB, x, y, orientation, visited, action, plan
U PDATE(x, y, orientation, visited, action)
if percept.stench
T ELL(KB, stenchx,y )
else T ELL(KB, ¬stenchx,y )
if percept.breeze
T ELL(KB, breezex,y )
else T ELL(KB, ¬breezex,y )
if percept.glitter
return action = Grab
...
NC State University
Logical Agents
107 / 111
A propositional Wumpus world agent
PL-W UMPUS -AGENT(percept)
...
else if not Empty?(plan)
return action = Pop(plan)
else for j, k on F RINGE(visited)
if A SK(KB, Pj,k ∧ ¬Wj,k ) is True or
A SK(KB, Pj,k ∨ Wj,k ) is False
plan = P LAN -PATH(x, y, orientation, j, k,visited)
return action = Pop(plan)
else return action = C HOOSE -R ANDOM -M OVE()
NC State University
Logical Agents
108 / 111
Expressiveness limitation of
propositional logic
KB contains “physics” sentences for every single square.
For every time t and every x, y location Lx,y ,
Lx,y (t) ∧ FacingRight(t) ∧ Forward(t) ⇒ Lx+1,y (t + 1),
etc.
Rapid proliferation of clauses.
NC State University
Logical Agents
109 / 111
Summary
Logical agents apply inference to a knowledge base to
derive new information and make decisions.
• Syntax: Formal structure of sentences.
• Semantics: Truth of sentences with respect models.
• Entailment: Necessary truth of one sentence given
another.
• Inference: Deriving sentences from other sentences.
• Soundness: Derivations produce only entailed
sentences.
• Completeness: Derivations can produce all entailed
sentences.
NC State University
110 / 111
Logical Agents
Last observations
An agent, as in Wumpus world, needs to represent partial
and negated information, reason by cases, etc.
Resolution is complete for propositional logic.
Forward and backward chaining run in linear time; they
are complete for Horn clauses.
Propositional logic lacks expressive power.
NC State University
Logical Agents
111 / 111
© Copyright 2026 Paperzz