Heuristic Search Methods

search
HEURISTIC SEARCH
Heuristics:
• Rules for choosing the branches in a state space that are most
likely to lead to an acceptable problem solution.
• Rules that provide guidance in decision making
• Often improves the decision making:
•
Shopping: Choosing the shortest queue in a
supermarket does not necessarily means that you will
out of the market earlier
Used when:
•
Information has inherent ambiguity
•
computational costs are high
CS 331/531 Dr M M Awais
1
search
Finding Heuristics
Tic - Tac - Toe
which one to choose?
X
X
X
Heuristic:
calculate winning lines and move to
state with most wining lines.
CS 331/531 Dr M M Awais
2
search
Calculating winning lines
X
X
X
3 winning
lines
4 winning
lines
3 winning
lines
Always choose the state with maximum heuristic value
Maximizing heuristics
CS 331/531 Dr M M Awais
3
search
Heuristic: Choosing city with minimum distance:
Always choose the city with minimum heuristic value
Minimizing heuristics (hi<hi-1)
CS 331/531 Dr M M Awais
4
search

Finding Heuristic functions
8-puzzle




Avg. solution cost is about 22 steps (branching factor
+/- 3)
Exhaustive search to depth 22: 3.1 x 1010 states.
A good heuristic function can reduce the search process.
CAN YOU THINK OF A HEURISTIC ?
CS 331/531 Dr M M Awais
5
8 Puzzle
search


Two commonly used heuristics
h1 = the number of misplaced tiles


h1(s)=8
h2 = the sum of the distances of the tiles from
their goal positions (manhattan distance).

h2(s)=3+1+2+2+2+3+3+2=18
CS 331/531 Dr M M Awais
6
search
Heuristics: Quality

Admissibility of Heuristics


Effective Branching factor


Heuristic function should never overestimate
the actual cost to the goal
Heuristic that has lower effective branching
factor is better
More Informed Heuristics

Heuristic that has a higher value is more
informed compared to the others
CS 331/531 Dr M M Awais
7
search
Algorithms for Heuristic Search
Heuristic Search
Hill Climbing
Best first search
CS 331/531 Dr M M Awais
A* Algo
8
search
Hill Climbing:
If the Node is better, only then you proceed to that Node
When is a node better?: Apply heuristic to compare the nodes
Simplified Algorithm:
1. Start with current-state (cs) = initial state
2. Until cs = goal-state or there is no change in the
cs do:
(a) Get the successor of cs and use the
EVALUATION FUNCTION to assign a score to
each successor
(b) If one of the successor has a better score than
cs then set the new state to be the successor with
the best score.
CS 331/531 Dr M M Awais
9
search
Navigation Problem
Choose the closest city to travel
10
CS 331/531 Dr M M Awais
search
Navigation Problem
Choose the closest city to travel
hi < hi-1
11
CS 331/531 Dr M M Awais
search
GET STUCK
No Backtracking
Navigation Problem
Choose the closest city to travel
12
CS 331/531 Dr M M Awais
search
Hill Climbing
A:10
B:5
D:4
C:3
E:2
G:0
Goal/
Solution
F:6
Node label: heuristic value of the node
A:10
A is node name,
10 is the heuristic evaluation
CS 331/531 Dr M M Awais
13
search
Hill Climbing
A:10
Compare B with C
B:5
D:4
C:3
E:2
G:0
Goal/
Solution
F:6
Node label: heuristic value of the node
A:10
A is node name,
10 is the heuristic evaluation
CS 331/531 Dr M M Awais
14
search
Hill Climbing
A:10
C is better so move
B:5
D:4
C:3
E:2
G:0
Goal/
Solution
F:6
F is poor than C
So gets stuck at C
Node label: heuristic value of the node
A:10
A is node name,
10 is the heuristic evaluation
CS 331/531 Dr M M Awais
15
search
Hill-climbing search

“is a loop that continuously moves in the
direction of increasing value (for maximizing
heuristic), or in the direction of decreasing value
(for minimizing heuristic)”




It terminates when a peak is reached
Hill climbing does not look ahead of the
immediate neighbors of the current state.
Hill-climbing chooses randomly among the set
of best successors, if there is more than one.
Hill-climbing a.k.a. greedy local search
CS 331/531 Dr M M Awais
16
search
Hill-climbing search: Algo
function HILL-CLIMBING( problem) return a state that is a local
maximum
input: problem, a problem
local variables: current, a node.
neighbor, a node.
current  MAKE-NODE(INITIAL-STATE[problem])
loop do
neighbor  a highest valued successor of current
if VALUE [neighbor] ≤ VALUE[current] then return
STATE[current]
current  neighbor
CS 331/531 Dr M M Awais
17
search
Hill-climbing example



8-queens problem (complete-state
formulation).
Successor function: move a single
queen to another square in the same
column.
Heuristic function h(n): the number of
pairs of queens that are attacking each
other (directly or indirectly).
CS 331/531 Dr M M Awais
18
search
Hill-climbing example
a)
b)
a) shows a state of h=17 and the h-value for
each possible successor.
b) A local minimum in the 8-queens state space
(h=1).
CS 331/531 Dr M M Awais
19
Drawback
search



Ridge = sequence of local maxima difficult for hill climbing to navigate
Plateaux = an area of the state space where the evaluation function is
flat.
GETS STUCK 86% OF THE TIME.
CS 331/531 Dr M M Awais
20
search
Hill-climbing variations



Stochastic hill-climbing
 Random selection among the uphill moves.
 The selection probability can vary with the
steepness of the uphill move.
First-choice hill-climbing
 Stochastic hill climbing by generating
successors randomly until a better one is
found.
Random-restart hill-climbing
 Tries to avoid getting stuck in local
maxima/minima.
CS 331/531 Dr M M Awais
21
search
Reading Assignment on Simulated Annealing
Study the methods to get rid of the local minima
problem especially simulated annealing
You can consult the Text Books
(no need to turn in any report)
CS 331/531 Dr M M Awais
22
search
Best-First Search




It exploits state description to estimate how
promising each search node is
An evaluation function f maps each search node
N to positive real number f(N)
Traditionally, the smaller f(N), the more
promising N
Best-first search sorts the nodes in increasing f
[random order is assumed among nodes with equal
values of f]
“Best” only refers to the value of f, not to the quality of the
actual path. Best-first search does not generate optimal paths
in general
23
CS 331/531 Dr M M Awais
search
Summary: Best-first search

General approach:


Idea: evaluation function measures distance
to the goal.


Best-first search: node is selected for expansion
based on an evaluation function f(n)
Choose node which appears best based on the
heuristic value
Implementation:


A queue is sorted in decreasing order of
desirability.
Special cases: greedy search, A* search
CS 331/531 Dr M M Awais
24
search
Evaluation function



Same as Hill Climbing
Heuristic Evaluation
f(n)=h(n) = estimated cost of the
cheapest path from node n to goal
node.
 If n = goal then h(n)=f(n)=0
CS 331/531 Dr M M Awais
25
search
Best First Search Method
Algo:
1. Start with agenda (priority queue) = [initial-state]
2. While agenda not empty do:
(a) remove the best node from the agenda
(b) if it is the goal node then return with success.
Otherwise find its successors.
( c) Assign the successor nodes a score using the
evaluation function and add the scored nodes
to agenda
CS 331/531 Dr M M Awais
26
search
Breadth - First
A:10
B:5
D:4
E:2
C:3
Depth First
F:6
Hill Climbing
G:0
Solution
CS 331/531 Dr M M Awais
27
search
A:10
Best First Search Method
1.
Open [A:10] : closed []
Evaluate A:10;
2.
B:5
C:3
Open [C:3,B:5]; closed [A:10]
Evaluate C:3;
3.
Open [B:5,F:6]; closed [C:3,A:10]
Evaluate B:5;
4.
D:4
E:2
F:6
Open [E:2,D:4,F:6]; closed [C:3,B:5,A:10].
Evaluate E:2;
5
Open [G:0,D:4,F:6]; closed [E:2,C:3,B:5,A:10]
G:0
Evaluate G:0;
Solution
the solution / goal is reached
CS 331/531 Dr M M Awais
28
search
Comments: Best First Search Method
•If the evaluation function is good best first
search may drastically cut the amount of
search requested otherwise.
•The first move may not be the best one.
•If the evaluation function is heavy / very
expensive the benefits may be overweighed
by the cost of assigning a score
CS 331/531 Dr M M Awais
29
search
Romania with step costs in km



hSLD=straight-line
distance heuristic.
hSLD can NOT be
computed from the
problem description
itself
In this example
f(n)=h(n)

Expand node that is
closest to goal
= Greedy best-first search
CS 331/531 Dr M M Awais
30
search
Greedy search example
Open=[Arad:366]
Arad (366)


Assume that we want to use greedy search to
solve the problem of travelling from Arad to
Bucharest.
The initial state=Arad
CS 331/531 Dr M M Awais
31
Greedy search example
search
Open=[Sibiu:253 , Tmisoara:329 , Zerind:374]
Arad
Zerind(374)
Sibiu(253)
Timisoara
(329)

The first expansion step produces:


Sibiu, Timisoara and Zerind
Greedy best-first will select Sibiu.
CS 331/531 Dr M M Awais
32
search
Greedy search example
Open=[Fagaras:176 , RV:193 , Arad:366 , Or:380]
Arad
Sibiu
Arad
(366)

Rimnicu Vilcea
(193)
If Sibiu is expanded we get:


Fagaras Oradea
(176) (380)
Arad, Fagaras, Oradea and Rimnicu Vilcea
Greedy best-first search will select: Fagaras
CS 331/531 Dr M M Awais
33
search
Greedy search example
Open=[Bucharest:0 ,… ] , goal achieved
Arad
Sibiu
Fagaras
Sibiu
(253)

If Fagaras is expanded we get:


Bucharest
(0)
Sibiu and Bucharest
Goal reached !!

Yet not optimal (see Arad, Sibiu, Rimnicu Vilcea, Pitesti)
CS 331/531 Dr M M Awais
34
search
Greedy search, evaluation

Completeness: NO (DF-search)


Check on repeated states
With Oradea as GOAL, and start state lasi, what
would be the path
Lasi to Neamt to lasi and so on …
CS 331/531 Dr M M Awais
35
search
8-Puzzle
f(N) = h(N) = number of misplaced tiles
3
5
3
4
3
4
2
2
4
3
1
3
4
5
4
0
Total nodes expanded 16
CS 331/531 Dr M M Awais
36
search
8-Puzzle
6
f(N) = h(N) = S distances of tiles to goal
5
Savings 25%
2
2
4
1
3
5
4
6
5
0
Total nodes expanded 12
CS 331/531 Dr M M Awais
37
search
Robot Navigation
CS 331/531 Dr M M Awais
38
search
Robot Navigation
f(N) = h(N), with h(N) = Manhattan distance to the goal
8
7
7
6
5
4
5
4
3
3
2
6
7
6
8
7
3
2
3
4
5
6
5
1
0
1
2
4
5
6
5
4
3
2
3
CS 331/531 Dr M M Awais
4
5
6
39
search
Robot Navigation
f(N) = h(N), with h(N) = Manhattan distance to the goal
8
7
7
6
5
4
5
4
3
3
2
6
77
6
8
7
3
2
3
4
5
6
5
1
00
1
2
4
5
6
5
4
3
2
3
4
5
6
Not optimal at all
CS 331/531 Dr M M Awais
40
search
Greedy search, evaluation


Completeness: NO (DF-search)
m
O(b
)
Time complexity?
Worst-case DF-search
(with m is maximum depth of search space)
 Good heuristic
 can give dramatic
improvement.

CS 331/531 Dr M M Awais
41
search
Greedy search, evaluation



Completeness: NO (DF-search)
m
O(b
)
Time complexity:
Space complexity: O(b m )

Keeps all nodes in memory


CS 331/531 Dr M M Awais
42
search
Greedy search, evaluation




Completeness: NO (DF-search)
m
O(b
)
Time complexity:
Space complexity: O(b m )
Optimality? NO


Same as DF-search

CS 331/531 Dr M M Awais
43
search
Robot Navigation Other Examples
N
yN
yg
xN
CS 331/531 Dr M M Awais
xg
44
search
Robot Navigation Other Examples
N
yN
yg
h1 (N) =
(xN -xg ) +(yN -yg )
2
2
h2(N) = |xN-xg| + |yN-yg|
xg
xN
(Euclidean distance)
(Manhattan distance)
CS 331/531 Dr M M Awais
45
search
A* Alogorithm

Problems with Best First Search

It reduces the costs to the goal but


It is not optimal nor complete
Uniform cost
CS 331/531 Dr M M Awais
46
search
A:10
Path Cost
2
2
Cost1 = 7 (2+3+2)
B:8
C:9
Cost2 = 14 (2+5+3+4)
5
3
Path for:
D:6
G:3
Hill Climbing: ABDEF
3
2
Best First: ABDEF
E:4
F:0
4
F:0
ABDEF
is it optimal / shortest pat. (NO)
CS 331/531 Dr M M Awais
47
search
A* Search
Evaluation function:
f(n) = g(n) +h(n)
Path cost to node n + heuristic cost at n
Constraints:
h(n) <= h*(n)
(Admissible: Studied earlier)
g(n) >= g*(n)
(Coverage)
CS 331/531 Dr M M Awais
48
search
Coverage: g(n) >=
g*(n)
g(n)
Goal will
never be
reached
g*(n)
CS 331/531 Dr M M Awais
49
search
Observations
h
g
h*
Remarks
Immediate convergence, A* converges
to goal
(No Search)
0
0
Random Search
0
1
Breath - First Search
>=h*
No Convergence
<=h*
Admissible Search
<=g*
No Convergence
CS 331/531 Dr M M Awais
50
search
Example of A*
A:10
Path: (P1): Best First/Hill Climbing
2
ABDEF: Cost P1 = 14 (not optimal)
2
For A* algorithm
B:8
C:9
F(A)=0+10=10,
5
3
F(B)=2+8=10, f(C) = 2+9=11, Expand B D:6
F(D)=(2+5)+6=13, f(C)=11, Expand C 3
F(G)=(2+3)+3=8, f(D)=13, Expand G
G:3
E:4
F(f)=(2+3+2)+0=7, GOAL achieved
2
4
Path ACGF: Cost P2=7 (Optimal)
F:0
Path Admissibility
Cost P2 < Cost P1
hence P2 is admissible Path
CS 331/531 Dr M M Awais
51
search
Explanation
A:10
For A* algorithm Path (P2)
Now lets start from A, should we
2
2
move from A to B or A to C.
B:8
C:9
Lets check the path cost SAME,
5
3
D:6
G:3
So lets see the total cost
(fb=hb+gb=8+2=10)
(fc=hc+gb=9+2=11), hence moving through
3
B is better
E:4
Next move to D, total path cost to D is 2+5 =
7, and heuristic cost is 6, total is 7+6=13.
On the other side If you move through C to
G, then the path cost is 2+3=5, and heuristic
cost is 3, total = 3+5=8, which is much better
than moving through state D. So now we
choose to change path and move through G
CS 331/531 Dr M M Awais
2
4
F:0
52
search
Explanation
A:10
2
2
B:8
C:9
Total Path cost via G is 2+3+2=7
5
3
And Total Path cost via D is 2+5+3+4=14
D:6
G:3
Now from G we move to the Goal node F
Hence moving through G is much better
will give the optimal path.
3
E:4
2
4
F:0
CS 331/531 Dr M M Awais
53
search
For A*never throw away
unexpanded nodes:
Always compare paths through
expanded and unexpanded nodes
Avoid expanding paths that are already
expensive
CS 331/531 Dr M M Awais
54
search
A* search



Best-known form of best-first search.
Idea: avoid expanding paths that are already
expensive.
Evaluation function f(n)=g(n) + h(n)



g(n) the cost (so far) to reach the node.
h(n) estimated cost to get from the node to the
goal.
f(n) estimated total cost of path through n to goal.
CS 331/531 Dr M M Awais
55
search
A* search

A* search uses an admissible heuristic


A heuristic is admissible if it never
overestimates the cost to reach the goal
Are optimistic
Formally:
1. h(n) <= h*(n) where h*(n) is the true cost from n
2. h(n) >= 0 so h(G)=0 for any goal G.
e.g. hSLD(n) never overestimates the actual road distance
CS 331/531 Dr M M Awais
56
search
Romania example
CS 331/531 Dr M M Awais
57
search
A* search example

Starting at Arad

f(Arad) = c(Arad,Arad)+h(Arad)=0+366=366
CS 331/531 Dr M M Awais
58
search
A* search example

Expand Arad and determine f(n) for each node




f(Sibiu)=c(Arad,Sibiu)+h(Sibiu)=140+253=393
f(Timisoara)=c(Arad,Timisoara)+h(Timisoara)=118+329=447
f(Zerind)=c(Arad,Zerind)+h(Zerind)=75+374=449
Best choice is Sibiu
CS 331/531 Dr M M Awais
59
A* search example
search

Previous Paths


f(Timisoara)=c(Arad,Timisoara)+h(Timisoara)=118+329= 447

f(Zerind)=c(Arad,Zerind)+h(Zerind)=75+374=
Expand Sibiu and determine f(n) for each node





449
f(Arad)=c(Sibiu,Arad)+h(Arad)=280+366=
f(Fagaras)=c(Sibiu,Fagaras)+h(Fagaras)=239+179=
f(Oradea)=c(Sibiu,Oradea)+h(Oradea)=291+380=
f(Rimnicu Vilcea)=c(Sibiu,Rimnicu Vilcea)+
h(Rimnicu Vilcea)=220+192=
Best choice is Rimnicu Vilcea
CS 331/531 Dr M M Awais
646
415
671
413
60
search
A* search example






447
f(Zerind)=c(Arad,Zerind)+h(Zerind)=75+374=
f(Arad)=c(Sibiu,Arad)+h(Arad)=280+366=
f(Fagaras)=c(Sibiu,Fagaras)+h(Fagaras)=239+179=
f(Oradea)=c(Sibiu,Oradea)+h(Oradea)=291+380=
449
646
415
671
Expand Rimnicu Vilcea and determine f(n) for each node




f(Timisoara)=c(Arad,Timisoara)+h(Timisoara)=118+329=
f(Craiova)=c(Rimnicu Vilcea, Craiova)+h(Craiova)=360+160=
f(Pitesti)=c(Rimnicu Vilcea, Pitesti)+h(Pitesti)=317+100=
f(Sibiu)=c(Rimnicu Vilcea,Sibiu)+h(Sibiu)=300+253=
Best choice is Fagaras
CS 331/531 Dr M M Awais
526
417
553
61
search
A* search example
f(Timisoara)=c(Arad,Timisoara)+h(Timisoara)=118+329=
447
f(Zerind)=c(Arad,Zerind)+h(Zerind)=75+374=

f(Arad)=c(Sibiu,Arad)+h(Arad)=280+366=

f(Fagaras)=c(Sibiu,Fagaras)+h(Fagaras)=239+179=

f(Oradea)=c(Sibiu,Oradea)+h(Oradea)=291+380=
Expand Rimnicu Vilcea and determine f(n) for each node

f(Craiova)=c(Rimnicu Vilcea, Craiova)+h(Craiova)=360+160=

f(Pitesti)=c(Rimnicu Vilcea, Pitesti)+h(Pitesti)=317+100=

f(Sibiu)=c(Rimnicu Vilcea,Sibiu)+h(Sibiu)=300+253=
Expand Fagaras and determine f(n) for each node

f(Sibiu)=c(Fagaras, Sibiu)+h(Sibiu)=338+253=

f(Bucharest)=c(Fagaras,Bucharest)+h(Bucharest)=450+0=
Best choice is Pitesti !!!
CS 331/531 Dr M M Awais
449
646
415
671





526
417
553
591
450
62
search
A* search example

Expand Pitesti and determine f(n) for each node


Best choice is Bucharest !!!


f(Bucharest)=c(Pitesti,Bucharest)+h(Bucharest)=418+0=418
Optimal solution (only if h(n) is admissable)
Note values along optimal path !!
CS 331/531 Dr M M Awais
63
search
Optimality of A*(standard
proof)


Suppose suboptimal goal G2 in the queue.
Let n be an unexpanded node on a shortest to
optimal goal G.
f(G2 )
= g(G2 )
> g(G)
>= f(n)
Since f(G2) > f(n), A* will
since h(G2 )=0
since G2 is suboptimal
since h is admissible
never select G2 for expansion
CS 331/531 Dr M M Awais
64
search
BUT … graph search

Discards new paths to repeated state.


Previous proof breaks down
Solution:


Add extra bookkeeping i.e. remove more
expsive of two paths.
Ensure that optimal path to any repeated
state is always first followed.

Extra requirement on h(n): consistency
(monotonicity)
CS 331/531 Dr M M Awais
65
search
Consistency

A heuristic is consistent if

If h is consistent, we have
h(n)  c(n,a,n')  h(n')
f (n')  g(n')  h(n')
 g(n)  c(n,a,n')  h(n')

 g(n)  h(n)
 f (n)
i.e. f(n) is non decreasing along any path.

CS 331/531 Dr M M Awais
66
search
Optimality of A*(more useful)


A* expands nodes in order of increasing f value
Contours can be drawn in state space

Uniform-cost search adds circles.
F-contours are gradually
Added:
1) nodes with f(n)<C*
2) Some nodes on the goal
Contour (f(n)=C*).

Contour I has all
Nodes with f=fi, where
fi < fi+1.
CS 331/531 Dr M M Awais
67
search
A* search, evaluation

Completeness: YES


Since bands of increasing f are added
Unless there are infinitely many nodes with
f<f(G)
CS 331/531 Dr M M Awais
68
search
A* search, evaluation


Completeness: YES
Time complexity:

Number of nodes expanded is still
exponential in the length of the solution.
CS 331/531 Dr M M Awais
69
search
A* search, evaluation



Completeness: YES
Time complexity: (exponential with path
length)
Space complexity:


It keeps all generated nodes in memory
Hence space is the major problem not time
CS 331/531 Dr M M Awais
70
search
A* search, evaluation




Completeness: YES
Time complexity: (exponential with path length)
Space complexity:(all nodes are stored)
Optimality: YES




Cannot expand fi+1 until fi is finished.
A* expands all nodes with f(n)< C*
A* expands some nodes with f(n)=C*
A* expands no nodes with f(n)>C*
Also optimally efficient (not including ties)
CS 331/531 Dr M M Awais
71
search
Quality of Heuristics


Admissibility
Effective Branching factor
CS 331/531 Dr M M Awais
72
search
Admissible heuristics


A heuristic h(n) is admissible if for every
node n, h(n) ≤ h*(n), where h*(n) is the
true cost to reach the goal state from n.
An admissible heuristic never
overestimates the cost to reach the goal,
i.e., it is optimistic
CS 331/531 Dr M M Awais
73
search
Navigation Problem
Shows Actual Road Distances
CS 331/531 Dr M M Awais
74
search
Heuristic: Straight Line Distance between cities and
goal city (aerial)
Actual Road Distances
Is the new heuristic Admissible
75
CS 331/531 Dr M M Awais
search
Heuristic: Straight Line Distance between cities
goal city (aerial)
Is the new heuristic
Admissible
hsld(n)<=h*(n)
Consider n= sibiu
hsld(sibiu)=253
h*(sibiu)=80+97+101=278 (actual cost through Piesti)
h*(sibiu)=99+211=310 (actual cost through Fagaras)
hsld<=h* Admissible (never overestimates the actual road distance)
CS 331/531 Dr M M Awais
76
search

Which Heuristic is Admissible?
h1 = the number of misplaced tiles
h1(s)=8
h2 = the sum of the distances of the tiles from their
goal positions (manhattan distance).
 h2(s)=3+1+2+2+2+3+3+2=18



BOTH
CS 331/531 Dr M M Awais
77
search



New Heuristic: Permutation Inversions
let the goal be:
Let nI be the number of
tiles J < I that appear after
tile I (from left to right and
top to bottom)
h3 = n2 + n3 +  + n15 +
row number of empty tile
n2 = 0 n3 = 0 n4 = 0
n5 = 0 n6 = 0 n7 = 1
n8 = 1 n9 = 1 n10 = 4
n11 = 0 n12 = 0 n13 = 0
n14 = 0 n15 = 0
1
3
4
5
6
7
8
13 14 15
4
5 10 7
8
6
2
9 10 11 12
3
9
2
1
11 12
13 14 15
CS 331/531 Dr M M Awais
 h3 = 7 + 4
IS h3 admissible
78
search
New Heuristic: Permutation Inversions
IS h3 admissible
h3 = 7 + 4
h* = actual moves
required to achieve
the goal
If h3 <= h* then
Admissible,
Otherwise Not
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
1
2
3
4
5
10
7
8
9
6
11
12
13
14
15
Find out yourself
CS 331/531 Dr M M Awais
79
search
New Heuristic: Permutation Inversions
1
2
3
1
2
4
5
6
4
5
3
7
8
7
8
6
STATE(N)
STATE(Goal)
Is h3 admissible here:
h3=6+1=7
6 (block 3 and 6 requires 2 jumps each)
1(empty block is in row 1)
h*=2 (actual moves)
So h3 is not admissible
CS 331/531 Dr M M Awais
80
search
Example
5
8
1
2
3
x
4
2
1
x
x
7
3
6
x
x
STATE(N)
GOAL STATE
Goal contains partial information, the sequence in the
first row is important.
CS 331/531 Dr M M Awais
81
search
Finding admissible heuristics

Relaxed Problem:

Admissible heuristics can be derived from the exact
solution cost of a relaxed version of the problem:



Relaxed 8-puzzle for h1 : a tile can move anywhere As a
result, h1(n) gives the shortest solution
Relaxed 8-puzzle for h2 : a tile can move to any
adjacent square.
As a result, h2(n) gives the shortest solution.
The optimal solution cost of a relaxed problem is
no greater than the optimal solution cost of the
real problem.
CS 331/531 Dr M M Awais
82
search


Relaxed Problem: Example
By solving relaxed problems at each node
In the 8-puzzle, the sum of the distances of each tile to
its goal position (h2) corresponds to solving 8 simple
problems:
5


8
1
2
3
6
4
2
1
4
5
7
3
6
7
8
It ignores negative interactions among tiles
Store the solution pattern in the database
CS 331/531 Dr M M Awais
83
Relaxed Problem: Example
search
5
8
1
2
3
6
4
2
1
4
5
7
3
6
7
8
8
8
5
5
6
6
h = h8 + h5 + h6 +…
CS 331/531 Dr M M Awais
84
search

Complex: Relaxed Problem
Consider two more complex relaxed problems:
5
2
3


1
1
2
3
6
4
2
1
4
5
7
3
6
7
8
1
4
8
2
3
5
8
4
5
7
6
7
6
8
h = d1234 + d5678 [disjoint pattern heuristic]
These distances could have been pre-computed in a
CS 331/531 Dr M M Awais
database
85
search
Relaxed Problem
Several order-of-magnitude speedups
for
15- and 24-puzzle
Problem have been obtained through the
application of relaxed problem
CS 331/531 Dr M M Awais
86
search
Finding admissible heuristics

Find an admissible heuristic through
experience:


Solve lots of puzzles
Inductive learning algorithm can be
employed for predicting costs for new
states that may arise during search.
CS 331/531 Dr M M Awais
87
search
Summary: Admissible Heuristic

Defining and Solving Sub-problem:





Admissible heuristics can also be derived from the
solution cost of a sub-problem of a given problem.
This cost is a lower bound on the cost of the real
problem.
Pattern databases store the exact solution to for every
possible sub-problem instance.
The complete heuristic is constructed using the
patterns in the DB
Learning through experience
CS 331/531 Dr M M Awais
88
search
More on Heuristic functions

8-puzzle




Avg. solution cost is about 22 steps (branching factor
+/- 3)
Exhaustive search to depth 22: 3.1 x 1010 states.
A good heuristic function can reduce the search process.
CAN YOU THINK OF A HEURISTIC ?
CS 331/531 Dr M M Awais
89
8 Puzzle
search


Two commonly used heuristics
h1 = the number of misplaced tiles


h1(s)=8
h2 = the sum of the distances of the tiles from
their goal positions (manhattan distance).

h2(s)=3+1+2+2+2+3+3+2=18
CS 331/531 Dr M M Awais
90
search
Heuristic quality

Effective branching factor b*

Is the branching factor that a uniform tree of depth
d would have in order to contain N+1 nodes.
N  1  b * (b*)2  ...  (b*)d


A good heuristic should have b* as low as possible
This measure is fairly constant for sufficiently hard
problems.


Can thus provide a good guide to the heuristic’s overall
usefulness.
A good value of b* is 1.
CS 331/531 Dr M M Awais
91
search


Heuristic quality and dominance
Effective branching factor of h2 is lower than h1
If h2(n) >= h1(n) for all n then h2 dominates h1 and is
better for search (value of h2 and h1 e.g. 18 vs 8)
CS 331/531 Dr M M Awais
92