Professional Documents
Culture Documents
AI - Lect (Solving Problem - 3)
AI - Lect (Solving Problem - 3)
Search
◆ Informed (heuristic) Search strategies
◆ best-first search
◆ search with heuristics
◆ memory-bounded search
◆ iterative improvement search
◆ ConstraintSatisfaction
◆ Important Concepts and Terms
◆ Chapter Summary
Search 1
Search 2
Informed (heuristic) Search
◆ Relies
on additional knowledge about the problem or
domain
◆ frequently expressed through heuristics (“rules of thumb”)
◆ Used to distinguish more promising paths towards a
goal
◆ may be mislead, depending on the quality of the heuristic
◆ In
general, performs much better than uninformed
search
◆ but frequently still exponential in time and space for
realistic problems
Search 3
Best-First Search (BFS)
◆ Relies on an evaluation function that gives an
indication of how useful it would be to expand a node
◆ family of search methods with various evaluation functions
◆ usually gives an estimate of the distance to the goal
◆ often referred to as heuristics in this context
Search 4
Greedy Best-First Search
◆ Minimizes the estimated cost to a goal
◆ expand the node that seems to be closest to a goal
◆ utilizes a heuristic function as evaluation function
❖ f(n) = h(n) = estimated cost from the current node to a goal
❖ heuristic functions are problem-specific
❖ often straight-line distance for route-finding and similar problems
◆ often
better than depth-first, although worst-time
complexities are equal or worse (space)
Search 5
Greedy Best-First Search Snapshot
1 9 Initial
Visited
Fringe
2 7 3 7 Current
Visible
Goal
4 6 5 5 6 5 7 6
Heuristics 7
8 7 9 5 10 4 11 3 12 2 13 4 14 5 15 6
8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
Search 8
Greedy Search
Search 9
Greedy Search
Search 10
Greedy Search
Search 11
Figure 4.2
Search 12
A* Search
◆ Combines greedy and uniform-cost search to find
the (estimated) cheapest path through the current
node
Search 13
Uniform Cost Search – Example (Rev.)
A
1 10
B 5
S 5 G
15 5
• BFS will find the path S,A,G, with a cost of 11, but
S,B,G is cheaper with a cost of 10
• Uniform Cost Search will find the cheaper solution
(S, B, G). It will find S, A, G but will not see it as it
is not at the head of the queue
Search 14
Uniform-Cost Snapshot
1 Initial
Visited
4 3
Fringe
2 3 Current
7 2 2 4
Visible
Goal
4 5 6 7
Edge Cost 9
2 5 4 4 4 3 6 9
8 9 10 11 12 13 14 15
3 4 7 2 4 8 6 5 4 3 4 2 8 3 9 2
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
Fringe: [27(10), 4(11), 25(12), 26(12), 14(13), 24(13), 20(14), 15(16), 21(18)]
+ [22(16), 23(15)] Search 15
A* Search
◆ Combines greedy and uniform-cost search to find
the (estimated) cheapest path through the current
node
Evaluation Function:
Search 17
A* Search
◆ Consistent heuristics
n
A heuristic h(n) is Consistent or
Monotonic if for every node n, c(n,a,n`)
every successor n` of n
n` h(n)
generated by action a.
h(n) ≤ c(n,a,n`)+h(n`) h(n`)
Search 18
◆ heuristics must be admissible
❖ never overestimate the cost to reach the goal (Like SLD heuristic)
◆ very good search method, but with complexity problems
Search 19
A* Snapshot
9
1 9 Initial
+
Visited
4 3
11 10 Fringe
2 7 3 7 Current
7 2 2 4
? Visible
10 13 Goal
4 6 5 5 6 5 7 6
Edge Cost 9
2 5 4 4 4 3 6 9
Heuristics 7
11 12
8 7 9 5 10 4 11 3 12 2 13 4 14 5 15 6 f-cost 10
3 4 7 2 4 8 6 5 4 3 4 2 8 3 9 2
13 13
8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
3 4 7 2 4 8 6 5 4 3 4 2 8 3 9 2
24 24 29 23 18 19 18 16 13 13 14 13 25 21 31 25
8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
Search 21
Figure 4.3
Search 22
A* Properties
◆ The value of f never decreases along any path
starting from the initial node (also known as
monotonicity of the function)
◆ almost all admissible heuristics show monotonicity
Search 23
A* Snapshot with Contour f=11
1 9 Initial
Visited
4 3
11 10 Fringe
2 7 3 7 Current
7 2 2 4
Visible
17 11 10 13 Goal
4 6 5 5 6 5 7 6
Edge Cost 9
2 5 4 4 4 3 6 9
Heuristics 7
20 21 14 13 11 12 18 22
8 7 9 5 10 4 11 3 12 2 13 4 14 5 15 6 f-cost 10
3 4 7 2 4 8 6 5 4 3 4 2 8 3 9 2
Contour
24 24 29 23 18 19 18 16 13 13 14 13 25 21 31 25
8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
Search 24
A* Snapshot with Contour f=13
1 9 Initial
Visited
4 3
11 10 Fringe
2 7 3 7 Current
7 2 2 4
Visible
17 11 10 13 Goal
4 6 5 5 6 5 7 6
Edge Cost 9
2 5 4 4 4 3 6 9
Heuristics 7
20 21 14 13 11 12 18 22
8 7 9 5 10 4 11 3 12 2 13 4 14 5 15 6 f-cost 10
3 4 7 2 4 8 6 5 4 3 4 2 8 3 9 2
Contour
24 24 29 23 18 19 18 16 13 13 13 25 21 31 25
8 7 6 5 4 3 2 1 0 1 3 4 5 6 7
16 17 18 19 20 21 22 23 24 25 27 28 29 30 31
14
2
26
Search 25
Optimality of A*
◆ A* will find the optimal solution
◆ the first solution found is the optimal one
◆ A* is optimally efficient
◆ no other algorithm is guaranteed to expand fewer nodes
than A*
◆ A* is not always “the best” algorithm
◆ optimality refers to the expansion of nodes
◆ it generates and keeps all nodes in memory
Search 26
Complexity of A*
◆ Thenumber of nodes within the goal contour search
space is still exponential
◆ with respect to the length of the solution
◆ better than other algorithms, but still problematic
◆ Frequently,
space complexity is more severe than
time complexity
◆ A* keeps all generated nodes in memory
Search 27
Memory-Bounded Search
◆ search algorithms that try to conserve memory
◆ most are modifications of A*
◆ iterativedeepening A* (IDA*)
◆ Recursive Best-First Search
◆ simplified memory-bounded A* (SMA*)
Search 28
Iterative Deepening A* (IDA*)
◆ Explores paths within a given contour (f-cost limit) in
a depth-first manner
◆ thissaves memory space because depth-first keeps only
the current path in memory
❖ but it results in repeated computation of earlier contours since it
doesn’t remember its history
◆ was the “best” search algorithm for many practical
problems for some time
◆ does have problems with difficult domains
❖ contours differ only slightly between states
❖ algorithm frequently switches back and forth
❖ similar to disk thrashing in (old) operating systems
Search 29
Recursive Best-First Search
◆ similarto best-first search, but with lower space
requirements
◆ O(bd) instead of O(bm)
◆ it
keeps track of the best alternative to the current
path
◆ best f-value of the paths explored so far from predecessors
of the current node
◆ if it needs to re-explore parts of the search space, it knows
the best candidate path
◆ still may lead to multiple re-explorations
Search 30
Simplified Memory-Bounded A*
(SMA*)
◆ uses all available memory for the search
◆ drops nodes from the queue when it runs out of space
❖ those with the highest f-costs
◆ avoids re-computation of already explored area
❖ keeps information about the best path of a “forgotten” subtree in its
ancestor
◆ complete if there is enough memory for the shortest
solution path
◆ often better than A* and IDA*
❖ but some problems are still too tough
❖ trade-off between time and space requirements
Search 31
Heuristics for Searching
◆ Formany tasks, a good heuristic is the key to finding
a solution
◆ prune the search space
◆ move towards the goal
Search 32
8-Puzzle Heuristics
◆ Level of difficulty
◆ around 20 steps for a typical solution
◆ branching factor is about 3
◆ 9!/2 = 181,440 different reachable states
❖ distinct arrangements of 9 squares
◆ Candidates for heuristic functions
◆ number of tiles in the wrong position
◆ sum of distances of the tiles from their goal position
❖ city block or Manhattan distance
◆ Generation of heuristics
◆ possible from formal specifications
Search 33
Effective Branching Factor
N + 1 = 1 + b* + (b*)2 + … + (b*)d
Search 34
Effective Branching Factor
Example: N = 6; d = 2; b* = 2
Example: N = 3; d = 2; b* = 1
Search 35
Effective Branching Factor
Good heuristics have a b* close to 1.
Search 36
Relaxed problems
Fewer restrictions on the successor function
(operators)
Search 37
Local Search and Optimization
◆ Forsome problem classes, it is sufficient to find a
solution
◆ the path to the solution is not relevant
◆ Memory requirements can be dramatically relaxed
by modifying the current state
◆ allprevious states can be discarded
◆ since only information about the current state is kept, such
methods are called local
Search 38
Local Search Algorithms
➢ If the path does not matter we can deal
with local search.
➢ They use a single current state
➢ Move only to neighbors of that state
Advantages:
✓ Use very little memory
✓ Often find reasonable solutions
Search 39
Find global maximum or minimum
Search 40
Landscape
Search 41
Iterative Improvement Search
◆ For some problems, the state description provides all
the information required for a solution
◆ path costs become irrelevant
◆ global maximum or minimum corresponds to the optimal
solution
◆ Iterative
improvement algorithms start with some
configuration, and try modifications to improve the
quality
◆ 8-queens: number of un-attacked queens
◆ VLSI layout: total wire length