Artificial Intelligence - Informed Search

Description

A-Level Artificial intelligence Flashcards on Artificial Intelligence - Informed Search, created by Wojciech Gryncze on 10/01/2016.
Wojciech Gryncze
Flashcards by Wojciech Gryncze, updated more than 1 year ago
Wojciech Gryncze
Created by Wojciech Gryncze over 8 years ago
13
1

Resource summary

Question Answer
Uninformed search vs Informed search Uninformed search methods have access only to the problem definition. Informed search methods may have access to a heuristic function h(n) that estimates the cost of a solution from n.
Best-first search The generic best-first search algorithm selects a node for expansion according to an evaluation function.
Greedy best-first search Greedy best-first search expands nodes with minimal h(n). It is not optimal but is often efficient.
A∗ search (Combine greedy and uniform-cost) A∗ search expands nodes with minimal f(n) = g(n) + h(n). A∗ is complete and optimal, provided that h(n) is admissible (for TREE-SEARCH) or consistent (for GRAPH-SEARCH). The space complexity of A∗ is still prohibitive.
Admissible heuristic? h(n) is admissible if h(n) never overestimates cost to reach a goal. Consequences: -h(n) is always optimistic - f(n) never overestimates cost of a solution through node n -h(n) = 0 if n is a goal
Consistent heuristic The estimation from a node n to the goal, is lesser than the cost from n to a new node n' with the estimation from the node n' to the goal h(n) ≤ c(n, a, n') + h(n') Consistent heuristic is admissible.
Triangle inequality? Each side of a triangle cannot be longer than the sum of the other two sides. • h(n) : estimated cost between n and G • C(n, a n’) : cost to go in n’ from n with action a • h(n’) : estimated cost between n’ and G
A* is Optimal : Proof A* (using Tree Search) is optimal if h(n) is admissible Proof : • A solution is a path from the initial state to a goal state • Let C* be the lowest path cost among all solutions • We must show that A* will not return a suboptimal path to a goal Part (1) • Let G be a goal node in the fringe, but in a suboptimal path • Its path cost g(G)=C is not the lowest one (C > C*) • f(G) = g(G) + h(G) • h(G) = 0 because G is a goal node and h is admissible • f(G) = g(G) = C > C* • f(G) > C* Part (2) • Let n be a node in the fringe, with n in the path to the optimal solution (cost C* ) • f(n) = g(n) + h(n) • f(n) ≤ C* because h is admissible Part (3) • f(n) ≤ C* < f(G) • Node G will never be selected (end of proof) Consequence • A* expands no nodes with f(n) > C*
Properties of consistency? • If a heuristic is consistent, then it is also admissible • A* (using Graph Search) is optimal if h(n) is consistent
Monotonicity of f(n) If h(n) is consistent (Let a be an action and n,n’ two nodes we have h(n) ≤ c(n, a, n') + h(n')) then f(n) along any path is non decreasing.
Properties of A* • With h(n) consistent, the sequence of nodes expanded by A* using GraphSearch is in non decreasing order of f(n) • A* (using Graph-Search ) is optimal if h(n) is consistent • A* expands all nodes with f(n) < C* • A* expands some (at least one) of the nodes on the C* contour before finding the goal • A* expands no nodes with f(n) > C*
A* is optimally efficient? A* is optimally efficient because he don’t expand node with f(n) > C∗. We can’t be more efficient because if we don’t expand all node with f(n) < C∗ we take the risk of missing the optimal solution!
Iterative Deepening A* We combined the algorithm A* with the Iterative deepening. Let l be a limit. IDA* use f-cost(g+h) as a cutoff rather than the depth. At each iteration, the cutoff-value is the smallest f-cost of any node the exceeded the cutoff on the previous iteration.
Recursive Best-First Search It’s a simple recursive algorithm trying to run in a linear space. The structure is the same than the DFS except it does not keep going down indefinitely. It us a f_limit value variable to keep track of the f-value of the best alternative path from the ancestor of the current node. IDA* and RBFS use to little memory
Simple Memory-Bounded A* SMA* does run as A* until the memory is full. When the memory is full, it need to discard a node if it wants to add a new one. It will discard the worst one (one with highest f-value). SMA* backups the value of the its parent so that the ancestor knows the quality of the best path in that sub tree.
Informed search comparaison
Designing heurisitic You must choose an admissible and consistent heuristic. You must choose the most dominant heuristics (The most dominant is the one with the most informations)
Show full summary Hide full summary

Similar

Cognition: Artificial Intelligence
Hannah Shakeshaft
Artificial Intelligence - Solving Problems by Searching
Wojciech Gryncze
A2 CCEA Digital Technology (Whole course)
Abbie Welch
Functional Skills screening
Matt Freeman
Grafisk kommunikation
Pierre Norrbrink
How many people speak Spanish over time changed compared to all of the other major languages worldwide?
fionntanflanagan
Intelligence artificielle : ...
Manon Bricard
Artificial Intellegence
nicky elin
INTELIGENCIA ARTIFICIAL
Lizeth León
Artificial Intelligence
Fynder Enlil