0% found this document useful (0 votes)
16 views4 pages

A3 Solutions

The document contains solutions to a series of questions related to informed search algorithms, including A* search and its properties, heuristic functions, and the efficiency of different search strategies. It discusses the implications of various heuristic functions, the behavior of algorithms like DFS and IDA*, and the conditions under which these algorithms guarantee optimal solutions. Each solution is accompanied by explanations and justifications for the answers provided.

Uploaded by

guptagovind730
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views4 pages

A3 Solutions

The document contains solutions to a series of questions related to informed search algorithms, including A* search and its properties, heuristic functions, and the efficiency of different search strategies. It discusses the implications of various heuristic functions, the behavior of algorithms like DFS and IDA*, and the conditions under which these algorithms guarantee optimal solutions. Each solution is accompanied by explanations and justifications for the answers provided.

Uploaded by

guptagovind730
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

NPTEL: AI Assignment-3

Informed Search

Solution Q1) Answer: AD


Explanation:
Consider f(n) = (2-w) g(n) + w h(n)

- For w = 1, we get f(n) = g(n) + h(n). This is the A* search algorithm.


- For w = 2, we get f(n) = 2 h(n). This represents (scaled) greedy best-first
search. As explained in videos, this is not complete and may get stuck in
loops (for example, Iasi -> Neamt -> Iasi -> Neamt -> ..)
- For w > 2, substitute w-2 = y. We have f(n) = (y+2) h(n) - y g(n), for y > 0. You
can easily find counter-examples to prove that this is not optimal.
- For w = 0, we get f(n) = 2 g(n), which represents UCS (The factor of 2 just
scales the distance values)

Solution Q2) Answer: SAG


Explanation:
- At S, we need to choose the next node. The candidates are A and B.
- For A, f(n) = 4 + 5*(1) = 9, and for B, f(n) = 2 + 5*(5) = 27. Thus, we choose
node A.
- At node A, we can now go to G. For G, g(n) = 4+4 and h(n) = 0, thus, f(n) = 8.
Since f(n) for G is less than that of B, we choose G, and reach our goal.

Solution Q3) Answer: acbfdez


Explanation:
- Priority queue pq. Push (a, 21) in the pq.
- Pop the min cost entry from pq. Since we only have one entry here, we pop a,
add a to the answer, and push the neighbors of a into pq. Thus, we push (b,
23), (c, 22) and (d, 25) into the pq.
- Pop the min cost entry from pq. Thus pop c. Add c to the answer. Push the
neighbors of C into pq. Thus, push (f, {4+12+8 =} 24), (e, {4+17+5 = } 26) into
the pq.
- Pop the min cost entry from pq. Thus pop b. We push neighbors of b into pq.
The only neighbor of b is e. f(n) for e using this path is 9+11+5 = 25. Since this
is < 26, we update the cost of e in the pq to 25.
- Pop the min cost entry from pq. Thus we pop f. Add f to answer. Push (z, {4 +
12 + 9 = } 25) from the pq.
- Pop the min cost entry from pq. We have 3 candidates with same value of 25
(d, e and z). Thus, we use lexicographic ordering to break the ties. We first
pop d.
- Now note that, after we pop d and e, all the nodes we push and pop will have
cost > 25. Thus, our solution z is at distance 25. Thus, we add “dez” to our
answer.

Solution Q4) Answer: A


Explanation:
- h+1 can never be an admissible heuristic. Thus is because, for goal state, we
want h(n) = 0.

Solution Q5) Answer: BCD


Explanation:
- Note that, h1 + h2 is greater than h1 and h2. Thus, it may or may not be
admissible.
- min(h1, h2) <= h1 and min(h1, h2) <= h2, hence, it is guaranteed to be
admissible.
- max(h1, h2) <= h1 and max(h1, h2) <= h2, hence, it is guaranteed to be
admissible.
- αh1 + (1-α)h2 for α є [0,1] lies between h1 and h2. Thus, it is <= max(h1, h2).
From option C, since max(h1, h2) is admissible, this is also admissible.

Solution Q6) Answer: 2

i. This statement is false. A* with negative edge costs may not return the optimal
solution because it relies on the admissibility of the heuristic function. Negative edge
costs can cause the algorithm to select paths that are not optimal.

ii. This statement is true. Iterative Deepening A* (IDA*) does not require a priority
queue because it uses iterative deepening to search through the space, while A*
relies on a priority queue to expand nodes in order of their f-cost.

iii. This statement is true. If h1 and h2 are admissible, then max(h1, h2) will always
be greater than or equal to both h1 and h2. Therefore, max(h1, h2) dominates h1
and h2.

iv. This statement is false. An inconsistent heuristic can still be admissible if it never
overestimates the true cost to reach the goal. Consistency is a stronger property
than admissibility. However, consistency implies admissibility.
v. This statement is false. Depth First Search (DFS) does not guarantee optimality,
and it can terminate faster than A* in certain cases. A* with an admissible heuristic is
designed to find the optimal solution, but it may take longer due to its more informed
exploration of the search space.

Solution Q7) Answer: BC

- b) This is correct because DFS B&B can quickly reach a solution (even if it's
suboptimal) and then use that solution to establish bounds for pruning. If initial
solutions are easily found, the algorithm can effectively eliminate many paths
that cannot lead to a better solution, thereby improving efficiency.

- c) The branch and bound technique involves using bounds (cost limits) to
prune subtrees that cannot possibly lead to a better solution than the best one
found so far. This reduces the search space and improves efficiency by
avoiding the exploration of paths that are guaranteed to be less optimal.

The incorrect options are:

- a) While DFS B&B aims to be optimal by pruning paths that cannot lead to a
better solution, its performance and optimality can be severely hampered in
infinite search spaces due to its depth-first nature, potentially getting stuck in
deep, non-terminating paths without ever exploring other potentially optimal
paths.

- d) DFS B&B does not help when there is a single path to the goal. Its
efficiency depends more on the ability to quickly find a suboptimal solution to
establish bounds for pruning and less on the number of solutions. In cases
where the single solution is deep or difficult to find without guidance, DFS
B&B might not perform efficiently.

Solution Q8) Answer: CD


- This statement is incorrect. Removing edges from a graph actually makes the
problem harder or even unsolvable, as it restricts the possible paths from the
source to the destination. Problem relaxation typically involves loosening
constraints, such as removing restrictions or adding edges, which can make it
easier to find a solution, albeit a suboptimal one for the original problem.
- This statement is incorrect. As we remove more constraints, the problem
becomes easier and hence h2 will give lower values than h1 for nodes, which
means h1 dominates h2
- The total time to solve a problem = Time to compute heuristic + Time to do
actual search using the heuristic.
- If we remove all constraints then it will be easy to compute the heuristic
but these will not be effective for search and hence the total time for
search will be high
- If we don’t remove any constraints, then computing the heuristic will
itself be time consuming which increases the total time taken for search
even though the actual search will be fast. Hence the total time needed
to solve the problem as we increase the number of constraints
removed will first decrease and then increase. (from lecture slides)
- From lectures slides and videos

Solution Q9) Answer: ABD


- A* is guaranteed to give optimal solutions only when the heuristic used is
admissible, in other cases it may give non-optimal solutions
- Both algorithms are exponential in the worst-case since it is possible that all
nodes are explored even with A*
- A* is a systematic search algorithm since it visits each search node exactly
once
- The worst-case time complexity is still exponential in the size of the search
space, as it potentially explores all reachable nodes.

Solution Q10) Answer: AC


Explanation:
- For Greedy Best First Search f(n) = h(n), any function which gives the
same ordering as h(n) will exhibit similar behaviour to greedy best first
search.
- The evaluation functions f(n) = 100 * h(n) and h(n)^2 will result in
identical behaviour to greedy best-first search. These functions
preserve the ordering according to h(n).
- The other functions either incorporate the cost to reach the node (g(n))
or invert the order of nodes based on their heuristic values, thereby
altering the search behaviour.

You might also like