sh7

Department
Lecturer
Course
Course No.
COLLEGE OF ENGINEERING & TECHNOLOGY
: .. Department of Computer Science ..……..........................………
: .. Dr. Ibrahim Imam ....................................………............
: .. Introduction to Artificial Intelligence ..........……............
: .. CC 511 & CS 366 ……….. Sheet : ..……7..........
Q1. The following and-or trees were extracted from a set of decision rules
The values associated with any node is the probability associated with the decision rule
obtained from that node and its children. Assume that all attributes have binary domain.
a) Drive all distinct decision rules from these and-or trees.
b) If you are given the case (A = 1) (B = 0) (E = 1) (F = 0) (D = 1) (H = 1), what
decision can you drive using backward chaining? Draw the list of conditions,
actions and intermediate actions.
c) Design a neural network where D and I are the output attributes and all others
attributes are input.
d) Design a Bayesian network of the decision rules your extracted in a). Draw the truth
table for each node with one or more parents. Consider the probability of an event
to be caused with something other than its parents equal to 0.1.
e) Assume that P(A) = 0.3; P(B) = 0.7; P(E) = 0.6; P(F) = 0.2; P(D) = 0.3; P(H) = 0.5.
Calculate the probability P(~A, B, E, ~F, D, ~C, G, I, K).
f) Which probability is bigger P(A, B, E, F, D, C, G, I, K) or P(~A, ~B, ~E, ~F, ~D,
~C, ~G, ~I, ~K)? and why? (Think. Do not calculate)
g) Suppose we can move between the following states:
from A to B
from A to K
from E to H
from K to B
from D to F
from K to H
- Start from the Bayesian network you obtained in d) and add the above
connections. Consider the new graph as a maze. Assume that the probability to move from
one state to another follows a uniform distribution. Draw the transition matrix and
calculate the probability that you can move from room K to room G in two steps PKG(2).
What is the probability a robot starts from state H and return to room K for the first time in
three steps fHK(3).
Q2. Assume you have the following set of rules and associated probabilities:
R1. (a = 0) (b = 0) (c = 1)  (f = 1)
R3. (a = 1) (b = 0) (c = 0)  (f = 1)
0.5
0.6
R2. (a = 0) (b = 1) (c = 0)
R4. (d = 1) (e = 0) (c = 0)
 (f = 1)
 (g = 0)
0.4
0.5
R5. (d = 0) (e = 1) (c = 0)  (g = 0)
0.6
R6. (d = 0) (e = 0) (c = 1)  (g = 0)
0.4
R7. (f = 1) (g = 1) (h = 0)  (m = 1)
0.5
R8. (f = 0) (g = 0) (h = 0)  (m = 1)
0.5
R9. (f = 0) (g = 0) (h = 1)  (m = 1)
0.8
R10. (n = 1)  (a = 0)
0.8
a) Can you transfer these decision rules into one decision tree? If yes, show how.
b) If you are given the case (n = 1) (g = 1) (b = 1) (e = 1) (c = 0) (d = 1)(h=0), what
decision can you drive using backward chaining? Explain your answer using
queues and and-or-trees.
c) Suppose 0 means “false” and 1 means “true”. Draw a Bayesian network to represent
the causality relationship in these decision rules using the associated probabilities.
Calculate the truth table for each child-node in the network (Hint: consider the
probability of P(false, false, false)=0.1).
d) Using the Bayesian network in part c), assume that P(c=1)=P(n=1)=P(h=1)=0.6 and
P(b=1)=P(d=1)=P(e=1)=0.4. calculate the probability of:
P((n = 1) (g =
1) (b = 1) (e = 1) (c = 0) (d = 1) (m = 1) (f = 1) (a = 0) (h = 0)). In your opinion,
what is the difference between backward chaining and Bayesian network?
What is the domain of each attribute?. Design two different neural networks such that
attributes appear as actions or intermediate actions shall be output nodes, and all other
attributes are input nodes. How would the case in part (d) of this question be entered to
each neural network?
Q3. Suppose you have the following probabilities:
P(x | a, ~b, ~c) = 0.2 P(y | a, ~d, ~c) = 0.3
P(x | ~a, b, ~c) = 0.4 P(y | ~a, d, ~c) = 0.5
P(x | ~a, ~b, c) = 0.6 P(y | ~a, ~d, c) = 0.7
P(z | x, ~y, ~t) = 0.5 P(z | ~x, y, ~t) = 0.4
P(z | ~x, ~y, t) = 0.3 P(a) = P(t) = 0.8
P(d) = P(b) = 0.3
P(c) = 0.4
a. Draw the corresponding Bayesian Network, and obtain all truth tables for all nodes.
b. Calculate the probability of (~z, y, x, ~a, b, ~c, d, ~t)
Q4. Assume you have the following set of decision rules with their probabilities:
R1. (a = 0) (b = 1) (c = 0)  (d = 0)
0.4
R2. (a = 1) (b = 0) (c = 1)  (d =
1)
0.4
R3. (a = 1) (b = 0) (c = 0)  (d = 1)
0.8
R4. (e = 1) (f = 1) (g = 1)  (h =
0)
0.8
R5. (e = 1) (f = 1) (g = 0)  (h = 1)
0.4
R6. (e = 1) (f = 0) (g = 1)  (h =
1)
0.4
R7. (d = 0) (h = 1) (i = 0)  (k = 0)
0.2
R8. (d = 1) (h = 1) (i = 0)  (k =
1)
0.4
R9. (d = 0) (h = 1) (i = 1)  (k = 1)
0.2
Assume that all attributes have binary domain.
a) Transfer these 9 decision rules into one decision tree.
b) If you are given the case (a = 0) (b = 1) (c = 0) (e = 1) (f = 1) (g = 0) (i = 1), what
decision(s) can you drive using backward chaining? Explain your answer using
and-or trees.
c) Can you drive a decision for the case (a = 1) (b = 1) (c = 1) (e = 0) (f = 1) (g = 0) (i
= 1) using either forward or backward chaining? (Your answer is needed for part g).
d) Suppose a database D is created using all attributes, where “k” is the decision
attribute. The data is partitioned into two parts: training and testing. Design two
different neural networks to classify the two decisions (I need only the input and
output layers).
e) Suppose the given decision rules represent a causality model. Draw a Bayesian
network to represent the causality model given by these rules. Draw the truth table
for each caused node (i.e., a node with one or more parents). Consider the
probability of a null causality equal to 0.1 for all nodes (i.e., the probability an event
occurs without the occurrence of its cased events).
f) Consider the Bayesian network you draw in part (e). Assume that P(a) = 0.3; P(b) =
0.7; P(c) = 0.6; P(e) = 0.2; P(f) = 0.3; P(g) = 0.5; P(i) = 0.5. Calculate the
probability P(~a, b, ~c, d, ~e, f, ~g, h, ~i, k)
g) Consider the Bayesian network in (e) and (f). Can you drive a probable decision for
the case in (c)? If yes, show how? From your answer, comment on the difference in
reasoning between Expert Systems and Bayesian Networks.
h) Given the Bayesian network in e, suppose you are given an additional set of
causality rules:
R10. (k = 0)  (i = 0)
R11. (b = 1)
 (a = 1)
R12. (c = 1)  (b = 1)
R13. (e = 1)
 (c = 0)
R14. (f = 1)  (e = 1)
R15. (g = 1)
 (f = 1)
R16. (i = 1)  (g = 1)
- Redraw the Bayesian network in (e) and add to it the new seven links. Consider
the new network as a maze the directions indicate you can move from one state to another.
Assume that the probability to move from one state to another follows a uniform
distribution. Use Markov chains as a model for planning. Draw the transition matrix and
calculate the probability that you can move from state (f) to state (k) in two steps Pfk(2).
What is the probability a robot starts from room k and return to room k for the first time in
five steps fkk(5).
i) Consider the graph you obtained in part (h) as your search space. The initial state is
e. List the first 10 states each of the following algorithms will visit: depth-first and
breadth-first. Which algorithm can reach state c first?
j) Suppose the cost of moving from a node to another is given by (1/number of
children). Is the best-first search algorithm faster than the breadth-first algorithm in
reaching any node from node e? Explain your answer.
k) Suppose a database is created from the first 9 decision rules. How many records in
the data?. If you are given the case in part (c), what decision can you drive using the
K-Nearest-Neighbor algorithm with the normalized difference function. Solve it for
K=1 and 3.
l) Consider the data in part (k), drive 4 association rules with frequency greater than
1/17