Professional Documents
Culture Documents
School of Computing: COS351D/204/2007
School of Computing: COS351D/204/2007
School of Computing: COS351D/204/2007
School of Computing
Discussion of Assignment 4
COS351D/204/2007
Table of Contents
Outline of syllabus................................................................................................................. 3
Outline of syllabus
Chapter 1
Read as background.
Chapter 2
Familiarise yourself with the concepts and terminology.
Chapter 3
Study everything. You have to understand all the algorithms and be able to apply them to a given
problem statement. You do not need to memorise any pseudo-code.
Chapter 4
Study everything except the following sections, which you can omit entirely:
- Memory-bounded heuristic search, pp. 101 – 104.
- 4.4 Local Search in Continuous Spaces, pp. 119 – 129.
You have to understand all the algorithms and be able to apply them to a given problem statement.
You do not need to memorise any pseudo-code.
Chapter 5
Study everything except the following section, which you can omit entirely:
- 5.4 The Structure of Problems, pp. 151 – 155.
You have to understand all the algorithms and be able to apply them to a given problem statement.
You do not need to memorise any pseudo-code.
Chapter 6
Study sections 6.1, 6.2 and 6.3. You have to understand the minimax and alpha-beta algorithms, and
be able to apply them to a given problem statement.
Read sections 6.4, 6.5, 6.6, 6.7 and 6.8. You have to understand the concepts covered in these
sections, but you do not need to be familiar with any particular game such as backgammon, chess or
any card game.
Chapter 7
Study everything except the following section, which you can omit entirely:
- 7.7 Agents based on propositional logic, pp. 225 – 232.
Many of the concepts introduced in this chapter should already be familiar to you from COS261.
However, there are some important algorithms in this chapter that you will probably not be familiar
with. In particular, you have to understand and be able to apply forward and backward chaining,
resolution, and the DPLL and WALKSAT algorithms.
Chapter 8
Most of the concepts introduced in this chapter should also already be familiar to you. Read the
chapter to refresh your memory. Make sure that you can translate English language sentences and
paragraphs into first-order logic.
Chapter 9
You have to be able to convert an English language paragraph into first-order logic, convert this into
conjunctive normal form, and use resolution to show that a given conclusion follows from the
premises. In order to apply the resolution algorithm, you will need to understand the unification of
terms. You can omit the following sections:
- 9.3 Forward chaining, pp. 280 – 287;
- 9.4 Backward chaining, pp. 287 – 295.
You can also omit everything from p. 300 Completeness of resolution to the end of the chapter.
Chapter 18
Understand the forms of learning and the concepts of inductive learning. You have to be able to
construct a decision tree and reason with the contents thereof. The theory of decision trees is also
important. You can omit the following sections:
- 18.4 Ensemble Learning, pp. 664 – 668;
3
COS351D/204/2007
Chapter 20
Sections 20.1 to 20.4 serve as background to much of the rest of the chapter. Read and understand
the concepts as far as it pertains to sections 20.5 onwards. Section 20.5 forms the core of the work
that you need to understand fully. You will have to be able to construct a simple neural network and
train it using a given training set. You have to understand the limitations of the various models of
neural networks. You can omit the following sections:
- 20.6 Kernel machines, pp. 749 – 752;
- 20.7 Case Study, pp. 752 – 754.
4
COS351D/204/2007
Solution: Assignment 4
Answer 1
The key distinction between analogical and sentential representations as that the analogical
representation automatically generates consequences that can be “read off” whenever suitable
premises are encoded.
a) Symbols on the map may depend on the scale and type of map. Symbols may typically include
city and town markers, road symbols, lighthouses, historic monuments, etc.
b) When the map creator put a symbol in a specific location, he says one explicit thing (Hillbrow
tower is here), but the analogical structure of the map representation means that we can
derive many implicit sentences.
Explicit sentences:
- There is a tower called Hillbrow tower here.
- Voortrekker Road runs approximately East-West.
- False Bay exists and is roughly circular.
Implicit sentences:
- The N1 is longer than the N3.
- Johannesburg is north of Cape Town.
- The shortest route from Johannesburg to Cape Town is via the N1.
d) Sentences that are easier to express in the map language: any sentence that can be
expressed easily in English is not going to be a good candidate for this question. Any linguistic
abstraction about the shape of False Bay can probably be expressed equally easily in the
predicate calculus, since that is what it was designed for. Facts, such as the shape of the
coastline are best expressed in the map language.
- Audio tape recordings. Advantages: simple circuits are required to record and reproduce
sounds. Disadvantages: subject to errors, noise, not easy to separate individual sounds.
- Traditional clock face. Advantages: easy to read quickly, easy to determine remaining
time. Disadvantages: hard to read precisely, cannot represent small units such as
milliseconds.
- Graphs, pie charts, bar charts. Advantages: enormous data compression, easy trend
analysis, easy to communicate information. Disadvantages: imprecise, cannot represent
disjunctive or negated information.
Answer 2
The main point of this exercise is to understand connectives and quantifiers, as well as the use of
predicates, functions, constants, and equality. Let the basic vocabulary be as follows:
5
COS351D/204/2007
d) The best score in Greek is always higher than the best score in French.
∀s ∃x ∀y Score( x, G, s ) > Score( y, F , s )
g) There is an agent who sells policies only to people who are not insured.
∃x Agent ( x) ∧ ∀y ∀z Policy ( y ) ∧ Sells ( x, y, z ) ⇒ ( Person( z ) ∧ ¬Insured ( z ))
h) There is a barber who shaves all men in town who do not shave themselves.
∃x Barber ( x) ∧ ∀y Man( y ) ∧ ¬Shaves( y, y ) ⇒ Shaves( x, y )
j) A person born outside the UK, one of whose parents is a UK citizen by birth, is a UK citizen by
descent.
∀x Person( x) ∧ ¬Born( x, UK ) ∧
(∃y Parent ( y, x) ∧ Citizen( y, UK , birth)) ⇒ Citizen( x, UK , descent )
k) Politicians can fool some of the people all of the time, and they can fool all of the people some
of the time, but they can’t fool all of the people all of the time.
6
COS351D/204/2007
∀x Politician( x) ⇒
(∃y ∀t Person( y ) ∧ Fools ( x, y, t )) ∧
(∃t ∀y Person( y ) ⇒ Fools ( x, y, t )) ∧
¬(∀t ∀y Person( y ) ⇒ Fools ( x, y, t ))
Answer 3
Answer 4
a) ∀x Horse( x) ⇒ Animal ( x)
∀x ∀h Horse( x) ∧ HeadOf (h, x) ⇒ ∃y Animal ( y ) ∧ HeadOf (h, y )
b) ¬Horse( x) ∨ Animal ( x) 1
Horse(G ) 2
HeadOf ( H , G ) 3
¬Animal ( y ) ∨ ¬HeadOf ( H , y ) 4
Here 1 is derived from the first sentence in (a), while the other three comes from the second
sentence. H and G are Skolem constants.
7
COS351D/204/2007
Answer 5
There are many possible answers to this question. The important point is that real-world situations
create a huge number of possible exceptions and the answer should reflect a sufficient number of
them to be usable in the real world. Below is one possible decision tree.
FrontOfQueue?
No Yes
CarAheadMoving? IntersectionBlocked?
No Yes No Yes
NO YES CrossTraffic? NO
No Yes
Pedestrians? NO
No Yes
Turning? NO
Right No Left
No Yes No Yes
YES NO YES NO
Answer 6
XOR, and in fact any Boolean function, is easiest to construct using step-function units. Because XOR
is not linearly separable, we will need a hidden layer. It turns out that only one hidden node suffices. To
design the network we can think of the XO function as OR with the AND case (where both inputs are
true) ruled out. Thus the hidden layer computes AND, while the output layer computes OR but we set
the weights between the hidden and the output layer to negative value, so as to remove the effect of
the AND. The network below illustrates this idea.
0.3
0.3
-0.6
t=0.5 t=0.2
0.3
0.3
8
COS351D/204/2007
The following figure illustrates the output of the network above as a function of its two inputs x1 and x 2 ,
Weights and other parameter values are as in the network above. The function is limited to the unit
cube.
output
1
1
x1
1
x2
(1,1)
9
COS351D/204/2007
Unisa
PO Box 392, UNISA, 0003
10