Chapter 7 - Find-S Algorithm, Candidate-Elimination Algorithm, Version Space & Inductive Bias

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Find S Algorithm It find most specific hypothesis

that fits all positive examples It only considers the


positive examples
This algorithm starts with the most specific hypothesis
and generalizes this hypothesis each time it fails to
classify an observed positive training data Hence the
find s algorithm moves from the most specific hypothesis
to the most general hypothesis
most specific hypothesis 0 Acceptable

Most general hypothesis p Not Acceptable

Algorithm
step 1 Initialise with most specific hypothesis O
no O O O O 0 S attributes

step 2 for each ve sample


For each attribute
if value hypothesis value
ignore
e
Replace with most general hypothesis

Numerical Example

no O O O O 07
h C
Japan Honda Blue 1980 Economy
he h
h a
Japan P Blue Economy
ha hs
hs C
Japan P P P Economy
he C
Japan P D P Economy
h he

2
Enample

no c d d O O O O
he a
Sunny Warm Normal Strong Warm Same 7

ha Sunny Warm P Strong Warm Same


h ha
a
ha Sunny Warm P Strong D P 3

Disadvantages of find s

Considers only the examples


It may not be sole hypothesis that fits the
complete data
Consistent hypothesis An hypothesis h is consistent
with a set of training examples D if and only if
h m c n for each example in D
Consistent Ch D Ct N c n ED ha a

For example

he P N No Many
For ex 2 it is not true and the result is
also not true so it is consistent
with ex 1
for ex 2 it is true and result is also same
So we say that he is consistent
can with
all examples so h is consistent

ha C No
for ex I ha is consistent but the result is
No so we
directly can say that
he is not consistent as first
example itself stands not true

Version space The


Version Space Usaid is the
subset of the hypothesis from H consistent with
the training examples in D

US nib I h E H I Consistent h D 3
H hypothesis
D training examples
List Then Eliminate Algorithm This algorithm is
used to obtain the version space

Step 2 Initially we will consider all the available


hypotheses
version space List containing every hypothesis
in H

step 2 from this step we keep on removing inconsistent


hypothesis from version space
For each training example M C n remove
any hypothesis that is h n I c n

step 3 Obtain the list of hypotheses into version


space after checking for all the training
examples
Enample
H D
US

ng
no

Now we will check all the training examples D


for example for h
hi hi cand
hi Nz Us V
h Us Cena X
We can see Ms is not consistent so we will
eliminate hi same procedure will be done for all
the hypothesis
X X V X X x x x x

Here consistent hypothesis


version space A C
are

Candidate Elimination Algorithm


Uses the concept of Version Space
Considers both tve and ve values yes and no
We will get both specific and general hypothesis
It is the extended version of find s Algorithm
For positive samples move from specific to general
for negative samples move from general to specific
S E O O O O 03 t d
G E P 3 T
Algorithm
step 1 Load the dataset
Step 2 Initialize general and specific hypothesis
S E d O d 03 Depends on no
G E i 7 P p3 of attributes

is positive tue
Step 3 If example
if attribute value hypothesis value
do nothing
else
replace attribute value with
basically generalizing it
else
Make generalized hypothesis more specific

Example

so E 0,0 0,0 0,03 Go 3


D tve
S Sunny Warm Normal Strong Warm Same
G E 0 3
2 ve Specific to General
S2 Sunny Warm Strong Warm Same
G E 0
3 ve keepspecific hypothesis as it is

53 Sunny Warm D Strong Warm Same


G E Sunny C Warm
same 3
4 fue
su E Sunny warm P strong 3
G E c Sunny C Warm 3
Here if you notice we have removed third
attribute same from Gu because it is not
present in specific hypothesis

InductiveLearning here from examples we devieve


rules It is also known as Concept Learning it is
how AI systems attempts to use a generalized rules
to carry out observations
Deductive Learning Here existing rules are
applied to
our
examples
Biased Hypothesis Space does not consider all types of
training example
Solution Include all hypothesis

Unbiased Hypothesis space Providing a hypothesis


capable of representing set of all examples

Inductive Bias
Based on some previous examples
machine will find a generalized rule and it will
apply this rule to all the new examples
Learner generalizes beyond the observed training
examples to infer new examples

Inductively inferred from


x y y is inductively inferred from 2
Inductive bias is the set of assumptions a learner
uses to predict results on given inputs it has not
yet encountered It has some prion assumptions
about the task
An inductive bias allows a learning algorithmoverto
prioritize one solution Co interpretation another

You might also like