Professional Documents
Culture Documents
Non Monotonic Reasoning System
Non Monotonic Reasoning System
System
Limitations of Monotonic System
Logic based systems are monotonic in nature i.e. it remains true under all
circumstances.
AI provides solutions for those problems whose facts and rules of inference
are stored in knowledge base. But as mentioned data and knowledge are
incomplete in nature.
E.g. we say Rohini is bird. the conclusion that is arrived at is that Rohini can
fly.
But on the other hand, it is not necessary that Rohini should fly because of
variety of reasons.
--Rohini could be ostrich
--Rohini wings are broken.
--Rohini is too weak to fly.
--Rohini could be caged.
-- Rohini could be a dead bird etc.
Golden rules for Default reasoning
it is clear that it is not possible for a system to have all the information to
arrive at a decision. Guessing of information is permitted as long as it does
not contradict with the existing ones. Default reasoning assists in generation
of these guesses. The golden rules of default reasoning is
Consider only those whose existences is required for getting the clear picture
of the solution. This principle of avoiding all unnecessary details and taking into
account only those that are absolutely essential is called circumscription.
Non Monotonic Reasoning
This formalism helps in representing axiomatically notions like “if an animal is a bird,
unless proven it is not so, it can fly”
i.e. if q is formula in first order logic then Mq is also a formula in logic. The
modal operator M is consistent with all current beliefs in the sense that the negation
of q cannot be proved from the current information.
E.g.
x: [Bird(x) & M fly(x) fly(x) ]
it holds until it is possible to prove that x cannot fly
also consider
Bird(Rohini)
from these we have fly (Rohini) because we cannot prove that Rohini cannot fly.
Let’s another axiom
x: [Ostrich(x) ~fly(x) ]
addition of this does not have any impact and original deduction fly (Rohini) still holds
good.
If new information is added
Ostrich(Rohini)
then fly(Rohini) can no longer be inferred. A simple addition of new fact makes the
system non monotonic because the previously held inferences are no longer valid.
Non monotonic reasoning system are more complex than monotonic reasoning
systems. Monotonic reasoning systems generally do not provide facilities for altering
facts, deleting rules because it will have adverse effect on the reasoning process.
Bayes Theorem
This theorem provides a method for reasoning about partial beliefs. In this every event is happening or likely to
happen is quantified by pieces of knowledge about he event and the rules of probability dictate how these
numerical values are to be calculated.
To illustrate this , let S stand for the statement “the horse challenger with the jockey Abc will win
the race this season”.
One can associate with this statement two probability values.
1. Probability of challenger winning the race Probability(challenger- winning)=60%. This is called prior
probability because we do not know the ground situation as on today.
2. once we knowledge about the jockey, ground conditions and other relevant information, the probability
might be revised. This probability value is called posterior probability.
Probability(challenger- winning | jockey is Abc)= 65%
when there Is change in the belief about the condition of the horse, the jockey or the ground information, the
probability values are also revised.
1. Probability of a statement is always greater than zero and less than unity.
2. the probability of a sure proposition is unity.
3. Prob(A or B)= Prob(A) + Prob(B) if A and B are mutually exclusive.
4. Prob(Not A)= 1- Prob(A)
` in general formula is
Let E1,E2 And E3 be the events that a bulb selected at random is made by machine M1, M2 and
M3 and let Q denote that it is defective.
Then
Prob(E1|Q)= Prob(E1)* Prob(Q|E1)/[Prob(E1)* Prob(Q|E1)+ Prob(E2)* Prob(Q|E2)+ Prob(E3)*
Prob(Q|E3)]
=0.3*0.02/(0.3*0.02)+(0.3*0.03)+(0.04*0.04)
= 0.1935
similar
Prob(E2|Q)= 0.2903
Prob(E3|Q)= 0.5162
Reasoning about certainty Factor
Probability based reasoning adopted bayes theorem for handling uncertainty. To
apply bayes theorem one needs to estimate prior and conditional probabilities which
are difficult to calculate in many domains. To overcome this problem, the developers
of MYCIN system adopted certainty factors.
A certainty factor (CF) is a numerical estimate of the belief or disbelief on a
conclusion in the presence of set of evidence. various method’s of using CF have been
adopted.
1. use a scale from 0 to 1 where 0 represents total disbelief and 1 stands for total
belief. Other values between 0 to 1 represent varying degrees of belief and disbelief.
In expert systems, every production rule has certainty factor associated with
it. Herewith we give a typical rule with a CF.
IF there is enough fuel in the vehicle.
AND the ignition system is working properly
AND the vehicle does not start
Since at a point of time, a rule should either enhance the belief or disbelief
in most expert systems, multiple rules exist that relate to given conclusion. If more
than one rule is involved, then bone has to find out the composite certainty factor. It
is given by
CF (composite)[c,e (all)]= MB[c,e (supp)]-MD[c,e (aga)]
Where:
c- conclusion
CF (composite)[c,e (all)]: composite CF or nett CF on conclusion c after taking account
all evidences.
MB[c,e (supp)] : consolidated measure of belief in conclusion c given all evidences
supprting (e supp) it.
MD[c,e (aga)] : consolidated measure of disbelief in conclusion c given all evidences
against (e aga) it.
here c is conclusion and, s1 & s2 are the two sources and MB[c,s1 &s2 ] is the measure
of belief on conclusion c based on s1 & s2.
similarly
1. the CF of the conjunction of several facts is taken to be minimum of the CF’s of the
individual facts.
2. . the CF for the conclusion is obtained by multiplying the CF of the rule with the
minimum CF of IF part.
3. the CF for the fact produced as the conclusions of one or more rules is the
maximum of the CF’s produced.
= Max (0.195,0.35)
= 0.35
Let’s take real life example. In MYCIN, there is production rule which states
IF the organism is showing gram positive
AND the organism is coccus
AND the organism is growing in chains
THEN the organism is streptococcus CF= 0.7
the interpretation is
“ if Rule 1,2 &3 are 100% certain, then the organism is only 70% certain to be
streptococcus “
assume that the user is 100 % certain for Rule 1 &2. the system them poses the
question
MYCIN: did the organism grows in clumps, chains or pairs?
User: chains(8), clumps(4), pairs(-5)
the answer implies that the user is certain to 0.8 for chains, 0.4 for clumps and -0.5
from pairs
since rule 3 is not 100% certain but only to 80%, the new value of
MB[h1,s1&s2&s3] = 0.7 * 0.8= 0.56
since all the values of MB are greater than zero, MD values are equal to zero.
Hence the new CF = 0.56 – 0 = 0.56
Once set membership has been redefined , it is possible to define a reasoning system
based on techniques for combining distributions. Such reasoners have been applied
in control systems for devices as diverse of trains and washing machines.
Dempster Shafer Theory
in which degree of belief must lie. Belief (denoted by Bel) measures the strength of
the evidence in favor of a set of propositions. It range from 0 (no evidence) to
1(denoting certainty).
Plausibility (PI) is denned to be
PI(s) = 1- Bel(~s)
it also ranges from 0 to 1 and measures the extent to which evidence in
favor of ~s, then Bel(~s) will be 1 and PI(s) will be 0. thus possible value of Bel(s) is also
0.