Professional Documents
Culture Documents
AI Merged PDF
AI Merged PDF
AI and KR
What is KR & R
Knowledge and Reasoning
Knowledge and Reasoning:
humans are very good at acquiring new information by
combining raw knowledge, experience with reasoning.
AI-slogan: “Knowledge is power” (or “Data is power”?)
Examples:
Medical diagnosis --- physician diagnosing a patient
infers what disease, based on the knowledge he/she
acquired as a student, textbooks, prior cases
8
Representation
How can we represent knowledge in a machine?
13
Knowledge Inference
This activity involves the design of
software to enable the computer to make
inferences based on the stored knowledge
and the specifics of a problem.
14
Explanation and Justification
15
What is Logic?
Reasoning about the validity of arguments.
An argument is valid if its conclusions follow logically from its
premises – even if the argument doesn’t actually reflect the real
world:
Logical Operators
And Λ
Or V
Not
Implies → (if… then…)
Iff ↔ (if and only if)
Translating between English and Logic
For example:
It is Raining and it is Thursday:
RΛT
R means “It is Raining”, T means “it is Thursday”.
Truth Tables
Tables that show truth values for all possible inputs to a logical
operator.
For example:
A Λ A ≡ A : Idempotence law
A v A ≡ A : Idempotence law
A Λ (B Λ C) ≡ (A Λ B) Λ C : associative law
A v (B v C) ≡ (A v B) v C : associative law
A Λ (B v C) ≡ (A Λ B) v (A Λ C) : Distributive law
A Λ (A v B) ≡ A : Absorption law
A v (A Λ B) ≡ A
A Λ true ≡A AΛ false ≡ false
A v true ≡ true A v false ≡ A
Propositional Logic
Most of the statements we make are proposition.
Ex 1: Hari is hardworking.
2. If Hari is hardworking and Hari is intelligent then Hari scores good mark.
Any proposition in propositional logic can have two values: True or False
P: “It is raining.”
Q: “Jack is having tea.”
Questions
1. It rains in July
1. It rains in July : P
2. If it rains today and one doesn’t carry umbrella then he will be drenched
P QR
Rules: To simplify propositional sentences
p ∨ ¬ p …..Tautology
p∨p …..Contingency
51
Prove that q∧¬(p→q) is a
contradiction without truth tables
52
Prove that q∧¬(p→q) is a
contradiction:
53
Validity
If the conclusion column of truth table is FALSE while
every premise column is TRUE, then the argument is
INVALID. Otherwise, the argument is VALID
Example: Where
Validity
If ducks sink, then ducks are made of
small rocks.
Ducks do sink.
55
Validity
If ducks sink, then ducks are made of
small rocks.
Ducks do sink.
Conclusion: Therefore, ducks are made of
small rocks.
D: ducks sink
R: ducks are made of small rocks
56
Pre = (D->R) ˄ D Pre -> R
T T
F T
F T
F T
57
Inference rules
58
Rules of Inference
59
60
Modus Ponens: Example
Example:
“If I study well, then I will get good mark in
math.” p q
“I study well.” p
Conclusion: q
Therefore , “I will get good mark in Math.”
61
Modus Tollens: Example
Example:
“If I study well, then I will get good mark in
math.” p q
“I will not study.” ¬q
Conclusion: ¬p
Therefore , “It will not get good mark in math .”
62
Hypothetical Syllogism / Transitivity
Example:
“If I study well, then I will get good mark in math” p q
“If I get good mark in math, I will get good CGPA.” q r
Conclusion: p r
Therefore , “If I study well, then I will get good CGPA .”
63
Disjunctive Syllogism / Elimination
Example:
“I will study Math or I will study AI.” p V q
“I will not study Math.” ¬p
Conclusion: p r
Therefore , “I will study AI.”
64
Addition
Example:
“I will study math.” p
Conclusion: p V q
Therefore , “I will study math or I will watch a
movie”
65
Simplification
Example:
“I will study Math and I will study Physics.” p q
Conclusion: q
Therefore , “I will study Math”
66
Conjunction
Example:
“I will study Math”. p
“I will study Physics.” q
Conclusion: p q
Therefore , “I will study Math and I will study Physics”
67
Resolution
Example:
“I will study not math or I will study Physics”. ¬p V r
“I will study math or I will study Chemistry”. p V q
Conclusion: q V r
Therefore , “I will study Chemistry or I will study
Physics”
68
Contradiction
Sita falsely stated
“It was not a mountain” ¬p
Conclusion :
“ It was a mountain” p
69
Valid Arguments – Exercise1
70
Valid Arguments – Exercise2
With these hypotheses:
“It is not sunny this afternoon and it is colder than yesterday.”
“If we go swimming then it is sunny”
“If we do not go swimming, then we will take a canoe trip.”
“If we take a canoe trip, then we will be home by sunset.”
72
Proof
p : “It is sunny this afternoon.”
q : “It is colder than yesterday.”
r : “We will go swimming.”
s : “We will take a canoe trip.”
t : “We will be home by sunset.”
propositions hypotheses
74
Using the rules of inference to build arguments
1. p q
p It is sunny this afternoon
q It is colder than yesterday
2. r p
r We go swimming 3. r s
s We will take a canoe trip 4. s t
t We will be home by sunset (the conclusion)
5. t
Step
Step Reason
Reason
Reason
1.
pp qq Hypothesis
Hypothesis
Hypothesis
2. p Simplifica
Simplification
tionusing
using(1)
(1)
3. rp Hypothesis
Hypothesis
4. r Modus
Modustollens
tollens using
using(2)
(2)and
and(3)
(3)
5. r s Hypothesis
6. s Modus ponens using (4) and (5)
7. s t Hypothesis
8. t Modus ponens using (6) and (7)
75
Predicate Logic
82
Predicate Logic or Quantifier Logic
Example:
All men are mortal
Socrates is a man
X lives in Y is predicate
Mary lives in Austin.
P(Mary, Austin)
lives(Mary, Austin)
89
Propositional Vs Predicate Logic
Propositional logic is logic that includes sentence
represented by letters (P,Q,R)and logical connectives, but
not quantifiers.
Eg: "Socrates is a Man“ represented as “P".
90
Propositional Vs Predicate Logic
ƒ
Quantifiers: The important difference is that we can use
predicate logic to say something about a set of objects, by
introducing the universal quantifier ("∀"), the existential
quantifier ("∃") and variables ("x", "y" or "z").
91
Predicate Calculus - Exercise
Today is wet
John likes apples
It rained on Tuesday
Ben is a good cat.
If it does not rain on Monday, Rama will go to
the mountains.
Mary is a child who takes Coconut-crunchy
97
Predicate Calculus
Properties: Wet(Today)
Relations: likes(John, Apples)
weather(Tuesday, rain)
/ rain(Tuesday)
Good_cat (Ben)
¬ Weather (rain, Monday) → go (Rama, mountains)
Child(Mary) Takes(Mary, Coconut-crunchy )
98
Predicate Logic/ calculus
All basketball players are tall.
Some people like chocolates are tall.
99
Predicate Logic/ calculus
All basketball players are tall.
∀x (basketball-players (x) → tall (x))
Some people like chocolates.
∃ x likes ((x, chocolates)^tall(x))
100
The following statements associate with
persons
a. Someone is sleeping
b. No one is sleeping
c. Everyone is sleeping
d. Not everyone is sleeping
101
There is some entity x such
1. Someone is sleeping that x is a person and x is
Let S = is sleeping, P = person sleeping
∃x (P(x) ∧ S(x))
2. No one is sleeping It is not the case that there is
Let S = is sleeping, P = person some entity x such that x is a
person and x is sleeping
¬∃x (P(x) ∧ S(x))
3. Everyone is sleeping
Let S = is sleeping, P = person For every entity x, if x is a
person, then x slept
∀x (P(x) → S(x))
4. Not everyone is sleeping
Let S = is sleeping, P = person:
It is not the case that for every
¬∀x (P(x) → S(x)) entity x, if x is a person, then x
slept
102
Some girl liked some boy
∃x ∃y [ (GIRL(x) ∧ BOY(y)) ∧ LIKE(x,y) ]
Every girl liked every boy
∀x ∀y [ (GIRL(x) ∧ BOY(y)) → LIKE(x,y) ]
103
a. Some dragon is sleeping and twitching
∃x [D(x) ∧ (S(x) ∧ T(x))]
b. No dragon is sleeping and twitching
¬∃x [D(x) ∧ (S(x) ∧ T(x))]
c. Every dragon is sleeping and twitching
∀x [D(x) → (S(x) ∧ T(x))]
d. Not every dragon is sleeping and
twitching
¬∀x [D(x) → (S(x) ∧ T(x))]
104
Exercises
All saints were Hindus
All Hindus were either loyal to Ravana or hated him
Everyone is loyal to someone
105
Exercises
All saints were Hindus
All Hindus were either loyal to Ravana or hated him
Everyone is loyal to someone
∀ x: saints(x) → hindus(x)
∀x: hindus(x) → loyal to (x, Ravana) ∨ hate (x,
Ravana)
∀x: ∃y: loyal to (x, y)
106
FOPL to sentences
Theorems of logical operations
Laws of Inference
Inference Rule
Note: Find the particular x that satisfies the condition and then eliminate “there exists”
Inference Rule
Combining rules of proposition and
quantified statements: Lifting
117
Arguments
118
Express the Argument
We can express the premises (above the line)
and the conclusion (below the line) in predicate
logic as an argument:
119
Unification -Unify algorithm takes two sentences and returns
a unifier/substitution that makes the two sentences look identical
Forward chaining: From the available set of rule in KB, start from
simple atomic sentences and infer a new fact using inference rules
like, Modus Ponen.
Rule based system is nothing but the KB with all the rule.
Inference Rule/FC/BC
Forward chaining and Backward chaining: Inference Mechanism
Bob is a buffalo
Pat is a pig
Steve is a slug
Conclusion
Someone who passed the exam has not read
the book.
132
A(x): “x is in Section A of the course"
B(x): “x reads the book"
P(x): “x passed the exam.“
Hypotheses:
∃ x(A(x) ^ ¬ B(x))
∀ x(A(x) P(x)).
Conclusion: ∃ x(P(x) ^ ¬ B(x)).
133
Semantic Networks
A knowledge representation technique that
focuses on the relationships between objects
137
Semantic Network
The syntax of semantic net is simple. It is a network of
labeled nodes and links.
− It’s a directed graph with nodes corresponding to concepts, facts,
objects etc. and
− arcs showing relation or association between two concepts.
The commonly used links in semantic net are of the
following types.
- isa subclass of entity (e.g., child hospital is subclass of
hospital)
- Represented by rectangle
- inst particular instance of a class (e.g., India is an instance of
country)
- Represented by rectangle
- prop property link (e.g., property of dog is ‘bark)
- Represented by ellipse, and connected with dotted arrow
138
Representation of Knowledge in Sem
Net
“Every human, animal and bird is living
thing who breathe and eat. All birds can
fly. All man and woman are humans
who have two legs. Cat is an animal
and has a fur. All animals have skin and
can move. Giraffe is an animal who is
tall and has long legs. Parrot is a bird
and is green in color”.
139
Representation
Semantic Net
in Semantic Net
breathe, eat
Living_thing prop
isa isa
two legs isa fly
Human Animal Bird
isa isa inst inst inst
prop green
Man Woman Giraffe Cat Parrot
prop prop prop
inst fur
john skin, move tall, long legs
140
Module 5: Uncertainty and knowledge Reasoning
Uncertainty
Agents almost have never access to the whole truth about their
environment
Uncertainty:
uncertain input
uncertain knowledge
uncertain output
Why we need reasoning under uncertainty?
We cant say that the above said rule is correct. “All patients with toothache
has cavity”. It is not correct. Some person may have gum disease or other
problems.
In order to make the rule true, we need to add almost many list of possible
causes. Hence trying FOL for domain like medical diagnosis fails for three
reasons.
Ex:
P(Weather = Sunny) = 0.1
P(Weather = Rain) = 0.7
P(Weather = Snow) = 0.2
Weather – random variable
3. P(~A) = 1 – P(A)
Prior probability
Hence,
4. Graphical representation
P(J, M, A, B, E ) =
P(J, M, A, B, E ) =
P(J, M, A)
Some Properties of BN
BN is a DAG
Processing Elements in NN
An ANN consists of perceptrons. Each of the perceptrons receives
inputs, processes inputs and delivers a single output.
Perceptron
xn
xi = input
wi =weight
O= Output
36
The Perceptron: Threshold Activation Function
Step Threshold
1 if I>θ
O=f(I)=
0 otherwise
Perceptron and Linearly Separable Problem
Class1 Class2
Linearly Separable
XOR Problem: Not Linearly Separable
1
O = f(I) =
1 + e −I
Sigmoid
K Nearest Neighbor
How does KNN algorithm work”
At k=3, we classify the new variable as square
At k=7, we classify the new variable as triangle
KNN Distance Metrics
Decision Tree
1. At each stage (node), pick out the best feature as the test condition.
2. Now split the node into the possible outcomes (internal nodes).
3. Repeat the above steps till all the test conditions have been exhausted
into leaf nodes.
Decision Tree
For the right decision tree calculate entropy and information gain.
Decision Tree
Entropy calculation
Example Dataset
Decision Tree
Decision Tree
Decision Tree
Entropy(Buy Computer)
9 9 5 5
Info( D) = I (9,5) = − log 2 ( ) − log 2 ( ) =0.940
14 14 14 14
Decision Tree
Tree Construction
Entropy(Buy computer, Age)
5 4
Infoage ( D ) = I ( 2,3) + I ( 4,0)
14 14
5
+ I (3,2) = 0.694
14
Gain(income) = 0.029
Gain( student ) = 0.151
Gain(credit _ rating ) = 0.048
Unsupervised Learning Algorithms
K-Means Clustering
K Means Algorithm
Let the Input: Data Points (V1,V2,…..Vn)
Step 2: Using Euclidean distance assign each data point to closest centroid Cj
arg min dist(Vi,Cj)
Step 4: Repeat step 2 and 3 until none of the cluster assignment change
K-Means Clustering Algorithm
K-Means Clustering Algorithm Cntd..
K-Means Clustering Algorithm Cntd..
K-Means Clustering Algorithm Cntd..
Choosing the right number of clusters in Kmeans Clustering
Elbow Method