Download as pdf or txt
Download as pdf or txt
You are on page 1of 33

Course: Intelligent Systems

Knowledge Representation and


Reasoning

Martin Molina
2021
Outline

• Example of knowledge representation


– A rule-based system
• General view of knowledge representation and reasoning
– Symbolic approach
• Other knowledge representation methods
– Logic, taxonomies, constraints, etc.

1
Programming a deliberation component
Functional description
of an intelligent system
Operational
Other agents
version to be A common approach:
executed
in a computer
Knowledge representation:
Symbolic representation
Interaction of what the system knows
(knowledge base)
Computer program
Deliberation
for deliberation
Reasoning:
Perception
Action Deriving conclusions
control from premises using the
knowledge base
(inference engine)
Environment

2
Example: Car failure diagnosis
Premises
• The car does not start
• There is gas in the fuel tank
• There is gas in the carburetor
• The engine turns over ?
• The lights come on

Conclusion
• The problem is spark plugs

3
Knowledge about car failures
can be represented using rules
Rule 1: IF the engine is getting gas,
AND the engine will turn over,
THEN the problem is spark plugs.

Rule 2: IF the engine does not turn over,


AND the lights do not come on,
THEN the problem is battery or cables.

Rule 3: IF the engine does not turn over,


AND the lights do come on,
THEN the problem is the starter motor.

Rule 4: IF there is gas in the fuel tank,


AND there is gas in the carburetor,
THEN the engine is getting gas.

A.T. Al-Taani (2007): "An Expert System for Car Failure Diagnosis" International Journal of Mechanical,
Aerospace, Industrial, Mechatronic and Manufacturing Engineering Vol:1, No:12.

4
Rule-based systems were proposed in the
early years of artificial intelligence

Newell, A. (1973). Production systems: Models


of control structures. In Visual information
processing (pp. 463-526). Academic Press.

Allen Newell
(1927-1992)
Carnegie Mellon University

5
The set of rules can be represented graphically

Rule 1: IF the engine is getting gas,


AND the engine will turn over,
THEN the problem is spark plugs.

Rule 2: IF the engine does not turn over,


AND the lights do not come on,
THEN the problem is battery or cables.

Rule 3: IF the engine does not turn over,


AND the lights do come on,
THEN the problem is the starter motor.

Rule 4: IF there is gas in the fuel tank,


AND there is gas in the carburetor,
THEN the engine is getting gas.

6
Inference can be done by
“forward chaining” from premises
Data driven the problem is
Based on deductive
spark plugs
(Forward chaining)
logical inference

7
Rule-based representation can generate explanations

Explanation “how”
It has been established that the problem is
spark plugs using:

Rule 1: IF the engine is getting gas,


AND the engine will turn over,
THEN the problem is spark plugs.

It has been established that the engine is


getting gas using:

Rule 4: IF there is gas in the fuel tank,


AND there is gas in the carburetor,
THEN the engine is getting gas.

This explanation uses symbols


understandable by general users

8
There are multiple tools and languages based on rules

• CLIPS
• Drools
• Jess
• RuleML
• Ops5, Ops83, …

9
Example of rule in CLIPS

(deftemplate person
(slot firstname)
(slot name)
(slot age (type INTEGER))
)

(defrule adult
(person(firstname ?p_first)
(age ?p_age)) Adult person
(test (>= ?p_age 18)))
=>
(assert (adult (firstname ?p_first)))
)

10
Examples of rules in Drools

rule "Cash purchases have no discount"


when
$p:Purchase(paymentMethod == PaymentMethod.CASH)
then Cash
$p.setDiscount(0);
end

rule "Debit Card purchases have 5% of discount"


when
$p:Purchase(paymentMethod == PaymentMethod.DEBIT)
then Debit card
$p.setDiscount(0.05);
end

rule "Credit Card purchases have 10% of discount"


when
$p:Purchase(paymentMethod == PaymentMethod.CREDIT)
then Credit card
$p.setDiscount(0.1);
end
11
Conclusions learned from this example
• The system uses a symbolic knowledge representation
– Human understandable symbols
• The system reasons using a general inference mechanism
– Deductive logical inference (observed in humans)
• The set of rules are domain specific
– Car mechanical failures
• The system is able to explain its reasoning
– The system simulates introspection

12
Reasoning can be simulated as symbol manipulation
Formal symbols represent what the
agent believes about the world

Knowledge
Model using
symbols A program
applies a
Reasoning Symbol general
manipulation algorithm to
simulate
reasoning
Agent Computer

Observed Simulated

Newell, A., & Simon, H. A. (1976). Computer Science as Empirical Inquiry: Symbols and Search. Communications.
13
Symbolism contrasts with connectionism in AI

Symbolism Connectionism

Tries to create intelligence by Tries to create intelligence by


manipulating symbols that represent simulating the structure of the brain
what someone believes about the world (e.g., artificial neural networks)

14
There are multiple types of reasoning
• Logical reasoning • Approximate reasoning
• Deduction • Imprecise reasoning
• Induction • Uncertain reasoning

• Reasoning about satisfiability • Probabilistic reasoning

• Boolean satisfaction (SAT) • Causal reasoning


• Constraint satisfaction • Abduction

• Taxonomic reasoning • Analogical reasoning


• Disjointness • Case-based reasoning
• Membership, …
• Other types of reasoning
• Non-monotonic reasoning • Diagrammatic reasoning
• Default reasoning • …

15
There are multiple knowledge representation methods
• Rules
• Production rules, classification rules, etc.

• Taxonomies
• Semantic networks, frames, etc.

• Logic
• First order logic, Horn clauses, description logic, propositional logic, etc.

• Constraints
• Qualitative constraints, binary constraints, etc.

• Representations for approximate reasoning


• Bayesian networks, fuzzy logic, etc.

• Etc.

16
Properties of a knowledge representation
• Expressive
– It is able to represent what an agent believes that is true (facts and general propositions)
• Generative
– It is able to generate new conclusions from premises through inference mechanisms
• Natural
– It is used in human expression (using symbols with commonly accepted meanings)
• Formal
– It has formal semantics to avoid ambiguous interpretation
• Efficient
– It makes efficient use of computational resources

17
Semantic networks have concepts and relations
Representation: Concepts (nodes) + relation (links)

eats
Lion
Fur Zebra
is a is a
has
Mammal
Reptile
is a Whale
is a is a
Animal lives in

is a is a Fish Water
Bird
lives in
eats
is a
Penguin

Quillian, Ross (1968) "Semantic memory”. In "Semantic Information Processing". (M. Minsky ed.) MIT Press, Cambridge.
18
Frames represent concepts, attributes and relations
Representation: Concepts + attributes (slots) + relations (few types)

Animal

is a is a is a

Mammal Fish
Bird
Has: wings
Legs: 4 Lives in: water
Covered by: feathers
Covered by: fur Covered by: scales
Legs: 2
Locomotion: walking Locomotion: swiming
Locomotion: flying
is a
is a is a is a
is a

Zebra Lion Barracuda Eagle Penguin


Diet: herbivore Diet: carnivore Diet: piscivore Diet: piscivore
Diet: carnivore
Locomotion: swiming

Minsky, Marvin (1974). "A Framework for Representing Knowledge" (1974) MIT-AI Laboratory Memo 306.
19
Example in RDF Schema

20
Example: Logical reasoning
• John is allergic to Penicillin
• Anyone allergic to Penicillin is also allergic to Amoxicillin
THEREFORE
• John is allergic to Amoxicillin

j: John P(j, e)
e: Penicillin "x [P(x, e) ® P(x, a)]
a: Amoxicillin ________________________
P(x, y) : x is allergic to y P(j, a)

21
Example of logic representation using Prolog
ancestor(jose,luis).
ancestor(jose,manuel).
ancestor(pilar,aurora).
ancestor(aurora,antonio).

man(jose).
man(luis).
man(manuel).
man(antonio).

woman(pilar).
woman(aurora).

father(X,Y) :- ancestor(X,Y), man(X).


mother(X,Y) :- ancestor(X,Y), woman(X).

son(X,Y) :- ancestor(Y,X), man(X).


daughter(X,Y) :- ancestor(Y,X), woman(X).

grandfather(X,Y) :- father(X,Z), ancestor(Z,Y).


grandmother(X,Y) :- mother(X,Z), ancestor(Z,Y).

22
Example of logic representation using OWL
Class(p:Pizza partial
restriction(p:hasBase someValuesFrom(p:PizzaBase)))
DisjointClasses(p:Pizza p:PizzaBase)
Class(p:NonVegetarianPizza complete
intersectionOf(p:Pizza complementOf(p:VegetarianPizza)))
ObjectProperty(p:isIngredientOf Transitive
inverseOf(p:hasIngredient))

Pizza has PizzaBase as its base; Pizza is disjoint with PizzaBase; NonVegetarianPizza is
exactly Pizza that is not VegetarianPizza; isIngredientOf is a transitive property;
isIngredientOf is inverse of hasIngredient.
:Pizza rdfs:subClassOf
[ a owl:Restriction ;
owl:onProperty :hasBase ;
owl:someValuesFrom :PizzaBase ] ;
owl:disjointWith :PizzaBase .

:NonVegetarianPizza owl:equivalentClass
[ owl:intersectionOf
( [owl:complementOf :VegetarianPizza]
:Pizza ) ] .

:isIngredientOf
a owl:TransitiveProperty , owl:ObjectProperty ;
owl:inverseOf :hasIngredient .

23
Example: Assign patients to hospital beds

24
Example: Map coloring

N E W

A
S
O

Variables: 𝑁, 𝑆, 𝐸, 𝐴, 𝑊, 𝑂
Domains: 𝑟𝑒𝑑, 𝑔𝑟𝑒𝑒𝑛, 𝑝𝑢𝑟𝑝𝑙𝑒
Constraints: 𝑅!: 𝑁 ≠ 𝑆
𝑅": 𝐸 ≠ 𝑆
𝑅#: 𝐸 ≠ 𝐴
𝑅$: 𝑊 ≠ 𝐴 …

A solution is an assignment satisfying all constraints:


{𝑁 = 𝑟𝑒𝑑, 𝑆 = 𝑔𝑟𝑒𝑒𝑛, 𝐸 = 𝑝𝑢𝑟𝑝𝑙𝑒, 𝐴 = 𝑟𝑒𝑑, 𝑊 = 𝑔𝑟𝑒𝑒𝑛, 𝑂 = 𝑝𝑢𝑟𝑝𝑙𝑒}
25
There are multiple software tools
for constraint satisfaction problems

Other tools:
CHIP V5
Oz

26
Representations for approximate reasoning

Vagueness Fuzzy logic (1974)


The car goes “fast”

Lofti Zadeh (1921-2017)


University of California,
Berkeley

Uncertainty Bayesian networks (1988)


P(s1) = 0.2

The “most likely” S

cause of lung
Smoker

cancer is smoking
P(b1/s1) = 0.25
P(b1/s2) = 0.05 P(c1/s1) = 0.003
P(c1/s2) = 0.00005

B C
Bronchitis Cancer
P(f1/b1, c1) = 0.75
P(f1/b1, c2) = 0.10
P(f1/b2, c1 ) = 0.5
P(f1/b2, c2) = 0.05
P(x1/c1) = 0.6
P(x1/c2) = 0.02 Judea Pearl (1936)
University of California, LA
X
F
X-Ray
Fatigue
pattern
27
Popular tools for symbolic knowledge representation

• Logic: Ciao, Swi Prolog, TPTP language


• Rules: CLIPS, Drools, Jess
• Constraints: Opta Planner, Ciao

28
Some guidelines for knowledge representation

Representation
Pros Cons
method
Rules Intuitive, efficient and good for Not appropriate for some types of
heuristic knowledge reasoning (e.g., satisfiability)
Intuitive and useful to describe
Taxonomies Limited capabilities of reasoning
concepts and relations of a domain
Very expressive with formal Not appropriate for approximate or
Logic
semantics non-monotonic reasoning
Concise with powerful inference Constraint problem solving problems
Constraints
engines are NP-complete
Representations
Useful for uncertainty (Bayesian Not appropriate for other types of
for approximate
networks) or vagueness (fuzzy logic) reasoning
reasoning

29
Conclusions
• Good news:
– We have multiple methods to represent knowledge for building
intelligent systems
• Bad news:
– We don't have a universal method for knowledge representation
useful for all kind of problems

30
Summary of knowledge representation

Main characteristics Universal relations


• Cause-effect
• It represents what an agent believes that is true in the world • Part-of
– Declarative representation approach • Property-of
• Subclass-of
• It is designed to simulate efficiently human reasoning • Implies
– E.g., deductive reasoning, default reasoning • Same
• It is based on human expression • Different
• Negation
– Symbols with commonly accepted meanings • ...

Utility of a knowledge representation Domain specific symbols


• Failure
• Simulation of reasoning • Car
• Fuel
– Planning what to do, • Carburetor
– Designing an artifact, etc. • Engine
• Lights
• Exhibiting introspection ability • Spark plugs
• Battery
– Talking about own knowledge, • ...
– Explaining own reasoning, etc.

31
Lecture slides of master course “Intelligent Systems”.
© 2021 Martin Molina

This work is licensed under Creative Commons license CC BY-SA 4.0:


https://creativecommons.org/licenses/by-sa/4.0/legalcode

Suggested work citation:


Molina, M. (2021): “Intelligent Systems”. Master course (lecture slides). Department of
Artificial Intelligence. Universidad Politécnica de Madrid.

32

You might also like