Professional Documents
Culture Documents
1.3. Knowledge Representation and Reasoning
1.3. Knowledge Representation and Reasoning
Martin Molina
2021
Outline
1
Programming a deliberation component
Functional description
of an intelligent system
Operational
Other agents
version to be A common approach:
executed
in a computer
Knowledge representation:
Symbolic representation
Interaction of what the system knows
(knowledge base)
Computer program
Deliberation
for deliberation
Reasoning:
Perception
Action Deriving conclusions
control from premises using the
knowledge base
(inference engine)
Environment
2
Example: Car failure diagnosis
Premises
• The car does not start
• There is gas in the fuel tank
• There is gas in the carburetor
• The engine turns over ?
• The lights come on
Conclusion
• The problem is spark plugs
3
Knowledge about car failures
can be represented using rules
Rule 1: IF the engine is getting gas,
AND the engine will turn over,
THEN the problem is spark plugs.
A.T. Al-Taani (2007): "An Expert System for Car Failure Diagnosis" International Journal of Mechanical,
Aerospace, Industrial, Mechatronic and Manufacturing Engineering Vol:1, No:12.
4
Rule-based systems were proposed in the
early years of artificial intelligence
Allen Newell
(1927-1992)
Carnegie Mellon University
5
The set of rules can be represented graphically
6
Inference can be done by
“forward chaining” from premises
Data driven the problem is
Based on deductive
spark plugs
(Forward chaining)
logical inference
7
Rule-based representation can generate explanations
Explanation “how”
It has been established that the problem is
spark plugs using:
8
There are multiple tools and languages based on rules
• CLIPS
• Drools
• Jess
• RuleML
• Ops5, Ops83, …
9
Example of rule in CLIPS
(deftemplate person
(slot firstname)
(slot name)
(slot age (type INTEGER))
)
(defrule adult
(person(firstname ?p_first)
(age ?p_age)) Adult person
(test (>= ?p_age 18)))
=>
(assert (adult (firstname ?p_first)))
)
10
Examples of rules in Drools
12
Reasoning can be simulated as symbol manipulation
Formal symbols represent what the
agent believes about the world
Knowledge
Model using
symbols A program
applies a
Reasoning Symbol general
manipulation algorithm to
simulate
reasoning
Agent Computer
Observed Simulated
Newell, A., & Simon, H. A. (1976). Computer Science as Empirical Inquiry: Symbols and Search. Communications.
13
Symbolism contrasts with connectionism in AI
Symbolism Connectionism
14
There are multiple types of reasoning
• Logical reasoning • Approximate reasoning
• Deduction • Imprecise reasoning
• Induction • Uncertain reasoning
15
There are multiple knowledge representation methods
• Rules
• Production rules, classification rules, etc.
• Taxonomies
• Semantic networks, frames, etc.
• Logic
• First order logic, Horn clauses, description logic, propositional logic, etc.
• Constraints
• Qualitative constraints, binary constraints, etc.
• Etc.
16
Properties of a knowledge representation
• Expressive
– It is able to represent what an agent believes that is true (facts and general propositions)
• Generative
– It is able to generate new conclusions from premises through inference mechanisms
• Natural
– It is used in human expression (using symbols with commonly accepted meanings)
• Formal
– It has formal semantics to avoid ambiguous interpretation
• Efficient
– It makes efficient use of computational resources
17
Semantic networks have concepts and relations
Representation: Concepts (nodes) + relation (links)
eats
Lion
Fur Zebra
is a is a
has
Mammal
Reptile
is a Whale
is a is a
Animal lives in
is a is a Fish Water
Bird
lives in
eats
is a
Penguin
Quillian, Ross (1968) "Semantic memory”. In "Semantic Information Processing". (M. Minsky ed.) MIT Press, Cambridge.
18
Frames represent concepts, attributes and relations
Representation: Concepts + attributes (slots) + relations (few types)
Animal
is a is a is a
Mammal Fish
Bird
Has: wings
Legs: 4 Lives in: water
Covered by: feathers
Covered by: fur Covered by: scales
Legs: 2
Locomotion: walking Locomotion: swiming
Locomotion: flying
is a
is a is a is a
is a
Minsky, Marvin (1974). "A Framework for Representing Knowledge" (1974) MIT-AI Laboratory Memo 306.
19
Example in RDF Schema
20
Example: Logical reasoning
• John is allergic to Penicillin
• Anyone allergic to Penicillin is also allergic to Amoxicillin
THEREFORE
• John is allergic to Amoxicillin
j: John P(j, e)
e: Penicillin "x [P(x, e) ® P(x, a)]
a: Amoxicillin ________________________
P(x, y) : x is allergic to y P(j, a)
21
Example of logic representation using Prolog
ancestor(jose,luis).
ancestor(jose,manuel).
ancestor(pilar,aurora).
ancestor(aurora,antonio).
man(jose).
man(luis).
man(manuel).
man(antonio).
woman(pilar).
woman(aurora).
22
Example of logic representation using OWL
Class(p:Pizza partial
restriction(p:hasBase someValuesFrom(p:PizzaBase)))
DisjointClasses(p:Pizza p:PizzaBase)
Class(p:NonVegetarianPizza complete
intersectionOf(p:Pizza complementOf(p:VegetarianPizza)))
ObjectProperty(p:isIngredientOf Transitive
inverseOf(p:hasIngredient))
Pizza has PizzaBase as its base; Pizza is disjoint with PizzaBase; NonVegetarianPizza is
exactly Pizza that is not VegetarianPizza; isIngredientOf is a transitive property;
isIngredientOf is inverse of hasIngredient.
:Pizza rdfs:subClassOf
[ a owl:Restriction ;
owl:onProperty :hasBase ;
owl:someValuesFrom :PizzaBase ] ;
owl:disjointWith :PizzaBase .
:NonVegetarianPizza owl:equivalentClass
[ owl:intersectionOf
( [owl:complementOf :VegetarianPizza]
:Pizza ) ] .
:isIngredientOf
a owl:TransitiveProperty , owl:ObjectProperty ;
owl:inverseOf :hasIngredient .
23
Example: Assign patients to hospital beds
24
Example: Map coloring
N E W
A
S
O
Variables: 𝑁, 𝑆, 𝐸, 𝐴, 𝑊, 𝑂
Domains: 𝑟𝑒𝑑, 𝑔𝑟𝑒𝑒𝑛, 𝑝𝑢𝑟𝑝𝑙𝑒
Constraints: 𝑅!: 𝑁 ≠ 𝑆
𝑅": 𝐸 ≠ 𝑆
𝑅#: 𝐸 ≠ 𝐴
𝑅$: 𝑊 ≠ 𝐴 …
Other tools:
CHIP V5
Oz
…
26
Representations for approximate reasoning
cause of lung
Smoker
cancer is smoking
P(b1/s1) = 0.25
P(b1/s2) = 0.05 P(c1/s1) = 0.003
P(c1/s2) = 0.00005
B C
Bronchitis Cancer
P(f1/b1, c1) = 0.75
P(f1/b1, c2) = 0.10
P(f1/b2, c1 ) = 0.5
P(f1/b2, c2) = 0.05
P(x1/c1) = 0.6
P(x1/c2) = 0.02 Judea Pearl (1936)
University of California, LA
X
F
X-Ray
Fatigue
pattern
27
Popular tools for symbolic knowledge representation
28
Some guidelines for knowledge representation
Representation
Pros Cons
method
Rules Intuitive, efficient and good for Not appropriate for some types of
heuristic knowledge reasoning (e.g., satisfiability)
Intuitive and useful to describe
Taxonomies Limited capabilities of reasoning
concepts and relations of a domain
Very expressive with formal Not appropriate for approximate or
Logic
semantics non-monotonic reasoning
Concise with powerful inference Constraint problem solving problems
Constraints
engines are NP-complete
Representations
Useful for uncertainty (Bayesian Not appropriate for other types of
for approximate
networks) or vagueness (fuzzy logic) reasoning
reasoning
29
Conclusions
• Good news:
– We have multiple methods to represent knowledge for building
intelligent systems
• Bad news:
– We don't have a universal method for knowledge representation
useful for all kind of problems
30
Summary of knowledge representation
31
Lecture slides of master course “Intelligent Systems”.
© 2021 Martin Molina
32