Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 175

First Order Logic

Predicates, Forall, & Exists


 vs 
The Proof Game
Negations
Reals
Games
Classifying Functions
Computable & Uncomputable
Time Complexity
Jeff Edmonds
Church's Thesis
Lecture 0 York University Many Courses
Understand Quantifiers!!!
Say, I have a game for you.
We will each choose an integer.
You win if yours is bigger.
I am so nice, I will even let you go first.

Easy.
I choose a trillion trillion.

Well done. That is big!

But I choose a trillion trillion and one


so I win.
Understand Quantifiers!!!
You laugh but this is
a very important game
in theoretical computer science.

You choose the size of your Java program.


Then I choose the size of the input.
Likely |I| >> |J|
So you better be sure your Java program
can handle such long inputs.
Understand Quantifiers!!!
The first order logic we can state
that I win the game:
"x, $y, y>x

The proof:
Let x be an arbitrary integer.
Let y = x+1
Note y = x+1 >x

Good game.
Let me try again. I will win this time!
Understand Quantifiers!!!
Loves(b,g) Sam Mary

Bob Beth
Marilin
John
Monro
Fred Ann

A relation/predicate
• returns True/False
• depending whether objects
b and g have relation Loves.
Loves(Sam,Mary) = False
Loves(Sam,Beth) = True
Understand Quantifiers!!!
Loves(b,g) Sam Mary

Bob Beth
Marilin
John
Monro
Fred Ann

“There exists a boy that loves Mary”


b, Loves(b,Mary) False
b, Loves(b,Beth) True
“Every boy loves Beth”
"b, Loves(b,Beth) False
"b, Loves(b,MM) True
Understand Quantifiers!!!
Sam Mary
Bob Beth
Marilin
John
In all three examples, Monro
Fred Ann
every boy loves a girl. One girl

The difference? Sam Mary


Bob Beth
Marilin
John
Monro
Fred Ann
Could be a separate girl.

Sam Mary
Bob Beth
Marilin
John
Monro
Fred Ann
His special woman.
Understand Quantifiers!!!
Sam Mary
Bob Beth
Marilin
John
“There is a girl whom every boy loves” Monro
Fred Ann
“There is a inverse (eg 1/3) for all reals.”One girl
Sam Mary
Not clear.
Bob Beth
Marilin
John
Monro
Fred Ann
Could be a separate girl.

Sam Mary
Bob Beth
“For each boy, there is a girl.” John
Marilin
Monro
“For each person, there is God.” Fred Ann
His special woman.
Understand Quantifiers!!!
Sam Mary
Bob Beth
Marilin
John
Monro
Fred Ann
One girl

Sam Mary
$g, "b, Loves(b, g) Bob Beth
Marilin
John
"b, $g, Loves(b, g)
Monro
Fred Ann
Could be a separate girl.

Sam Mary
Bob Beth
Marilin
John
Monro
Fred Ann
His special woman.
Understand Quantifiers!!!
$g, "b, Loves(b, g, date) Sam Mary

 Property (date) 
Bob Beth
Marilin
John
Monro
Let’s understand this deeply One girl
Fred Ann

Which variable is this statement about?:


– boy b
– girl g
– time date

Here the date is considered free variable


because it is not bounded by a $ or a "
quantifier.
Hence, the sentence
= true says something
  about date.
= false
Understand Quantifiers!!!
$g,["b, Loves(b, g)] Sam Mary

   $ g , Property (g) 
Bob Beth
Marilin
John
Monro
Let’s understand this deeply One girl
Fred Ann

Which variable is this statement about?:


– boy b
– girl g
There are no free variables and hence depending
on the world, this is either true or false.

$ g is the left most quantifier. Hence,


this statement is “about” the collection of girls,
i.e. the existence of at least one with some property.
Understand Quantifiers!!!
"b, Loves(b, g) Sam Mary

  bProperty(g)
         ,     Property g    ( b 
Bob Beth
Marilin
John
Monro
Let’s understand this deeply One girl
Fred Ann

Which variable is this statement about?:


– boy b
– girl g
Girl g would have been a free variable,
but g denotes a specific girl like MM.

"b is the left most quantifier. Hence,


this statement is “about” the collection of boys,
i.e. that all of them have some property.
Understand Quantifiers!!!
$g, "b, Loves(b, g) Sam
Bob
Mary
Beth
Marilin
John
Monro
Fred Ann
One girl
Build understanding by building the statement backwards.
What does each subsequence say about its free variables.
(with little emphasis on the bound variables.)

Loves(b, g) “boy b loves girl g”.  


"b, Loves(b, g) “girl g is loved by all boys”.  
$ g, "b, Loves(b, g) “there is girl that is loved by all boys”.  
Understand Quantifiers!!! Sam Mary
$g, "b, Loves(b, g) Bob Beth

"b, $g, Loves(b, g) John


Marilin
Monro
Fred Ann
His special woman.
Build understanding by building the statement backwards.
What does each subsequence say about its free variables.
(with little emphasis on the bound variables.)

Loves(b, g) “boy b loves girl g”.  


"b, Loves(b, g) “girl g is loved by all boys”.  
$ g, "b, Loves(b, g) “there is girl that is loved by all boys”.  

$ g, Loves(b, g) “boy b loves some girl”.  


"b, $g, Loves(b, g) “every boy loves some girl”.  
Understand Quantifiers!!!
Proof: "b, $g, Loves(b, g)
I must insist that you prove this using
the following game.
Two players:
a prover and a disprover.
Sam Mary

Bob Beth
Marilin
John
Monro
Fred Ann
Understand Quantifiers!!!
Proof: "b, $g, Loves(b, g)
They read the statement left to right.

I produce the object when it is a $.


I produce the object when it is a ".
Sam Mary
I can always win
if and only if Bob Beth
statement is true. Marilin
John
Monro
Fred Ann
Understand Quantifiers!!!
Proof: "b, $g, Loves(b, g) true
No! I produce b=Bob.

Knowing you chose Bob,


I produce g = Marilin Monro
Bob loves MM so I win.
Sam Mary
b=John!
Bob Beth
g=Mary
Marilin
and I still win. John
Monro
Fred Ann
Understand Quantifiers!!!
Proof: "b, $g, Loves(b, g) true
$g, "b, Loves(b, g) false
The order the players go
REALY matters.
Ha ha
You need to go first now! Sam Mary

I produce g = MM Bob Beth

Nope. b=John Marilin


John
Monro
does not love her
Fred Ann
Understand Quantifiers!!!
Proof: "b, $g, Loves(b, g) true
$g, "b, Loves(b, g) false
The order the players go
REALY matters.

I could only win


Sam Mary
this game if there
was a single girl Bob Beth
loved by every boy.
Marilin
John
Yup Monro
Fred Ann
Understand Quantifiers!!!
Proof: "b, $g, Loves(b, g) true
$g, "b, Loves(b, g) false true
Good. Now I produce g=MM.

Knowing MM, I produce boy b.


Sam Mary
I prove b loves MM.
Bob Beth
Marilin
I can always win. John
Monro
Hence, statement is true. Fred Ann
Understand Quantifiers!!!
Prove: A B
"b, $g, Loves(b, g)
Assume A and prove B.
If A is true, then there is a strategy for the game.
i.e. for any b requested,
the strategy responds with a g
such that Loves(b, g) is true.
Sam Mary

Bob Beth
Marilin
Sam A Beth John
Monro
Fred Ann
Fred Ann
Understand Quantifiers!!!
Negations: Ø[ $g, "b, Loves(b, g) ]
= "g, Ø[ "b, Loves(b, g) ]
= "g, $b, Ø[ Loves(b, g) ]
= "g, $b, ØLoves(b, g)

Ø moves right
$ "
= 
Reals
Build understanding by building the
"x, $y, x+y=0 statement backwards.
$y, "x, x+y=0 What does each subsequence say
about its free variables.
$y, "x, x+y=x (with little emphasis on the bound
2 -2 variables.)
x+y=0 “y is the additive inverse of x”.  
$y, x+y=0 “x has an additive inverse”.  
"x, $y, x+y=0 “Every real has an additive inverse”.  
x+y=0 “x is the additive inverse of y”.  
"x, x+y=0 “Every real is an additive inverse of y.  
$ y, "x, x+y=0 “Some real has every real as an inverse”.  
x+y=x “Adding y does not change x”.  
"x, x+y=x “Adding y makes no changes”.  
$ y, "x, x+y=x “There is an additive zero”.
Reals
Proof: "x, $y, x+y=0 true (additive inverse)

Let x be an arbitrary real number.

Let y = -x.

The relation is true.


x+y = x + (-x) = 0

I can always win.


Hence, the statement is true.
“Every real number has an additive inverse.”
Reals
Proof: "x, $y, x+y=0 true (additive inverse)
$y, "x, x+y=0

The order the players go


REALY matters.
Reals
Proof: "x, $y, x+y=0 true (additive inverse)
$y, "x, x+y=0 false
Let y = ??? aaah

Let x=-y+1
The relation is false.
x+y = (-y+1) + y ≠ 0

I can always win.


Hence, the statement is false.
Reals
Proof: "x, $y, x+y=0 true (additive inverse)
$y, "x, x+y=0 false

Again, I could only


win this game if there
was a single y that
works for every x.

Yup
Reals
Proof: "x, $y, x+y=0 true (additive inverse)
$y, "x, x+y=0 false
$y, "x, x+y=x true (additive zero)
Let y = 0.

Then x+y=x is true


for all y.

Many students answer this.


But NO!
I insist that you play the game.
Reals
Proof: "x, $y, x+y=0 true (additive inverse)
$y, "x, x+y=0 false
$y, "x, x+y=x true (additive zero)
Let y = 0.
Let x be an arbitrary real number.

The relation is true.


x+y = x + (0) = x

See there is my single y that


works for all x.
Reals
"x, $y, x×y=1 Build understanding by building the
statement backwards.
$x,"y, x×y  1 What does each subsequence say
about its free variables.
(with little emphasis on the bound
2 ½ variables.)
x×y=1 “y is the multiplicative inverse of x”.  
$ y, x×y=1 “x has an multiplicative inverse”.  
"x, $ y, x×y=1 “Every real has an multiplicative inverse”.  
What about zero.

x×y  1 “y is not the multiplicative inverse of x”.  


"y, x×y  1 “Everything fails to be x’s mult inverse”.
  “x’s does not have a mult inverse”.
$x,"y, x×y  1 “There is a real without x’s a mult inverse”.
Namely zero.
Reals

“Every real has an additive inverse "x, property(x)  


except for zero.”
“x has an additive inverse property(x) 
or True for all reals.
x is zero.”
  ( $ y, x×y=1)
or (x=0)
( $ y, x=0) ?
  $y, x=0 or x×y=1
Reals
$a, "x, $y, x=a or x×y=1

“Every real has an additive inverse "x,   $ y, x=0 or x×y=1


except for zero.”
with one exception.”
Reals
Proof: "x, $y, x×y=1 (multiplicative inverse)

Let x be an arbitrary real number.

Let y = 1/x.

The relation is true.


x×y = x × 1/x = 1

I can always win.


Hence, the statement is true.
What a minute the statement is false!
Reals
Proof: Ø[ "x, $y, x×y=1 ] = $x, "y, x×y  1
Let x=0.

Let y be an arbitrary real number.

The negated relation is true.


x×y = 0×y = 0  1

I can always win.


Hence, the negated statement is true
and the original statement is false.
Reals
Proof: $a, "x, $y, x=a or x×y=1
How about
“Every real number has a
multiplicative inverse
except for zero”
Reals
Proof: $a, "x, $y, x=a or x×y=1
Let a=0.

Let x be an arbitrary real number.

If x= 0, let y=5.
Else let y= 1/x . I can
always win.
x=a or x×y=1 is always Hence,
true. the statement
If x= 0, then x=a. is true.
1
Playing Game White moves.
Who wins? Black moves.
White wins

Black wins
Playing Game White moves.
Who wins? Black moves.
White wins

Black wins
Playing Game White moves.
Who wins? Black moves.
White wins

Black wins
Playing Game White moves.
Who wins? Black moves.
M1w M1b White wins

Black wins

Black has a
winning strategy

"M1w, $M1b, Black-Wins(M1w,M1b)


Playing Game White moves.
Who wins? Black moves.
M1w M1b White wins

White wins

White has a
winning strategy

$M1w, "M1b, White-Wins(M1w,M1b)


$M1w, "M1b, $M2w, "M2b, $M3w, "M3b, White-Wins(...
Classifying Functions

f(n) = θ(g(n))

 c 1, c 2,  n 0,  n  n 0, c 1 g ( n )  f ( n )  c 2 g ( n )
f(n) is sandwiched between c1g(n) and c2g(n)
for some sufficiently small c1 (= 0.0001)
Classifying Functions

f(n) = θ(g(n))

 c 1, c 2,  n 0,  n  n 0, c 1 g ( n )  f ( n )  c 2 g ( n )
For all sufficiently large n
For some definition of “sufficiently large”
Classifying Functions
Proof: $c,n0, "n≥n0, 8n2+1000n ≤ cn2 true
Let c=9 & n0=1000.
Let n be an arbitrary real number
≥1000
The relation is true.
8n2+1000n ≤ 8n2+n2 = 9n2 = cn2

See there is my single c that


works for all sufficiently large n.
Classifying Functions
Proof: $c,n0, "n≥n0, 8n2+1000n ≤ cn2 true
$c,n0, "n≥n0, 2n ≤ cn false
Let c=10000000 & n0=1.
Let n=c.

The relation is false.


2n > n∙n = cn
Classifying Functions
Proof: $c,n0, "n≥n0, 8n2+1000n ≤ cn2 true
$c,n0, "n≥n0, 2n ≤ cn false
"n, $c, 2n ≤ cn

The order the players go


REALY matters.
Classifying Functions
Proof: $c,n0, "n≥n0, 8n2+1000n ≤ cn2 true
$c,n0, "n≥n0, 2n ≤ cn false
"n, $c, 2n ≤ cn true
Let n=10000000

Let c=2n.
The relation is true.
2n ≤ 2nn = cn
Classifying Functions
Theta f(n) = θ(g(n)) f(n) ≈ c g(n)
BigOh f(n) = O(g(n)) f(n) ≤ c g(n)
Omega f(n) = Ω(g(n)) f(n) ≥ c g(n)
Little Oh f(n) = o(g(n)) f(n) << c g(n)
Little Omega f(n) = ω(g(n)) f(n) >> c g(n)

Giving an idea of how fast a function grows


without going into too much detail.
Classifying Functions

3n2 + 7n + 8 = nθ(1)
Polynomial time

$ c1, c2, n0, " n ³ n0 nc1 £ f(n) £ nc2


Classifying Functions

3n n = 2[ log(3) n + log n ]
= 2θ(n)
Exponential time

$ c1, c2, n0, " n ³ n0 2c1n £ f(n) £ 2c2n


Computable Problem
A Computational Problem P states
• for each possible input I
• what the required output P(I) is.
Eg: Sorting

An Algorithm/Program/Machine M is
• a set of instructions
(described by a finite string “M”)
• on a given input I
• follow instructions and
• produces output M(I)
• or runs for ever.
Eg: Insertion Sort
Computable Problem
Problem P is
computable if
$M, "I, M(I)=P(I)
$M,“Machine/Algorithm M  
computes problem P.”
"I, “Machine/Algorithm M  
gives the answer required
by problem P on input I.”
M(I)=P(I)
Computable Problem
Problem P is
computable if
$M, "I, M(I)=P(I)
I have a machine M that I
claim works.
Oh yeah, I have an input I for
which it does not.
I win if M on input I gives
the correct output

What we have been doing all along.


Computable Problem
Problem P is
computable if
$M, "I, M(I)=P(I)
$M
I come up with this machine M
at compile time.
Without knowing the input!
It needs a finite description.
• Finite set of instructions
• Finite set of variables
• With finite ranges of values.
Computable Problem
Problem P is
computable if
$M, "I, M(I)=P(I)
"I
I choose the input I at run time
after I already know the machine M.
My input can be much bigger
than the machine.
Its computation may require
• a lot more memory to be allocated
dynamically.
• at lot more time.
Computable Problem
Problem P is
computable if
$M, "I, M(I)=P(I)

Here the P is considered


free because it is not
bounded by a quantifier.
Hence, the sentence says
something about P.
Computable Problem
Problem P is Problem P is
computable if uncomputable if
$M, "I, M(I)=P(I) "M, $I, M(I)  P(I)
I have a machine M that I
claim works.
I find one counter example
input I for which
his machine M fails us.

I win if M on input I gives


the wrong output
Generally very hard to do.
Computable Problem
Problem P is Problem P is
true computable if uncomputable if true
$M, "I, M(I)=Sorting(I) "M, $I, M(I)  Halting(I)
"I, $M, M(I) = Halting(I)
The order the players go
REALY matters.

If you don’t know if it is


true or not, trust the
game.
Computable Problem
Problem P is Problem P is
true computable if uncomputable if true
$M, "I, M(I)=Sorting(I) "M, $I, M(I)  Halting(I)
"I, $M, M(I) = Halting(I) true
I give you an input I.
Given I either
Halting(I) = yes or
A tricky Halting(I) = no.
one. "I, Myes(I) says yes
"I, Mno(I) says no
I don’t know which, but
one of these does the trick.
Time Complexity
• Problem P is computable in n2 time.
M, "I, M(I)=P(I) & Time(M,I)≤|I|2
• Problem P is not computable in n2 time.
M, [If M solves P]  [M takes > n2 time]
M, "I, [M(I)=P(I)]  [Time(M,I)>|I|2]
M, ["I M(I)=P(I)]  ["I Time(M,I)>|I|2]
No
Time Complexity
• Problem P is computable in n2 time.
M, "I, M(I)=P(I) & Time(M,I)≤|I|2
• Problem P is not computable in n2 time.
Just negate the above statement!
M, I, M(I)≠P(I) or Time(M,I)>|I|2)
Let M be any machine.
I find one counter example input I
for which his machine M fails us.
Either because it gives the wrong
answer or takes too much time.
Computable with Fixed Resources

Computable mean
that some algorithm
computes it.
Computable with Fixed Resources
Note that the number of line
in a print out of the code
does not actually depend on
the input
Computable with Fixed Resources

These are the types of things


we will be saying with our
first order logic statements.
Computable with Fixed Resources
Computable with Fixed Resources
Computable with Fixed Resources
Computable with Fixed Resources
Computable with Fixed Resources
Computable with Fixed Resources
Halting problem poly Math Truth

“TM M halts on input I”


=  C, “C is an integer encoding
a valid halting computation
for TM M on input I”
 “time t” “a legal TM
M step is taken”
Harder for 4111
Time Complexity
• Problem P is computable in polynomial time.
 M, c, n0,"I, M(I)=P(I) & (|I| < n0 or Time(M,I) ≤ |I|c)
• Problem P is not computable in polynomial time.
 M, c, n0,  I, M(I)≠P(I) or (|I| ≥ n0 & Time(M,I) > |I|c)
• Problem P is computable in exponential time.
 M, c, n0,"I, M(I)=P(I) & (|I| < n0 or Time(M,I) ≤ 2c|I|)
• The computational class “Exponential Time"
is strictly bigger than the computational
class “Polynomial Time”.
 P, [ M, c, n0,  I, M(I)≠P(I) or (|I| ≥ n0 & Time(M,I) > |I|c)]
& [ M, c, n 0 , " I, M(I)=P(I) & (|I| < n 0 or Time(M,I) ≤ 2 c|I|
)]
Church’s Thesis
A computational problem is computable
• by a Java Program
• by a Turing Machine

If a boy can do it, a girl can do it better.


If a Java program can do it,
Turing Machine can simulate it.
Church’s Thesis
A computational problem is computable
• by a Java Program
• by a Turing Machine

Why are these the


same?
Church’s Thesis

Why are these the


same?
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
Proof
End
Understand Quantifiers!!!
Proof: $g, "b, Loves(b, g)
I produce girl MM.

Let b be an arbitrary boy.

I give an “alg” Sam Mary


with input b,
and output Bob Beth
a proof that Marilin
John
Loves(b,MM) Monro
Fred Ann
Understand Quantifiers!!!
Proof: "b, $g, Loves(b, g)
Let b be an arbitrary boy.

I give an “alg”
with input b,
and output an g
Sam Mary
and a proof that
Loves(b, g). Bob Beth

You give Sam, I give Beth. Marilin


John
You give Bob, I give MM. Monro
… Fred Ann
Understand Quantifiers!!!
Proof: "b, Loves(b, MM)
I don’t like proving universal (")
statements by contradiction.

By way of Instead play the game.


contradiction assume
$b, ØLoves(b, MM) Sam Mary
Let b’ be the object
Bob Beth
that is assumed to
exist. Marilin
John
Prove Loves(b’,MM) Monro
Contradiction. Fred Ann
Understand Quantifiers!!!
Negations: "bBoys, Loves(g)
What is the negation of this?

$bBoys, ØLoves(b, g)

No, we are still talking about

properties of the b
in the set Boys.
Understand Quantifiers!!!
Negations: "x $yx, y=2x
What is the negation of this?
$x "y<x, y2x
$x "yx, y2x
No, we are still talking about

properties of the y
that are bigger than x.
Understand Quantifiers!!!
"x, $y, x×y=1
"x>0, $y, x×y=1

How about
“Every real number has a
multiplicative inverse
except for zero”
This does correctly express it.
But changing the range of x,
is more like 2nd order logic.
Understand Quantifiers!!!
"x, $y, x×y=1
"x, [x0] and [$y, x×y=1]

How about
“Every real number has a
multiplicative inverse
except for zero”

If you say "x,


then is x=0 a possibility.
Understand Quantifiers!!!
"x, $y, x×y=1
"x, [x=0] or [$y, x×y=1]

How about
“Every real number has a
multiplicative inverse
except for zero”

We need this statement true


for every value of x!
Understand Quantifiers!!!
"x, $y, x×y=1
"x, [x=0] or [$y, x×y=1]
"x, $y, x=0 or x×y=1

How about
“Every real number has a
multiplicative inverse
except for zero”

Usually we prefer the quantifiers


to be on the outside.
Understand Quantifiers!!!
"x, $y, x×y=1
"x, [x=0] or [$y, x×y=1]
"x, $y, x=0 or x×y=1
$a, "x, $y, x=a or x×y=1

How about
“Every real number has a
multiplicative inverse
except for zero”
Understand Quantifiers!!!
"b, Loves(b, g)
“Every boy loves g”

b vs g?
b is a bound variable. g is a free variable.
(appears in  or ) (does not appear in  or )
The statement is about The statement is about g.
the set of all boys. Free variable
x=5 Object
loop i = 1..9 Bound variable
++x
end loop
Understand Quantifiers!!!
"b, Loves(b, g)

g is a free variable.
(does not appear in  or )
A relation/predicate
• returns True/False The statement is about g.
• depending whether object
g has the property VeryLoved.
VeryLoved(g) = "b, Loves(b, g)
Classifying Functions

3n2 + 7n + 8 = θ(n2)
True

 c 1, c 2,  n 0,  n  n 0, c 1 g ( n )  f ( n )  c 2 g ( n )
3 4 8 n ³ 8 3·n2 £ 3n2 + 7n + 8 £ 4·n2
7n + 8 £ 1·n2
7 + 8/n £ 1·n
Classifying Functions

3n2 + 7n + 8 = θ(n2)

 c 1, c 2,  n 0,  n  n 0, c 1 g ( n )  f ( n )  c 2 g ( n )
Classifying Functions

3n2 + 7n + 8 = θ(n)

The reverse statement

 c 1, c 2,  n 0,  n  n 0, c 1 g ( n )  f ( n )  c 2 g ( n )
 c 1, c 2,  n 0,  n  n 0, c 1 g ( n )  f ( n ) o r f ( n )  c 2 g ( n )
AB Implications
• If A is true then B is true.
• This may because A causes B.
• The counter positive is B  A
because if B is not true, then A can’t be true
because otherwise B would be true.
Hence B may cause A.
• Or C may cause both A and B.
• Or maybe cause and effect is not involved at all.
• A ⇒ B formally means ¬(A and ¬B)
• Bring the negation in gives ¬A or B
If A is false, then the statement A ⇒ B follows.
If B is true, then A ⇒ B again follows.
Proof of A  B Implications
• Officially two cases.
ignore
• If A is false, then A ⇒ B is trivially true.
• If A is true, one must prove B.
• Indenting, pushing and popping stack.
• A proof consists of a sequence of statements
that follow from the initial assumptions.
• When we make the new assumption A,
indents the lines for the duration
of this new assumption.
• Then after B is proved,
this indenting “stack” is popped
with the conclusion that A ⇒ B.
Proof of A  B Implications
• Goal is to prove A ⇒ B.
• Assume that A is true.
• Goal is to prove B.
• ... proof of B.
• Hence B is true.
• Hence A ⇒ B is true.
Implications
Suppose you assume this is true:
S1: "x, P(x)
Then you know that P(x’) is true
for your favorite value x′.

Suppose you assume this is true:


S2: $y, Q(y)
Then you can ask for a y’
for which Q(y’) is true.
Implications
Suppose you assume this is true:
S3: "x, $y, R(x,y)

You can use it as an oracle.


If you have an x’, it gives you a y’.
“Let y’ be the value assumed to exist from S3
By definition of y’, we know that
R(x’,y’)
Implications
Quantifiers,
Do you understand them?
Express:
• Problem P is computable by some algorithm.
• Problem P is computable in time T(n).
• Problem P is computable in polynomial time.
• The computational class “Exponential Time"
is strictly bigger than the computational
class “Polynomial Time”.

“A(I)=P(I)"means algorithm A gives the required


output for problem P on instance I.
Time(A,I) is the running time of algorithm A
on instance I.
T(n) some function like n2.
The Time Complexity
The minimum time needed by an algorithm to solve it.
Upper Bound:
Problem P is computable in time Tupper(n)
if there is an algorithm A which
• outputs the correct answer
• in this much time
given any input instance I.

Eg: Sorting computable in Tupper(n) = O(n2) time.


The Time Complexity
Upper Bound:
A(I)=P(I)
We need our algorithm A
to give the correct answer
to problem P
on input I
The Time Complexity
Upper Bound:
A(I)=P(I) Time(A,I) £ n2
We need it to run
in say n2 time.
The Time Complexity
Upper Bound:
A(I)=P(I) Time(A,I) £ nT2upper(|I|)
But what is n?
The Time Complexity
Upper Bound:
A(I)=P(I) and Time(A,I) £ Tupper(|I|)
Between these?
The Time Complexity
Upper Bound:
$A,
A(I)=P(I) and Time(A,I) £ Tupper(|I|)
What do we want from our
algorithm A?
Where does it come from?

I want to prove the statement is true.


I want the algorithm
to be one that works well,
so I will come up with it.
$A
The Time Complexity
Upper Bound:
$A,
"I, A(I)=P(I) and Time(A,I) £ Tupper(|I|)
What do we want from our
input I?
Where does it come from?

I want to prove the statement is false. I


want the input to be the worst,
so I will come up with it.
"I
The Time Complexity
Upper Bound:
$A,
"I, A(I)=P(I) and Time(A,I) £ Tupper(|I|)
Who goes first
and why?

I want one fixed algorithm


that works for every input,
so I should go first.
The Time Complexity
Upper Bound:
$A,
$A,"I,"I, A(I)=P(I) and Time(A,I) £ Tupper(|I|)
Who goes first
and why?

I want my input to be
the worst for his algorithm
so I should go second.
The Time Complexity
Upper Bound:
$A, "I, A(I)=P(I) and Time(A,I) £ Tupper(|I|)
What do we want from our
problem P?
Where does it come from?

We are making a statement about P.


It comes our boss.
The Time Complexity
Upper Bound:
$A, "I, A(I)=P(I) and Time(A,I) £ Tupper(|I|)
How does
the proof game go?
The Time Complexity
Upper Bound:
$A, "I, A(I)=P(I) and Time(A,I) £ Tupper(|I|)
I have an algorithm A that I
claim works and is fast.
Oh yeah, I have an input I for
which it does not.
I win if A on input I gives
• the correct output
• in the allotted time.

What we have been doing all along.


The Time Complexity
The minimum time needed by an algorithm to solve it.
Lower Bound:
Time Tlower(n) is a lower bound for problem p
if no algorithm solves the problem faster.

There may be algorithms that give the correct answer


or run quickly on some inputs instance.
The Time Complexity
The minimum time needed by an algorithm to solve it.
Lower Bound:
Time Tlower(n) is a lower bound for problem p
if no algorithm solves the problem faster.

But for every algorithm, there is at least one


instance I for which either the algorithm gives
the wrong answer or it runs in too much time.

Eg: No algorithm can sort N values


in Tlower = sqrt(N) time.
The Time Complexity
The minimum time needed by an algorithm to solve it.

Upper Bound:
$A, "I, A(I)=P(I) and Time(A,I) £ Tupper(|I|)
Lower Bound:
"A, $I, A(I) ≠ P(I) or Time(A,I) ³ Tlower(|I|)

“There is"and “there isn’t a faster algorithm”


are almost negations of each other.
The Time Complexity
Lower Bound:
"A, $I, [ A(I)  P(I) or Time(A,I) ³ Tlower(|I|)]
How does
the proof game go?
The Time Complexity
Lower Bound:
"A, $I, [ A(I)  P(I) or Time(A,I) ³ Tlower(|I|)]
I have an algorithm A that I
claim works and is fast.
Oh yeah, I have an input I for
which it does not .
I win if A on input I gives
• the wrong output or
• runs slow.
The Time Complexity
Lower Bound:
"A, $I, [ A(I)  P(I) or Time(A,I) ³ Tlower(|I|)]
I have an algorithm A that I
claim works and is fast.
Lower bounds are very hard
to prove, because I must
consider every algorithm
no matter how strange.
The Time Complexity
• Problem P is computable in polynomial time.
 A, "I, A(I)=P(I) & Time(A,I) ≤ |I|c
 c,
Great, but which c?

I want to prove the statement is true


so I will come up with it.
$c

Wow. That’s not fair.


A bigger c can always
make the statement true.
The Time Complexity
• Problem P is computable in polynomial time.
 A, c, "I, A(I)=P(I) & Time(A,I) ≤ |I|c
 c,
Who goes first
and why?

I will let him choose any c.


Even c=1000 if he wants.
But this fixed c must work for
all inputs.
So I should go second.
The Time Complexity
• Problem P is computable in polynomial time.
 A, c, "I, A(I)=P(I) & Time(A,I) ≤ |I|c

Who goes first


and why?

If he comes up with
a bigger c,
I will come up with
an even bigger I.
The Time Complexity
• Problem P is computable in polynomial time.
 A, c, "I, A(I)=P(I) & Time(A,I) ≤ |I|c

Time(A,I) ≤ |I|c
Does not apply for small I.
The Time Complexity
• Problem P is computable in polynomial time.
 A, c,n0,"I, A(I)=P(I) & (|I| < n0 or Time(A,I) ≤ |I|c)

What about n0?

Time(A,I) ≤ |I|c
Does not apply for small I.
The Time Complexity
• Problem P is computable in polynomial time.
 A, c, n0,"I, A(I)=P(I) & (|I| < n0 or Time(A,I) ≤ |I|c)
• Problem P is not computable in polynomial time.
 A, c, n0,  I, A(I)≠P(I) or (|I| ≥ n0 & Time(A,I) > |I|c)
• Problem P is computable in exponential time.
 A, c, n0,"I, A(I)=P(I) & (|I| < n0 or Time(A,I) ≤ 2c|I|)
• The computational class “Exponential Time"
is strictly bigger than the computational
class “Polynomial Time”.
 P, [ A, c, n0,  I, A(I)≠P(I) or (|I| ≥ n0 & Time(A,I) > |I|c)]
& [ A, c, n 0 ,"I, A(I)=P(I) & (|I| < n 0 or Time(A,I) ≤ 2c|I|
)]
Fixed/Constant vs Arbitrary/Finite
• Given the needs of the problem at hand,
• a programmer can give her Java program
• as many lines of code
• as many variables and
• as a big of a (finite) range for each variable
as she wants.
But these numbers are fixed/constant. Meaning:
• If K(J,I) is these numbers
program J has on input I,
then this function can depend on J,
but can’t depend on I (or on |I|)
• "computable problems P,
 a Java program J,  an integer k,
"inputs I,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
• Given the needs of the problem at hand,
• a programmer can give her Java program
• as many lines of code number of lines actually
• as many variables and used by J on I.
• as a big of a (finite) range for each variable
as she wants.
But these numbers are fixed/constant. Meaning:
• If K(J,I) is these numbers
program J has on input I,
then this function can depend on J,
but can’t depend on I (or on |I|) K(J,I) does
• "computable problems P, depend on I.
 a Java program J,  an integer k,
"inputs I, Clearly J can’t use more lines

K(J,I) = k than the number k that it actually has.
Fixed/Constant vs Arbitrary/Finite
Remember that we say that
f(n) = 5+sin(n)  (1)
because, though f(n) is not actually constant wrt n,
it is bounded above by a constant.

But these numbers are fixed/constant. Meaning:


In a “bigOh” sort of way,
we might still say that
K(J,I) is a fixed/constant number.
• "computable problems P,
 a Java program J,  an integer k,
"inputs I,
K(J,I)  k
Fixed/Constant vs Arbitrary/Finite
I come up with a hard problem P

• "computable problems P,
 a Java program J,  an integer k,
"inputs I,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
I write a Java program J
and give it some k lines of code.
I can use as MANY as I like.
1,000,000,000,000,000!

• "computable problems P,
 a Java program J,  an integer k,
"inputs I,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
I write a Java program J
and give it some k lines of code.
I can use as MANY as I like.
1,000,000,000,000,000!

Wow. That’s not fair.


With more and more code,
you can memorize
more and more answers.

• "computable problems P,
 a Java program J,  an integer k,
"inputs I,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
I write a Java program J
and give it some k lines of code.
I can use as MANY as I like.
1,000,000,000,000,000!

I will let him use any


number of lines he wants,
But this fixed J must work
for all inputs.

• "computable problems P,
 a Java program J,  an integer k,
"inputs I,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
I write a Java program J
and give it some k lines of code.
I can use as MANY as I like.
1,000,000,000,000,000!

If he uses more lines of code,


I will give him a bigger input I.
Hee Hee Hee
J must still solve the problem.

• "computable problems P,
 a Java program J,  an integer k,
"inputs I,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite

A similar statement
that speaks of all J,
even if not associated with a P.
• "computable problems P, " Java programs J,
 a Java program J,  an integer k,  an integer k,
"inputs I, "inputs I,
K(J,I) = k K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
• Given the input,
• a Java program,
• can dynamically allocated as many bytes of memory
as it wants.
• This number, though still finite, is not constant. Meaning:
• If K(J,I) is the number bytes
program J uses on input I,
then this function can depend on J,
and on I. (hence on |I|)
Careful will infinity.
• "computable problems P,
There are more than one
 a Java program J,
infinities and some are
"inputs I,
in fact smaller than others.
K(J,I) < 
Fixed/Constant vs Arbitrary/Finite
• Given the input,
• a Java program,
• can dynamically allocated as many bytes of memory
as it wants.
• This number, though still finite, is not constant. Meaning:
• If K(J,I) is the number bytes
program J uses on input I,
then this function can depend on J,
and on I. (hence on |I|)
In contrast, when we say
• "computable problems P,
J(I) =  to mean J runs
 a Java program J,
forever, we are really using
"inputs I,
∞ as a symbol and not really
K(J,I) < 
as meaning it to be infinity.
Fixed/Constant vs Arbitrary/Finite
• Given the input,
• a Java program,
• can dynamically allocated as many bytes of memory
as it wants.
• This number, though still finite, is not constant. Meaning:
• If K(J,I) is the number bytes
program J uses on input I,
then this function can depend on J,
and on I. (hence on |I|)
• "computable problems P,
 a Java program J,
"inputs I,
 an integer k,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
I come up with a hard problem P

• "computable problems P,
 a Java program J,
"inputs I,
 an integer k,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
I write a Java program J.
I don’t need to specify how many
bytes of memory k it will allocate.

Oops.
I now need to come up with
the input I before knowing this k.

• "computable problems P,
 a Java program J,
"inputs I,
 an integer k,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
No problem.
If he gives me a bigger input I,
I will use even more memory!

• "computable problems P,
 a Java program J,
"inputs I,
 an integer k,
K(J,I) = k
Fixed/Constant vs Arbitrary/Finite
No problem.
If he gives me a bigger input I,
I will use even more memory!

Good thing he cannot increase


the number of lines of code
in the same way or else
• it would take infinite space
to write down the
algorithm
• such an algorithm could
solve any problem.
Fixed/Constant vs Arbitrary/Finite
Home work
to do starting now
Home work
to do starting now
Home work
to do starting now
Home work
to do starting now
Home work
to do starting now
3.

If a boy can do it, a girl can do it better.


If a Java program can do it,
a Turing Machine can simulate it
Home work
to do starting now
Understand Quantifiers!!!
Sam
Bob
One politician
John Layton
Fred

$politician, "voters, Loves(v, p)


"voters, $politician, Loves(v, p)
Sam
Bob Harper
Could be a
different politician. John Layton
Fred
Understand Quantifiers!!!
Sam
“There is a politician that is loved by everyone.”
Bob
This statement is “about” a politician. John Layton
The existence of politician with some property. Fred
The property that he is “loved by every voter”.

[ "voters, Loves(v, p)]


$politician,
"voters,[$politician, Loves(v, p)]
Sam
Bob Harper
“Every voter loves some politician.”
John Layton
This statement is “about” voters.
Fred
Something is true about every voter.
We claim that he “loves some politician.”

You might also like