Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 50

Lecture 5: Streaming Lower

Bounds

1
2
Technique: 2-Party Communication
Complexity
• In all the lower bounds we’ve seen…

• Abstract this as a 2-party “game”

3
One-Way Communication Complexity

Alice Message Bob


Answer
𝑥 𝑦

Protocol computes with error if:


answer on

Question: how many bits do we need to send?

Notation:
4
One-Way CC vs. Streaming Algorithms
To prove streaming lower bound for :
• Choose
• Let be a streaming algorithm with space
• Solve using :

Alice State of Bob


Answer
𝑥 𝑦

5
Index
Alice Message Bob
𝑥𝑗
𝑗 ∈ { 1 ,… ,𝑛 }
𝑛
𝑥 ∈ {0 , 1}

Claim:

Graph Gap Hamming


Median -Sampling
Connectivity Distance

𝐹0 6
Today
1. Reductions from 2. Lower bound for

Gap Hamming Distance


3. Lower bound for -sampling

7
Three Reductions from

8
Lower Bound for Median

9
Lower Bound for Graph Connectivity
• Outline:

Alice Bob
adds adds
edges edges
𝐸𝐴 𝐸𝐵

is connected

10
Lower Bound for Graph Connectivity

11
Lower Bound for GHD
• Input:
• Definition:

• Claim:

12
From to
Reduction: given input to ,

“Experiment”: To solve :
• Alice generates uniform 1. Generate
• Bob generates uniform 2. Set ,
3. Return

13
Generating Slightly-Correlated Bits
• Want:
• How does behave when / ?
𝑥

𝜃
𝑒𝑗

14
Example

𝑥=( +1 ,+1 ,+1 )

𝜃 𝑒 𝑗 =( +1 , 0 , 0 )

is acute

15
Example

𝜃
𝑥=( −1 ,+1 ,+1 ) 𝑒 𝑗 =( +1 , 0 , 0 )

is obtuse

16
Example

𝑒 𝑗 =( +1 , 0 , 0 )
𝜃

is obtuse

𝑥=( −1 , −1 ,+1 )
17
Example

𝑒 𝑗 =( +1 , 0 , 0 )
𝜃

is acute
𝑥=( +1 , −1 ,+1 )
18
Generating Slightly-Correlated Bits
• Want:
• Claim:
• Proof: recall… 𝑥

𝜃
• In our case: 𝑒𝑗
𝑥 𝑗=¿

19
Generating Slightly-Correlated Bits
• Want:
• Claim:
• Claim: 𝑥

𝜃
𝑒𝑗

3
𝜋 𝜃
−𝑂 ( 𝜃 )
5
arccos 𝜃= − 𝜃 −
2 6

20
Generating Slightly-Correlated Bits
• Want:
• Claim:
• Claim: 𝑥
• So:
𝜃
𝑒𝑗

21
Generating Slightly-Correlated Bits
• Want:

+¿ −+¿ 𝑥
• Wishful thinking: − 𝑟
• Sample uniform on the plane spanned by

• Alice:
and
𝑒𝑗
𝜃=
𝜋
2
−Θ
( )
𝑥𝑗
√𝑛
• Bob:

22
Generating Slightly-Correlated Bits
• Want:

Real experiment: +¿ −+¿ 𝑥


• Wishful thinking: − 𝑟
• Sample uniform on the plane spanned by

• Alice:
and
𝑒𝑗
𝜃=
𝜋
2
−Θ
( )
𝑥𝑗
√𝑛
• Bob:

23
From to
• To solve :
• Generate
• Set
• Solve

• What about the error probability?

24
From GHD to
• Given (resp.):
• Alice’s part of the stream:
• Bob’s part of the stream:
• What is ?

27
Lower Bound for

28
Crash Course in Information Theory
Entropy of with domain :

• = #bits required to encode (losslessly)


• Examples:
• If is deterministic:
• If is uniform:
Crash Course in Information Theory
Conditional entropy:
• Let
• Let be the marginal distribution of Y
• Let be the dist. of conditioned on

Entropy of
Example
• Let ,

31
Crash Course in Information Theory
Conditional entropy:

• Properties:

• iff are independent


Crash Course in Information Theory
• Chain rule for entropy:
Crash Course in Information Theory
• Exercise:
Lower Bound for with Error
Augmented Index:
• Alice gets
• Bob gets and
• Goal: output

36
Lower Bound for
• Let uniform in , uniform in
• Let be Alice’s message
• Plan:
1. Show that is large
2. Show that is small
3. Remember that

is long

37
Step 1: is large
• uniform in

38
Step 2: is small
• uniform in , uniform in
• From Bob outputs his “guess”

• Let indicate the event “”

•…

39
Step 2: is small

• Alice doesn’t know !
is independent of
is also independent of

40
Conclusion

41
Augmented Index
:
• Alice gets
• Bob gets and
• Goal: output

42
Lower Bound for -Sampling

43
Plan of Attack
0. New problem: the Universal Relation ()
1. Reduce to -sampling
2. Reduce to

44
The Universal Relation
:
• Alice gets
• Bob gets
• Promise:
• Goal: find such that

45
Step 1: Reduction from to -Sampling
Given inputs to :
• Alice’s stream:
• Bob’s stream:

• iff

• Conclusion: memory required for -sampling with universe for any

46
Step 2: Lower Bound for
• Plan: reduce from to
• When succeeds: succeeds w.p. >
overall success probability
• Wait, what?

• Check success and repeat if necessary

47
Reduction: to 𝑧5

0000100000
• Alice gets
• Bob gets and 0000000000
• Attempt #1: Bob learns some
• For let with

Improvement:
𝑥 =¿ 𝑒 𝑧 𝑒 𝑧 𝑒 𝑧 𝑒 𝑧 𝑒 𝑧
1 2 3 4 5
( 𝑠=5 ) • Choose random
permutation
𝑦 =¿ 𝑒 𝑧 𝑒 𝑧 𝑒 𝑧
1 2 3
0 …0 0 …0 ( 𝑗= 4) • Solve

Bob learns where


𝑈𝑅 ( 𝑥 , 𝑦 ) is uniform
48
Reduction: to
• For let
𝑠=3 , 𝑗=2
𝑥 =¿ 𝑒 𝑧 𝑒 𝑧 𝑒 𝑧
1 2 3
Bob learns where
is uniform
𝑦 =¿ 𝑒 𝑧 1
00 • Choose random
permutation
• Solve
4 copies 2 copies 1 copy
𝑥 =¿ 𝑒 𝑧 𝑒 𝑧 𝑒 𝑧 𝑒 𝑧 𝑒𝑧 𝑒𝑧 𝑒𝑧
Bob learns w.p.
1 1 1 1 2 2 3

𝑦 =¿ 𝑒 𝑧 𝑒 𝑧 𝑒 𝑧 𝑒 𝑧
1 1 1 1
00 0
49
Reduction: to
• General construction: copies
• copies of 𝑒𝑧 𝑒𝑧
𝑖 𝑖
⋯ 𝑒𝑧 𝑖
hit

𝑥 =¿𝑤 1 𝑤 2 ⋯ 𝑤 𝑠−1 𝑤𝑠−1

𝑦 =¿ 𝑤 1 𝑤 2 ⋯ 00
Replace with 0

50
Reduction: to
• Reduce from: with alphabet size , length
• To: with length
• Lower bound:
• Set , to get

51
Index
Alice Message Bob
𝑥𝑗
𝑗 ∈ { 1 ,… ,𝑛 }
𝑛
𝑥 ∈ {0 , 1}

Claim:

Graph Gap Hamming


Median -Sampling
Connectivity Distance

𝐹0 52
END PART I (Streaming
Algorithms)

53

You might also like