Download as pdf or txt
Download as pdf or txt
You are on page 1of 49

CSI3020

Advanced Graph Algorithms (Mod – 5)

By,
Dr. Kovendan AKP,
Senior Assistant Professor,
Department of Database Systems,
School of Computer Science and Engineering,
Vellore Institute of Technology, Vellore.

Email: kovendan.akp@vit.ac.in Mobile: +91-9677190102 Cabin: PRP 208-C


Module 5: Exponential Algorithm
Independent set – Chromatic Number – Domatic Partition – The
travelling Salesman Problem Set Cover – Dominating Set – Subset Sum.

Monday, 03 July 2023 Instructor: Dr. Kovendan AKP 2


What does it mean for an algorithm to be
efficient?
Definitions of efficiency
• Fast in practice

• Qualitatively better worst case performance than a brute force


algorithm
Polynomial time efficiency
• An algorithm is efficient if it has a polynomial run time
• Run time as a function of problem size
• Run time: count number of instructions executed on an underlying model of
computation
• T(n): maximum run time for all problems of size at most n
Polynomial Time
• Algorithms with polynomial run time have the property that
increasing the problem size by a constant factor increases the run
time by at most a constant factor (depending on the algorithm)
Why Polynomial Time?
• Generally, polynomial time seems to capture the algorithms which are
efficient in practice

• The class of polynomial time algorithms has many good,


mathematical properties
Polynomial vs. Exponential Complexity
• Suppose you have an algorithm which takes n! steps
on a problem of size n
• If the algorithm takes one second for a problem of
size 10, estimate the run time for the following
problems sizes:

12 14 16 18 20
Ignoring constant factors
• Express run time as O(f(n))
• Emphasize algorithms with slower growth rates
• Fundamental idea in the study of algorithms
• Basis of Tarjan/Hopcroft Turing Award
Why ignore constant factors?
• Constant factors are arbitrary
• Depend on the implementation
• Depend on the details of the model

• Determining the constant factors is tedious and provides little insight


Why emphasize growth rates?
• The algorithm with the lower growth rate will be faster for all but a
finite number of cases
• Performance is most important for larger problem size
• As memory prices continue to fall, bigger problem sizes become
feasible
• Improving growth rate often requires new techniques
Formalizing growth rates
• T(n) is O(f(n)) [T : Z+  R+]
• If n is sufficiently large, T(n) is bounded by a constant multiple of f(n)
• Exist c, n0, such that for n > n0, T(n) < c f(n)

• T(n) is O(f(n)) will be written as: T(n) = O(f(n))


• Be careful with this notation
Prove 3n 2 + 5n + 20 is 2
O(n )
Let c =

Let n0 =

T(n) is O(f(n)) if there exist c, n0, such that for n > n0,
T(n) < c f(n)
Order the following functions in increasing
order by their growth rate
a) n log4n
b) 2n2 + 10n
c) 2n/100
d) 1000n + log8 n
e) n100
f) 3n
g) 1000 log10n
h) n1/2
Lower bounds
• T(n) is W(f(n))
• T(n) is at least a constant multiple of f(n)
• There exists an n0, and e > 0 such that T(n) > ef(n) for all n > n0
• Warning: definitions of W vary

• T(n) is Q(f(n)) if T(n) is O(f(n)) and T(n) is W(f(n))


Useful Theorems
• If lim (f(n) / g(n)) = c for c > 0 then f(n) = Q(g(n))

• If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n))

• If f(n) is O(h(n)) and g(n) is O(h(n)) then f(n) + g(n) is O(h(n))


Ordering growth rates
• For b > 1 and x > 0
• logbn is O(nx)

• For r > 1 and d > 0


• nd is O(rn)
On Independent Sets And Cliques
• Independent set: a set W so that no two vertices in W share an edge.

In a CLIQUE every two vertices are adjacent


Problem Definition
• INPUT: An undirected graph G(V,W)
• Goal: A partition of G(V,W) into as
many subsets V=U Di as possible
such that the sets Di are
independent sets, pairwise disjoint, and
there is an edge between every pair of
sets Di Dj ij.
Example

• In the following graph, the optimum is 4.


A Partition To Independent Sets Is Called A
Coloring
• The Chromatic number problem is to color the graph with as few
colors as possible, so that each color set is an independent set.
Namely, to decompose G to as few independent sets as possible.
• It is denoted by (G).
• The maximum achromatic number is a coloring as well and is denoted
by (G).
• Clearly (G) (G).
An Example Of A Difference Between The Two
Functions
• Say that (G) is a universal constant c.
• Then we are able to solve the problem in polynomial time by
exhaustive search.
• Search in time nc2 for the edges that cross between sets. This gives a
partial achromatic coloring. Later we shall see how to complete the
partial achromatic coloring into a full one.
• For Chromatic (namely minimum size) coloring the problem is NP-
Hard even if we are promised that (G)=3.
How Do We Find A Feasible Solution?
The Achromatic Number Is ‘Sensitive’ To Few
Edges
• A Complete Bipartite Graph:
About the Achromatic Number Problem
• The problem is very extensively studied in the Branch of math called
Graph Theory.
• Some times in Graph Theory the authors are more concerned with
proving a statement then finding a structure in polynomial time.
• As the problem is central in Graph Theory, there are two surveys on
the problem, one by MacGillivary and one by Eduards.
The complexity Of The Problem
• The decision version of the problem is NPC even if the underlying
graph is a tree! (Cairnie and Eduards).
• For trees we can find -1 size partition!
• We do not know (yet?) how to find the optimum for NP-Hard
questions in polynomial time.
About P=NP?
• It seems that we gave up on trying to show that an NPC problems,
say, the Traveling Sales Person, can be solved in polynomial time.
• I would imagine most people in theory believe that PNP.
• Still if we classified a problem to a certain complexity class, it does
not go away.
Approximation Algorithms
• But some times (only on hard inputs!) the running time indeed does not allow us
to solve NP-Hard problems even in practice. Computers can not run an algorithm
that performs 2n operations for large n.

• For n=300 this is more than the number of atoms in the known universe.

• Some rule of thumb usually works great, in the sense that is %5 from optimum
(how do we know that?).

• We want a provable multiplicable distance from the optimum. This is called


Approximation ratio.
Approximation Ratio For Maximization
Problems
• Consider some abstract problem P with infinitely many inputs
• An approximation algorithm has an approximation ratio  if it runs in
polynomial time and if for every instance I:
Val(OPT(I))/Val(A(I)) 
It is conjecture by Chadhary and Vishwanathan that the Achromatic
number problem admits a sqrt{} approximation.
Some Known Approximation Algorithms
• For general graphs, very bad situation.
• An algorithm with ratio n/sqrt{log n}
was given by Chaudhary and
Vishanathan.
Krauthgamer and I, gave a roughly
n/log n ratio approximation algorithm,
improving the above result.
Some Known Approximation Algorithms,
Continued
• In a paper with Krauthgamer,
min{sqrt{} ,n1/3} approximation algorithm for graph with girth at
least 5.
This improves a min{sqrt{} ,n3/8} ratio approximation algorithms by
Krysta and Lorys. And they even looked at an easier problem: Girth at
least 6!
Bipartite graphs
• For bipartite graphs (as for graphs of girth at most 5) we were able to
get a truly sub-linear ratio approximation.
• We give an O(n4/5 ) ratio approximating algorithm.
• This is joint work with Shende.
• The algorithm is truly complex and uses a previous result that we are
going to see in this talk. The result is from a previous paper.
A Lower Bound
• What is a lower bound on the approximability?
• If can be approximated by  then can be solved in polynomial time.
Thus P=NP.
• We show (a paper with Radhakrishnan and Sivasubramanian) that if
you can approximate the achromatic number problem within sqrt{log
n}, you can solve the problem.
• Called sqrt{log n} lower bound.
Complte partitions of graphs.

• It is defined just as Achromatic number but


the sets do not have to be independent set
• The problem is fully understood up to a
constant with respect to approximation
• O({sqrt{log n}) upper bound
• W( {sqrt{log n} ) lower bound
• Such a ratio is called TIGHT.
Why Not Approximate Via Small Maximal
Independent Sets?
• The algorithm that we saw finds a maximal independent set, gives it
color 1, and then recurses.
• Obviously, there is a maximal independent set of size n/ or less.
• Big difficulty: If you can approximate the minimum maximal
independent set by a ratio of n1-e , then P=NP. Thus a lower bound of
n1-e by Magnus. M. Halldorsson.
Graph Theory
• We show that if the girth of the graph is 5 or more then there is
always an achromatic number of size m/n with n=|V| and m=|E|.
• As it happens a lot in Discrete Math, the proof is in fact an algorithm.
• Note, this result is tight. The example is a complete bipartite graph
without a perfect matching.
Proof That m/n In A Graph Of Girth At
Least 5
• Remove all vertices with degree strictly less then m/n.
• The graph does not turn empty.
• The minimum degree is at least m/n.
• Consider the 2 BFS layers of an arbitrary vertex v.
How Do We Color The Graph:Example
• For a leaf  and a star S,  is adjacent to at most one leaf in S.
From Partial Coloring To A Complete Coloring
• Say that we have a partial achoromatic coloring with p colors of a
partial set.
• This means that only part of the vertices are inside independent sets.
• Also the independent sets form a partial achromatic coloring. They
admit the property that every two independent sets share an edge.
How to complete the coloring?
• Consider v that does not belong to any partial independent set.
• If there is some (partial) independent set with no neighbors of v, put
v in this set. Clearly the partial coloring is still legal.
• If a neighbor of v appears in every independent set, color v with a
new color.
• We again have a legal partial coloring which is larger by one (contains
one more vertex).
• Recurse.
An Equivalence Relation And Bipartite Graphs
• We say that v  u if N(v)=N(u).
• Note that v and u can not be neighbors.
• We are again (not be talking about approximation but rather) show a
result in graph theory on bipartite graphs.
• Say that q=sqrt{}. In this case the conjecture holds; We find a
partition of size q=sqrt{}. But in general its n/q for q sqrt{}.
No Huge Equivalence Classes

We may assume that the size of every equivalence


class is at most sqrt{}.
• If a class has sqrt{}+1 copies of a vertex x
and we use them all, each achromatic set
may contain a copy of x.
• But there will be a class with two copies of x.
This is not needed.
Small Size Classes Don’t Count

• Recompute n.
• Classes of size at most n/(2sqrt{}) can contain
at most n/2 vertices.
• Therefore equivalence classes of size at least
n/(2sqrt{}) vertices contain at least n/2 vertices.
Recall their size is also at most sqrt{}.
• Thus there are at least n/(2sqrt{}) equivalence
sets with size at least n/(2sqrt{}) sqrt{}/2.
The Algorithm
• We show A,B and xV2 such that x is adjacent to (say) B but is not to
A.

A B

x
C
The Set A,B And The Vertex x
• The sets A and B from previous slide are not equivalent.
• Therefore there must be some x that
that is independent of A but not of B.
• We are going to partition V1 into sets
that are adjacent to x and to set that are not
adjacent to x. A PROPER partition.
• As we shall see vertices are added to A and B but only from V2 (so
independent, and independent with respect to A and B).
Finding Some Adjacent Sets
• The vertex x has about sqrt{} equivalent copies.
• Put (a copy of) x in any equivalence set that does not contain x.
• This partitions V1 to equivalence classes S that contain x and T that do
not contain x .
How To Continue
• Note: every equivalence set in S is adjacent to every equivalence set
in T.
• This is because of the edges from x to T
• The sets in T itself do not share an edge nor do the sets that belong
to S .
• Thus recurse on S and T (Quicksort?).
Why Can We Continue The Recursion?
• The sets in S were not separated by x.
• The sets in T were not separated by x.
• Therefore V2 contains, a vertex y that is adjacent to some A’ in S but
not to some B’ in S. Again put a copy of y in all sets that are
independent of y.
• Same for T .
Open Problems
• For general graphs the upper bound of n/log n approximation ratio, is
hugely far from the lower bound of sqrt{log n}.
• If this is very hard to narrow, then perhaps one can narrow the gap
for bipartite graphs?

You might also like