Professional Documents
Culture Documents
Algo - 1
Algo - 1
(LECTURE-1)
SHAHBAZ AHMAD SAHI
RECOMMENDED BOOKS
◼ Assignments 10 %
◼ Quizzes 05 %
◼ Final Paper 25%
◼ Mid-Term 20 %
◼ Final 40 %
COURSE CONTENTS
◼ Graph Algorithms: Graph Theory, Optimization techniques, Searching algorithms, Minimal spanning
tree algorithms, Shortest Path, Maximum Flow Problem
◼ NP completeness
◼ Advance Techniques: Multithreaded/ Parallel Algorithms, Polynomials and FFT,
◼ Approximation Algorithms
◼ Number theoretic algorithms, RSA cryptosystems,
◼ Pattern matching, Computational Geometry,
◼ Image processing and streaming algorithms.
OBJECTIVE OF THIS COURSE
◼ A computer algorithm is a detailed step-by-step method for solving a problem by using a computer.
◼ An algorithm is a sequence of unambiguous instructions for solving a problem in a finite amount of
time.
◼ More generally, an Algorithm is any well-defined computational procedure that takes collection of
elements as input and produces a collection of elements as output.
◼ Problem
◼ The statement of the problem specifies, in general terms, the desired input/output relationship.
◼ Algorithm
◼ The algorithm describes a specific computational procedure for achieving input/output relationship.
◼ Example
◼ One might need to sort a sequence of numbers into non-decreasing order.
◼ Algorithms
◼ Various algorithms e.g. merge sort, quick sort, heap sorts, radix sort, counting sort etc.
IMPORTANT DESIGNING TECHNIQUES
◼ Brute Force
◼ Straightforward, naïve/simple approach
◼ Mostly expensive
◼ Divide-and-Conquer
◼ Divide into smaller sub-problems
◼ Iterative Improvement
◼ Improve one change at a time
◼ Decrease-and-Conquer
◼ Decrease instance size
◼ Transform-and-Conquer
◼ Modify problem first and then solve it
◼ Space and Time Tradeoffs
◼ Use more space now to save time later
SOME OF THE IMPORTANT DESIGNING TECHNIQUES
◼ Greedy Approach
◼ Locally optimal decisions, can not change once made.
◼ Efficient
◼ Easy to implement
◼ The solution is expected to be optimal
◼ Every problem may not have greedy solution
◼ Dynamic programming
◼ Decompose into sub-problems like divide and conquer
◼ Sub-problems are dependant
◼ Record results of smaller sub-problems
◼ Re-use it for further occurrence
◼ Mostly reduces complexity exponential to polynomial
PROBLEM SOLVING PHASES
◼ Analysis
◼ How does system work?
◼ Breaking a system down to known components
◼ How components (processes) relate to each other
◼ Breaking a process down to known functions
◼ Synthesis
◼ Building tools
◼ Building functions with supporting tools
◼ Composing functions to form a process
◼ How components should be put together?
◼ Final solution
PROBLEM SOLVING PROCESS
◼ Problem
◼ Strategy
◼ Algorithm
◼ Input
◼ Output
◼ Steps
◼ Analysis
◼ Correctness
◼ Time & Space
◼ Optimality
◼ Implementation
◼ Verification
MODEL OF COMPUTATION (ASSUMPTIONS)
◼ Design assumption
◼ Level of abstraction which meets our requirements
◼ Neither more nor less e.g. [0, 1] infinite continuous interval
◼ Analysis independent of the variations in
◼ Machine
◼ Operating system
◼ Programming languages
◼ Compiler etc.
◼ Low-level details will not be considered
◼ Our model will be an abstraction of a standard generic single-processor machine, called a random access machine or
RAM.
MODEL OF COMPUTATION (ASSUMPTIONS)
◼ Definitely do not want a Laptop if there is a another Laptop that is both fast and cheaper
◼ We say that fast cheap Laptops “dominates” the slow expensive
◼ So given a list of Laptops we want those that are not dominated by the other
CRITERION FOR SELECTION
◼ Count the number of steps of the pseudo code that are executed
◼ Or count the number of times an element of p is accessed
◼ Or number of comparisons that are performed
ANALYSIS OF SELECTION PROBLEM
◼ Input size is n
◼ Algorithm complexity is the work done by the algorithm there are following
complexities:
◼ Worst case time
◼ Average case time
◼ Best case time
◼ Expected time
GROWTH OF FUNCTIONS
◼ At what rate the functions will grow if we increase the input size approximately
towards infinite
HOW FUNCTIONS GROW
Algorithm 1 2 3 4
33n 46nlgn 13n2 3.4n3 2n
Input(n) Solution time
◼ Given a function g(n), we define Θ(g(n)) to be a set of functions that are asymptotically equivalent to
g(n)
◼ For example functions:
◼ 4n2
◼ 8n2+2n-3
◼ n2/5+10log n
◼ All are asymptotically equivalent Θ(n2)
FORMAL DEFINITION
◼ Θ(g(n))={ f(n): there exist positive constants c1, c2 and n0 such that
◼ 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n)
◼ for all n ≥ n0
◼ }
GRAPHICALLY
ASYMPTOTIC NOTATION
◼ Algorithm 3 Cost
◼ sum = 0; c1
◼ for(i=0; i<N; i++) c2
◼ for(j=0; j<N; j++) c2
◼ sum += arr[i][j]; c3
◼ ------------
◼ c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)
48
Ο-NOTATION AND Ω-NOTATION
◼ The definition of Θ–notation relies on proving both lower and upper asymptotic bounds
◼ Sometime we only interested in proving one bound or the other
◼ Ο-notation used to define the upper bound
◼ Ω -notation used to define the lower bound
Ο-NOTATION(BIG-OH)