Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

Measuring Time Complexity

• Time Complexity is the amount of time required by the algorithm to run to


completion.
• It is denoted by T(P) and it is the sum of the compile time and run time (execution
time.
• The compile time does not depend on the instance characteristics.
• T(P) = C + tp

• Fixed part is usually ignored and only the variable part is considered.

• Time complexity is measured in terms of a unit called Frequency Count.


• Tp = Cop . C(n) where
• Cop= time taken by basic operation
• C(n)=number of times operation needs to be executed

The steps performed to measure Time Complexity are:

1) Identify the basic operations in an algorithm


• eg: searching a key element from a list of n numbers in a list.
• Here the basic operation is comparing each element with key element
• i/p size – n elements

• Eg: performing addition of 2 nos.


• Here the basic operation is addition of two nos.
• i/p size = 2

2) Program step – ie. Syntactically or semantically meaningful segments of a program


that have execution time.
• Eg: comments count as zero steps
• Assignment statements count as One step
• For while repeat until count as step count for that control structure.

• When we analyze algorithms, we should employ mathematical techniques that


analyze algorithms independently of specific implementations, computers, or data.

• To analyze algorithms:
– First, we start to count the number of significant operations in a particular
solution to assess its efficiency.
– Then, we will express the efficiency of algorithms using growth functions.

The Execution Time of Algorithms

Each operation in an algorithm (or a program) has a cost.


è Each operation takes a certain of time.
count = count + 1; è take a certain amount of time, but it is constant

A sequence of operations:

count = count + 1; Cost: c1


sum = sum + count; Cost: c2

è Total Cost = c1 + c2

Example: Simple If-Statement


Cost Times
if (n < 0) c1 1
absval = -n c2 1
else
absval = n; c3 1

Total Cost <= c1 + max(c2,c3)

Example: Simple Loop


Cost Times
i = 1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
i = i + 1; c4 n
sum = sum + i; c5 n
}

Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5


è The time required for this algorithm is proportional to n

Example: Nested Loop


Cost Times
i=1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
j=1; c4 n
while (j <= n) { c5 n*(n+1)
sum = sum + i; c6 n*n
j = j + 1; c7 n*n
}
i = i +1; c8 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*(n+1)*c5+n*n*c6+n*n*c7+n*c8
è The time required for this algorithm is proportional to n2

Example : calculate time complexity for the following algorithm.


Find the maximum in an array of elements.

Algorithm arrayMax(A, n)
Input array A of n integers
Output maximum element of A
currentMax ¬ A[0]
for i ¬ 1 to n - 1 do
if A[i] > currentMax then
currentMax ¬ A[i]
return currentMax

Step Basic Operation No. of Iterations


Current_max[0] A[0] Indexing into array 1
Assigning a value to a 1
variable

For I = 1 Assigning 1
I<n Comparing two values N

True False
If current_max < A[ I ] Indexing 1 1
comparing 1 1

current_max A[i] Indexing 1


assigning 1
I ++ summing N-1
Return returning 1
For body runs n-1 times

To summarize:
The no. of primitive operations executed by the algorithm atleast is
T(n)=2+1+n+2(n-1)+1
The no. of primitive operations executed by the algorithm atmost is
T(n)=2+1+n+4(n-1)+1

General Rules for Estimation

• Loops: The running time of a loop is at most the running time of the statements inside
of that loop times the number of iterations.
• Nested Loops: Running time of a nested loop containing a statement in the inner
most loop is the running time of statement multiplied by the product of the sized of all
loops.
• Consecutive Statements: Just add the running times of those consecutive statements.
• If/Else: Never more than the running time of the test plus the larger of running times
of S1 and S2.

Primitive Operations
Basic computations performed by an algorithm
Identifiable in pseudocode
Largely independent from the programming language
Assumed to take a constant amount of time in the RAM model

The Random Access Machine (RAM) Model


2
1
0

A CPU

An potentially unbounded bank of memory cells, each of which can hold an arbitrary
number or character

Memory cells are numbered and accessing any cell in memory takes unit time.

Examples:
 Evaluating an expression
 Assigning a value to a variable
 Indexing into an array
 Calling a method
 Returning from a method

Computing Order of Growth

Measuring the performance of an Algorithm in relation with the input size n is called order of
growth/ Growth rate.

• We measure an algorithm’s time requirement as a function of the problem size.


– Problem size depends on the application: e.g. number of elements in a list for a
sorting algorithm, the number disks for towers of hanoi.
• So, for instance, we say that (if the problem size is n)
– Algorithm A requires 5*n2 time units to solve a problem of size n.
– Algorithm B requires 7*n time units to solve a problem of size n.
• The most important thing to learn is how quickly the algorithm’s time requirement
grows as a function of the problem size.
– Algorithm A requires time proportional to n2.
– Algorithm B requires time proportional to n.
• An algorithm’s proportional time requirement is known as growth rate.
• We can compare the efficiency of two algorithms by comparing their growth rates.

Common Growth Rates

Function Growth Rate Name


C Constant

Log N Logarithmic

Log2N Log-squared

N Linear

N log N

N2 Quadratic

N3 Cubic
2N Exponential

Running times for small inputs

Running times for moderate inputs


• From the graph it is seen that :
• the logarithmic function is the slowest growing function and the
• exponential function is the fastest growing function – grows rapidly with varying
input size n
• Exponential function gives huge values even for small input n.

You might also like