Performance Measuring Metrics For Computer System

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 10

Subject – CS8029 Parallel and Distributed Systems

Day-1

 Sequential computing and Concurrent computing

 What is Parallelism?

 Parallelism Vs. Concurrency

 Why do we use Parallelism?

 The subject “Parallel processing” and Application areas

 Parallel Computing vs. Distributed Computing

 Performance measuring metrics for computer system


Day-1
 Computation
• The term computation in computer Science refers to any type of calculation that includes arithmetical
and non-arithmetical steps and it follows a well-defined model (e.g. an algorithm).

 Sequential computing
o The machine finishes each instruction (completely) before it starts the next one.
o It is exactly what a sequential computer does.

 Concurrent computing
o A form of computing in which several computations are executed concurrently (by different processors or processing
elements)—during overlapping time periods.

o Here, execution of the next instructions can start before finishing the earlier instructions.
Day-1

What is Parallelism?
• Several activities may happen at the same time. This is the basic principle of parallelism.

• Parallelism need not refer to computing only. For example,


o Writing notes while listening to a lecture are parallel activities.
o When we talk with our friends, our heart pumps, our lungs breathe, our eyes move, etc. - all in parallel.
o Many problems are inherently parallel (built-in). For example, our universe, function of human body, human activities, etc.

• In computer science -
o In uni-processor/multi-processor system, there might appear to be computing at the same time for several timeshared
users.
 In uni-processor, we have the illusion of instructions being executed simultaneously, i. e., the illusion of parallelism.
 In fact, the processor is executing instructions for only one job at a time, so it is not true parallelism.
 For actual parallelism, we need to have several physical processors or processing elements that will be involved in
computation separately.
Day-1

Parallelism Vs. Concurrency


• Concurrency means capability of operating at the same time. It may be illusion too.

• Parallelism – is the real concurrency.

• It seems that “concurrency” and “parallelism” are synonyms. But there is a bit difference
between the two.
o We reserve “parallelism” to refer to situations where actions truly happen at the same time by
distinct processing agents (CPUs) simultaneously.
o “Concurrency” refers to both situations -- ones which are truly parallel and ones which have the
illusion of parallelism, as execution of tasks is time-sliced on one processing agent (CPU).

Note In context of parallel computing, CPU is often called as PE (processing element) , and a PE with its
own memory as CE (computing element).
Day-1

Why do we use Parallelism?


• The main reason is to achieve higher performance or speed (to reduce time) in computation.
o The rapid rate of generation of data (stored in the form of distributed databases) and its
processing make concern (time concern) to us.
o The concept of parallelism may resolve such a concern to a great extent.

• The other reason is fault tolerance.


• All of today’s supercomputers use parallelism extensively to increase performance. Today’s super
computers are capable of trillions of operations per second and no single processor is that fast.

We emphasize parallelism in designing both hardware or software level.


Day-1

The subject “Parallel processing” :


• Parallel processing (also called as parallel computing) deals with all the subject matter where the principles of
parallelism are applicable.

• It is a sub-field of computer science which includes ideas from theoretical computer science, computer
architecture, programming languages, algorithms.

Application areas –
• Computer graphics and image processing

• Artificial intelligence

• Healthcare

• Numerical computing

• Fluid (e.g., gas, liquid) flow (motion)

• Weather forecasting, etc.


Day-1

Parallel Computing vs. Distributed Computing

• Parallel computing,
o It is usually confined to solve a single problem on multiple processing elements or processors.
o The processors are tightly coupled (highly dependent on each other, changes of one require changes of others )
for coordination.

• Distributed computing,
o Not necessary to get the solution of a single problem (e.g., several remote requests (queries) may be resolved in
distributed computing).
o The processors are loosely coupled, usually separated by a distance of many feet or miles to form a network .

Note Distributed computing is a kind of parallel computing but problems to be solved may be different.
Examples:
o Parallel Computing: 10 People pulling a rope to lift a rock
o Distributed Computing: 10 people pulling 10 ropes to lift 10 rocks
Grid Computing

Three Persons: one in India, one in USA and the other in Norway
working for completing a project. They are connected over Net.

It is also known as Concurrent Computing.


Day-1
Performance measuring metrics for computer system:
• Parallel run time: The parallel run time is defined as the time that elapses:

- from the moment (t1) that a parallel computation starts

- to the moment (t2) that the last processor finishes execution.

Notation: Serial run time Ts, parallel run time Tp .

• Throughput: It is the amount of work performed by a computer in a given time period.


o When the work is instruction and time is sec, then throughput = frequency.

• Speedup of a system:
o In computer science, speedup is a measure that gives the relative performance of two systems processing the same
problem.
o The speedup of a parallel system is defined as: speedup(S) = T1(N)/TP(N), where T1(N) is the execution-time for the best
sequential algorithm on a problem of size N, and T p(N) is defined to be the execution-time of the parallel algorithm using
P processors


Day-1/2

 How do we increase the speed of computers?


 FLYNN’S TAXONOMY OF COMPUTERS

You might also like