Professional Documents
Culture Documents
Week 09 2021a
Week 09 2021a
Week 09 2021a
a) Introduction to the
Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
(www.ichec.ie)
3
Collective Communication
4
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Characteristics of Collective Communication
5
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Barrier Synchronization
6
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Broadcast
• C: int MPI_Bcast(void *buf, int count, MPI_Datatype datatype,
int root, MPI_Comm comm)
before r ed
bcast
after r ed r ed r ed r ed r ed
bcast
e.g., root=1
• rank of the sending process (i.e., root process)
• must be given identically by all processes
7
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Scatter
ABCDE
before
scatter
e.g., root=1
ABCDE
after
scatter A B C D E
8
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Gather
before A B C D E
gather
e.g., root=1
A B C D E
after
gather AB C D E
Click here for sample C code implementation of MPI Scatter & Gather
9
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Introduction to MPI Virtual Topologies
10
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Topologies - Motivations
11
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Groups & communicators
12
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Working with groups
Odd_group
1 3
0 5
2 4
Even_group
13
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
For the previous example
– Alternatively…
– color = modulo(myrank, 2)
– MPI_Comm_split(MPI_COMM_WORLD, color, key, &newcomm)
14
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Group Management
• Group Accessors
– MPI_Group_size(…)
– MPI_Group_rank(…)
– …
• Group Constructors
– MPI_COMM_GROUP(…)
– MPI_GROUP_INCL(…)
– MPI_GROUP_EXCL(…)
– …
• Group Destructors
– MPI_GROUP_FREE(group)
15
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Communicator Management
• Communicator Accessors
– MPI_COMM_SIZE(…)
– MPI_COMM_RANK(…)
– …
• Communicator Constructors
– MPI_COMM_CREATE(…)
– MPI_COMM_SPLIT(…)
• Communicator Destructors
– MPI_COMM_FREE(comm)
16
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Virtual topology
• For more complex mapping, MPI routines are available
• Global array A(1:3000, 1:4000, 1:500) =
6•10 words
9
• on 3 x 4 x 5 = 60
processors
• process coordinates 0..2, 0..3, 0..4
• example:
on process ic0=2, ic1=0, ic2=3
(rank=43)
decomposition, e.g., A(2001:3000, 1:1000, 301:400) =
0.1•109 words
17
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Graphical representation
• Distribution of processes
1000 2000 3000 over the grid
4
3 • Distribution of the Global
2 1000
Array
1
0 2000 • Coordinate (2, 0, 3)
0 3000
represents process
number 43
1 4000
• It is being assigned the
2 cube A(2001:3000,
1:1000, 301:400)
3
0 1 2
18
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Virtual Topologies
19
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
How to use a Virtual Topology
20
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Topology Types
• Cartesian Topologies
– each process is connected to its neighbor in a virtual grid,
– boundaries can be cyclic, or not,
– processes are identified by Cartesian coordinates,
– of course,
communication between any two processes is still allowed.
• Graph Topologies
– general graphs,
– not covered here.
21
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Creating a Cartesian Virtual Topology
0 3 6 9
(0,0) (1,0) (2,0) (3,0)
comm_old = MPI_COMM_WORLD
ndims = 2
1 4 7 10
dims = ( 4, 3 ) (0,1) (1,1) (2,1) (3,1)
periods = ( 1/.true., 0/.false. )
reorder = see next slide
2 5 8 11
(0,2) (1,2) (2,2) (3,2)
22
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Example – A 2-dimensional Cylinder
• Ranks and Cartesian process coordinates in comm_cart
• Ranks in comm and comm_cart may differ, if reorder = 1 or .TRUE.
• This reordering can allow MPI to optimize communications
7 6 5 4
0 3 6 9
(0,0) (1,0) (2,0) (3,0)
11 10 9 8
1 4 7 10
(0,1) (1,1) (2,1) (3,1)
3 2 1 0
2 5 8 11
(0,2) (1,2) (2,2) (3,2)
23
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Cartesian Mapping Functions
• Mapping
7 ranks to
(2,1) process grid coordinates
24
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Cartesian Mapping Functions
7
(2,1)
25
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
Own coordinates
0 3 6 9
(0,0) (1,0) (2,0) (3,0)
MPI_Cart_rank
1 4 7 10
(0,1) (1,1) (2,1) (3,1)
MPI_Cart_coords
2 5 8 11
(0,2) (1,2) (2,2) (3,2)
27
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
MPI_Cart_shift – Example
0 3 6 9
(0,0) (1,0) (2,0) (3,0)
1 4 7 10
(0,1) (1,1) (2,1) (3,1)
2 5 8 11
(0,2) (1,2) (2,2) (3,2)
0 3 6 9
(0,0) (1,0) (2,0) (3,0)
1 4 7 10
(0,1) (1,1) (2,1) (3,1)
2 5 8 11
(0,2) (1,2) (2,2) (3,2)
29
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)
MPI_Cart_sub – Example
• Ranks and Cartesian process coordinates in comm_sub
0 1 2 3
0 (0) 3 (1) 6 (2) 9 (3)
(0,0) (1,0) (2,0) (3,0)
0 1 2 3
1 (0) 4 (1) 7 (2) 10 (3)
(0,1) (1,1) (2,1) (3,1)
0 1 2 3
2 (0) 5 (1) 8 (2) 11 (3)
(0,2) (1,2) (2,2) (3,2)
(true, false)
30
Introduction to the Message Passing Interface (MPI), Irish Centre for High-End Computing (ICHEC)