Download as pdf or txt
Download as pdf or txt
You are on page 1of 54

An Introduction to Parallel

Programming. Second Edition Peter S.


Pacheco
Visit to download the full and correct content document:
https://ebookmass.com/product/an-introduction-to-parallel-programming-second-editi
on-peter-s-pacheco/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

An Introduction to Parallel Programming 2. Edition


Pacheco

https://ebookmass.com/product/an-introduction-to-parallel-
programming-2-edition-pacheco/

Fishes: An Introduction to Ichthyology Peter B. Moyle

https://ebookmass.com/product/fishes-an-introduction-to-
ichthyology-peter-b-moyle/

Parallel programming: concepts and practice González-


Domínguez

https://ebookmass.com/product/parallel-programming-concepts-and-
practice-gonzalez-dominguez/

An Introduction to Programming through C++ Abhiram G.


Ranade

https://ebookmass.com/product/an-introduction-to-programming-
through-c-abhiram-g-ranade/
An Introduction to Policing 9th Edition John S. Dempsey

https://ebookmass.com/product/an-introduction-to-policing-9th-
edition-john-s-dempsey/

The Tangled Bank: An Introduction to Evolution Second


Edition – Ebook PDF Version

https://ebookmass.com/product/the-tangled-bank-an-introduction-
to-evolution-second-edition-ebook-pdf-version/

The Stacked Deck: An Introduction to Social Inequality


Second Edition Jennifer Ball

https://ebookmass.com/product/the-stacked-deck-an-introduction-
to-social-inequality-second-edition-jennifer-ball/

Introduction to global studies Second Edition John


Mccormick

https://ebookmass.com/product/introduction-to-global-studies-
second-edition-john-mccormick/

An Introduction to Redox Polymers for Energy-Storage


Applications Ulrich S. Schubert

https://ebookmass.com/product/an-introduction-to-redox-polymers-
for-energy-storage-applications-ulrich-s-schubert/
An Introduction to Parallel
Programming

SECOND EDITION

Peter S. Pacheco
University of San Francisco

Matthew Malensek
University of San Francisco
Table of Contents

Cover image

Title page

Copyright

Dedication

Preface

Chapter 1: Why parallel computing

1.1. Why we need ever-increasing performance

1.2. Why we're building parallel systems

1.3. Why we need to write parallel programs

1.4. How do we write parallel programs?

1.5. What we'll be doing

1.6. Concurrent, parallel, distributed

1.7. The rest of the book


1.8. A word of warning

1.9. Typographical conventions

1.10. Summary

1.11. Exercises

Bibliography

Chapter 2: Parallel hardware and parallel software

2.1. Some background

2.2. Modifications to the von Neumann model

2.3. Parallel hardware

2.4. Parallel software

2.5. Input and output

2.6. Performance

2.7. Parallel program design

2.8. Writing and running parallel programs

2.9. Assumptions

2.10. Summary

2.11. Exercises

Bibliography

Chapter 3: Distributed memory programming with MPI


3.1. Getting started

3.2. The trapezoidal rule in MPI

3.3. Dealing with I/O

3.4. Collective communication

3.5. MPI-derived datatypes

3.6. Performance evaluation of MPI programs

3.7. A parallel sorting algorithm

3.8. Summary

3.9. Exercises

3.10. Programming assignments

Bibliography

Chapter 4: Shared-memory programming with Pthreads

4.1. Processes, threads, and Pthreads

4.2. Hello, world

4.3. Matrix-vector multiplication

4.4. Critical sections

4.5. Busy-waiting

4.6. Mutexes

4.7. Producer–consumer synchronization and semaphores


4.8. Barriers and condition variables

4.9. Read-write locks

4.10. Caches, cache-coherence, and false sharing

4.11. Thread-safety

4.12. Summary

4.13. Exercises

4.14. Programming assignments

Bibliography

Chapter 5: Shared-memory programming with OpenMP

5.1. Getting started

5.2. The trapezoidal rule

5.3. Scope of variables

5.4. The reduction clause

5.5. The parallel for directive

5.6. More about loops in OpenMP: sorting

5.7. Scheduling loops

5.8. Producers and consumers

5.9. Caches, cache coherence, and false sharing

5.10. Tasking
5.11. Thread-safety

5.12. Summary

5.13. Exercises

5.14. Programming assignments

Bibliography

Chapter 6: GPU programming with CUDA

6.1. GPUs and GPGPU

6.2. GPU architectures

6.3. Heterogeneous computing

6.4. CUDA hello

6.5. A closer look

6.6. Threads, blocks, and grids

6.7. Nvidia compute capabilities and device architectures

6.8. Vector addition

6.9. Returning results from CUDA kernels

6.10. CUDA trapezoidal rule I

6.11. CUDA trapezoidal rule II: improving performance

6.12. Implementation of trapezoidal rule with warpSize thread


blocks

6.13. CUDA trapezoidal rule III: blocks with more than one warp
6.14. Bitonic sort

6.15. Summary

6.16. Exercises

6.17. Programming assignments

Bibliography

Chapter 7: Parallel program development

7.1. Two n-body solvers

7.2. Sample sort

7.3. A word of caution

7.4. Which API?

7.5. Summary

7.6. Exercises

7.7. Programming assignments

Bibliography

Chapter 8: Where to go from here

Bibliography

Bibliography

Bibliography
Index
Copyright
Morgan Kaufmann is an imprint of Elsevier
50 Hampshire Street, 5th Floor, Cambridge, MA 02139, United States

Copyright © 2022 Elsevier Inc. All rights reserved.

No part of this publication may be reproduced or transmitted in any


form or by any means, electronic or mechanical, including
photocopying, recording, or any information storage and retrieval
system, without permission in writing from the publisher. Details on
how to seek permission, further information about the Publisher's
permissions policies and our arrangements with organizations such
as the Copyright Clearance Center and the Copyright Licensing
Agency, can be found at our website:
www.elsevier.com/permissions.

This book and the individual contributions contained in it are


protected under copyright by the Publisher (other than as may be
noted herein).
Cover art: “seven notations,” nickel/silver etched plates, acrylic on
wood structure, copyright © Holly Cohn

Notices
Knowledge and best practice in this field are constantly changing.
As new research and experience broaden our understanding,
changes in research methods, professional practices, or medical
treatment may become necessary.
Practitioners and researchers must always rely on their own
experience and knowledge in evaluating and using any
information, methods, compounds, or experiments described
herein. In using such information or methods they should be
mindful of their own safety and the safety of others, including
parties for whom they have a professional responsibility.

To the fullest extent of the law, neither the Publisher nor the
authors, contributors, or editors, assume any liability for any injury
and/or damage to persons or property as a matter of products
liability, negligence or otherwise, or from any use or operation of
any methods, products, instructions, or ideas contained in the
material herein.

Library of Congress Cataloging-in-Publication Data


A catalog record for this book is available from the Library of
Congress

British Library Cataloguing-in-Publication Data


A catalogue record for this book is available from the British Library

ISBN: 978-0-12-804605-0

For information on all Morgan Kaufmann publications visit our


website at https://www.elsevier.com/books-and-journals

Publisher: Katey Birtcher


Acquisitions Editor: Stephen Merken
Content Development Manager: Meghan Andress
Publishing Services Manager: Shereen Jameel
Production Project Manager: Rukmani Krishnan
Designer: Victoria Pearson

Typeset by VTeX
Printed in United States of America

Last digit is the print number: 9 8 7 6 5 4 3 2 1


Dedication

To the memory of Robert S. Miller


Preface
Parallel hardware has been ubiquitous for some time now: it's
difficult to find a laptop, desktop, or server that doesn't use a
multicore processor. Cluster computing is nearly as common today
as high-powered workstations were in the 1990s, and cloud
computing is making distributed-memory systems as accessible as
desktops. In spite of this, most computer science majors graduate
with little or no experience in parallel programming. Many colleges
and universities offer upper-division elective courses in parallel
computing, but since most computer science majors have to take a
large number of required courses, many graduate without ever
writing a multithreaded or multiprocess program.
It seems clear that this state of affairs needs to change. Whereas
many programs can obtain satisfactory performance on a single core,
computer scientists should be made aware of the potentially vast
performance improvements that can be obtained with parallelism,
and they should be able to exploit this potential when the need
arises.
Introduction to Parallel Programming was written to partially
address this problem. It provides an introduction to writing parallel
programs using MPI, Pthreads, OpenMP, and CUDA, four of the
most widely used APIs for parallel programming. The intended
audience is students and professionals who need to write parallel
programs. The prerequisites are minimal: a college-level course in
mathematics and the ability to write serial programs in C.
The prerequisites are minimal, because we believe that students
should be able to start programming parallel systems as early as
possible. At the University of San Francisco, computer science
students can fulfill a requirement for the major by taking a course on
which this text is based immediately after taking the “Introduction
to Computer Science I” course that most majors take in the first
semester of their freshman year. It has been our experience that there
really is no reason for students to defer writing parallel programs
until their junior or senior year. To the contrary, the course is
popular, and students have found that using concurrency in other
courses is much easier after having taken this course.
If second-semester freshmen can learn to write parallel programs
by taking a class, then motivated computing professionals should be
able to learn to write parallel programs through self-study. We hope
this book will prove to be a useful resource for them.
The Second Edition
It has been nearly ten years since the first edition of Introduction to
Parallel Programming was published. During that time much has
changed in the world of parallel programming, but, perhaps
surprisingly, much also remains the same. Our intent in writing this
second edition has been to preserve the material from the first
edition that continues to be generally useful, but also to add new
material where we felt it was needed.
The most obvious addition is the inclusion of a new chapter on
CUDA programming. When the first edition was published, CUDA
was still very new. It was already clear that the use of GPUs in high-
performance computing would become very widespread, but at that
time we felt that GPGPU wasn't readily accessible to programmers
with relatively little experience. In the last ten years, that has clearly
changed. Of course, CUDA is not a standard, and features are
added, modified, and deleted with great rapidity. As a consequence,
authors who use CUDA must present a subject that changes much
faster than a standard, such as MPI, Pthreads, or OpenMP. In spite of
this, we hope that our presentation of CUDA will continue to be
useful for some time.
Another big change is that Matthew Malensek has come onboard
as a coauthor. Matthew is a relatively new colleague at the
University of San Francisco, but he has extensive experience with
both the teaching and application of parallel computing. His
contributions have greatly improved the second edition.
g y p
About This Book
As we noted earlier, the main purpose of the book is to teach
parallel programming in MPI, Pthreads, OpenMP, and CUDA to an
audience with a limited background in computer science and no
previous experience with parallelism. We also wanted to make the
book as flexible as possible so that readers who have no interest in
learning one or two of the APIs can still read the remaining material
with little effort. Thus the chapters on the four APIs are largely
independent of each other: they can be read in any order, and one or
two of these chapters can be omitted. This independence has some
cost: it was necessary to repeat some of the material in these
chapters. Of course, repeated material can be simply scanned or
skipped.
On the other hand, readers with no prior experience with parallel
computing should read Chapter 1 first. This chapter attempts to
provide a relatively nontechnical explanation of why parallel
systems have come to dominate the computer landscape. It also
provides a short introduction to parallel systems and parallel
programming.
Chapter 2 provides technical background on computer hardware
and software. Chapters 3 to 6 provide independent introductions to
MPI, Pthreads, OpenMP, and CUDA, respectively. Chapter 7
illustrates the development of two different parallel programs using
each of the four APIs. Finally, Chapter 8 provides a few pointers to
additional information on parallel computing.
We use the C programming language for developing our
programs, because all four API's have C-language interfaces, and,
since C is such a small language, it is a relatively easy language to
learn—especially for C++ and Java programmers, since they will
already be familiar with C's control structures.
Classroom Use
This text grew out of a lower-division undergraduate course at the
University of San Francisco. The course fulfills a requirement for the
computer science major, and it also fulfills a prerequisite for the
undergraduate operating systems, architecture, and networking
courses. The course begins with a four-week introduction to C
g
programming. Since most of the students have already written Java
programs, the bulk of this introduction is devoted to the use pointers
in C.1 The remainder of the course provides introductions first to
programming in MPI, then Pthreads and/or OpenMP, and it finishes
with material covering CUDA.
We cover most of the material in Chapters 1, 3, 4, 5, and 6, and
parts of the material in Chapters 2 and 7. The background in Chapter
2 is introduced as the need arises. For example, before discussing
cache coherence issues in OpenMP (Chapter 5), we cover the
material on caches in Chapter 2.
The coursework consists of weekly homework assignments, five
programming assignments, a couple of midterms and a final exam.
The homework assignments usually involve writing a very short
program or making a small modification to an existing program.
Their purpose is to insure that the students stay current with the
coursework, and to give the students hands-on experience with
ideas introduced in class. It seems likely that their existence has been
one of the principle reasons for the course's success. Most of the
exercises in the text are suitable for these brief assignments.
The programming assignments are larger than the programs
written for homework, but we typically give the students a good
deal of guidance: we'll frequently include pseudocode in the
assignment and discuss some of the more difficult aspects in class.
This extra guidance is often crucial: it's easy to give programming
assignments that will take far too long for the students to complete.
The results of the midterms and finals and the enthusiastic reports
of the professor who teaches operating systems suggest that the
course is actually very successful in teaching students how to write
parallel programs.
For more advanced courses in parallel computing, the text and its
online supporting materials can serve as a supplement so that much
of the material on the syntax and semantics of the four APIs can be
assigned as outside reading.
The text can also be used as a supplement for project-based
courses and courses outside of computer science that make use of
parallel computation.
Support Materials
An online companion site for the book is located at
www.elsevier.com/books-and-journals/book-
companion/9780128046050.. This site will include errata and
complete source for the longer programs we discuss in the text.
Additional material for instructors, including downloadable figures
and solutions to the exercises in the book, can be downloaded from
https://educate.elsevier.com/9780128046050.
We would greatly appreciate readers' letting us know of any errors
they find. Please send email to mmalensek@usfca.edu if you do find
a mistake.
Acknowledgments
In the course of working on this book we've received considerable
help from many individuals. Among them we'd like to thank the
reviewers of the second edition, Steven Frankel (Technion) and Il-
Hyung Cho (Saginaw Valley State University), who read and
commented on draft versions of the new CUDA chapter. We'd also
like to thank the reviewers who read and commented on the initial
proposal for the book: Fikret Ercal (Missouri University of Science
and Technology), Dan Harvey (Southern Oregon University), Joel
Hollingsworth (Elon University), Jens Mache (Lewis and Clark
College), Don McLaughlin (West Virginia University), Manish
Parashar (Rutgers University), Charlie Peck (Earlham College),
Stephen C. Renk (North Central College), Rolfe Josef Sassenfeld (The
University of Texas at El Paso), Joseph Sloan (Wofford College),
Michela Taufer (University of Delaware), Pearl Wang (George Mason
University), Bob Weems (University of Texas at Arlington), and
Cheng-Zhong Xu (Wayne State University). We are also deeply
grateful to the following individuals for their reviews of various
chapters of the book: Duncan Buell (University of South Carolina),
Matthias Gobbert (University of Maryland, Baltimore County),
Krishna Kavi (University of North Texas), Hong Lin (University of
Houston–Downtown), Kathy Liszka (University of Akron), Leigh
Little (The State University of New York), Xinlian Liu (Hood
College), Henry Tufo (University of Colorado at Boulder), Andrew
g y y
Sloss (Consultant Engineer, ARM), and Gengbin Zheng (University
of Illinois). Their comments and suggestions have made the book
immeasurably better. Of course, we are solely responsible for
remaining errors and omissions.
Slides and the solutions manual for the first edition were prepared
by Kathy Liszka and Jinyoung Choi, respectively. Thanks to both of
them.
The staff at Elsevier has been very helpful throughout this project.
Nate McFadden helped with the development of the text. Todd
Green and Steve Merken were the acquisitions editors. Meghan
Andress was the content development manager. Rukmani Krishnan
was the production editor. Victoria Pearson was the designer. They
did a great job, and we are very grateful to all of them.
Our colleagues in the computer science and mathematics
departments at USF have been extremely helpful during our work
on the book. Peter would like to single out Prof. Gregory Benson for
particular thanks: his understanding of parallel computing—
especially Pthreads and semaphores—has been an invaluable
resource. We're both very grateful to our system administrators,
Alexey Fedosov and Elias Husary. They've patiently and efficiently
dealt with all of the “emergencies” that cropped up while we were
working on programs for the book. They've also done an amazing
job of providing us with the hardware we used to do all program
development and testing.
Peter would never have been able to finish the book without the
encouragement and moral support of his friends Holly Cohn, John
Dean, and Maria Grant. He will always be very grateful for their
help and their friendship. He is especially grateful to Holly for
allowing us to use her work, seven notations, for the cover.
Matthew would like to thank his colleagues in the USF
Department of Computer Science, as well as Maya Malensek and
Doyel Sadhu, for their love and support. Most of all, he would like to
thank Peter Pacheco for being a mentor and infallible source of
advice and wisdom during the formative years of his career in
academia.
Our biggest debt is to our students. As always, they showed us
what was too easy and what was far too difficult. They taught us
how to teach parallel computing. Our deepest thanks to all of them.
1
“Interestingly, a number of students have said that they found the
use of C pointers more difficult than MPI programming.”
Chapter 1: Why parallel
computing
From 1986 to 2003, the performance of microprocessors increased, on
average, more than 50% per year [28]. This unprecedented increase
meant that users and software developers could often simply wait
for the next generation of microprocessors to obtain increased
performance from their applications. Since 2003, however, single-
processor performance improvement has slowed to the point that in
the period from 2015 to 2017, it increased at less than 4% per year
[28]. This difference is dramatic: at 50% per year, performance will
increase by almost a factor of 60 in 10 years, while at 4%, it will
increase by about a factor of 1.5.
Furthermore, this difference in performance increase has been
associated with a dramatic change in processor design. By 2005,
most of the major manufacturers of microprocessors had decided
that the road to rapidly increasing performance lay in the direction
of parallelism. Rather than trying to continue to develop ever-faster
monolithic processors, manufacturers started putting multiple
complete processors on a single integrated circuit.
This change has a very important consequence for software
developers: simply adding more processors will not magically
improve the performance of the vast majority of serial programs,
that is, programs that were written to run on a single processor. Such
programs are unaware of the existence of multiple processors, and
the performance of such a program on a system with multiple
processors will be effectively the same as its performance on a single
processor of the multiprocessor system.
All of this raises a number of questions:
• Why do we care? Aren't single-processor systems fast
enough?
• Why can't microprocessor manufacturers continue to
develop much faster single-processor systems? Why build
parallel systems? Why build systems with multiple
processors?
• Why can't we write programs that will automatically convert
serial programs into parallel programs, that is, programs
that take advantage of the presence of multiple processors?

Let's take a brief look at each of these questions. Keep in mind,


though, that some of the answers aren't carved in stone. For
example, the performance of many applications may already be
more than adequate.

1.1 Why we need ever-increasing performance


The vast increases in computational power that we've been enjoying
for decades now have been at the heart of many of the most dramatic
advances in fields as diverse as science, the Internet, and
entertainment. For example, decoding the human genome, ever
more accurate medical imaging, astonishingly fast and accurate Web
searches, and ever more realistic and responsive computer games
would all have been impossible without these increases. Indeed,
more recent increases in computational power would have been
difficult, if not impossible, without earlier increases. But we can
never rest on our laurels. As our computational power increases, the
number of problems that we can seriously consider solving also
increases. Here are a few examples:

• Climate modeling. To better understand climate change, we


need far more accurate computer models, models that
include interactions between the atmosphere, the oceans,
solid land, and the ice caps at the poles. We also need to be
able to make detailed studies of how various interventions
might affect the global climate.
• Protein folding. It's believed that misfolded proteins may be
involved in diseases such as Huntington's, Parkinson's, and
Alzheimer's, but our ability to study configurations of
complex molecules such as proteins is severely limited by
our current computational power.
• Drug discovery. There are many ways in which increased
computational power can be used in research into new
medical treatments. For example, there are many drugs that
are effective in treating a relatively small fraction of those
suffering from some disease. It's possible that we can devise
alternative treatments by careful analysis of the genomes of
the individuals for whom the known treatment is ineffective.
This, however, will involve extensive computational analysis
of genomes.
• Energy research. Increased computational power will make it
possible to program much more detailed models of
technologies, such as wind turbines, solar cells, and batteries.
These programs may provide the information needed to
construct far more efficient clean energy sources.
• Data analysis. We generate tremendous amounts of data. By
some estimates, the quantity of data stored worldwide
doubles every two years [31], but the vast majority of it is
largely useless unless it's analyzed. As an example, knowing
the sequence of nucleotides in human DNA is, by itself, of
little use. Understanding how this sequence affects
development and how it can cause disease requires extensive
analysis. In addition to genomics, huge quantities of data are
generated by particle colliders, such as the Large Hadron
Collider at CERN, medical imaging, astronomical research,
and Web search engines—to name a few.
These and a host of other problems won't be solved without
tremendous increases in computational power.

1.2 Why we're building parallel systems


Much of the tremendous increase in single-processor performance
was driven by the ever-increasing density of transistors—the
electronic switches—on integrated circuits. As the size of transistors
decreases, their speed can be increased, and the overall speed of the
integrated circuit can be increased. However, as the speed of
transistors increases, their power consumption also increases. Most
of this power is dissipated as heat, and when an integrated circuit
gets too hot, it becomes unreliable. In the first decade of the twenty-
first century, air-cooled integrated circuits reached the limits of their
ability to dissipate heat [28].
Therefore it is becoming impossible to continue to increase the
speed of integrated circuits. Indeed, in the last few years, the
increase in transistor density has slowed dramatically [36].
But given the potential of computing to improve our existence,
there is a moral imperative to continue to increase computational
power.
How then, can we continue to build ever more powerful
computers? The answer is parallelism. Rather than building ever-
faster, more complex, monolithic processors, the industry has
decided to put multiple, relatively simple, complete processors on a
single chip. Such integrated circuits are called multicore processors,
and core has become synonymous with central processing unit, or
CPU. In this setting a conventional processor with one CPU is often
called a single-core system.

1.3 Why we need to write parallel programs


Most programs that have been written for conventional, single-core
systems cannot exploit the presence of multiple cores. We can run
multiple instances of a program on a multicore system, but this is
often of little help. For example, being able to run multiple instances
of our favorite game isn't really what we want—we want the
program to run faster with more realistic graphics. To do this, we
need to either rewrite our serial programs so that they're parallel, so
that they can make use of multiple cores, or write translation
programs, that is, programs that will automatically convert serial
programs into parallel programs. The bad news is that researchers
have had very limited success writing programs that convert serial
programs in languages such as C, C++, and Java into parallel
programs.
This isn't terribly surprising. While we can write programs that
recognize common constructs in serial programs, and automatically
translate these constructs into efficient parallel constructs, the
sequence of parallel constructs may be terribly inefficient. For
example, we can view the multiplication of two matrices as a
sequence of dot products, but parallelizing a matrix multiplication as
a sequence of parallel dot products is likely to be fairly slow on
many systems.
An efficient parallel implementation of a serial program may not
be obtained by finding efficient parallelizations of each of its steps.
Rather, the best parallelization may be obtained by devising an
entirely new algorithm.
As an example, suppose that we need to compute n values and
add them together. We know that this can be done with the
following serial code:

Now suppose we also have p cores and . Then each core can
form a partial sum of approximately values:
Here the prefix indicates that each core is using its own, private
variables, and each core can execute this block of code
independently of the other cores.
After each core completes execution of this code, its variable
will store the sum of the values computed by its calls to
. For example, if there are eight cores, , and the 24 calls to
return the values

1, 4, 3, 9, 2, 8, 5, 1, 1, 6, 2, 7, 2, 5, 0, 4, 1, 8, 6, 5, 1, 2, 3, 9,

then the values stored in might be

Here we're assuming the cores are identified by nonnegative


integers in the range , where p is the number of cores.
When the cores are done computing their values of , they can
form a global sum by sending their results to a designated “master”
core, which can add their results:

In our example, if the master core is core 0, it would add the values
.
But you can probably see a better way to do this—especially if the
number of cores is large. Instead of making the master core do all the
work of computing the final sum, we can pair the cores so that while
core 0 adds in the result of core 1, core 2 can add in the result of core
3, core 4 can add in the result of core 5, and so on. Then we can
repeat the process with only the even-ranked cores: 0 adds in the
result of 2, 4 adds in the result of 6, and so on. Now cores divisible
by 4 repeat the process, and so on. See Fig. 1.1. The circles contain
the current value of each core's sum, and the lines with arrows
indicate that one core is sending its sum to another core. The plus
signs indicate that a core is receiving a sum from another core and
adding the received sum into its own sum.

FIGURE 1.1 Multiple cores forming a global sum.

For both “global” sums, the master core (core 0) does more work
than any other core, and the length of time it takes the program to
complete the final sum should be the length of time it takes for the
master to complete. However, with eight cores, the master will carry
out seven receives and adds using the first method, while with the
second method, it will only carry out three. So the second method
results in an improvement of more than a factor of two. The
difference becomes much more dramatic with large numbers of
cores. With 1000 cores, the first method will require 999 receives and
adds, while the second will only require 10—an improvement of
almost a factor of 100!
The first global sum is a fairly obvious generalization of the serial
global sum: divide the work of adding among the cores, and after
each core has computed its part of the sum, the master core simply
repeats the basic serial addition—if there are p cores, then it needs to
add p values. The second global sum, on the other hand, bears little
relation to the original serial addition.
The point here is that it's unlikely that a translation program
would “discover” the second global sum. Rather, there would more
likely be a predefined efficient global sum that the translation
program would have access to. It could “recognize” the original
serial loop and replace it with a precoded, efficient, parallel global
sum.
We might expect that software could be written so that a large
number of common serial constructs could be recognized and
efficiently parallelized, that is, modified so that they can use
multiple cores. However, as we apply this principle to ever more
complex serial programs, it becomes more and more difficult to
recognize the construct, and it becomes less and less likely that we'll
have a precoded, efficient parallelization.
Thus we cannot simply continue to write serial programs; we
must write parallel programs, programs that exploit the power of
multiple processors.

1.4 How do we write parallel programs?


There are a number of possible answers to this question, but most of
them depend on the basic idea of partitioning the work to be done
among the cores. There are two widely used approaches: task-
parallelism and data-parallelism. In task-parallelism, we partition
the various tasks carried out in solving the problem among the cores.
In data-parallelism, we partition the data used in solving the
problem among the cores, and each core carries out more or less
similar operations on its part of the data.
As an example, suppose that Prof P has to teach a section of
“Survey of English Literature.” Also suppose that Prof P has one
hundred students in her section, so she's been assigned four teaching
assistants (TAs): Mr. A, Ms. B, Mr. C, and Ms. D. At last the semester
is over, and Prof P makes up a final exam that consists of five
questions. To grade the exam, she and her TAs might consider the
following two options: each of them can grade all one hundred
responses to one of the questions; say, P grades question 1, A grades
question 2, and so on. Alternatively, they can divide the one
hundred exams into five piles of twenty exams each, and each of
them can grade all the papers in one of the piles; P grades the papers
in the first pile, A grades the papers in the second pile, and so on.
In both approaches the “cores” are the professor and her TAs. The
first approach might be considered an example of task-parallelism.
There are five tasks to be carried out: grading the first question,
grading the second question, and so on. Presumably, the graders
will be looking for different information in question 1, which is
about Shakespeare, from the information in question 2, which is
about Milton, and so on. So the professor and her TAs will be
“executing different instructions.”
On the other hand, the second approach might be considered an
example of data-parallelism. The “data” are the students' papers,
which are divided among the cores, and each core applies more or
less the same grading instructions to each paper.
The first part of the global sum example in Section 1.3 would
probably be considered an example of data-parallelism. The data are
the values computed by , and each core carries out
roughly the same operations on its assigned elements: it computes
the required values by calling and adds them together.
The second part of the first global sum example might be considered
an example of task-parallelism. There are two tasks: receiving and
adding the cores' partial sums, which is carried out by the master
core; and giving the partial sum to the master core, which is carried
out by the other cores.
When the cores can work independently, writing a parallel
program is much the same as writing a serial program. Things get a
great deal more complex when the cores need to coordinate their
work. In the second global sum example, although the tree structure
in the diagram is very easy to understand, writing the actual code is
g y y g
relatively complex. See Exercises 1.3 and 1.4. Unfortunately, it's
much more common for the cores to need coordination.
In both global sum examples, the coordination involves
communication: one or more cores send their current partial sums to
another core. The global sum examples should also involve
coordination through load balancing. In the first part of the global
sum, it's clear that we want the amount of time taken by each core to
be roughly the same as the time taken by the other cores. If the cores
are identical, and each call to requires the same amount
of work, then we want each core to be assigned roughly the same
number of values as the other cores. If, for example, one core has to
compute most of the values, then the other cores will finish much
sooner than the heavily loaded core, and their computational power
will be wasted.
A third type of coordination is synchronization. As an example,
suppose that instead of computing the values to be added, the values
are read from . Say, is an array that is read in by the master core:

In most systems the cores are not automatically synchronized.


Rather, each core works at its own pace. In this case, the problem is
that we don't want the other cores to race ahead and start computing
their partial sums before the master is done initializing and making
it available to the other cores. That is, the cores need to wait before
starting execution of the code:

We need to add in a point of synchronization between the


initialization of and the computation of the partial sums:
Another random document with
no related content on Scribd:
applause. The other five comedies met with equal commendation
from the Romans, though Volcatius[16], in his enumeration of them,
says,
Sumetur Hecyra sexta ex his fabula.
The Step-mother is reckoned the last of the six.
The Eunuch was acted twice in one day[17]; and the author
received for it a higher price than was ever paid for any comedy
before that time, viz., eight thousand sesterces[18]: on account of the
magnitude of the sum, it is mentioned in the title of that play.
Varro[19] even prefers the opening scenes of the Brothers of Terence
to the same part in Menander. The report that Terence was indebted
to Scipio and Lælius, with whom he was so intimate, for parts of his
comedies, is well known; and he himself scarcely seems to have
discouraged the assertion, as he never seriously denies it: witness
the Prologue to the Brothers:

Nam quod isti dicunt malevoli, homines nobiles


Eum adjutare, assidueque una scribere:
Quod illi maledictum vehemens existimant,
Eam laudem hic ducit maximam, cum illis placet
Qui vobis universis, et populo placent:
Quorum opera in bello, in otio, in negotio
Suo quisque tempore usus est sine superbia.

“And as for what those malicious railers say[20], who assert that
certain noble persons assist the poet, and very frequently write with
him, what they think a reproach, he considers as the highest praise;
that he should be thought to please those who please you, and all
Rome; those who have assisted every one in war, and peace, and
even in their private affairs, with the greatest services; and yet have
been always free from arrogance.” It is likely, that he might wish, in
some measure, to encourage this idea, because he knew that it
would not be displeasing to Scipio and Lælius: however, the opinion
has gained ground, and is strongly entertained even to the present
day. Quintus Memmius[21], in an oration in his own defence, says,
Publius Africanus, qui a Terentio personam mutuatus, quæ
domi luserat ipse, nomine illius in scenam detulit.——
“Publius Africanus, who borrowed the name of Terence for
those plays which he composed at home for his
diversion.——”
Cornelius Nepos[22] asserts, that he has it from the very first
authority, that Caius Lælius being at his country-house at [23]Puteoli,
on the first of March[24], and being called to supper by his wife at an
earlier hour than usual, requested that he might not be interrupted;
and afterwards coming to table very late, he declared that he had
scarcely ever succeeded better in composition than at that time; and,
being asked to repeat the verses, he read the following from the
Self-tormentor, Act IV, Scene III.

Satis pol proterve me Syri promissa huc induxerunt


Decem minas quas mihi dare pollicitus est, quod si is nunc me
Deceperit, sæpe obsecrans me, ut veniam, frustra veniet:
Aut, cum venturam dixero, et constituero, cum is certe
Renunciârit; Clitiphon cum in spe pendebit animi
Decipiam, ac non veniam; Syrus mihi tergo pænas pendet.

“Truly this Syrus has coaxed me hither, impertinently enough, with


his fine promises that I should receive ten minæ; but, if he deceives
me this time, ’twill be to no purpose to ask me to come again; or, if I
promise, and appoint to come, I’ll take good care to disappoint him.
Clitipho, who will be full of eager hope to see me, will I deceive, and
will not come; and Syrus’ back shall pay the penalty.”
Santra[25] thinks, that if Terence had required any assistance in
his comedies; he would not have requested it from Scipio and
Lælius, who were then extremely young[26]; but from [27]Caius
Sulpicius Gallus, a man of great learning, who also was the first
person who procured[28] the representation of comedies at the
consular games or from [29]Quintus Fabius Labeo; or from[30]
Marcus Popilius Lænas, two eminent poets, and persons[31] of
consular dignity: and Terence himself, speaking of those who were
reported to have assisted him, does not mention them as young
men, but as persons of weight and experience, who had served the
Romans in peace, in war, and in private business.
After the publication of his six comedies, he quitted Rome, in the
thirty-fifth year of his age, and returned no more. Some suppose that
he undertook this journey with a view to silence the reports of his
receiving assistance from others in the composition of his plays:
others, that he went with a design to inform himself more perfectly of
the manners and customs of Greece.
Volcatius speaks of his death as follows:

Sed ut Afer sex populo edidit comœdias


Iter hinc in Asiam fecit: navim cum semel
Conscendit, visus nunquam est. Sic vita vacat.

“Terence, after having written six comedies, embarked for Asia,


and was seen no more. He perished at sea.”
Quintus Consentius[32] writes, that he died at sea, as he was
returning from Greece, with one hundred and eight plays, translated
from Menander[33]. Other writers affirm, that he died at Stymphalus,
a town in Arcadia, or in Leucadia[34], in the consulate of[35] Cneus
Cornelius Dolabella and Marcus Fulvius Nobilior, and that his end
was hastened by extreme grief for the loss of the comedies which he
had translated, and some others which he had composed himself,
and sent before him in a vessel which was afterwards wrecked.
He is said to have been of a middle stature, well-shaped, and of a
dark complexion. He left one daughter, who was afterwards married
to [36]a Roman knight, and bequeathed to her a garden of [37]XX
jugera, near the Appian Way, and close to the [38]Villa Martis: it is
therefore surprising that Portius should write thus:

——nihil Publius
Scipio profuit, nihil ei Lælius, nihil Furius:
Tres per idem tempus qui agitabant nobiles facillime,
Eorum ille opera ne domum quidem habuit conductitiam:
Saltem ut esset, quo referret obitum domini servulus.

“His three great friends, Scipio, Lælius, and Furius, give him no
assistance, nor even enable him to hire a house, that there might at
least be a place where his slave might announce to Rome his
master’s death.”
Afranius[39] prefers Terence to all the comic poets, saying, in his
Compitalia[40].
Terentio non similem dices quempiam.

“Terence is without an equal.”


But Volcatius places him not only after [41]Nævius, [42]Plautus,
and [43]Cæcilius, but even after [44]Licinius. [45]Cicero, in his
ΛΕΙΜΩΝ, writes of Terence thus,

Tu quoque qui solus lecto sermone, Terenti,


Conversum, expressumque Latina voce Menandrum
In medio populi sedatis vocibus effers,
Quicquid come loquens, ac omnia dulcia dicens.

“And thou, also, O Terence, whose pure style alone could make
Menander speak the Latin tongue, thou, with the sweetest harmony
and grace, hast given him to Rome.”
Also Caius Julius Cæsar[46],
Tu quoque tu in Summis, O dimidiate Menander,
Poneris et merito, puri sermonis amator,
Lenibus atque utinam scriptis adjuncta foret vis
Comica ut æquato virtus polleret honore,
Cum Græcis neque in hac despectus parte jaceres,
Unum hoc maceror, et doleo tibi deesse Terenti.

“And thou, also, O thou half Menander, art justly placed among
the most divine poets, for the purity of thy style. O would that humour
had kept pace with ease in all thy writings; then thou wouldest not
have been compelled to yield even to the Greeks; nor could a single
defect have been objected to thee. But, as it is, thou hast this great
defect, and this, O Terence, I lament.”
THE ANDRIAN,
A Comedy,

ACTED AT

THE MEGALESIAN GAMES[47];


IN THE [48]CURULE ÆDILATE OF [49]MARCUS FULVIUS AND
MARCUS GLABRIO[50]; BY THE COMPANY[51] OF LUCIUS
AMBIVIUS TURPIO, AND LUCIUS ATTILIUS[52],
OF PRÆNESTE.

Flaccus, the Freedman of Claudius, composed the Music for


[53]equal Flutes, right and left handed.
[54]It is taken from the Greek, and was published during the
Consulate of Marcus Claudius Marcellus, and Cneus Sulpicius
Galba[55].
Year of Rome 587
Before Our Saviour 162
Author’s Age 27
THE ARGUMENT.
There were in Athens two brothers, Chremes and Phania. The
former making a voyage to Asia, left his infant daughter, named
Pasibula, under the protection of Phania; who, to avoid the
dangers of a war which shortly after convulsed the Grecian
States, quitted Athens, and embarked also for Asia with the infant
Pasibula, designing to rejoin his brother Chremes. His vessel
being wrecked off Andros, he was received and hospitably
entertained by an inhabitant of the island, where he died,
bequeathing his niece to his host, who generously educated her
with his own daughter Chrysis; changing her name from
Pasibula to Glycera. After some years he also died, and his
daughter Chrysis, finding herself reduced to poverty, and avoided
by her relations, removed to Athens, accompanied by her adopted
sister Glycera, or Pasibula. Here, supported by her industry, she
lived for some months in a virtuous seclusion; but after that period
became acquainted with several young Athenians of good family,
whose visits she admitted, hoping perhaps to accomplish an
advantageous marriage either for Glycera or for herself. She was
seduced by pleasure, and her conduct from that time became very
far from irreproachable. Meanwhile a young man, named
Pamphilus, is accidently introduced at her house, sees Glycera,
is enamoured of her; she returns his affections, and they are
privately betrothed; a short time previous to the death of Chrysis,
which happens about three years after her removal to Athens.
Chremes, whom we left in Asia, returned to Athens, and became
the father of another daughter, who was called Philumena; he had
long before formed a friendship with Simo, the father of
Pamphilus. Pamphilus being a youth of great worth and high
reputation, Chremes wishes to bestow on him the hand of his
daughter Philumena. Here the play opens. A report of the
connexion between Pamphilus and Glycera reaching the ears of
Chremes, he breaks off the marriage. Simo conceals this, and to
try the truth of the rumour, proposes Philumena again to his son,
and desires him to wed her instantly. Apprized by his servant
Davus of his father’s artful stratagem, Pamphilus professes his
willingness to marry, thinking by this measure to disappoint it; but
he defeats himself, for from his ready consent, Chremes
concludes the rumour false, and renews the treaty to the great
embarrassment of Pamphilus, which, with the artifices Davus
employs to extricate him, form the most diverting scenes of the
play. However, when the affairs of Pamphilus and Davus are
reduced to extremity, and a breach between father and son
appears inevitable on account of the marriage with Glycera, and
the refusal to accept Philumena, a stranger called Crito, most
opportunely arrives from Andros, and discovers Glycera to be
Pasibula, the daughter of Chremes, who willingly confirms her
the wife of Pamphilus, and bestows Philumena, his other
daughter, on Charinus, a friend of Pamphilus, to the great
satisfaction of all parties.
DRAMATIS PERSONÆ.

Simo, an old man, the father of Pamphilus.


Sosia, the freedman of Simo.
Pamphilus, the son of Simo.
Davus, servant to Pamphilus.
Charinus, a young man, the friend of Pamphilus.
Byrrhia, servant to Charinus.
Chremes, an old man, the friend of Simo.
Crito, a stranger, from the island of Andros.
Dromo, a servant.
Glycera, the Andrian.
Mysis, her maid.
Lesbia, a midwife.

MUTES.

Archillis, Glycera’s nurse.


Servants belonging to Simo.

The Scene lies in Athens, in a street between the


houses of Simo and Glycera.
The Time is about nine hours.
PROLOGUE[56].

Our poet, when first he bent his mind to write, thought that he
undertook no more than to compose Comedies which should please
the people. But he finds himself not a little deceived; and is
compelled to waste his time in making Prologues; not to narrate the
plot of his play, but to answer the snarling malice of an older poet[57].
And now, I pray you, Sirs, observe what they object against our
Author: Menander wrote the [58]Andrian and Perinthian: he who
knows one of them knows both, their plots are so very similar; but
they are different in dialogue, and in style. He confesses that
whatever seemed suitable to the Andrian, he borrowed from the
Perinthian, and used as his own: and this, forsooth, these railers
carp at, and argue against him that Comedies thus mixed are good
for nothing. But, in attempting to shew their wit, they prove their folly:
since, in censuring him, they censure Nævius, Plautus[59], Ennius,
who have given our author a precedent for what he has done: and
whose careless ease he would much rather imitate than their
obscure correctness. But henceforth let them be silent, and cease to
rail; or I give them warning, they shall hear their own faults
published. And now deign to favour the play with your attention; and
give it an impartial hearing, that you may know what is in future to be
expected from the poet, and whether the Comedies that he may
write hereafter, will be worthy to be accepted, or to be rejected by
you.
THE ANDRIAN.

ACT I.

Scene I.
Simo, Sosia, and Slaves, carrying Provisions.
Simo. [60]Carry in those things, directly. (Exeunt Slaves.) Do you
come hither Sosia; I have something to say to you.
Sosia. You mean, I suppose, that I should take care that these
provisions are properly drest.
Simo. No; it’s quite another matter.
Sosia. In what else can my skill be of any service?
Simo. There is no need of your skill in the management of the
affair I am now engaged in; all that I require of you is faithfulness and
secrecy; qualities I know you to possess.
Sosia. I long to hear your commands.
Simo. You well know, Sosia, that from the time when I first bought
you as my slave;[61] even from your childhood until the present
moment; I have been a just and gentle master: you served me with a
free spirit; and I gave you freedom; [62]as the greatest reward in my
power to bestow.
Sosia. Believe me, Sir, I have not forgotten it.
Simo. Nor have you given me any cause to repent that I did
so.[63]
Sosia. I am very glad, Simo, that my past, and present conduct
has been pleasing to you; and I am grateful for your goodness in
receiving my poor services so favourably: but it pains me to be thus
reminded of the benefits you have conferred upon me, as it seems to
upbraid me with having forgotten them.[64] Pray, Sir, let me request
to know your will at once.
Simo. You shall; but first I must inform you that my son’s
marriage, which you expect to take place, is only a feigned marriage.
Sosia. But why do you make use of this deceit?
Simo. [65]You shall hear every thing from the beginning; by which
means you will learn my son’s course of life, my intentions, and the
part I wish you to take in this affair. When my son, Pamphilus,
arrived at man’s estate,[66] of course he was able to live more
according to his own inclination: for, until a man has attained that
age, his disposition does not discover itself, being kept in check
either by his tutor, or by bashfulness, or by his tender years.
Sosia. That is very true.
Simo. Most young men attach themselves chiefly to one particular
pursuit; such, for instance, as breeding horses, keeping hounds, or
frequenting the schools of the philosophers.[67] He did not devote
himself entirely to any one of these: but employed a moderate
portion of his time in each; and I was much pleased to see it.
Sosia. As well you might, for I think that every man, in the conduct
of his life, should adhere to this precept, “Avoid excess.”
Simo. This was his way of life; he bore patiently with every one,
accommodated himself to the tempers of his associates; and fell in
with them in their pursuits; avoided quarrels; and never arrogantly
preferred himself before his companions. Conduct like this will
ensure a man praise without envy, and gain many friends.
Sosia. This was indeed a wise course of life; for in these times[68],
flattery makes friends; truth, foes.
Simo. Meantime, about three years ago, a certain woman,
exceedingly beautiful, and in the flower of her age, removed into this
neighbourhood; she came from the Island of Andros[69]; being
compelled to quit it by her poverty and the neglect of her
relations[70].
Sosia. I augur no good from this woman of Andros.
Simo. At first she lived chastely, and penuriously, and laboured
hard, managing with difficulty to gain a livelihood[71] with the distaff
and the loom: but soon afterwards several lovers made their
addresses to her[72]; promising to repay her favours with rich
presents; and as we all are naturally prone to pleasure, and averse
to labour, she was induced to accept their offers; and at last admitted
all her lovers without scruple. It happened that some of them with
much persuasion prevailed on my son to accompany them to her
house. Aha! thought I, he is caught[73]: he is certainly in love with
her. In the morning I watched their pages going to her house and
returning; I called one of them; Hark ye, boy, prithee tell me who was
the favourite of Chrysis, yesterday? For this was the Andrian’s name.
Sosia. I understand you, Sir.
Simo. I was answered that it was Phædrus, or Clinia, or
Niceratus; for all these were her lovers at that time: well, said I, and
what did Pamphilus there! oh! he paid[74] his share and supped with
the rest. Another day I inquired and received the same answer; and I
was extremely rejoiced that I could learn nothing to attach any blame
to my son. Then I thought that I had proved him sufficiently; and that
he was a miracle of chastity:—for he who has to contend against the
example of men of such vicious inclinations, and can preserve his
mind from its pernicious influence, may very safely be trusted with
the regulation of his own conduct. To increase my satisfaction, every
body joined as if with one voice in the praise of Pamphilus, every
one extolled his virtues, and my happiness, in possessing a son
endued with so excellent a disposition. In short, this his high
reputation induced my friend Chremes to come to me of his own
accord, and offer to give his daughter to Pamphilus with a large
dowry[75]. I contracted [76]my son, as I was much pleased with the
match, which was to have taken place on this very day.
Sosia. And what has happened to prevent it?
Simo. You shall hear: within a few days of this time our neighbour
Chrysis died.
Sosia. O happy news! I was still fearful of some mischief from this
Andrian.
Simo. Upon this occasion my son was continually at the house
with the lovers of Chrysis, and joined with them in the care of her
funeral; meantime he was sad, and sometimes would even weep.
Still I was pleased with all this; if, thought I, he is so much concerned
at the death of so slight an acquaintance, how would he be afflicted
at the loss of one whom he himself loved, or at my death. I attributed
every thing to his humane and affectionate disposition; in short, I
myself, for his sake, attended the funeral, even yet suspecting
nothing.
Sosia. Ah! what has happened then?
Simo. I will tell you. The corpse is carried out; we follow: in the
mean time, among the women who were there[77], I saw one young
girl, with a form so——
Sosia. Lovely, without doubt.
Simo. And with a face, Sosia, so modest, and so charming, that
nothing can surpass it; and as she appeared more afflicted than the
others who were there, and so pre-eminently beautiful[78], and of so
noble a carriage, I approach the women who were following the
body[79], and inquire who she is: they answer, The sister of the
deceased. Instantly the whole truth burst upon me at once: hence
then, thought I, proceed those tears; this sister it is, who is the cause
of all his affliction.
Sosia. How I dread to hear the end of all this!
Simo. In the mean time the procession advances; we follow, and
arrive at the tomb[80]: the corpse is placed on the pile[81], and quickly
enveloped in flames; they weep; while the sister I was speaking of,
rushed forward in an agony of grief toward the fire; and her
imprudence exposed her to great danger. Then, then it was, that
Pamphilus, half dead with terror, publicly betrayed the love he had
hitherto so well concealed: he flew to the spot, and throwing his arms
around her with all the tenderness imaginable; my dearest Glycera,
cried he, what are you about to do? Why do you rush upon
destruction? Upon which she threw herself weeping upon his bosom
in so affectionate a manner, that it was easy enough to perceive their
mutual love.
Sosia. How! is this possible!
Simo. I returned home, scarcely able to contain my anger; but yet
I had not sufficient cause to chide Pamphilus openly; as he might
have replied to me, What have I done amiss, my father? or how have
I offended you? of what am I guilty? I have preserved the life of one
who was going to throw herself into the flames: I prevented her: this
would have been a plausible excuse.
Sosia. You consider this rightly, Sir; for if he who has helped to
save a life is to be blamed for it; what must be done to him who is
guilty of violence and injustice?
Simo. The next day Chremes came to me, and complained of
being shamefully used, as he had discovered for a certainty that
Pamphilus had actually married this strange woman[82]. I positively
denied that this was the case, and he as obstinately insisted on the
truth of it: at last I left him, as he was absolutely resolved to break off
the match.
Sosia. Did you not then rebuke Pamphilus?
Simo. No: there was nothing yet so flagrant as to justify my
rebuke.
Sosia. How so, Sir, pray explain?
Simo. He might have answered me thus: you yourself, my father,
have fixed the time when this liberty must cease; and the period is at
hand when I must conform myself to the pleasure of another: permit
me then, I beseech you, for the short space that remains to me, to
live as my own will prompts me.
Sosia. True. What cause of complaint can you then find against
him?
Simo. If he is induced by his love for this stranger, to refuse to
marry Philumena in obedience to my commands, that offence will lay
him open to my anger; and I am now endeavouring by means of this
feigned marriage, to find a just cause of complaint against him: and,
at the same time, if that rogue Davus has any subtle scheme on foot,
this will induce him to bring it forward now, when it can do no harm;
as I believe that rascal will leave no stone unturned in the affair;
though more for the sake of tormenting me, than with a view to serve
or gratify my son.
Sosia. Why do you suspect that?
Simo. Why? because of a wicked mind one can expect nothing
but wicked intentions[83]. But if I catch him at his tricks—However,
’tis in vain to say more: if it appear, as I trust it will, that my son
makes no objection to the marriage, I have only to gain Chremes,
whom I must prevail upon by entreaty; and I have great hopes that I
shall accomplish it. What I wish you to do is, to assist me in giving
out this marriage for truth, to terrify Davus, and to watch the conduct
of my son, what he does; and what course he and his hopeful
servant resolve upon.
Sosia. It is enough, Sir; I will take care to obey you. Now, I
suppose, we may go in.
Simo. Go, I will follow presently[84].
[Exit Sosia.

Scene II.
Simo, Davus.
Simo. My son, I have no doubt, will refuse to marry; for I observed
that Davus seemed terribly perplexed just now, when he heard that
the match was to take place: but here he comes[85].
Davus. (not seeing Simo.) I wondered that this affair seemed
likely to pass off so easily! and always mistrusted the drift of my old
master’s extraordinary patience and gentleness; who, though he was
refused the wife he wished for, for his son, never mentioned a word
of it to us, or seemed to take any thing amiss.
Simo. (aside.) But now he will, as you shall feel, rascal.
Davus. His design was to entrap us while we were indulging in an
ill-founded joy, and fancied ourselves quite secure. He wished to
take advantage of our heedlessness, and make up the match before
we could prevent him: what a crafty old fellow!
Simo. How this rascal prates[86]!
Davus. Here is my master! he has overheard me! I never saw
him!
Simo. Davus.
Davus. Who calls Davus?
Simo. Come hither, sirrah.
Davus. (aside.) What can he want with me?
Simo. What were you saying?
Davus. About what, Sir?
Simo. About what, Sir? The world says that my son has an
intrigue.
Davus. Oh! Sir, the world cares a great deal about that, no doubt.
Simo. Are you attending to this, Sir?
Davus. Yes, Sir, certainly.
Simo. It does not become me to inquire too strictly into the truth of
these reports. I shall not concern myself in what he has done
hitherto; for as long as circumstances allowed of it, I left him to
himself: but it is now high time that he should alter and lead a new
life. Therefore, Davus, I command, and even entreat, that you will
prevail on him to amend his conduct.
Davus. What is the meaning of all this discourse?
Simo. Those who have love intrigues on their hands are generally
very averse to marriage.
Davus. So I have heard.
Simo. And if any of them manage such an affair after the counsel
of a knave, ’tis a hundred to one but the rogue will take advantage of
their weakness, and lead them a step further, from being love-sick to
some still greater scrape or imprudence.
Davus. Truly, Sir, I don’t understand what you said last.
Simo. No! not understand it!
Davus. No. I am not Œdipus[87] but Davus.
Simo. Then you wish that what I have to say should be explained
openly and without reserve.
Davus. Certainly I do.
Simo. Then, sirrah, if I discover that you endeavour to prevent my
son’s marriage by any of your crafty tricks; or interfere in this
business to show your cunning; you may rely on receiving a few
scores of lashes, and a situation in the grinding-house[88] for life:
upon this token, moreover, that when I liberate you from thence, I will
grind in your stead. Is this plain enough for you, or don’t you
understand yet?
Davus. Oh, perfectly! you come to the point at once: you don’t use
much circumlocution, i’faith.
Simo. Remember! In this affair above all others, if you begin
plotting, I will never forgive it.
Davus. Softly, worthy Sir, softly, good words I beg of you.
Simo. So! you are merry upon it, are you, but I am not to be
imposed upon. I advise you, finally, to take care what you do: you
cannot say you have not had fair warning.
[Exit.

Scene III[89].
Davus.
In truth, friend Davus, from what I have just heard from the old
man about the marriage, I think thou hast no time to lose. This affair
must be [90]handled dexterously, or either my young master or I must
be quite undone. Nor have I yet resolved which side to take; whether
I shall assist Pamphilus, or obey his father. If I abandon the son, I
fear his happiness will be destroyed: if I help him, I dread the threats
of the old man, who is as crafty as a fox. First, he has discovered his
son’s intrigue, and keeps a jealous eye upon me, lest I should set
some scheme a-foot to retard the marriage. If he finds out the least
thing, I am undone[91], for right or wrong, if he once takes the whim
into his head, he will soon find a pretence for sending me to grind in
the mill for my life; and, to crown our disasters, this Andrian,
Pamphilus’s wife or mistress, I know not which, is with child by him:
’tis strange enough to hear their presumption. I think their
[92]intentions savour more of madness than of any thing else: boy or

girl, say they, the child shall be brought up[93]. They have made up
among them too, some story or other, to prove that she is a citizen of
Athens[94]. Thus runs the tale. Once upon a time there was a certain
old merchant[95], who was shipwrecked upon the island of Andros,
where he afterwards died, and the father of Chrysis took in his
helpless little orphan, who was this very Glycera. Fables! for my part
I don’t believe a word of it: however, they themselves are vastly
pleased with the story. But here comes her maid Mysis. Well, I’ll
betake myself to the Forum[96], and look for Pamphilus: lest his
father should surprise him with this marriage before I can tell him any
thing of the matter.
[Exit.
Scene IV.
Mysis.
[97]I understand you, Archillis: you need not stun me with the
same thing over so often: you want me to fetch the midwife Lesbia:
in truth, she’s very fond of the dram-bottle, and very headstrong; and
I should think she was hardly skilful enough to attend a woman in her
first labour.—However, I’ll bring her.——Mark how [98]importunate
this [99]old baggage is to have her fellow-gossip, that they may tipple
together. Well, may Diana grant my [100]poor mistress a happy
minute; and that Lesbia’s want of skill may be shewn any where
rather than here. But what do I see? here comes Pamphilus,
seemingly half-distracted, surely something is the matter. I will stay
and see whether this agitation is not the forerunner of some
misfortune.

Scene V.
Pamphilus, Mysis[101].
Pam. Heavens! is it possible that any human being, much less a
father, could be guilty of an action like this?
Mysis. (aside.) What can be the matter?
Pam. By the faith of gods and men, if ever any one was
unworthily treated, I am. He peremptorily resolved that I should be
married on this very day. Why was not I informed of this before? Why
was not I consulted?
Mysis. (aside.) Miserable woman that I am! what do I hear?
Pam. And why has Chremes changed his mind, who obstinately
persisted in refusing me his daughter, after he heard of my
imprudence[102]? Can he do this to tear me from my dearest
Glycera? Alas! if I lose her, I am utterly undone. Was there ever such
an unfortunate lover?—was there ever such an unhappy man as I
am? Heavens and earth! will this persecution never end? Shall I

You might also like