Professional Documents
Culture Documents
ebook download (Original PDF) Product Planning Essentials 2nd Edition by Kenneth B. Kahn all chapter
ebook download (Original PDF) Product Planning Essentials 2nd Edition by Kenneth B. Kahn all chapter
http://ebooksecure.com/product/original-pdf-essentials-of-
mis-13th-edition-by-kenneth/
https://ebooksecure.com/download/essentials-of-anatomy-
physiology-2nd-edition-ebook-pdf/
http://ebooksecure.com/product/ebook-pdf-essentials-of-strategic-
planning-in-healthcare-2nd-edition/
http://ebooksecure.com/product/ebook-pdf-essentials-of-
psychology-4th-edition-by-kenneth-s-bordens/
(eBook PDF) E-Commerce Essentials 1st Edition by
Kenneth C. Laudon
http://ebooksecure.com/product/ebook-pdf-e-commerce-
essentials-1st-edition-by-kenneth-c-laudon/
http://ebooksecure.com/product/ebook-pdf-essentials-of-mis-14th-
edition-by-kenneth-c-laudon/
http://ebooksecure.com/product/ebook-pdf-essentials-of-mis-12th-
edition-by-kenneth-c-laudon/
http://ebooksecure.com/product/ebook-pdf-essentials-of-
sociology-9th-edition-by-david-b-brinkerhoff-2/
http://ebooksecure.com/product/ebook-pdf-essentials-of-
sociology-9th-edition-by-david-b-brinkerhoff/
CHARLES BONCELET
CONTENTS vu
Summary 215
Problems 216
B ACRONYMS 399
D BIBLIOGRAPHY 403
INDEX 405
REEAC
I have many goals for this book, but this is foremost: I have always liked probability and
have been fascinated by its application to predicting the future. I hope to encourage th is
generation of students to study, appreciate, and apply probability to the many applications
they will face in the years ahead.
To the student: This book is written for you. The prose style is Jess formal than many
textbooks use. This more engaging prose was chosen to encourage you to read the book.
I firmly believe a good textbook should help you learn the material. But it will not help
if you do not read it. Whenever I ask my students what they want to see in a text, the
answer is: "Examples. Lots of examples'.' I have tried to heed this advice and included
"lots" of examples. Many are small, quick examples to illustrate a single concept. Others
are long, detailed examples designed to demonstrate more sophisticated concepts. Finally,
most chapters end in one or more longer examples that illustrate how the concepts of that
chapter apply to engineering or scientific applications.
Almost all the concepts and equations are derived using routine algebra. Read the
derivations, and reproduce them yourselves. A great learning technique is to read through
a section, then write down the salient points. Read a derivation, and then reproduce it
yourself. Repeat the sequence-read, then reproduce-until you get it right.
I have included many figures and graphics. The old expression, "a picture is worth
a thousand words:' is still true. I am a believer in Edward Tufte's graphics philosophy:
maximize the data-ink ratio. 1 All graphics are carefully drawn. They each have enough ink
to tell a stor y, but only enough ink.
To the instructor: This textbook has several advantages over other textbooks. It is the
right size-not too big and not too small. It should cover the essential concepts for the level
of the course, but should not cover too much. Part of the art of textbook writing is to decide
what should be in and what should be out.
The selection of topics is, of course, a determination on the part of the author and
represents the era in which the book is written. When I first started teaching my course
more than two decades ago, the selection of topics favored continuous random variables and
continuous time random processes. Over time, discrete random variables and discrete time
random processes have grown in importance. Students today are expected to understand
more statistics than in the past. Computation is much more important and more immediate.
Each year I add a bit more computation to the course than the prior year.
I like computation. So do most students. Computation gives a reality to the theoretical
concepts. It can also be fun. Throughout the book, there are computational examples and
exercises. Unfortunately, not everyone uses the same computational packages. The book uses
'Edward Tufte, TI,e Visual Display of Quantitative Information, 2nd ed. Cheshire, CT: Graphics Press,
2001. A great book, highly recommended.
XI
XII PREFACE
three of the most popular: Matlab, Python, and R. For the most part, we alternate between
Matlab and Python and postpone discussion of R until the statistics chapters.
Most chapters have a common format: introductory material, followed by deeper and
more involved topics, and then one or more examples illustrating the application of the
concepts, a summary of the main topics, and a list of homework problems. The instructor
can choose how far into each chapter to go. For instance, I usually cover entropy (Chapter
5) and Aloha (Chapter 6), but skip error-correcting coding (also Chapter 6).
I am a firm believer that before statistics or random processes can be understood,
the student must have a good knowledge of probability. A typical undergraduate class
can cover the first nine chapters in about two-thirds of a semester, giving the student a
good understanding of both discrete and continuous probability. The instr uctor can select
topics from the later chapters to fill out the rest of the semester. If students have had basic
probability in a prior course, the first nine chapters can be covered quickly and greater
emphasis placed on the remaining chapters.
Depending on the focus of the course, the instructor can choose to emphasize statistics
by covering the material in Chapters 10 through 12. Alternatively, the instructor can
emphasize random signals by covering Chapters 13 and 14.
The text can be used in a graduate class. Assuming the students have seen some
probability as undergraduates, the first nine chapters can be covered quickly and more
attention paid to the last five chapters. In my experience, most new graduate students need to
refresh their probability knowledge. Reviewing the first nine chapters will be time well spent.
Graduate students will also benefit from doing computational exercises and learning the
similarities and differences in the three computational packages discussed, Matlab, Python,
and R.
Chapter Coverage
Chapters 1 and 2 are a fairly standard introduction to probability. The first chapter
introduces the basic definitions and the three axioms, proves a series of simple theorems,
and concludes with detailed examples of calculating probabilities for simple networks.
The second chapter covers conditional probability, Bayes theorem and the law of total
probability, and several applications.
Chapter 3 is a detour into combinatorics. A knowledge of combinatorics is essential
to understanding probability, especially discrete probability, but students often confuse the
two, thinking combinatorics to be a branch of probability. The two are different, and we
emphasize that. Much of the development of probability in history was driven by gambling.
I, too, use examples from gambling and game play in this chapter (and in some later chapters
as well). Students play games and occasionally gamble. Examples from these subjects help
bring probability to the student life experience-and we show that gambling is unlikely to
be profitable!
Chapters 4 and 5 introduce discrete probability mass functions, distribution functions,
expected values, change of variables, and the uniform, geometric, and Poisson distributions.
Chapter 4 culminates with a discussion of the financial considerations of gambling versus
buying insurance. Chapter 5 ends with a Jong section on entropy and data compression.
(It still amazes me that most textbooks targeting an electrical and computer engineering
PREFACE xm
Acknow ledgments
I would like to thank the reviewers who helped shaped this text during its development.
Their many comments are much appreciated. They are the following:
I would like to thank the following people from Oxford University Press who helped make
this book a reality: Nancy Blaine, John Appeldorn, Megan Carlson, Christine Mahon, Daniel
Kaveney, and Claudia Dukeshire.
Last, and definitely not least, I would like to thank my children, Matthew and Amy, and
my wife, Carol, for their patience over the years while I worked on this book.
Charles Boncelet
C HA PTER
PROBABILITY BASICS
Probability refers to how likely something is. By convention, probabilities are real numbers
between Oand 1. A probability of O refers to something that never occurs; a probability of
1 refers to something that always occurs. Probabilities between Oand 1 refer to things that
sometimes occur.
For instance, an ordinar y coin when flipped will land heads up about half the time and
land tails up about half the time. We say the probability of heads is 0.5; the probability of
tails is also 0.5.
As another example, a typical telephone line has a probability of sending a data bit
correctly of around 0.9999, or 1 - 10- 4 • The probability the bit is incorrect is 10- •. A
fiber-optic line may have a bit error rate as low as 10- 15 .
Imagine Alice sends a message to Bob. For Bob to receive any infor mation (any new
knowledge), the message must be unknown to Bob. If Bob knew the message before
receiving it, then he gains no new knowledge from hearing it. Only if the message is random
to Bob will Bob receive any information.
There are a great many applications where people try to predict the future. Stock
markets, weather, spor ting events, and elections all are random. Successful prediction
of any of these would be immensely profitable, but each seems to have substantial
randomness.
Engineers worry about reliability of devices and systems. Engineers control complex
systems, often without perfect knowledge of the inputs. People are building self-driving
automobiles and aircraft. These devices must operate successfully even though all sorts of
unpredictable events may occur.
1
2 CHAPTER 1 PROBABILITY BASICS
Probabilities may be functions of other variables, such as time and space. The
probability of someone getting cancer is a function of lots of things, including age, gender,
genetics, dietar y habits, whether the person smokes, and where the person lives. Noise in
an electric circuit is a function of time and temperature. The number of questions answered
correctly on an exam is a function of what questions are asked-and how prepared the test
taker is!
In some problems, tin1e is the relevant quantity. How many flips of a coin are required
before the first head occurs? How many before the !OOth head?
The point of th is is that many experiments feature randomness, where the result of
the experiment is not known in advance. Furthermore, repetitions of the same experiment
may produce different results. Flipping a coin once and getting heads does not mean that a
second flip will be heads (or tails). Probability is about understanding and quantifying this
randomness.
EXAMPLE 1.1 Let us test this question: How many flips are required to get a head? Find a coin, and flip it
until a head occurs. Record how many flips were required. Repeat the experiment again, and
record the result. Do this at least IO times. Each of these is referred to as a run, a sequence
of tails ending with a heads.
What is the longest run you observed? What is the shor test? What is the average run
length? Theory tells us that the average r un length will be about 2.0, though of course your
average may be different.
1.2 Experiments, Outcomes, and Events 3
An experi111ent is whatever is done. It may be flipping a coin, rolling some dice, measuring
a voltage or someone's height and weight, or numerous others.
The experiment results in outcomes. The outcomes are the atomic results of the
experiment. They cannot be divided further. For instance, for a coin flip, the outcomes
are heads and tails; for a counting experiment (e.g., the number of electrons crossing a PN
junction), the outcomes are the nonnegative in tegers, 0, 1, 2, 3, .... Outcomes are denoted
with italic lowercase letters, perhaps with subscripts, such as x, n, a 1, a2 , etc.
The number of outcomes can be finite or infinite, as in the two examples mentioned
in the paragraph above. Further more, the experiment can result in discrete outcomes, such
as the integers, or continuous outcomes, such as a person's weight. For now, we postpone
continuous experiments to Chapter 7 and consider only discrete experiments.
Sets of outcomes are known as events. Events are denoted with italic uppercase Roman
letters, perhaps with subscripts, such as A, B, and A ;. The outcomes in an event are listed with
=
braces. For instance, A = {1 , 2,3,4} or B {2, 4,6}. A is the event containing the outcomes I,
2, 3, and 4, while B is the event containing outcomes 2, 4, and 6.
The set of all possible outcomes is the sample space and is denoted by S. For example,
the outcomes of a roll of an ordinar y six-sided die 1 are 1, 2, 3, 4, 5, and 6. The sample space
=
is S {1 , 2,3,4,5,6}. The set containing no outcomes is the e,npty set and is denoted by¢.
The complernent of an event A , denoted A, is the event containing every outcome not
in A . The sample space is the complement of the empty set, and vice versa.
The usual r ules of set arithmetic apply to events. The union of two events, Au B, is the
event containing outcomes in either A or B. The intersection of two events, A n B or more
simply AB, is the event containing all outcomes in both A and B.
- - -
For any event A , A n A = AA = ¢ and A u A = S.
EXAMPLE 1.2 Consider a roll of an ordinary six-sided die, and let A = {1,2,3,4} and B
- -
={2,4,6}. Then,
Au B = {1 , 2, 3,4,6} and A n B = {2,4}. A = {5,6} and B = {1,3,5}.
EXAMPLE 1.3 Consider the following exper iment: A coin is flipped three times. The outcomes are
=
the eight flip sequences: hhh, hht, .. . , ttt. If A {first flip is head} =
{hhh, hht, hth, htt} ,
then A = {ttt, tth, tht, thh}. If B = {exactly two heads} = {hht, hth, thh}, then Au B =
{hhh, hht, hth, htt, thh} and AB = {hht, hth}.
Comment 1.2: Be carefu l in defi ning events. In the coin fl ipping experiment above, a n
event m ight be specified as C = {two heads}. Is this "exact ly two head s" or "at least two
head s"? The former is {hht, hth, thhl, while the latter is {hht, hth, thh, hhh}.
DeMorgan's laws are handy when the complements of events are easier to define and specify
than the events themselves.
A is a subset of B, denoted A c B, if each outcome in A is also in B. For instance, if
A = {1,2} and B = {1,2,4,6}, then Ac B. Note that any set is a subset of itself, Ac A. If Ac B
and B c A , then A = B.
Two events are disjoint (also known as mutually exclusive) if they have no outcomes in
common, that is, if AB = ¢. A collection of events, A; for i = 1,2, ..., is pairwise disjoint if
each pair of events is disjoint, i.e., A ;Aj = ¢for all i "¢ j.
A collection of events, A ; for i = 1,2, ..., forms a partition of S if the events are pairwise
disjoint and the union of all events is the sample space:
LJA;= S
i= I
In the next chapter, we introduce the law of total probability, which uses a par tition to divide
a problem into pieces, with each A ; representing a piece. Each piece is solved and the pieces
combined to get the total solution.
A useful tool for visualizing relationships between sets is the Venn diagrarn. Typically, Venn
diagrams use a box for the sample space and circles (or circle-like figures) for the var ious
events.
In Figure 1.1, we show a simple Venn diagram. The outer box, labeled S, denotes the
sample space. All outcomes are in S. The two circles, A and B, represent two events. The
s
A B
AB AB AB
Light: AB A n B
Dark: AB A u B
FIGURE 1.2 A Venn diagram "proof" of the second of DeMorgan's laws (Equation 1.2). The
"dark" parts show AB= Au B, while the "light" parts show AB= An B.
shaded area is the union of these two events. One can see that A= ABuAB, thatB ABuAB,=
=-
and that AuB ABuABuAB.
-
Figure 1.2 presents a simple Venn diagram proof of Equation (1.2). The dark shaded
area in the leftmost box represents AB, and the shaded areas in the two rightmost boxes
- -
represent A and B, respectively. The left box is the logical OR of the two rightmost boxes.
On the other hand, the light area on the left is AB. It is the logical AND of A and B. Figure 1.3
shows a por tion of the Venn diagram of Au Bu C. The shaded area, representing the union,
can be d ivided into seven parts. One part is ABC, another part is ABC, etc. Problem 1.13
asks the reader to complete the picture.
s
A B
ABC ABC
ABC
of bits can be transmitted across a wireless communications network and the number of bits
received in error counted. A randomly chosen person's height, weight, age, temperature, and
blood pressure can be measured. All these quantities are represented by numbers.
Random variables are mappings from outcomes to numbers. We denote random
variables with bold- italic uppercase Roman letters (or sometimes Greek letters), such as
X and Y , and sometimes with subscripts, such as X 1, X2, etc. The outcomes are denoted
with italic lowercase letters, such as x , y, and n. For instance,
X (heads) =1
X (tails) =0
Events, sets of outcomes, become relations on the random variables. For instance,
{hhh } = { Y = 3}
{hht, hth, thh} = { Y = 2}
{hhh, hht, hth, thh} ={2 < Y < 3} ={ Y =2 } u { Y =3}
In some experiments, the variables are discrete (e.g., counting experiments), and in
others, the variables are continuous (e.g., height and weight). In still others, both types of
random variables can be present. A person's height and weight are continuous quantities,
but a person's gender is discrete, say, 0 = male and 1 = female.
A crucial distinction is that between the random variable, say, N, and the outcomes,
say, k = 0, 1,2,3. Before the experiment is done, the value of N is unknown. It could be any
of the outcomes. After the experiment is done, N is one of the values. The probabilities of N
refer to before the experin1ent; that is, Pr[ N = k] is the probability the experiment results in
the outcome k (i.e., that outcome k is the selected outcome).
Discrete random variables are considered in detail in Chapters 4, 5, and 6 and
continuous random variables in Chapters 7, 8, and 9.
In this section, we take an intuitive approach to the basic rules of probability. In the next
section, we give a more formal approach to the basic rules.
When the experiment is performed, one outcome is selected. Any event or events
containing that outcome are true; all other events are false. This can be a confusing point:
even though only one outcome is selected, many events can be true because many events
can contain the selected outcome.
For example, consider the experiment of rolling an ordinary six-sided die. The
=
outcomes are the numbers 1, 2, 3, 4, 5, and 6. Let A = {1,2,3,41, B {2,4,61, and C {2}. =
Then, if the roll results in a 4, events A and B are true while C is false.
1.5 Basic Probability Rules 7
Comment 1.3: The operations of set arithmet.ic are analogous to those of Boolean
a lgebra. Set union is analogous to Boolean Or, set intersection to Boolean And, and set
complement to Boolean complement.
For example, if C =Au 8, then C contains the selected outcome if either A or 8 ( or
both) contain the selected outcome. Altematively, we say C is true if A is true or 8 is
true.
Probability is a function of events that yields a number. If A is some event, then the
probability of A , denoted Pr [A), is a number; that is,
A probability of O means the event does not occur. The empty set ¢, for instance, has
probability 0, or Pr(¢ ) = 0, since it has no outcomes. By definition whatever outcome is
selected is not in the empty set. Conversely, the sample space contains all outcomes. It is
always tr ue. Probabilities are normalized so that the probability of the sample space is I :
Pr(S] = I
The probability of any event A is between Oand I; that is, 0 < Pr(A] < 1. Since A u A = S,
it is reasonable to expect that Pr(A] + Pr(A] = 1. This is indeed tr ue and can be handy.
Sometimes one of these probabilities, Pr[ A] or Pr(AJ, is much easier to compute than the
other one. Reiterating, for any event A,
O<Pr(A]<l
Pr [A) + Pr[A] = I
The second avoids the overlap by breaking the union into nonoverlapping pieces:
the first coin is heads and B the event the second coin is heads. If the two coin flips are done
in such a way that the result of the first flip does not affect the second flip (as coin flips are
usually done) , then we say the two flips are independent. When A and B are independent,
the probabilities multiply:
Another way of thinking about independence is that knowing A has occurred ( or not
occur red) does not give us any information about whether B has occurred, and conversely,
knowing B does not give us information about A. See Chapter 2 for further discussion of
this view of independence.
EXAMPLE 1.4 In Example 1.2, we defined two events, A and B, but said nothing about the probabilities.
Assume each side of the die is equally likely. Since there are six sides and each side is equally
likely, the probability of any one side must be I f 6:
= Pr[l ) +Pr [2) +Pr[3)+ Pr[4) (break the event into its outcomes)
I I I I 4
=-+- + -+-=- (each side equally likely)
6 6 6 6 6
3 I
Pr(B) = Pr[2,4,6) = =
6 2
Continuing, A u B = {1,2,3,4,6} and AB= {2,4}. Thus,
5
Pr[AuB) = Pr[ {1,2,3, 4,6}] = (first, solve directly)
6
= Pr[A] + Pr[B] -Pr[AB) (second, solve with union formula)
4 3 2 5
=-+- - -=-
6 6 6 6
1.6 Probability Formalized 9
- - 2 2 l 5
Pr[A u BJ = Pr(ABJ + Pr [ABJ + Pr[ABJ = + + =
6 6 6 6
EXAMPLE 1.5 In Example 1.4, we assumed all sides of the die are equally likely. The probabilities do not
have to be equally likely. For instance, consider the following probabilities:
Pr( l J = 0.5
= Pr[ 1) + Pr(2J + Pr[3) + Pr [4) (break the event into its outcomes)
l l l l 8
= - + -+- + - = - ( unequal probabilities)
2 10 10 10 10
3
Pr(B) = Pr[2,4,6) =
10
Continuing, Au B = {1,2,3,4,6), and AB = {2,4). Thus,
9
Pr[Au B) = Pr[ {1,2,3,4,6) ) = (first, solve directly)
10
= Pr[A) + Pr[BJ -Pr[AB) (second, solve with union for mula)
8 3 2 9
= -+--- = -
10 10 10 10
Alter natively,
- - 2 6 l 9
Pr[A u BJ = Pr(AB) + Pr [ABJ + Pr[ABJ = - + - + - = -
10 10 10 10
A formal development begins with three axioms. Axioms are truths that are unproven but
accepted. We present the three axioms of probability, then use these axioms to prove several
basic theorems.
The first two axioms are simple, while the third is more complicated:
From these three axioms, the basic theorems about probability are proved. The first
axiom states that all probabilities are nonnegative. The second axiom states that the
probability of the sample space is I. Since the sample space contains all possible outcomes
(by definition), the result of the exper iment (the outcome that is selected) is contained in
S. Thus, Sis always true, and its probability is I. The third axiom says that the probabilities
of nonoverlapping events add; that is, if two or more events have no outcomes in common,
then the probability of the union is the sum of the individual probabilities.
Probability is like mass. The first axiom says mass is nonnegative, the second says the
mass of the universe is l , and the third says the masses of nonoverlapping bodies add. In
advanced texts, the word "measure" is often used in discussing probabilities.
This third axiom is handy in computing probabilities. Consider an event A containing
outcomes, a 1 ,a2, ... ,a11 • Then,
when che A; are pairwise disjoint. A common special case holds for two disjo int events:
if AB=¢, chen Pr[A u 8] = Pr(A] + Pr(B] .
Both of t hese are special cases ofche third axiom (just let the excess A; = ¢). But, for
technical reasons chat are beyond chis text, che fin ite version does not imply che infinite
version.
Another random document with
no related content on Scribd:
Gerrit staan. Z’n afgeknaagd, smerig vest stak boven z’n pilow
clownig-gelapte, zakkerige broek uit, met drie knoopen-open-naar-
voren, en z’n groenige rots-kale jekker bobbelde losjes over z’n
lange blauwe kiel. Z’n smoezelig vuil-geel gezichtsvel, stond over
den heelen vergrijsden kop tot diep in de halskuil gebarsten van
scherpe groefjes en zwartige rimpels. En z’n groezelige
stoppelbaard haardotte verwaarloosd over ’t heele gelaat als was ie
ingewreven met turfmolm. Als ’n gekerfde lijnden nijdig de barsten
en ruitrimpels over ’t vel-geel van kaken en knevel-zware brauwen
gromden donker-dreigend boven kleine flauw-grijze oogjes.
Zwaar dofklankte z’n stem, als baste ’t geluid uit z’n neusgaten òp
met ’n brom-nasalen nadreun. [8]
Hassel uit z’n hoog rug-krommige lengte keek bar neer op den
sjofelen daglooner, die zoo plots zich kwam opdringen. Zonder
antwoord van Hassel af te wachten nasaalde z’n stem dreunend
weer uit z’n platten polypen-neus:
Nou ook bij ouë Gerrit.… niks!.… alleen kon ie nog vragen of ie
iemand wist, die ’m wat aardbei-mandjes wou laten vlechten. Maar
lieve heer.… wà’ kon hij uitvoere mi’ s’n bevende hande.… Vroeger
sestig, sevetig op ’n dag, nou geen tien.… wa’ kon ie hale.… vaif
sent.… En boompies make.… nee dat gong heelegoàr nie.… da
bukke.… da bukke.… ure achter malkaer.… de tieme rondvlechte.…
en stoan mi s’n poote op de boompies, gebukt.… nee, dan was ie
zoo roar in z’n ouë dop.… of d’r al moar bloed noar z’n hangende
kop droaide.… Verlede jaar hattie ’t prebeerd.. Maar bòns, was ie
tege den muur gesmakt, duizelig en tollerig en z’n stuut voelde ie
gebroke.… Wa hadde se’n pret had, ’t jonge goed.… nee, nee da
gong nie.… sou ie maor nie eens mee ankomme.… da gong nie.…
En dan.… se [13]woue ’m nie eens hewwe.… da jonge goed was ’m
de baas, da speelsche tuig.
Wà nou, wà’ nou, joeg ’t, jammer-knaagde ’t in ’m. Was ’t moar t’ met
somer.… moar lieve Heer!.… December.… begin December.…
goeie God.… wà’ mos da na-toe!—van wà’ nou vrete.…
De ouë Bolk had eerst nog voor zich heengekeken, stug, met
oogengraagte, de kalme zekere spithouwen van Dirk in z’n hei
bestarend,.… met jaloersche lust den kerel daar te verdringen.…
want dà kon ie, as de beste.… Maar toen inéén had ouë Gerrit ’m
aan ’t verstand gebracht dat ie gosommooglik ies veur ’m hat, dat ie
alleen die paar roe hat diep te spitte—miskien in ’t veurjaor, tegen
Moart, as de grond nie nog te hard was.…
—Hoho!.… Bolkie, schraapte de Ouë eruit, je skiet of’r, je bin nie
f’rtuunelik.… ’t spait me duuf’ls daa’k segge mot.… tug is ’t puur
woar.… je skiet of’r.… d’r is op heede van alderlei volk te veul.… j’
bin te veul.… d’r is nies.. puur nies omhande.… ’t wort allegoar
dàan, allegoar deur hullie.… verjenne.… ’t spait me.… huu!.. huu.…
—Wa skeel je.… oud k’nijn! giftte ie met halve stem woedend.…
—Aa’s je nou d’rs de vaifde steek ommekeert Dirk, je zie nou water
hee?.… teemde goeig ouë Gerrit.
Nooit had ouë Bolk kinderen gehad; geen jongen, geen meid; z’n
grootste, stille smart.… En zóó maar grienbalke [16]kon ie as se d’r
van begonne.… nou weer die lamme kerel, had ie ze moar.…
—Bars jai, gromde Bolk, met tranen van windkou in z’n oogen,.…
godskristes.… koebeeste,.… ha’k moar koot’rs..
Weer kwam er schrei-droefnis in z’n stem, die stil, voor hem alleen
uit weende z’n smart.
Bolk draaide zich nijdig om, met z’n rug naar Piet. Dirk zwoegde
door, spraakloos.—Rondom was grauwer, late middag-somberte
over de velden aan ’t verdroeven.… Op akkerverten, hier en daar,
was drukker beweeg van spitters dan in ochtend, silhouettig, zwart-
vagelijk toch, in het wazig winterdroeve grijs-grauw. De lucht dromde
zwaarder van regenwolken, en windstooten loeiden op, bulderend
rondgierend in dollendans, op de leege wijdte van ’t bronzen land.
Schimmiger uit alle hemelhoeken woei áán, scheemrige wolkenjacht.
aansliertend, wijd-alom, in reuze-donkering naar horizon, grauwiger
opgeblaas van vale wolkenkoppen, monsterlijk verwrongen met àl
droeviger schemerschijn op angst-looien wolk-gezichten. En stiller
nu, in roerlooze effenheid stonden de donkere erfjes en achteruitjes
van Bikkerstraat, vertriestend vaal, in barren grijns.
—Hep je bai Neelis de Borstel niks.… niks te hakke?.…, informeerde
schuintjes, nog eens de Ouë, onverschillig toch [17]voor ’t antwoord,
kijkend met gretigheid naar Dirk, die weer zwaar-zwellend in kracht,
door panderige aardplaten heen moest bonken.
—Wâ hakke?.… die raize, da’s louw, die hakke se omm’rs selfers
—Wa jonge kop.… de heule tait m’r gaintjes.… neegtien jaor t’met.…
—Nou, mag nie?.… wa ken main jou mikmak skeele.… mo’k nou al
griene, wa jou Dirk?
—’n Herberg, rejaal in anbouw, lolde Piet met ’n slag van pret op z’n
been,—d’r hang ’n droppel an je kokkert!.…
—Nou gaat-ie bai Ronk, f’rek, die stong, van morge t’met an seve
uur op ’t land.… f’rek, hai hep g’n hand veur ooge sien-en kenne,
hep sain noodig.…..
—Half seeve? vroeg ouë Gerrit ’n beetje ongeloovig.… wie hep saìn
sien?
—Wa sel ’t dan hier bloase hewwe, bibberde ouë Gerrit in elkaar
schokkerend van kou, dieper handen z’n broekzakken indringend,
opsjortend z’n kleeren, in nauwe kreukels om z’n lijf. En plots voelde
ie wat lekker ’t voor hem was, zoo vroeg, in winterkilte dat Piet de
koebeesten voerde. Donderemente wa’ koud kreeg ie ’t nou.…
Piet was weer zingend naar z’n hei gestapt.
In de woonkamer, aan den weg, zat vrouw Hassel, met ’r rug naar de
kachel, dof omgonsd in ’r ooren, vol van dreunslagen van bekken,
versuft toch, als sliep ze met oogen open. Achter haar stoeltje rookte
’t benauwd, lichtelijk in brandlucht. Bij ’t kamer-instappen zag ouë
Gerrit dadelijk dat haar schort aan ’t branden was, over de kachel te
drogen gehangen.
Met schrik-gezicht, draaide vrouw Hassel om, wild grijpend naar d’r
schort. Ze had ook even wel iets vreemds geroken, iets smeulerigs,
maar ’t was ’r weer door d’r hoofd gegaan, omdat ze niet begreep
wàt ’t kon zijn, heelemaal niet meer wetend dat ze d’r schort voor de
kachel had gehangen.
—Wa vergat ze leste taid tug alles en alles.… Hoe was ’t gods-in-d’r
leve mogelik. Nou had se d’r eige nog so gesait …. Nou dènk je d’r
an.… en nou, na ’n paar menuute allegoar rejaal vergete.…
vergete.… Tranen kreeg ze in ’r oogen. ’t Maakte ’r zoo bang, zoo
huilerig en zenuwachtig. Ze voelde wel onrustig dat er al ’n poos
lang wat mals met ’r gebeurde, in ’r kop vooral, maar ze kon niet
vatten wàt, wàt.
Ouë Gerrit was op z’n kousen door den stal heengesjokt, had even
naar z’n beesten gekeken, en was moe-huiverig op [23]z’n biezen
kaal-korten gemakstoel bij den kachel geschoven. Z’n jekker had ie
aangehouen en met ingezakt bovenlijf zat-ie zich behaag-zwaar te
schurken, zoekend naar meer broeiing en warmte. Toch voelde ie
zich brommerig op zijn wijf dat al ’n jaar lang temet alles vergat. En
dàn, dat huilen van ’r, dat nare drenzerige grienen, met halve smart-
zinnetjes er tusschen gekermd, waarvan ie niks begreep, dàt, dàt
maakte ’m helsch. F’r wa griende da waif tug?.… Wier se soo fèl
geslage?.… godskristus, wa g’luk da Guurtje d’r was, s’ maid van
twintig, sain eenige maid goddank. Versakrementeele, aa’s die d’r
nie was, krege temet de jonges nie eens d’r buik vol. Dan vergat se
te koke.… snôverjenne.… wa’ malle f’rduufeling tug.…
Wa wou dat mormel tug, die laileke trien?.… Wà sat se nou weer?
van de skrik te griene.… ja verdomd!.… op ’t achterend weer.…