Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

Random Variables (1)

EMSE 6115: Uncertainty Analysis for Engineers

Dr. A. Etemadi

1
EMSE Department
George Washington University

Lecture 2

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 1 / 25


Outline

Random variables
Discrete random variables
Probability mass function (PMF)
Cumulative distribution function (CDF)
Bernoulli random variable
Binomial random variable
Negative binomial random variable
Geometric random variable
Hypergeometric random variable
Poisson random variable
Change of variable

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 2 / 25


Discrete

Random Variables

An association of a number with each outcome in Ω


Definition
A random variable is a function from the sample space Ω to the real numbers R

Discrete or continuous
Can have several random variables defined on the same sample space
Notation:
random variable X
numerical value x

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 3 / 25


Discrete

Discrete random variable example

Let the random variable X be given by the maximum of two 4-sided die rolls
X is a mapping from Ω to the real number line R

Lowercase x is used to denote a value of X

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 4 / 25


Discrete

Example

A coin is tossed 2 times, Ω = {T T, HT, T H, HH}


Let X be the number of heads 
0
 if TT
X= 1 if HT or TH

2 if HH

Question
HT was observed, X =

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 5 / 25


Discrete

Probability mass function (PMF)

Consider the probability that X equals x, i.e. P (X = x).


this probability, as a function of x, is given by the probability mass function (PMF)

Definition
The probability mass function p of a discrete random variable X is the function pX : R → [0, 1], defined
by
pX (a) = P (X = a) for −∞<a<∞

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 6 / 25


Discrete

PMF (cont.)

a.k.a. “probability law”, “probability distribution”


Notation: pX (x) = P (X = x)
Example: X = number of coin tosses until first head
assume independent tosses, P (H) = p > 0

pX (k) = P (X = k)
= P (T T · · · T H)
= (1 − p)k−1 p k = 1, 2, . . .

this is the geometric PMF, and we may write X ∼ Geo(p) to indicate the random variable X has the
geometric PMF with parameter p
note that k is often used as the dummy variable to denote integer values

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 7 / 25


PMF (cont.)
Discrete

PMF (cont.)

• PMFs may be represented graphically, e.g.,


PMFs may be represented graphically, e.g.,
pX (x )

1/4

1 2 3 4 x

Question
P
For any PMF: pX (x) =
x
P
• For any PMF: x pX (x) = ?

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 8 / 25


Discrete

Cumulative distribution function (CDF)

Often we are interested in the probability of the event {X ≤ x}


The CDF represents this probability, and is denoted here FX (x).

Definition
The distribution function FX of a random variable X is the function FX : R → [0, 1], defined by

FX (a) = P (X ≤ a) − ∞ < a < ∞

Note: The text generally uses “F (x)” or “F (a)”, where we use “FX (x)”, and “distribution
function” instead of “CDF”

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 9 / 25


PMF and CDF example
Discrete

PMFthe
• Recall, and CDF
PMF example
and CDF are defined, respectively

pX (x)
Recall, the PMF and CDF are=defined,
P (X = respectively
x)
FX (x) = P (X ≤ x)
pX (x) = P (X = x)
FX (x) = P (X ≤ x)
• For example, a discrete r.v. may have PMF and CDF [text Fig. 4.1]
For example, a discrete r.v. may have PMF and CDF [text Fig. 4.1]

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 10 / 25


Discrete

Toss fair coin twice example

X: the number of heads


As before 

0 if TT
X= 1 if HT or TH


2 if HH
So 

1/4 if x = 0 or x = 2
pX (x) = 1/2 if x = 1


0 otherwise

Question
P
x pX (x) =

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 11 / 25


Discrete

Toss fair coin twice:


TossPMF andtwice
fair coin CDFPMF and CDF

CDF
X
FX (x) = pX (xi)
1

0.8
FX(x) 0.6
0.4
xi ≤x
0.2

0
−1 −0.5 0 0.5 1
x
1.5 2 2.5 3
P
PMF
FX (x) = xi ≤x pX (xi )
0.6

pX(x) 0.4
0.2

0
0 1 2
x

We see that, for a discrete r.v. X, the CDF FX (x)


• We see that, for a discrete r.v. X, the CDF FX (x)
increases at each x = xi that has non-zero probability
– increases
increases at eachatx each = xi that to
= xixaccording hasthe
non-zero
value probability
pX (xi )
– increases at each x = xi according to the value pX (xi)

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 12 / 25


Discrete

Using PMFs

Once we know pX (x) we can calculate probabilities of various events, e.g.

{X ≤ x} or {x1 ≤ X ≤ x2 }

Consider a fair coin tossed twice. The probability that at least one head is observed is

P (X ≥ 1) = P ({X = 1} ∪ {X = 2})
= P (X = 1) + P (X = 2)
= px (1) + pX (2)
3
=
4

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 13 / 25


Discrete

Commonly encountered PMFs

Some PMFs describe events associated with coin tosses:


the probability that the first toss is a head: Bernoulli distribution
the probability of r heads in n tosses: Binomial distribution
the probability that the first head occurs at the nth toss: Geometric distribution
Some PMFs have other applications:
the probability that x white balls are drawn from an urn that contains black and white balls:
Hypergeometric distribution
the probability that the k events happens in a time interval: Poisson distribution

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 14 / 25


Discrete

Bernoulli PMF

Consider an experiment with two possible outcomes: one labeled as a success, and the other as a
failure. Let p be the probability of a success. If a random variable X is defined
(
1 if the experiment is a success
X=
0 otherwise

then the Bernoulli PMF is given by




p x=1
pX (x) = 1 − p x = 0


0 otherwise
We may write X ∼ Ber(p)

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 15 / 25


Discrete

Combinations


Let nk or n Ck , or Cn,k (read “n choose k”) be the number of unique k-element subsets possible,
using n “candidate” elements to choose from.

For example, 106 = 210 is the number of ways one can choose 6 correct answers from a list of 10
It turns out that  
n n!
=
k k!(n − k)!
where n! (read “n factorial”) is calculated as

n! = (n)(n − 1)(n − 2) · · · (2)(1)

and by convention
0! = 1

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 16 / 25


Discrete

Binomial PMF

X: number of heads in n independent coin tosses


P (H) = p. Let n = 4

pX (2) = P (HHT T ) + P (HT HT ) + P (HT T H)


+ P (T HHT ) + P (T HT H) + P (T T HH)
= 6p2 (1 − p)2
 
4 2
= p (1 − p)2
2

In general  
n k
pX (k) = p (1 − p)n−k , k = 1, 2, . . . , n
k
We may write X ∼ Bin(n, p)

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 17 / 25


Discrete

Negative Binomial PMF

X: The number of tails that appear before rth head, in repeated flips of a coin.
P (H) = p.
In general  
r−1+k r
pX (k) = p (1 − p)k , k = 0, 1, 2, . . .
k
We may write X ∼ N B(r, p).
The special case r = 1 corresponds to Geometric(p).
X: Number of failures before first success.
X ∼ Geometric(p) ⇒ pX (k) = (1 − p)k p, k = 0, 1, 2, . . ..
X: Number of trials before first success. X ∼ Geo(p) ⇒ pX (k) = (1 − p)k−1 p, k = 1, 2, 3, . . ..
Examples:
The number of light bulbs tested until the 3rd one that does not work.
In baseball, number of at-bats without a hit until the 4th hit.

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 18 / 25


Discrete

Example

Example: Suppose the probability that a man hit a target is 0.75.


1 What is the probability that he hits the target 3 times when he fires 10 times?
2 What is the probability that he hits the target at least 3 times when he fires 10 times?
3 What is the probability that he fires 5 times before he hits the target for the first time?
4 What is the probability of firing 10 times before hitting the target for the 6th time?

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 19 / 25


Discrete

Hypergeometric PMF

Suppose that an urn contains M white balls and N − M black balls. Suppose we draw n ≤ N
balls from the urn.
What if the probability that x white balls are drawn? max(0, n + M − N ) ≤ x ≤ min(n, M )

M N −M

x n−x
P (X = x) = N

n

X ∼ Hypergeometric(N, M, n)
Applicable to cases of sampling without replacement, a population of size N , where each element
either has or lacks a characteristic
Question
In case of sampling WITH replacement, what is the distribution of the number of white balls observed
in n draws?

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 20 / 25


Discrete

Hypergeometric PMF Example

A crate contains 50 light bulbs of which 5 are defective. A Quality Control Inspector randomly
samples 4 bulbs without replacement. Let X be the number of defective bulbs selected. Find the
probability P (X = k), k = 0, 1, 2, 3, 4.

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 21 / 25


Discrete

Poisson PMF
k: number of times an event occurs in an interval of time or space
Examples: 1) Number of patients arriving in an emergency room between 10 and 11 am, 2)
number of emails that arrive to a server in a time interval
When is Poisson an appropriate model?
k can take values 0,1,2,. . .
Events occur independently.
The rate at which events occur is constant.
No more than one event can occur in a very small subinterval.

Definition
A discrete random variable X has a Poisson distribution (X ∼ P ois(λ)) if

λk −λ
pX (k) = P (X = k) = e
k!
where λ > 0 is the average number events per interval.
Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 22 / 25
Discrete

Poisson PMF (Cont’d)

Example: Assume that the average number of goals in a World Cup soccer match is approximately
2.5 and the Poisson model is appropriate. Find probability of k goals in a match, for
k = 0, 1, 2, . . . , 6. Note that λ = 2.5.

2.51 −2.5
k = 1 : P (X = 1) = e = 0.205
1!
k 0 1 2 3 4 5 6
P (X = k) 0.082 0.205 0.257 0.213 0.133 0.067 0.028
If X ∼ Bin(n, p), and we set p = λ/n, λ > 0, as n → ∞, the PMF of binomial approaches that of
Poisson.
Example: If an aircraft flies a large number of times n, and probability of getting hijacked is
p = λ/n, then the probability that it is hijacked k times is approximately λk e−λ /k!.

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 23 / 25


Discrete

Poisson PMF (Cont’d)

Example: Consider telephone help line. If calls are received at a rate of r per unit of time, during a
time period of length t, the number of arrivals can be modeled by a P ois(rt). Suppose calls arrive
at the rate of two per minute.
1 What is the probability that five calls are made in the next 2 minutes?
2 What is the probability that five calls are made in the next 2 minutes and then five more in the
following 2 minutes?
3 What is the probability that no call is answered during a 10-minute period?

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 24 / 25


Discrete

Change of Variable: Discrete Case

Example: Let X be the number showing on a fair six-sided die. Let Y = X 2 − 3X + 2. Find
probability distribution of Y .

Dr. A. Etemadi (George Washington University) Random Variables (1) Lecture 2 25 / 25

You might also like