Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 29

ECE 44000 Fall 2020 – Transmission of Information

Lecture 16
- Chapter 6: Probability and Random Variables
6.2.1 Random Variables

 Random Variable: A “rule” that assigns a numerical value to


each possible outcome.
– The term is a misnomer; a random variable is really a function, since
it is a rule that assigns the members of one set to those of another

Example: The random


variable X1 maps the outcomes
of a coin toss to either a -1 or 1

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 2
6.2.2 Probability (Cumulative) Distribution Functions
(CDF)

 The Cumulative Distribution Function (CDF) of a random


variable X is defined as:

𝑃 ( 𝑋 ≤ 𝑥1 )
 

𝑥1
 The CDF Has the following properties:  

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 3
6.2.3 Probability-Density Function

  The Probability-Density Function (PDF) of a random


variable is defined as the derivative of the CDF
  𝑑
𝑓 𝑥 (𝑥 )= 𝐹 𝑋 ( 𝑥 )
𝑑𝑥
 The PDF has the following properties
 𝑓 ( 𝑥 ) ≥0
𝑥

 
∫ 𝑓 𝑥 ( 𝑥 ) 𝑑𝑥=1
−∞
𝑥
 
∫ 𝑓 𝑥 ( 𝑥 ) 𝑑𝑥= 𝐹 𝑋 ( 𝑥 )
−∞
𝑏
 
∫ 𝑓 𝑥 ( 𝑥 ) 𝑑𝑥 = 𝑃 ( 𝑎< 𝑋 ≤ 𝑏 )
𝑎
© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 4
6.2.3 Probability-Density Function

CDF pdf

  𝑑
𝑓 𝑥 (𝑥 )= 𝐹𝑋(𝑥)
𝑑𝑥

© 2018, Purdue School of Engineering and Technology, IUPUI

IUPUI 5
6.4 Some Useful distributions

 Uniform Distribution – each value is equally likely to be


observed.
Discrete:
1
P( X  k )  , k  1,2,, N .
N

Continuous:


1
, a  x  b,
f X ( x)   b  a

 0, otherwise.

https://en.wikipedia.org/wiki/Discrete_uniform_distribution
https://en.wikipedia.org/wiki/Uniform_distribution_(continuous)
© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 6
6.4.1 Binomial Distribution

  Binomial Distribution - describes the number of successes


in a series of independent Yes/No experiments all with the
same probability of success
 n  k n k
P( X  k )   k p q , k  0,1,2, , n.
 
– Example: Find the probability of heads in tosses of a coin if the
probability of a head is and tail is

– This particular sequence is only one of sequences where:

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 7
6.4 Some Useful distributions

 Bernoulli Distribution: takes value 1 with probability p and


value 0 with probability q = 1 − p. It can be used to represent
a (possibly biased) coin toss where 1 and 0 would represent
"heads" and "tails" (or vice versa), respectively, and p would
be the probability of the coin landing on heads or tails,
respectively.
P ( X  0)  q, P ( X  1)  p.

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 8
6.4 Some Useful distributions

 Table 6.4 in text give listing of several other Discrete and Continuous
distribution functions along with their means and variances

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 9
6.2.4 Joint CDFs and pdfs

 Some chance experiments must be characterized by two or


more random variables.
 Consider the chance experiment in which darts are
repeatedly thrown at a target
 Joint CDF of X and Y is:

 Joint pdf of X and Y is:

Note:
  The comma in is
interpreted as meaning “and”

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 10
6.2.4 Joint CDFs and pdfs

 The probability that the random variables X and Y lie within a


given range is then:

 Just as with one variable, if we include the entire sample


space the limits to infinity give the following

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 11
6.2.4 Joint CDFs and pdfs

 The marginal CDFs can be found by including the entire


sample size of the other variable:

 From this we have the following relationships:

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 12
6.2.4 Joint CDFs and pdfs

 We can find the marginal pdf from the marginal CDF as


follows. Since:

 We obtain:

Integrate over y for a particular value of x

Integrate over x for a particular value of y

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 13
6.2.4 Joint CDFs and pdfs

 The conditional pdf of a random variable Y, given that the


value of the random variable X is equal to x, is:

 If the pdf after knowledge of X is the same as the pdf without


knowledge of X, then the random variables are said to be
statistically independent.

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 14
6.2.4 Joint CDFs and pdfs

 If two random variables are independent, the following


relations will be true:

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 15
6.2.4 Joint CDFs and pdfs

 Summary: Some properties of Joint and Marginal CDFs and


pdfs:

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 16
6.2.4 Joint CDFs and pdfs

∞ ∞ 2 4
 
1= ∫ ∫ 𝑓 𝑋𝑌 ( 𝑥 , 𝑦 ) 𝑑𝑥 𝑑𝑦=∫∫ 𝐶 ( 1+ 𝑥𝑦 ) 𝑑𝑥 𝑑𝑦
−∞ −∞ 0 0

2 4 2 4
 
¿∫ ∫ 𝐶𝑑𝑥 𝑑𝑦 +∫ ∫ 𝐶 𝑥𝑦 𝑑𝑥 𝑑𝑦
0 0 0 0

2
  𝐶 2 1 2
¿ 8 𝐶 +∫ ( 4 ) 𝑦 𝑑𝑦=8 𝐶 +8 𝐶 ( 2 )
𝑓  𝑋∨𝑌 ( 𝑥∨1 ) 0 2 2

¿  8 𝐶+16 𝐶=24 𝐶
  1 1
a) 𝐶= 𝑓  𝑋𝑌 ( 𝑥 , 𝑦 ) = (1+ 𝑥 ∙ 𝑦 )
24 24
  ( 1,1.5 )= 1 ( 1+1 ∙ 1.5 )=0.104
𝑓
b) 𝑋𝑌
24
© 2018, Purdue School of Engineering and Technology, IUPUI

IUPUI 17
6.2.4 Joint CDFs and pdfs

c) 𝑓  𝑋𝑌 ( 1,3 )=0 Outside the range of y


  𝑓 𝑋𝑌 ( 𝑥 , 𝑦 )
𝑓
d) Note, from conditional probability: 𝑋 ∨𝑌 ( 𝑥∨ 𝑦 ) =
𝑓 𝑌 ( 𝑦)
4
  4
𝑓 𝑌 ( 𝑦 )=
1

24 0
1
( 1+ 𝑥𝑦 ) 𝑑 𝑥=
1
24 [( 1 2
𝑥+ 𝑥 𝑦
2 |)| ]0

𝑓  𝑋∨𝑌 ( 𝑥∨1 ) 𝑓  𝑌 ( 𝑦 )= (1+2 𝑦 )


6
  1
( 1+ 𝑥𝑦 )
𝑓 𝑋𝑌 ( 𝑥 , 𝑦 ) 24 ( 1+ 𝑥𝑦 )
𝑓 𝑋 ∨𝑌 ( 𝑥∨ 𝑦 )= = =
𝑓 𝑌 ( 𝑦) 1 4 ( 1+ 2 𝑦 )
( 1+ 2 𝑦 )
6
  ( 1+ 𝑥 ) ( 1+ 𝑥 )
𝑓 𝑋 ∨𝑌 ( 𝑥∨1 )= =
4 (1+2 ) 12
© 2018, Purdue School of Engineering and Technology, IUPUI

IUPUI 18
6.2.5 Transformation of Random Variables

 Situations are often encountered where the pdf (or CDF) of a


random variable 𝑋 is known and we desire the pdf of a
second random variable 𝑌 defined as a function of 𝑋,
 
𝑌 =𝑔 ( 𝑋 )
 The following relationship can be made between the pdfs:

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 19
6.2.5 Transformation of Random Variables

© 2018, Purdue School of Engineering and Technology, IUPUI

IUPUI 20
6.2.5 Transformation of Random Variables

© 2018, Purdue School of Engineering and Technology, IUPUI

IUPUI 21
6.2.5 Transformation of Random Variables

  For two or more random variables


 Suppose two new random variables 𝑈 and 𝑉 are defined in
terms of two original, joint random variables 𝑋 and 𝑌 by the
relations

 The new pdf is obtained from the old pdf using the Jacobian
matrix

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 22
6.2.5 Transformation of Random Variables

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 23
6.2.5 Transformation of Random Variables

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 24
6.3 Statistical Averages
Expectation of Discrete and Continuous RV

  The Expectation of a discrete random variable X, which


takes on the possible values , …, with respective
probabilities , …, is:
  𝑀
Discrete RV 𝑋´ =𝐸 [ 𝑋 ] =𝜇 𝑋 =∑ 𝑥 𝑗 𝑃 𝑗
𝑗=1

 For a continuous random variable this is:

  ∞
Continuous RV 𝐸 [ 𝑋 ] =𝜇 𝑋 = ∫ 𝑥 𝑓 𝑋 ( 𝑥 ) 𝑑𝑥
−∞

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 25
6.3.3 Average of a Function of a Random Variable

  The Expected Value of a discrete random function can be


found by equating and substituting in the equations before.
The Expected Values of are then:
  𝑀
Discrete RV 𝑋´ =𝐸 [ 𝑔 ( 𝑋 ) ]=∑ 𝑔 ( 𝑥 𝑗 ) 𝑃 ( 𝑋 =𝑥 𝑗 )
𝑗=1

 For a continuous random variable this is:

  ∞
Continuous RV 𝐸 [ 𝑔 ( 𝑋 ) ] = ∫ 𝑔 ( 𝑥 ) 𝑓 𝑋 ( 𝑥 ) 𝑑𝑥
−∞

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 26
6.3.5 Variance of a Random Variable

  The variance of the random variable is given by:

 The symbol is called the standard deviation of and is a


measure of the concentration of the pdf of .

 A useful relegation for obtaining is:

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 27
6.3.6 Average of a Linear Combination of N Random
Variables

 If 𝑋1,𝑋2,…,𝑋𝑁 are random variables and 𝑎1,𝑎2,…,𝑎𝑁 are


arbitrary constants, then:

 If 𝑋1,𝑋2,…,𝑋𝑁 are statistically independent random variables,


then:

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 28
6.3.10 Covariance and the Correlation Coefficient

 Two useful joint averages of a pair of random variables 𝑋 and


𝑌 are their covariance 𝜇𝑋𝑌 , defined as:

 and their correlation coefficient 𝜌𝑋𝑌 , which is written in terms


of the covariance as:

 From these there is the following relationship:

© 2020, Purdue School of Engineering and Technology, IUPUI

IUPUI 29

You might also like