Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 37

EE 309: Probability Methods In

Engineering
Lecture 6

Dr. Sadia Murawwat


BE V (2016-2020)
Fall 2018
Chapter 3:
2
Random variables and Probability
distributions
 Concept of random variable
 Discrete probability Distributions
 Continuous probability Distributions
 Joint/ Marginal/Conditional probability
Distributions
 Statistical Independence
 Examples: 16
 Definitions: 13
3

Joint Probability Distributions


4 4. Joint Probability Distributions
If X and Y are two discrete random variable, the probability
distribution for their simultaneous occurrence can be
represented by a function with value f(z, y) for any pair of
values (z, y) within the range of the random variable X and Y. It
is customary to refer to this function as the joint probability
distribution of X and Y,
5 Joint Probability Density Function
6 Joint Probability Mass Function
7 In conclusion, joint
8 Example#1

A certain market has both an express checkout line and a


superexpress checkout line. Let X denote the number of
customers in line at the express checkout line at a particular
time of day and let Y denote the number of customers in line
at the superexpress checkout at the same time. Suppose the
joint pmf of X and Y is as given in the following table:
9 Solution
10 Cont..
11 Contd.
12 H.W
An instructor has given a short quiz consisting of two
parts. For a randomly selected student. let X be the
number of points earned on the first part and Y be the
number of points earned on the second part. Suppose the
joint pmf of X and Y is given in the following table:
13 Example JOINT (discrete)
14 Solution:
15
16 Example#2
17 Cont..
18 Example #3 Joint (Cont.)
19 Solution
20 Example #4
21

Conditional probability Distributions


22 Conditional
23 Example#1
24 Cont..
25

Marginal probability Distributions


26 Marginal distribution

 More challenging, but also more useful, is to find


the marginal density from the joint density,
assuming it exists.
27 Marginal distribution
28 Example#1 Marginal
29 Example#2 Marginal
30

Statistical independence
31 5. Statistical independence
 When two events are statistically independent, it means
that knowing whether one of them occurs makes it
neither more probable nor less probable that the other
occurs. In other words, the occurrence of one event
occurs does not affect the outcome of the occurrence of
the other event.

 Similarly, when we assert that two random variables are


independent, we intuitively mean that knowing
something about the value of one of them does not yield
any information about the value of the other.
32 Statistical
independence(Representation)
33 Example#1
 The number of people crossing the road in one direction,
has no bearing on the number of people crossing in the
opposite direction. The two occurrences are therefore said
to be statistically independent of each other.

 The number appearing on the upward face of a die the


first time it is thrown and that appearing on the same die
the second time, are independent. e.g. the event of getting
a "1" when a die is thrown and the event of getting a "1"
the second time it is thrown are independent.
34

 Exercise:
 What conditions make statistical independence
35 Answer
 The condition for statistical independence is that
the outcome of one event does not affect the
outcome of the other.
36 Example#2

Show that the variables .. Are not statistically


independent?
37 Ex.

 Let us consider the point (0,1)


 From table we find three probabilities
f(0,1)= 3/14, g(0)=5/14 and h(1)=3/7

Clearly, f(0,1)≠ g(0) h(1)


Therefore, X and Y are not statistically independent.

You might also like