ECN 652 Lecture 3 Random Variable

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

ECN 652: Quantitative Methods

Handout # 3

Random Variable and Probability


Distribution

Prof. M. B. Ranathilaka and


Dr. T. Vinayagathasan
Department of Economics and Statistics
University of Peradeniya

Prof. M. B. Ranathilaka & Dr. T.


1
Vinayagathasan, Dept. of Econ. & Statistics
Random Variable
• In probability and statistics, Random Variables is described
– informally as a variable whose value depend on outcome of a
random phenomenon.
– formally, random variable is understood as a measurable function
defined on a probability space whose outcomes are typically real
numbers.
• A random variable has a probability distribution, which specifies the
probability of its values.
• Probability distributions are classified either as discrete or continuous
depending upon the nature of the variable being considered.
• Therefore, random variable can be
 Discrete Random Variable
 Continuous Random Variable

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 2


Random Variable….
• Discrete random variables: It is taking any of a specified finite or
countable list of values. This variable endowed with a probability
mass function which is the characteristics of this random variable’s
probability distribution.
• Continuous random variables: it is taking any numerical value in
interval or collection of interval via probability density function,
which is the characteristics of random variable’s probability
distribution. This variable have an infinite continuum of possible
values
– Examples: blood pressure, weight, the speed of a car, the real
numbers from 1 to 6.
• A random variable x takes on a defined set of values with different
probabilities.
– For example, if you roll a die, the outcome is random (not fixed)
and there are 6 possible outcomes, each of which occur with
probability one-sixth.
• Roughly, probability is how frequently we expect different outcomes
to occur if we repeat the experiment over and over

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 3


Probability Distribution
• In probability theory & statistics, a probability distribution
is a mathematical function that provides the probabilities
of occurrence of different possible outcomes in an
experiment.
• In other word, the probability distribution is a description
of a random phenomenon in terms of the probabilities of
events. For instance
– if the random variable Y is used to denote the outcome of a
coin toss ("the experiment"), then the probability
distribution of Y would take the value 0.5 for Y = heads, and
0.5 for Y= tails (assuming the coin is fair).
– if the random variable Y is used to denote the outcome of
throwing a fair dice ("the experiment"), then each of the six
values 1 to 6 has the probability 1/6.
– Examples of random phenomena can include the results of
an experiment or survey.
Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 4
Probability Distribution…
• A probability distribution is defined in terms of an
underlying sample space, which is the set of all possible
outcome of the random phenomenon being observed.
• The sample space may be the set of real numbers or a
higher-dimensional vector space, or it may be a list of
non-numerical values; for example, the sample space of a
coin flip would be {heads, tails}.
• Probability distributions are generally divided into two
classes.
1. A discrete probability distribution
2. A continuous probability distribution

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 5


Probability Distribution…
1. A discrete probability distribution can be encoded by a discrete list of the
probabilities of the outcomes, known as a probability mass function. This
function is applicable to the scenarios where the set of possible outcomes
is discrete.
– Example: a coin toss or a roll of dice.
– Some example for discrete probability distributions are:
 Bernouli distribution
 Binomial distribution
 Poisson distribution
 Uniform distribution
2. A continuous probability distribution is typically described by probability
density functions (with the probability of any individual outcome actually
being 0). This is applicable to the scenarios where the set of possible
outcomes can take on values in a continuous range (e.g. real numbers),
– Example: the temperature on a given day
– Some examples for continuous probability distributions.
 Normal and Standard Normal probability distribution
 Chi-squared probability distribution
 F- probability distribution
Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 6
Probability Distribution…
• A continuous probability distribution differs from a discrete probability
distribution in several ways.
– The probability that a continuous random variable will assume a particular
value is zero. As a result, a continuous probability distribution cannot be
expressed in tabular form.
– A probability distribution whose sample space is the set of real numbers is
called Univariate while a distribution whose sample space is a vector
space is called multivariate.
– A Univariate distribution gives the probabilities of a single random
variable taking on various alternative values.
– Important and commonly encountered Univariate probability distributions
include the binomial distribution, the hypergeometric distribution, and
the normal distribution.
– A multivariate distribution (a joint probability distribution) gives the
probabilities of a random vector – a list of two or more random variables –
taking on various combinations of values.
– The multivariate normal distribution is a commonly encountered
multivariate distribution.
Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 7
Probability functions
• The probability distribution of a discrete random
variable is a graph, table or formula that specifies
the probability associated with each possible
outcome the random variable can assume.
– p(x) ≥ 0 for all values of x
– p(x) = 1
• A probability function maps the possible values of
x against their respective probabilities of
occurrence, p(x)

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 8


Discrete variable example: Roll of a die (PMF)
X P(X) p(x)
1 1
P X=1 =
6
2 1
P X=2 =
6
3 1 1/6
P X=3 =
6
4 1 x
P X=4 = 1 2 3 4 5 6
6
5 1
P X=5 =
6
P X=6 =
6
1  P(x)  1
all x
6

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 9


Cumulative distribution function (CDF)
• CDF: The probability mass function (PMF) is one way to describe the
distribution of a discrete random variable. However, PMF cannot be
defined for continuous variable. CDF of a random variable is another
method to describe the distribution of random variables. The advantage
of CDF is that it can be defined for any kind of random variable (discrete,
continuous and mixed).
• The CDF of random variable X is defined as:
FX 𝑥 = P X ≤ 𝑥 , for all 𝑥 ∈ ℝ
• Note that the subscript X indicates that this is the CDF of the RV X. Also
note that the CDF is defined for all 𝑥 ∈ ℝ.
• Lets us look at an example: Rolling a fair die
1.0 P(x)
5/6
2/3
1/2
1/3
1/6
1 2 3 4 5 6 x

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 10


Mean
• The mean, or expected value, of a discrete random
variable is defined as:
𝑛
E X = 𝑖=1 X 𝑖 P(X 𝑖 ), for 𝑖 = 1, 2, … , 𝑛
• Example: Assume that we have three fair coins and we toss them
simultaneously. The possible number of heads that can appear as a result of the
random experiment are given in the following table.

E X = 3𝑖=0 X 𝑖 P(X 𝑖 ), 𝑓𝑜𝑟 𝑖 = 0, 1, 2, 3


E X = X 0 P X 0 + X1 P X1 + X 2 P X 2 + X 3 P X 3
E X = 0 ∗ 1 8 + 1 ∗ 3 8 + 2 ∗ 3 8 + 3 ∗ 1 8 = 12 8 = 1.5
Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 11
Mean
• The mean, or expected value, of a continuous random variable
is defined as:

E X = −∞
𝑥𝑓 𝑥 𝑑𝑥
1
Example: 𝑓 𝑥 = 1 − 𝑥 0<𝑥<2
2
2 1
E X = 0 𝑥 1 − 𝑥 𝑑𝑥
2
2
2 1 𝑥2 1 𝑥3
E X = 0
𝑥 − 𝑥2 𝑑𝑥 = − ∗
2 2 2 3 0
22 1 23
= − ∗ −0
2 2 3
4 1 8
= − ∗
2 2 3
8
= 2−
3
6−4 2
= =
3 3
Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 12
Variance
• The variance of a discrete random variable x is
Var X = E X 2 − [E(X)]2
𝑛 2
where, E X 2 = 𝑖=1 𝑖 P(X),
𝑥 for 𝑖 = 1, 2, 3, … , 𝑛
E X =μ
Var X = 𝑛𝑖=1 X𝑖2 P(X𝑖 ) − 𝜇2
• The standard deviation of a discrete random
variable x is
SD X = Var X = 𝑛 X 2 P(X )
𝑖=1 𝑖 𝑖 − 𝜇2 , for 𝑖 = 1, 2, 3, … , 𝑛

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 13


Variance: Example
Example: Assume that we have three fair coins and we toss them simultaneously.
The possible number of heads that can appear as a result of the random
experiment are given in the following table.
# of 𝐗𝟐 P(X) X*P(X) 𝐗 𝟐 P(X)
Heads (X)
0 0 1 8 0 0
1 1 3 8 3 8 3 8
2 4 3 8 6 8 12 8
3 9 1 8 3 8 9 8

E X = 3𝑖=0 X𝑖 P(X𝑖 )E X = X0 P X0 + X1 P X1 + X2 P X2 + X3 P X3
E X = 0 ∗ 1 8 + 1 ∗ 3 8 + 2 ∗ 3 8 + 3 ∗ 1 8 = 12 8 = 1.5
E X 2 = 𝑛𝑖=1 𝑥𝑖2 P(X) = 𝑥02 P 𝑥0 + 𝑥12 P 𝑥 + 𝑥22 P(𝑥2 )+𝑥32 P(𝑥3 )
E X 2 = 0 ∗ 1 8 + 1 ∗ 3 8 + 4 ∗ 3 8 + 9 ∗ 1 8 = 24 8 = 3
Var X = E X 2 − E X 2 = 3 − 1.52 = 3 − 2.25 = 0.75

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 14


Variance
• The variance of a continuous random variable is defined as:
∞ 2 ∞ 2
var X = E(𝑥)2 −[E 𝑥 ]2 = −∞
𝑥 𝑓 𝑥 𝑑𝑥 − −∞
𝑥𝑓 𝑥 𝑑𝑥
1
Example: 𝑓 𝑥 = 1 − 𝑥 0<𝑥<2
2
2 1
E X = 0
𝑥 1 − 𝑥 𝑑𝑥
2
2
2 1 2 𝑥2 1 𝑥3
E X = 0
𝑥− 𝑥 𝑑𝑥 = − ∗
2 2 2 3 0
22 1 23
= − ∗ −0
2 2 3
4 1 8
= − ∗
2 2 3
8
= 2−
3
6−4 2
= =
3 3

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 15


Variance
• The variance of a continuous random variable is defined as:
2 2 1
E(𝑥)2 = 0
𝑥 1 − 𝑥 𝑑𝑥
2
2
2 2 1 3 𝑥3 1 𝑥4
E X = 0
𝑥 − 𝑥 𝑑𝑥 = − ∗
2 3 2 4 0
23 1 24
= − ∗ −0
3 2 4
8 1 16
= − ∗
3 2 4
8 8
= −
3 4
8
= −2
3
8−6 2
= =
3 3
2
var X = E(𝑥)2 − E 𝑥 2 =2− 2 = −
2 4
=
6−4
=
2
3 3 3 9 9 9

Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 16


Laws of Expected Value and Variance
Law of Expected Mean
1. E(c) = c
2. E(cX) = cE(X)
3. E(X + Y) = E(X) + E(Y)
4. E(X − Y) = E(X) − E(Y)
5. E(XY) = E(X) E(Y) if independent
Law of Variance
1. V(c) = 0
2. V(cX) = 𝑐 2 V(X)
3. V(X + c) = V(X)
4. V(X + Y) = V(X) + V(Y)
5. V X − Y = V X − V(Y) if independent
Prof. M. B. Ranathilaka & Dr. T. Vinayagathasan, Dept. of Econ. & Statistics 17

You might also like