Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Unit-1-Probability-models.

pdf

thaisgr

Introducción a la Inferencia Estadística

2º Grado en Administración y Dirección de Empresas

Facultad de Economía
Universitat de València

Reservados todos los derechos.


No se permite la explotación económica ni la transformación de esta obra. Queda permitida la impresión en su totalidad.
Statistics

Unit 1: probability models

No se permite la explotación económica ni la transformación de esta obra. Queda permitida la impresión en su totalidad.
Univariate random variables
1.Random variables and probability distributions

A random variable is a variable that takes specific value with given probabilities.
- We denote random variables using capital letters near the end of the alphabet:
X,Y,U,V,W.
- Notation: capital letters (X) denote a random variable; lowercase (x) denote a specific
value.

Reservados todos los derechos.


Types of random variables:
- Discrete: are the outcome of a counting process (number of passengers going through
customs any given day).
- Continuous: are the outcome of a measuring process, and can take any value in a given
internal (time to complete statistics homework).
Defining discrete random variables:
- The probability distribution function P(X = x) assigns probabilities to values of the random
variable. It measures the probability that the random variable X takes on the value x.
- Alternatively, the cumulative distribution function F(x) measures the probability that the
random variable X is less than or equal to x. F(x) = P(X ≤ x)
Defining continuous random variables:
- X is continuous à P(X = x) = 0 for all possible values x.
- The probability density function f(X = x) assigns probabilities to intervals of the random
variable the specific values of the random values.
- In practice, we will be using the cumulative distribution function F(x) (defined as in the
discrete case, F(X) = P(X ≤ x).
Mean and variance of a random variables:
- The mean, or expected value, of a random variable X is 𝐸(𝑋) = 𝜇𝑥. It can be thought of
as the average value that is attained by a random variable.
- The variance of X is 𝑉𝑎𝑟(𝑋) = 𝜎 ! 𝑥. Generally speaking, the greater the variance (or sd)
the more spread out the possible values of the random variable.

a64b0469ff35958ef4ab887a898bd50bdfbbe91a-5111515
Statistics

2.Specific univariate models

Discrete probability models:

No se permite la explotación económica ni la transformación de esta obra. Queda permitida la impresión en su totalidad.
Continuous probability models:

Reservados todos los derechos.


3.Univariate linear transformations

Linear transformations of random variables:


- Let X be a random variable and a and b be constants. Then Y = a + bX
- It is a linear transformation of X. Note that b scales X (such that for any |𝑏| > 1 the linear
transformation amplifies X; otherwise, it dampens it) and a shifts it.
Mean and variance of linear transformations:
- Y = a + bX
- Mean of Y is à
- Variance of Y is à

Special case: transformation of a normal variable


- Let X be a normal random variable:
- Define Y = a + bX
- Then, Y is also a normal random variable

Me han encerrado aquí ¿alguien puede leer esto?


a64b0469ff35958ef4ab887a898bd50bdfbbe91a-5111515
Statistics

Bivariate random variables

1.Jointly distributed variables

Goal: to examine the probability distribution of two (bivariate) or more (multivariate) possible
related random variables.

No se permite la explotación económica ni la transformación de esta obra. Queda permitida la impresión en su totalidad.
Bivariate example: a basic economic relation links income (X) and consumption (Y). If we assume
X and Y are random variables then we can define the joint distribution for (X,Y).
The discrete case:
- When (X,Y) are discrete, their distribution is defined by a joint probability distribution
function.
- Joint probability for (X,Y) à 𝑃",$ (𝑥, 𝑦) = 𝑃(𝑋 = 𝑥, 𝑌 = 𝑦) probability that X = x and
Y = y.
- Marginal distribution of X à 𝑃" = ( 𝑋 = 𝑥) à distribution of X regardless of the value
of Y.

Reservados todos los derechos.


The continuous case:
- If (X,Y) are continuous, their distribution is defined by a joint cumulative distribution
function.
- Joint cumulative prob. for (X, Y) à 𝐹",$ (𝑥, 𝑦) = 𝑃(𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦) probability that X ≤ x
and Y ≤ y.
- Marginal distribution of X à 𝐹" = ( 𝑋 ≤ 𝑥), is the cumulative distribution of X regardless
of Y.
Independence:
- (X,Y) are independent if the joint distribution is the product of the marginal distributions.
- Discrete case à 𝑃(𝑋, 𝑌) = 𝑃" (𝑋) 𝑥 𝑃$ (𝑌)
- Continuous case à 𝐹(𝑋, 𝑌) = 𝐹" (𝑋) 𝑥 𝐹$ (𝑌)

2. Measuring linear relations

Covariance: is a measure of the linear relation between two random variables:


- N
- If 𝜎"$ > 0 à X tends to be large when Y is large
- If 𝜎"$ < 0 à X tends to be large for lesser values of Y
- If 𝜎"$ = 0 à (X,Y) are uncorrelated (no linear relation).

Me han encerrado aquí ¿alguien puede leer esto?


a64b0469ff35958ef4ab887a898bd50bdfbbe91a-5111515
Statistics

Correlation:

No se permite la explotación económica ni la transformación de esta obra. Queda permitida la impresión en su totalidad.
- The correlation coefficient à
- −1 ≤ 𝜌"$ ≤ 1
- 𝜌"$ measures the strength of the linear relationship between X and Y.
o 𝜌"$ > 0 à positive linear relationship.
o 𝜌"$ < 0 à negative linear relationship.
o 𝜌"$ = 0 à no linear relationship.
Independence, covariance and correlation:
- If X and Y are independent (no relationship between them), then they are also
uncorrelated (𝜎"$ = 𝜌"$ = 0): independent random variables have zero

Reservados todos los derechos.


covariance/correlation.
- However, the converse is not necessarily true. Two variables that are uncorrelated may
or may not be independent.

3. Multivariate linear transformations

Notation for bivariate distributions:


- Let X and Y be two random variables. A bivariate distribution gives the probability that
each variable falls in any particular range or discrete set of values.
- Let their means, variances and covariance be 𝜇% , 𝜇& ; 𝜎"! , 𝜎$! , and 𝜎"$ respectively. We
use the notation:

Linear transformations of two random variables:

Special case: independent normal random variables:

a64b0469ff35958ef4ab887a898bd50bdfbbe91a-5111515
Statistics

Sum of n independent random variables:

No se permite la explotación económica ni la transformación de esta obra. Queda permitida la impresión en su totalidad.
Special case: sum of n independent normal random variables:

Reproductive property of specific distributions:

Reservados todos los derechos.


- Let 𝑋' , 𝑋! , … 𝑋( be n independent random variables and W their sum: 𝑊 = 𝑋' + 𝑋! +
⋯ + 𝑋(
- If 𝑋) are binary random variables (Bernoulli) each one with the same probability of
success p, then W is a binomial random variables:
- If 𝑋) are Poisson random variables each one with the same mean 𝜆, then W is a poisson
random variable:
- If 𝑋) are normal random variables ach one with the same mean 𝜇𝑥 and variance 𝜎"! , then
W is a normal random variable:

4. The bivariate normal distribution

Let X and Y be two normal random variables:

Consider the linear transformation: W = a + bX + cY


If W is also normal for any set of values 𝑎, 𝑏, 𝑐 ∈ ℝ then (X,Y) is a bivariate normal distribution.
Notation:
Special case: uncorrelated implies independence:

Me han encerrado aquí ¿alguien puede leer esto?


a64b0469ff35958ef4ab887a898bd50bdfbbe91a-5111515

You might also like