Sta4853 2

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 50

Stochastic Process, ACF, PACF, White Noise, Estimation

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 2/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stochastic Process
Definition (stochastic process)
A stochastic process is sequence of indexed random variables denoted as Z(ω, t)
where ω belongs to a sample space and t belongs to an index set.

From here on out, we will simply write a stochastic process (or time series) as
{Zt } (dropping the ω).
Notation
For the time units t1 , t2 , . . . , tn , denote the n-dimensional distribution function of
Zt1 , Zt2 , . . . , Ztn as

FZt1 ,...,Ztn (x1 , . . . , xn ) = P (Zt1 ≤ x1 , . . . , Ztn ≤ xn )

Example
iid Qn
Let Z1 , . . . , Zn ∼ N (0, 1). Then FZt1 ,...,Ztn (x1 , . . . , xn ) = j=1 Φ(xj ).

Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stochastic Process
Definition (stochastic process)
A stochastic process is sequence of indexed random variables denoted as Z(ω, t)
where ω belongs to a sample space and t belongs to an index set.

From here on out, we will simply write a stochastic process (or time series) as
{Zt } (dropping the ω).
Notation
For the time units t1 , t2 , . . . , tn , denote the n-dimensional distribution function of
Zt1 , Zt2 , . . . , Ztn as

FZt1 ,...,Ztn (x1 , . . . , xn ) = P (Zt1 ≤ x1 , . . . , Ztn ≤ xn )

Example
iid Qn
Let Z1 , . . . , Zn ∼ N (0, 1). Then FZt1 ,...,Ztn (x1 , . . . , xn ) = j=1 Φ(xj ).

Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stochastic Process
Definition (stochastic process)
A stochastic process is sequence of indexed random variables denoted as Z(ω, t)
where ω belongs to a sample space and t belongs to an index set.

From here on out, we will simply write a stochastic process (or time series) as
{Zt } (dropping the ω).
Notation
For the time units t1 , t2 , . . . , tn , denote the n-dimensional distribution function of
Zt1 , Zt2 , . . . , Ztn as

FZt1 ,...,Ztn (x1 , . . . , xn ) = P (Zt1 ≤ x1 , . . . , Ztn ≤ xn )

Example
iid Qn
Let Z1 , . . . , Zn ∼ N (0, 1). Then FZt1 ,...,Ztn (x1 , . . . , xn ) = j=1 Φ(xj ).

Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stochastic Process
Definition (stochastic process)
A stochastic process is sequence of indexed random variables denoted as Z(ω, t)
where ω belongs to a sample space and t belongs to an index set.

From here on out, we will simply write a stochastic process (or time series) as
{Zt } (dropping the ω).
Notation
For the time units t1 , t2 , . . . , tn , denote the n-dimensional distribution function of
Zt1 , Zt2 , . . . , Ztn as

FZt1 ,...,Ztn (x1 , . . . , xn ) = P (Zt1 ≤ x1 , . . . , Ztn ≤ xn )

Example
iid Qn
Let Z1 , . . . , Zn ∼ N (0, 1). Then FZt1 ,...,Ztn (x1 , . . . , xn ) = j=1 Φ(xj ).

Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt } is strongly stationary if any collection

{xt1 , xt2 , . . . , xtn }

has the same joint distribution as the time shifted set

{xt1 +h , xt2 +h , . . . , xtn +h }

Strong stationarity implies the following:


All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t,
and c. This is what is called “first-order stationary”.
The covariance of Zs and Zt (if exists) is shift-independent, i.e.
E(Zs + h, Zt + h) = E(Zs , Zt ) for any choice of h.
Strong stationarity typically assumes too much. This leads us to the weaker
assumption of weak stationarity.
Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt } is strongly stationary if any collection

{xt1 , xt2 , . . . , xtn }

has the same joint distribution as the time shifted set

{xt1 +h , xt2 +h , . . . , xtn +h }

Strong stationarity implies the following:


All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t,
and c. This is what is called “first-order stationary”.
The covariance of Zs and Zt (if exists) is shift-independent, i.e.
E(Zs + h, Zt + h) = E(Zs , Zt ) for any choice of h.
Strong stationarity typically assumes too much. This leads us to the weaker
assumption of weak stationarity.
Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt } is strongly stationary if any collection

{xt1 , xt2 , . . . , xtn }

has the same joint distribution as the time shifted set

{xt1 +h , xt2 +h , . . . , xtn +h }

Strong stationarity implies the following:


All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t,
and c. This is what is called “first-order stationary”.
The covariance of Zs and Zt (if exists) is shift-independent, i.e.
E(Zs + h, Zt + h) = E(Zs , Zt ) for any choice of h.
Strong stationarity typically assumes too much. This leads us to the weaker
assumption of weak stationarity.
Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt } is strongly stationary if any collection

{xt1 , xt2 , . . . , xtn }

has the same joint distribution as the time shifted set

{xt1 +h , xt2 +h , . . . , xtn +h }

Strong stationarity implies the following:


All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t,
and c. This is what is called “first-order stationary”.
The covariance of Zs and Zt (if exists) is shift-independent, i.e.
E(Zs + h, Zt + h) = E(Zs , Zt ) for any choice of h.
Strong stationarity typically assumes too much. This leads us to the weaker
assumption of weak stationarity.
Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Mean Function
Definition (Mean Function)
The mean function of a time series {Zt } (if it exists) is given by
Z ∞
µt = E(Zt ) = x ft (x) dx
−∞

Example (Mean of an iid MA(q))


iid
Let at ∼ N (0, σ 2 ) and

Zt = at + θ1 at−1 + θ2 at−2 + · · · + θq at−q

then

µt = E(Zt ) = E(at ) + θ1 E(at−1 ) + θ2 E(at−2 ) + · · · + θq E(at−q ) ≡ 0

(free of the time variable t)


Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Mean Function
Definition (Mean Function)
The mean function of a time series {Zt } (if it exists) is given by
Z ∞
µt = E(Zt ) = x ft (x) dx
−∞

Example (Mean of an iid MA(q))


iid
Let at ∼ N (0, σ 2 ) and

Zt = at + θ1 at−1 + θ2 at−2 + · · · + θq at−q

then

µt = E(Zt ) = E(at ) + θ1 E(at−1 ) + θ2 E(at−2 ) + · · · + θq E(at−q ) ≡ 0

(free of the time variable t)


Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Mean Function
Definition (Mean Function)
The mean function of a time series {Zt } (if it exists) is given by
Z ∞
µt = E(Zt ) = x ft (x) dx
−∞

Example (Mean of an iid MA(q))


iid
Let at ∼ N (0, σ 2 ) and

Zt = at + θ1 at−1 + θ2 at−2 + · · · + θq at−q

then

µt = E(Zt ) = E(at ) + θ1 E(at−1 ) + θ2 E(at−2 ) + · · · + θq E(at−q ) ≡ 0

(free of the time variable t)


Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

Definition (White Noise)


White noise is a collection of uncorrelated random variables with constant mean
and variance.

Notation
at ∼ WN(0, σ 2 ) — white noise with mean zero and variance σ 2
IID WN
If as is independent of at for all s 6= t, then wt ∼ IID(0, σ 2 )
Gaussian White Noise ⇒ IID
Suppose at is normally distributed.
uncorrelated + normality ⇒ independent
Thus it follows that at ∼ IID(0, σ 2 ) (a stronger assumption).

Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

Definition (White Noise)


White noise is a collection of uncorrelated random variables with constant mean
and variance.

Notation
at ∼ WN(0, σ 2 ) — white noise with mean zero and variance σ 2
IID WN
If as is independent of at for all s 6= t, then wt ∼ IID(0, σ 2 )
Gaussian White Noise ⇒ IID
Suppose at is normally distributed.
uncorrelated + normality ⇒ independent
Thus it follows that at ∼ IID(0, σ 2 ) (a stronger assumption).

Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

Definition (White Noise)


White noise is a collection of uncorrelated random variables with constant mean
and variance.

Notation
at ∼ WN(0, σ 2 ) — white noise with mean zero and variance σ 2
IID WN
If as is independent of at for all s 6= t, then wt ∼ IID(0, σ 2 )
Gaussian White Noise ⇒ IID
Suppose at is normally distributed.
uncorrelated + normality ⇒ independent
Thus it follows that at ∼ IID(0, σ 2 ) (a stronger assumption).

Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

Definition (White Noise)


White noise is a collection of uncorrelated random variables with constant mean
and variance.

Notation
at ∼ WN(0, σ 2 ) — white noise with mean zero and variance σ 2
IID WN
If as is independent of at for all s 6= t, then wt ∼ IID(0, σ 2 )
Gaussian White Noise ⇒ IID
Suppose at is normally distributed.
uncorrelated + normality ⇒ independent
Thus it follows that at ∼ IID(0, σ 2 ) (a stronger assumption).

Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Now your turn!


Mean of a Random Walk with Drift
Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant and
at ∼ WN(0, σ 2 ). What is the mean function of Zt ? (That is compute E [Zt ].)

Arthur Berg Stationarity, ACF, White Noise, Estimation 7/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Now your turn!


Mean of a Random Walk with Drift
Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant and
at ∼ WN(0, σ 2 ). What is the mean function of Zt ? (That is compute E [Zt ].)

Note that
Zt = δ + Zt−1 + at
= 2δ + Zt−2 + at + at−1 (Zt−1 = δ + Zt−2 + at−1 )
..
.
t
X
= δt + aj
j=1

Therefore we see
 
Xt t
X
µt = E [Zt ] = E [δt] + E  aj  = δt + E [aj ] = δt
j=1 j=1
Arthur Berg Stationarity, ACF, White Noise, Estimation 7/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Covariance Function


Definition (Covariance Function)
The covariance function of a random sequence {Zt } is

γ(s, t) = cov(Zs , Zt ) = E [(Zs − µs ) (Zt − µt )]

So in particular, we have

var(Zt ) = cov(Zt , Zt ) = γ(t, t) = E (Zt − µt )2


 

Also note that γ(s, t) = γ(t, s) since cov(Zs , Zt ) = cov(Zt , Zs ).


Example (Covariance Function of White Noise)
Let at ∼ WN(0, σ 2 ). By the definition of white noise, we have
(
σ2, s = t
γ(s, t) = E(as , at ) =
0, s 6= t

Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Covariance Function


Definition (Covariance Function)
The covariance function of a random sequence {Zt } is

γ(s, t) = cov(Zs , Zt ) = E [(Zs − µs ) (Zt − µt )]

So in particular, we have

var(Zt ) = cov(Zt , Zt ) = γ(t, t) = E (Zt − µt )2


 

Also note that γ(s, t) = γ(t, s) since cov(Zs , Zt ) = cov(Zt , Zs ).


Example (Covariance Function of White Noise)
Let at ∼ WN(0, σ 2 ). By the definition of white noise, we have
(
σ2, s = t
γ(s, t) = E(as , at ) =
0, s 6= t

Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Covariance Function


Definition (Covariance Function)
The covariance function of a random sequence {Zt } is

γ(s, t) = cov(Zs , Zt ) = E [(Zs − µs ) (Zt − µt )]

So in particular, we have

var(Zt ) = cov(Zt , Zt ) = γ(t, t) = E (Zt − µt )2


 

Also note that γ(s, t) = γ(t, s) since cov(Zs , Zt ) = cov(Zt , Zs ).


Example (Covariance Function of White Noise)
Let at ∼ WN(0, σ 2 ). By the definition of white noise, we have
(
σ2, s = t
γ(s, t) = E(as , at ) =
0, s 6= t

Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))


Let at ∼ WN(0, σ 2 ) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1 , as + θas−1 )


= cov(at , as ) + θ cov(at , as−1 )+
+ θ cov(at−1 , as ) + θ2 cov(at−1 , as−1 )
If s = t, then
γ(s, t) = γ(t, t) = σ 2 + θ2 σ 2 = (θ2 + 1)σ 2
If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ 2

So all together we have



2 2
(θ + 1)σ ,
 ifs = t
γ(s, t) = θσ 2 , if|s − t| = 1

0, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))


Let at ∼ WN(0, σ 2 ) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1 , as + θas−1 )


= cov (at , as ) + θ cov (at , as−1 )+
+ θ cov (at−1 , as ) + θ 2 cov (at−1 , as−1 )
If s = t, then
γ(s, t) = γ(t, t) = σ 2 + θ2 σ 2 = (θ2 + 1)σ 2
If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ 2

So all together we have



2 2
(θ + 1)σ ,
 ifs = t
γ(s, t) = θσ 2 , if|s − t| = 1

0, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))


Let at ∼ WN(0, σ 2 ) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1 , as + θas−1 )


= cov (at , as ) + θ cov (at , as−1 )+
+ θ cov (at−1 , as ) + θ 2 cov (at−1 , as−1 )
If s = t, then
γ(s, t) = γ(t, t) = σ 2 + θ2 σ 2 = (θ2 + 1)σ 2
If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ 2

So all together we have



2 2
(θ + 1)σ ,
 ifs = t
γ(s, t) = θσ 2 , if|s − t| = 1

0, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))


Let at ∼ WN(0, σ 2 ) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1 , as + θas−1 )


= cov (at , as ) + θ cov (at , as−1 )+
+ θ cov (at−1 , as ) + θ 2 cov (at−1 , as−1 )
If s = t, then
γ(s, t) = γ(t, t) = σ 2 + θ2 σ 2 = (θ2 + 1)σ 2
If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ 2

So all together we have



2 2
(θ + 1)σ ,
 ifs = t
γ(s, t) = θσ 2 , if|s − t| = 1

0, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))


Let at ∼ WN(0, σ 2 ) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1 , as + θas−1 )


= cov (at , as ) + θ cov (at , as−1 )+
+ θ cov (at−1 , as ) + θ 2 cov (at−1 , as−1 )
If s = t, then
γ(s, t) = γ(t, t) = σ 2 + θ2 σ 2 = (θ2 + 1)σ 2
If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ 2

So all together we have



2 2
(θ + 1)σ ,
 ifs = t
γ(s, t) = θσ 2 , if|s − t| = 1

0, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Now your turn!


Covariance Function of a Random Walk with Drift
Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant and
at ∼ WN(0, σ 2 ). What is the covariance function of Zt ? (That is compute
cov (Zs , Zt ).)

Arthur Berg Stationarity, ACF, White Noise, Estimation 10/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Now your turn!


Covariance Function of a Random Walk with Drift
Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant and
at ∼ WN(0, σ 2 ). What is the covariance function of Zt ? (That is compute
cov (Zs , Zt ).)

From the representation


t
X
Zt = δt + aj
j=1
We see
 
s
X t
X
γ(s, t) = cov(Zs , Zt ) = cov  aj , ak 
j=1 k=1
s X
X t
= cov (aj , ak ) = min(s, t)σ 2
j=1 k=1

Arthur Berg Stationarity, ACF, White Noise, Estimation 10/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Autocorrelation Function

Definition (Correlation Function)


The correlation function of a random sequence {Zt } is

γ(s, t)
ρ(s, t) = p
γ(s, s)γ(t, t)

Cauchy-Schwartz inequality ⇒ |ρ(s, t)| ≤ 1

Arthur Berg Stationarity, ACF, White Noise, Estimation 11/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Autocorrelation Function

Definition (Correlation Function)


The correlation function of a random sequence {Zt } is

γ(s, t)
ρ(s, t) = p
γ(s, s)γ(t, t)

Cauchy-Schwartz inequality ⇒ |ρ(s, t)| ≤ 1

Arthur Berg Stationarity, ACF, White Noise, Estimation 11/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Weak Stationarity

Definition (Weak Stationarity)


A weakly stationary time series is a finite variance process that
(i) has a mean function, µ, that is constant (so it doesn’t depend on t);
(ii) has a covariance function, γ(s, t), that dependents on s and t only through
the difference |s − t|.

From now on when we say stationary, we mean weakly stationary.


For stationary processes, the mean function is µ (no t).
Since γ(s, t) = γ(s + h, t + h) for stationary processes, we have
γ(s, t) = γ(s − t, 0).
Therefore we can view the covariance function as a function of only one
variable (h = s − t); this is the autocovariance function γ(h).

Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Weak Stationarity

Definition (Weak Stationarity)


A weakly stationary time series is a finite variance process that
(i) has a mean function, µ, that is constant (so it doesn’t depend on t);
(ii) has a covariance function, γ(s, t), that dependents on s and t only through
the difference |s − t|.

From now on when we say stationary, we mean weakly stationary.


For stationary processes, the mean function is µ (no t).
Since γ(s, t) = γ(s + h, t + h) for stationary processes, we have
γ(s, t) = γ(s − t, 0).
Therefore we can view the covariance function as a function of only one
variable (h = s − t); this is the autocovariance function γ(h).

Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Weak Stationarity

Definition (Weak Stationarity)


A weakly stationary time series is a finite variance process that
(i) has a mean function, µ, that is constant (so it doesn’t depend on t);
(ii) has a covariance function, γ(s, t), that dependents on s and t only through
the difference |s − t|.

From now on when we say stationary, we mean weakly stationary.


For stationary processes, the mean function is µ (no t).
Since γ(s, t) = γ(s + h, t + h) for stationary processes, we have
γ(s, t) = γ(s − t, 0).
Therefore we can view the covariance function as a function of only one
variable (h = s − t); this is the autocovariance function γ(h).

Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Weak Stationarity

Definition (Weak Stationarity)


A weakly stationary time series is a finite variance process that
(i) has a mean function, µ, that is constant (so it doesn’t depend on t);
(ii) has a covariance function, γ(s, t), that dependents on s and t only through
the difference |s − t|.

From now on when we say stationary, we mean weakly stationary.


For stationary processes, the mean function is µ (no t).
Since γ(s, t) = γ(s + h, t + h) for stationary processes, we have
γ(s, t) = γ(s − t, 0).
Therefore we can view the covariance function as a function of only one
variable (h = s − t); this is the autocovariance function γ(h).

Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 13/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Autocovariance and Autocorrelation Functions

Definition (Autocovariance Function)


The autocovariance function of a stationary time series is

γh = γ(h) = E [(Zt+h − µ) (Zt − µ)]

(for any value of t).

Definition (Autocorrelation Function)


The autocorrelation function of a stationary time series is

γ(h)
ρh = ρ(h) =
γ(0)

Arthur Berg Stationarity, ACF, White Noise, Estimation 14/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Autocovariance and Autocorrelation Functions

Definition (Autocovariance Function)


The autocovariance function of a stationary time series is

γh = γ(h) = E [(Zt+h − µ) (Zt − µ)]

(for any value of t).

Definition (Autocorrelation Function)


The autocorrelation function of a stationary time series is

γ(h)
ρh = ρ(h) =
γ(0)

Arthur Berg Stationarity, ACF, White Noise, Estimation 14/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functions


γ0 = var(Zt ); ρ0 = 1
|γk | ≤ γ0 ; |ρk | ≤ 1
γk = γ−k ; ρk = ρ−k
γk and ρk are positive semidefinite (p.s.d.), i.e.
X n X n
αi αj γ|ti −tj | ≥ 0
i=1 j=1
n X
n
and X
αi αj ρ|ti −tj | ≥ 0
i=1 j=1
for any set of times t1 , . . . , tn and any real numbers α1 , . . . , αn .
Proof.
Pn
Define X = i=1 αi Zti , then
n X
X n n X
X n
0 ≤ var(X) = αi αj cov(Zti , Ztj ) = αi αj γ|ti −tj |
i=1 j=1 i=1 j=1

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functions


γ0 = var(Zt ); ρ0 = 1
|γk | ≤ γ0 ; |ρk | ≤ 1
γk = γ−k ; ρk = ρ−k
γk and ρk are positive semidefinite (p.s.d.), i.e.
X n X n
αi αj γ|ti −tj | ≥ 0
i=1 j=1
n X
n
and X
αi αj ρ|ti −tj | ≥ 0
i=1 j=1
for any set of times t1 , . . . , tn and any real numbers α1 , . . . , αn .
Proof.
Pn
Define X = i=1 αi Zti , then
n X
X n n X
X n
0 ≤ var(X) = αi αj cov(Zti , Ztj ) = αi αj γ|ti −tj |
i=1 j=1 i=1 j=1

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functions


γ0 = var(Zt ); ρ0 = 1
|γk | ≤ γ0 ; |ρk | ≤ 1
γk = γ−k ; ρk = ρ−k
γk and ρk are positive semidefinite (p.s.d.), i.e.
X n X n
αi αj γ|ti −tj | ≥ 0
i=1 j=1
n X
n
and X
αi αj ρ|ti −tj | ≥ 0
i=1 j=1
for any set of times t1 , . . . , tn and any real numbers α1 , . . . , αn .
Proof.
Pn
Define X = i=1 αi Zti , then
n X
X n n X
X n
0 ≤ var(X) = αi αj cov(Zti , Ztj ) = αi αj γ|ti −tj |
i=1 j=1 i=1 j=1

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functions


γ0 = var(Zt ); ρ0 = 1
|γk | ≤ γ0 ; |ρk | ≤ 1
γk = γ−k ; ρk = ρ−k
γk and ρk are positive semidefinite (p.s.d.), i.e.
X n X n
αi αj γ|ti −tj | ≥ 0
i=1 j=1
n X
n
and X
αi αj ρ|ti −tj | ≥ 0
i=1 j=1
for any set of times t1 , . . . , tn and any real numbers α1 , . . . , αn .
Proof.
Pn
Define X = i=1 αi Zti , then
n X
X n n X
X n
0 ≤ var(X) = αi αj cov(Zti , Ztj ) = αi αj γ|ti −tj |
i=1 j=1 i=1 j=1

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functions


γ0 = var(Zt ); ρ0 = 1
|γk | ≤ γ0 ; |ρk | ≤ 1
γk = γ−k ; ρk = ρ−k
γk and ρk are positive semidefinite (p.s.d.), i.e.
X n X n
αi αj γ|ti −tj | ≥ 0
i=1 j=1
n X
n
and X
αi αj ρ|ti −tj | ≥ 0
i=1 j=1
for any set of times t1 , . . . , tn and any real numbers α1 , . . . , αn .
Proof.
Pn
Define X = i=1 αi Zti , then
n X
X n n X
X n
0 ≤ var(X) = αi αj cov(Zti , Ztj ) = αi αj γ|ti −tj |
i=1 j=1 i=1 j=1

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 16/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

White noise, denoted by at ∼ WN(0, σ 2 ), is by definition a weakly stationary


process with autocovariance function
(
σ2, k = 0
γk =
0, k 6= 0

and autocorrelation function


(
1, k = 0
ρk =
6 0
0, k =

Not all white noise is boring like iid N (0, σ 2 ).


For example, stationary GARCH processes can have all the properties of white
noise.

Arthur Berg Stationarity, ACF, White Noise, Estimation 17/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

White noise, denoted by at ∼ WN(0, σ 2 ), is by definition a weakly stationary


process with autocovariance function
(
σ2, k = 0
γk =
0, k 6= 0

and autocorrelation function


(
1, k = 0
ρk =
6 0
0, k =

Not all white noise is boring like iid N (0, σ 2 ).


For example, stationary GARCH processes can have all the properties of white
noise.

Arthur Berg Stationarity, ACF, White Noise, Estimation 17/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 18/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Random Walk Simulation in R


2 iid
Simulate the random walk Zt = .2 + Zt−1Pt+ at where at ∼ N (0, σ ).
Hint: use the representation Zt = .2t + j=1 aj !

> w = rnorm(200,0,1)
> x = cumsum(w)
> wd = w +.2
> xd = cumsum(wd)
> plot.ts(xd, ylim=c(-5,55))
> lines(x)
> lines(.2*(1:200), lty="dashed")

Arthur Berg Stationarity, ACF, White Noise, Estimation 19/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 20/ 21


§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Homework 1b

Read the following sections from the textbook


§2.5: Estimation of the Mean, Autocovariances, and Autocorrelations
§2.6: Moving Average and Autoregressive Representations of Time Series
Processes
Do the following exercise.
Prove that a time series of iid random variables is strongly stationary.

Arthur Berg Stationarity, ACF, White Noise, Estimation 21/ 21

You might also like