Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Stat 150 Stochastic Processes

Spring 2009

Lecture 28: Brownian Bridge


Lecturer: Jim Pitman

From last time: Problem: Find hitting probability for BM with drift , Dt := a + t + Bt , ( ) Pa (Dt hits b before 0). Idea: Find a suitable MG. Found e2Dt is a MG. Use this: Under Pa start at a. Ea e2D0 = e2a . Let T = rst time Dt hits 0 or b. Stop the MG at time T , Mt = e2Dt . Look at (MtT , t 0). For simplicity, take > 0, then 1 e2Dt > e2a > e2b > 0. For 0 t T , get a MG with values in [0,1] when we stop at T . Final value of MG = 1 1(hit 0 before b) + e2b 1(hit b before 0) 1 e
2a

= P(hit b) + P(hit 0) = 1 P(hit 0) + e2b P(hit b)

= P(hit b) =

1 e2a 1 e2b

Equivalently, start at 0, P(Bt + t ever reaches a) = e2a . Let M := mint (Bt + t), then P(M a) = e2a = M Exp(2 ). Note the memoryless property: P(M > a + b) = P(M > a)P(M > b) This can be understood in terms of the strong Markov property of the drifting BM at its rst hitting time of a: to get below (a + b) the process must get down to a, and the process thereafter must get down a further amount b.
0 a

Lecture 28: Brownian Bridge

Brownian Bridge This is a process obtained by conditioning on the value of B at a xed time, say time=1. Look at (Bt , 0 t 1|B1 ), where B0 = 0. Key observation: E(Bt |B1 ) = tB1
B1

B0

To see this, write Bt = tB1 + (Bt tB1 ), and observe that tB1 and Bt tB1 are independent. This independence is due to the fact that B is a Gaussian process, so two linear combinations of values of B at xed times are independent if and only if their covariance is zero, and we can compute:
2 E(B1 (Bt tB1 )) = E(Bt B1 tB1 )=tt1=0

since E(Bs Bt ) = s for s < t. Continuing in this vein, let us argue that (Bt tB1 , 0 t 1) and B1 are
process r.v.

independent. Proof: (ti ) := Bt ti B1 , take any linear combination: Let B i E((


i

(ti ))B1 ) = i B
i

(ti )B1 = i EB
i

i 0 = 0

at xed times ti [0, 1] This shows that every linear combination of values of B is independent of B1 . It follows by measure theory (proof beyond scope of this (ti ), 0 course) that B1 is independent of every nite-dimensional vector (B t1 < < tn 1), and hence that B1 is independent of the whole process (t), 0 t 1). This process B (t) := (Bt tB1 , 0 t 1) is called (B Brownian bridge. (0) = B (1) = 0. Intuitively, B is B conditioned on B1 = 0. Moreover, Notice: B (t)+ tb, 0 t 1). These assertions about condiB conditioned on B1 = b is (B tioning on events of probability can be made rigorous either in terms of limiting statements about conditioning on e.g. B1 = (b, b + ) and letting tend to 0, or by integration, similar to the interpretation of conditional distributions and densities.

Lecture 28: Brownian Bridge

. Properties of the bridge B (1) Continuous paths. (2) (ti ) = i B


i i

i (B (ti ) ti B (1)) i B (ti ) (


i i

i ti )B (1)

= some linear combination of B () is a Gaussian process. As such it is determined by its mean and = B (t)) = 0. For 0 s < t 1, covariance functions: E(B (s)B (t)) = E[(Bs sB1 )(Bt tB1 )] E(B 2 = E(Bs Bt ) sE(Bt B1 ) tE(Bs B1 ) + stE(B1 ) = s st st + st = s(1 t) (1 t), 0 t 1) = (B (t), 0 Note symmetry between s and 1 t = (B t 1) is an inhomogeneous Markov Process. (Exercise: describe its transition (3) B t given B s for 0 < s < t.) rules, i.e. the conditional distribution of B
d

Application to statistics Idea: How to test if variables X1 , X2 , . . . are like an independent sample from some continuous cumulative distribution functionF ? Empirical distribution Fn (x) := fraction of Xi , 1 i n, with values x. Fn (x) is a random function of x which increases by 1/n at each data point. If the Xi s are IID(F ), then Fn (x) F (x) as n by LLN. In fact, supx |Fn (x) F (x)| 0. (Glivenko-Cantelli theorem). Now Var(Fn (x)) = F (x)(1 F (x)/n. So consider Dn := n supx |Fn (x) F (x)| as a test statistic for data distributed as F . Key observation: Let Ui := F (Xi ) U[0,1] . The distribution of Dn does not depend on F at all. To compute it, may as well assume that F (x) = x, 0 x 1; that is, sampling from U[0,1] . Then Dn := n sup |Fn (u) u|
n

Fn (t) =

1 n

1(Ui u)
i=1

Lecture 28: Brownian Bridge

Idea: Look at the process Xn (t) := ( n(Fn (t) t), 0 t 1). You can easily check for 0 s t 1: E(Xn (t)) = 0 Var(Xn (t)) = t(1 t) E[Xn (s)Xn (t)] = s(1 t)
d (t), 0 t 1) in the sense of convergence in We see (Xn (t), 0 t 1) (B distribution of all nite-dimensional distributions: In fact, with care from linear combination, d (t1 ), . . . , B (tk )) (Xn (t1 ), . . . , Xn (tk )) (B CenteredandScaledDiscrete Gaussian

Easy: For any choice of i and ti , 1 i m, i Xn (ti ) = centered sum of n IID random variables Normal by usual CLT. And it can be shown that also: d (t)| Dn := supt |Xn (t)| supt |B Formula for the distribution function is a theta function series which has been widely tabulated. The convergence in distribution to an explicit limit distribution is due to Kolmogorov and Smirnov in the 1930s. Identication of the limit in terms of Brownian bridge was made by Doob in the 1940s. Rigorous derivation of a family of such limit results for various functionals of random walks converging to Brownian motion is due to Donsker in the 1950s (Donskers theorem). Theorem: (Dvoretsky-Erd os-Kakutani) With probability 1 the path of a BM is nowhere dierentiable.
m i=1 d

You might also like