Professional Documents
Culture Documents
Kirk20130318 Book Club StochProcesses
Kirk20130318 Book Club StochProcesses
19/03/2013
Probability
6-sided die. Let X be the value we get when we roll the die.
Describe the likelihood that X will have each of the values 1, . . . , 6
of the time.
Paul Kirk
1 of 22
events.
X can take precisely one of the values between 1 and 6.
Mutually exclusive probabilities sum
of probability
theory = 4).
Prob(X = 3 or X A
= review
4) = Prob(X
= 3) + Prob(X
Paul Kirk
2 of 22
So, rather than writing Prob(X = 3) or Prob(X = x), we will (in later
chapters) tend to write Prob(3) or Prob(x).
Paul Kirk
3 of 22
So, rather than writing Prob(X = 3) or Prob(X = x), we will (in later
chapters) tend to write Prob(3) or Prob(x).
Paul Kirk
3 of 22
4 of 22
Expectation
The expectation of an arbitrary function, f (X ), with respect to the
Paul Kirk
5 of 22
the mean
Z
V [X ] =
P(x)(x hX i)2 dx
Z
=
P(x)(x 2 + hX i2 2xhX i)dx
Z
Z
Z
2
2
=
P(x)x dx +
P(x)hX i dx
P(x)2xhX idx
Z
Z
2
2
= hX i + hX i
P(x)dx 2hX i
P(x)xdx
= hX i + hX i 2hX i
= hX 2 i hX i2
Paul Kirk
6 of 22
1.2 Independence
Independence
Independent probabilities multiply
For independent variables, PX ,Y (x, y ) = PX (x)PY (y )
For independent variables, hXY i = hX ihY i
Paul Kirk
7 of 22
Dependence
If X and Y are dependent, then PX ,Y (x, y ) does not factor as the
Paul Kirk
8 of 22
PX ,Y (X = x, Y = y )
.
PY (Y = y )
Paul Kirk
9 of 22
Paul Kirk
hXY i hX ihY i
p
V [X ]V [Y ]
10 of 22
Paul Kirk
PX (s
z)PY (s)ds = PX PY
11 of 22
Paul Kirk
PX (s
z)PY (s)ds = PX PY
11 of 22
PX (s
z)PY (s)ds = PX PY
PX ,Y (z s, s)ds =
PZ (z) =
Paul Kirk
PX ,Y (s, z s)ds
11 of 22
PX (s
z)PY (s)ds = PX PY
PX ,Y (z s, s)ds =
PZ (z) =
PX ,Y (s, z s)ds
Paul Kirk
11 of 22
Paul Kirk
12 of 22
n=1
Moreover,
2 2
E Xn2 (E [Xn ])2
Xn
Xn
Xn
V [Xn ]
V
=
.
=E
E
=
N
N2
N
N2
N2
So,
"
V
N
X
Xn
n=1
Paul Kirk
#
=
N
X
V [Xn ]
n=1
N2
N
1 X 2 2
= 2
.
=
N
N
n=1
13 of 22
x=b
hf (Y )i =
y =g (b)
PX (x)f (g (x))dx =
x=a
Paul Kirk
PY (y )f (y )dy .
y =g (a)
14 of 22
x=b
hf (Y )i =
y =g (b)
PX (x)f (g (x))dx =
x=a
PY (y )f (y )dy .
y =g (a)
Paul Kirk
14 of 22
x=b
hf (Y )i =
y =g (b)
PX (x)f (g (x))dx =
x=a
PY (y )f (y )dy .
y =g (a)
PX (g 1 (y ))
.
|g 0 (g 1 (y ))|
14 of 22
Paul Kirk
d
D(x).
dx
15 of 22
1
2
16 of 22
Z
1
=
1 + tx + t 2 x 2 + ... P(x)dx
2!
1
1
= 1 + tm1 + t 2 m2 + . . . + t r mr + . . .
2!
r!
where mr = hX r i is the r -th (raw) moment.
Paul Kirk
17 of 22
1 2
1
t m2 + . . . + t r mr + . . .
2!
r!
Paul Kirk
18 of 22
R (t) =
M 0 (t)
M(t)
M(t)M 00 (t) (M 0 (t))2
(M(t))2
19 of 22
X
(n) (0)s n
n=0
n!
where (n) (s) is the n-th derivative of (s). But we also have:
*
+
n
X (isX )n
X
i hX n is n
isX
(s) = he i =
=
n!
n!
n=0
Paul Kirk
n=0
(n) (0)
in
20 of 22
Cumulants
The n-th order cumulant of X , is the n-th derivative of the log of the
characteristic function.
For independent random variables, X and Y , if Z = X + Y then the
n-th cumulant of Z is the sum of the n-th cumulants of X and Y .
The Gaussian distribution is also the only absolutely continuous
distribution all of whose cumulants beyond the first two (i.e. other
than the mean and variance) are zero.
Paul Kirk
21 of 22
hX 2n1 i = 0
Paul Kirk
22 of 22