Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Mathematical Finance

17.01.2020

1. Probabiliry

2. Conditional Probability

3. Independent Events

4. Random Variable

5. Expectation

6. Variance

7. Covariance

8. Correlation

1
1 Probability
A French nobleman of the 17th century who was interested in gambling raised
the question: Can we bet on the occurence of at least one double six in 24
throws of a pair of dice ?

Blaise Pascal and Pierre de Fermat took interest in the problem that lead to
the development of probability theory. In the same century, the other mathe-
maticians who contributed to the theory were Jakob Bernoulli and Abraham
de Moivre. Only in 1933, an axiomatic definition of probability was intro-
duced using measure theory, by Russian mathematician Kolmogorov.

Sample Space S: Set of all possible outcomes of an experiment


Example1: Tossing a coin, S = {H, T }
Example2: Throwing a die, S = {1, 2, 3, 4, 5, 6}
Example3: Trowing two dice, S = {(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6), (2, 1), (2, 2), ...}
Example4: Selecting a chip from a set of 10 chips numbered 1 to 10
Elements of a sample space are called elementary events.
A subset A of S is called an event.
If A ∩ B = ϕ, A and B are exclusive.
AC is the complement of A, i.e, S − A.

2
Probability:
Probability of an event A is denoted as P (A). P (S) = 1.
a posteriori
m
P (A) = limn−>∞
n
where n is the total no. of occurances and m is the no. of occurances favor-
able to A.
Example5: The probability of the share price of a company increses as com-
pared to the previous day can be estimated by computing the proportion of
days the share showed increase.

a priori
P (A) is determined on the basis of reasoning
Example6: Let S = H, T in coin tossing experiment. We can assign prob-
abilities 0.5 each to the elementary H and T, if we know that is a fair coin
and H and T are equally likely.

3
Question1: In a throw of a pair of dice, what is the probabilty of getting
a sum of 7 ?
Question2: The probability of occurance of i, i = 1, 2, ..., 6 in a throw of a
die is P (i) = 16 . Let A = {2, 3, 4, 5, 6}. What is P (A) ?
Question3: If S = {1, 2, ..., 6}, is it possible that P (i) = 21i , i = 1, 2, ..., 5
7
and P (6) = 21 ?

1. Probability of the sapmle space is 1

2. P (AC ) = 1 − P (A)

3. P (A ∪ B) = P (A) + P (B) − P (AB).

Question4: Let S = {1, 2, ..., 5}. Let P (1) = 0.4, P (2) = 0.21, P (3) =
0.004, P (4) = 0.034 and P (5) = 0.352. what is P (i ≥ 2) ?

Question5: Let S = {0, 1, 2, ...}. If P (i) = βαi−1 , what is β in terms


of α ?

Example7: Suppose the probability that the SEBI index increses on a day
is 0.54 and it increses on two successive days is 0.28. What is the probability
that it does not increse on either day?

4
1.1 Conditional probability
1. Conditional probability of A given B is denoted as P (A/B).
P (A|B) = PP(AB)
(B)
.
Conditional probability is calculated by taking the given event B as
the sample space.

2. P (AB) = P (B)P (A|B)

3. P (ABC) = P (A)P (B|A)P (C|AB)

4. Let B1 , B2 and B3 are disjoint sets of events and B = B1 ∪ B2 ∪ B3 . If


A ⊂ B, then
P (A) = P (B1 )P (A|B1 ) + P (B2 )P (A|B2 ) + P (B3 )P (A|B3 ).

Example8: In a throw of a fair die, probabilty of occurance of 2 given


that an even number occurs, is 61 / 36 = 13 .

Question5: What is the probability that 1 occurs in a throw of a die given


that an even number occurs ?

Question6: An urn contains 4 white and 6 red balls. 2 balls are drawn
one after the other without replacement. Given that first draw turns out to
be a white ball, what is the probability the second draw results in a white ball.

Question7: An urn contains 10 chips numbered 1,2,...,10. Three chips are


drawn from the urn without replacement. What is the probability that the
three draws produce 4,7 and 8 in the first, second and third draw respectively
?

Qustion8: There are two urns, the first containing 2 white and 3 red balls;
and the second 4 white and 3 red balls. A fair coin is tossed and the first urn
is selected when Head turns up. Otherwise, the second urn is selected. A
ball is drawn at random from the selected urn. Given that a red ball drawn
in this process, what is the probability it is drawn from the first urn ?

5
1.2 Independent events
A and B are independent if P (AB) = P (A)P (B)
When A and B are independent, P (A|B) = P (A), and (B|A) = P (B)

Question9: Probability that three different call options are gainful are
0.5,0.7 and 0.1 respectively. If investments are made in all the three call
options, What is the probability that atleast one call option is gainful ?

Question10: Solve Chevelier De Mere’s problem.

1.3 Random variable


Random variable X is a function that maps S into the numberline.
Example9: S = H, T in coin tossing experiment. Let X be the map:
T − > 0 and H− > 1.
Example10: In the experiment of throwing a die, S = {e1 , e2 , ..., e6 }. Let
X be the map: ei − > i.
Example11: Tossing of 10 coins can result in i Heads, i = 1, 2, ..., 10, which
we denote as ei . Here, S = {e1 , e2 , ..., e10 }. One of the choices of a random
variable is X which takes on values 1, 2, ..., 10

6
1.4 Expectation
The expected value of random variable X, denoted as E(X), is the weighted
average of its values, the weights being the corresponding probabilities,
E(X) = Σ(xi P (xi ), where xi , P (xi ) are the values taken on by X and the
corresponding probabilities respectively.
Example8: In Example 5, E(X) = 0 ∗ 0.5 + 1 ∗ 0.5 = 0.5.
In Example 6, E(X) = 1 ∗ (1/6) + 2 ∗ (1/6) + ... + 6 ∗ (1/6) = 21/6.

1. E(X + Y ) = E(X) + E(Y )

2. E(a + bX) = a + bE(X)

3. E((X + Y )Z) = E(XZ) + E(Y Z)

4. E(XY ) = E(X)E(Y ), if X and Y are independent item E(X) =


E(E(X|Y ))

Example12: In Example 5, Let Y be another r.v corresponding throw of


another coin. Then X + Y can take values, 0, 1, 2. E(X + Y ) = 0.5 + 0.5 = 1.
If f (X) is a one-one function of X into the range of f , then E(f (X)) =
Σ(f (xi )pi )

7
1.5 Variance
V (X) = E(X 2 ) − (E(X))2
V (X) = E(X − m)2 , where m = E(X).
V (a + bX) = b2 V (X)
If X and Y are independent,
V (X + Y ) = V (X) + V (Y )
(what is V (X − Y ) ?)

1.6 covariance
Cov(X, Y ) = E(XY ) − E(X)E(Y ) = E(X − E(X))(Y − E(Y ))
Cov(aX, Y ) = aCov(X, Y )
Cov(X + Y, Z) = Cov(X, Z) + Cov(Y, Z)
Cauchy − Schwarz(C − S) inequality: (Σai bi )2 ≤ (Σa2i )(Σb2i )

1.7 Correlation
ρ(X, Y ) = √ cov(X,Y )
From C − S inequality, we see that,
V ar(X)V ar(Y )
−1 ≤ ρ(X, Y ) ≤ 1

You might also like