Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 86

1.

Sample Space and Probability

陳信宏 Sin-Horng Chen


schen@nycu.edu.tw
~31822 、 03-5731822
工程四館 805 室
Outline
 Introduction
 Sets
 Probabilistic Models
 Conditional Probability
 Total Probability Theorem and Bayes’
Rule
 Independence
 Counting

2
Introduction
 Probability is usually defined in terms of
frequency of occurrence. 例如:林書豪的三分球
命中率,以電腦選號買一張大樂透中頭獎的機率。
 另一種定義是個人的 subjective belief. 例如:張忠
模說今年下半年半導體景氣有 90% 可能會持續高成長,
大樂透迷相信明牌號碼中頭獎的機率遠高於電腦選號
 Our main objective in this course is to develop
the art of describing uncertainty in terms of
probabilistic models, as well as the skill of
probabilistic reasoning.
 The subject of this chapter is to describe the
generic structure of such models, and their
basic properties.
3
1.1 Sets

 A set is a collection of objects, which


are the elements of the set.
 If S is a set and x is an element of S,
we write xS.
 If x is not an element of S, we write
xS.
 A set can have no elements, in which
case it is called the empty set, denoted
by Ø.
 Notation: S = {x1, x2, . . . , xn}.

4
Countability

 If S contains infinitely many elements


x1, x2, ..., which can be enumerated in a
list we write S = {x1, x2, ...}, and we
say that S is countably infinite.
 Ex: {0, 2,−2, 4,−4, . . .} is countably
infinite.
 Notation: {x | x satisfies P}, P is a
certain property.
 The symbol “|” is to be read as “such that.”
 Ex: {x | 0 ≤ x ≤ 1} is uncountable.

5
Relations between Sets

 If every element of a set S is also an


element of a set T, we say that S is a
subset of T, and we write S⊂T or T⊃S.
 If S⊂T and T⊂S, the two sets are equal,
and we write S=T.
 Universal set, denoted by Ω, contains
all objects that could conceivably be of
interest in a particular context.

6
Set Operations

 The complement of a set S, with respect


to the universe Ω, is the set {xΩ|xS}
of all elements of Ω that do not belong
to S, and is denoted by Sc.
 Note that Ωc = Ø.
 Union: S∪T = {x | xS or xT}.
 Intersection: S∩T = {x | xS and xT}.

7
Disjoint and Partition

 Two sets are said to be disjoint if their


intersection is empty.
 Several sets are said to be disjoint if no
two of them have a common element.
 A collection of sets is said to be a
partition of a set S if the sets in the
collection are disjoint and their union is
S.

8
Ordered Pair

 If x and y are two objects, we use (x, y)


to denote the ordered pair of x and y.
 The set of scalars (real numbers) is
denoted by R.
 The set of pairs (or triplets) of scalars,
i.e., the two-dimensional plane (or
three-dimensional space, respectively)
is denoted by R 2 (or R 3, respectively).
 Sets and the associated operations are
easy to visualize in terms of Venn
diagrams.
9
Venn Diagrams

10
The Algebra of Sets

S ∪ T = T ∪ S
 S ∪ (T ∪ U) = (S ∪ T) ∪ U
 S ∩ (T ∪ U) = (S ∩ T) ∪ (S ∩ U)
 S ∪ (T ∩ U) = (S ∪ T) ∩ (S ∪ U)
 (Sc)c = S
 S ∩ Sc = Ø
S ∪ Ω = Ω
S ∩ Ω = S

11
De Morgan’s Laws

c c
   
 n   n  n   n
c c
S  S S  S
 n  n  n  n

 S  , then x   S . So, x  S
c
proof: If x  n n n n n for any n.
This implies x  Snc for all n. So, x  Snc . This shows
n
c
 
 n   n . The converse inclusion is established
c
S  S
 n  n

by reversing the above argument and the first law follows.


The argument for the second law is similar.
12
1.2 Probabilistic Model
 A probabilistic model is a mathematical
description of an uncertain situation.
 Elements of a probabilistic model
 The sample space Ω, which is the set of all
possible outcomes of an experiment.
 The probability law, which assigns to a set A of
possible outcomes (also called an event) a
nonnegative number P(A) (called the
probability of A) that encodes our knowledge
or belief about the collective “likelihood” of the
elements of A. The probability law must satisfy
certain properties (to be discussed latter).
13
The Main Ingredients of a Probabilistic model

14
Sample Spaces and Events
 Every probabilistic model involves an underlying
process, called the experiment
 That produces exactly one out of several possible
outcomes
 The set of all possible outcomes is called the sample
space of the experiment, denoted by 
 A subset of the sample space (a collection of possible
outcomes) is called an event

 Examples of the experiment


 A single toss of a coin (finite outcomes)
 Three tosses of two dice (finite outcomes)
 An infinite sequences of tosses of a coin (infinite
outcomes)
 Throwing a dart on a square target (infinite outcomes),
etc. 15
Choosing an Appropriate Sample Space

 Different elements of the sample space


should be distinct and mutually
exclusive so that when the experiment
is carried out, there is a unique
outcome.
 The sample space chosen for a
probabilistic model must be collectively
exhaustive.
 The sample space should be at the
“right” granularity (avoiding irrelevant
details)
16
Granularity of the Sample Space
 Example 1.1. Consider two alternative games, both
involving ten successive coin tosses:
 Game 1: We receive $1 each time a head comes up
 Game 2: We receive $1 for every coin toss, up to and including
the first time a head comes up. Then, we receive $2 for every
coin toss, up to the second time a head comes up. More
generally, the dollar amount per toss is doubled each time a
head comes up

>> Game 1 consists of 11 (0,1,..,10) possible outcomes (of


money received)
>> Game 2 consists of ?? possible outcomes (of money received)
 A finer description is needed
 E.g., each outcome corresponds to a possible ten-long
sequence of heads and tails (will 1,each
1, 1, 2,sequence
2, 2, 4, 4, 4, 8have a
distinct outcome?)
17
Sequential Probabilistic Models
 Many experiments have an inherent sequential
character
 Tossing a coin three times
 Observing the value of stock on five successive days
 Receiving eight successive digits at a communication
receiver

>> They can be described by means of a tree-based


sequential description

18
Probability Laws

 Given the sample space associated with an


experiment is settled on, a probability law
 Specify the likelihood of any outcome, and/or
of any set of possible outcomes (an event)
 Or alternatively, assign to every event A, a
number P(A), called the probability of A,
satisfying the following axioms:

 Probability Axioms
 Nonnegativity: P(A) ≥ 0, for every event A.
 Additivity: If A1, A2, ... are disjoint events, then
P(A1 ∪ A2 ∪ ) = P(A1) + P(A2) + .
 Normalization: P(Ω) = 1. 19
 其他的 properties 可以 derived from 上面 3
個 axioms
 由 Additivity Axiom 知道 P(A) 等於它所含
的所有 elements 的 probabilities 的 sum
 Probability of an event 較具體的解釋是
relative frequency :重複做很多次實
驗, event A 發生的次數大約是 P(A) 乘以實
驗次數
 P(Ø)=0

20
Example 1.2. Discrete Model

 Consider an experiment involving three


coin tosses. Assume the coin is fair.
 Ω = {HHH, HHT, HTH, HTT, THH, THT,
TTH, TTT}.
 A = {exactly 2 heads occur} = {HHT,
HTH, THH}.
P{HHT, HTH, THH}
= P{HHT} + P{HTH} + P{THH}
= 1/8 + 1/8 + 1/8 = 3/8.

21
Discrete Probability Law

 The probability of any event {s1, s2, . . . ,


sn} is the sum of the probabilities of its
elements:
 P({s1, s2, . . . , sn}) = P(s1) + P(s2) +  + P(sn).
 If the sample space consists of n possible
outcomes which are equally likely, then the
probability of any event A is given by
number of elements of A
P ( A) 
n

22
Example 1.3

 The experiment of rolling a pair of 4-


sided dice

23
Continuous Models
 Probabilistic models with continuous sample
spaces
 It is inappropriate to assign probability to each
single-element event (?).
 Instead, it makes sense to assign probability to
any interval (one-dimensional) or area (two-
dimensional) of the sample space.
 Example 1.4. a wheel of fortune, Ω=[0,1]

P 0.3  ? b
 
P xa xb ?
P 0.33  ? c
P 0.333  ? d a

 24
Another Example for Continuous Models

 Example 1.5: Romeo and Juliet have a date at a given


time, and each will arrive at the meeting place with a delay
between 0 and 1 hour, with all pairs of delays being equally
likely. The first to arrive will wait for 15 minutes and will
leave if the other has not yet arrived. What is the
probability that they will meet?
y
x : arriving time for Romeo 1

y : arriving time for Juliet


M : the event that Romeo and Juliet will meet M

M  x, y  x  y  1 / 4, 0  x 1,0  y  1 
1/4

3 3 7 0 1/4 1 x
P(M )  1   
4 4 16
Properties of Probability Laws
 Probability laws have a number of
properties, which can be deduced from
the axioms. Some of them are
summarized below

26
Visualization and verification using Venn diagrams

27
Models and Reality
 The framework of probability theory can
be used to analyze uncertainty.
 In science and engineering, the choice of
a model often involves a tradeoff between
accuracy, simplicity, and tractability.
 如何選擇一個適當的 model 是要靠知識及經驗
 一旦選定 model ,我們可以使用數學去推論
certain event 之機率或具有的一些有趣特性
 Probability theory is full of “paradoxes”
due to poorly specified or ambiguous
probabilistic models.
28
Bertrand’s paradox

 P16, Fig.1,7: 在一個正三角形的外接圓上做


一弦 (chord) randomly. What is the
probability that the length of the chord
is greater than the side of the triangle?
1/2 or 1/3?

29
Conditional Probability
 Conditional probability provides us with
a way to reason about the outcome of
an experiment, based on partial
information. Some examples:
 How likely is it that a person has a certain
disease given that a medical test was
negative?
 A spot shows up on a radar screen. How
likely is it to correspond to an aircraft?
 今天下雨,明天下雨的可能性
 在 Kobe Bryant 防守時,林書豪進球的機率

31
Conditional Probability

 For any event A, gives us the


conditional probability of A given B,
denoted by P(A|B).
 Ex: P(the outcome is 6 | the outcome is
even) = 1/3.
 Definition: If P(B) > 0,

It is undefined for P(B)=0.

32
Conditional Probabilities Specify a
Probability Law
 P(A|B) specifies a new probability law
on Ω. All probability of probability laws
remain valid for conditional probability
laws.
 Nonnegativity: Clear.
 P(Ω | B) = 1.
 Additivity: For disjoint A1 and A2, P(A1 ∪
A2 | B) = P(A1 | B) + P(A2 | B).
 P(A ∪ C | B) ≤ P(A | B) + P(C | B).

33
Properties of Conditional Probability

 Conditional probabilities can also be


viewed as a probability law on a new
universe B. P(B | B) = 1.
這時 event A 以和 B 的部分來看 , i.e.,
A∩B
 In the case where the possible
outcomes are finitely many and equally
likely, we have

34
Example 1.6: Toss a Coin

 We toss a fair coin three successive


times.
 A = {more heads than tails come up},
B = {1st toss is a head}.
 B = {HHH, HHT, HTH, HTT}.
 A ∩ B = {HHH, HHT, HTH}.
 Thus,

35
Example 1.7

36
Example 1.8

SS SF
C FS FF

37
Using Conditional Probability for Modeling

 It is often natural and convenient to


first specify conditional probabilities and
then use them to determine
unconditional probabilities

 An alternative way to represent the


definition of conditional probability

 
P A  B   P B P A B

38
Example 1.9: Radar Detection

 If an aircraft is present in a certain


area, a radar correctly registers its
presence with probability 0.99.
 If it is not present, the radar falsely
registers an aircraft presence with
probability 0.10.
 We assume that an aircraft is present
with probability 0.05.
 Find P(false alarm) and P(missed
detection).

39
Example: Radar Detection - Solution
 Let A = {an aircraft is present}, B = {the radar
registers an aircraft presence}.
 P(false alarm) = P(Ac ∩ B) = P(Ac) P(B | Ac) =
0.95 · 0.10 = 0.095.
 P(missed detection) = P(A ∩ Bc) = P(A) · P(Bc
| A) = 0.05 · 0.01 = 0.0005.

40
Multiplication (Chain) Rule
 Assuming that all of the conditioning events have
positive probability, we have

    
P in1 Ai  PA1 P A2 A1 P A3 A1  A2  P An in11Ai 
 The above formula can be verified by writing

 
P in1 Ai  P A1 
P  A1  A 2  P  A1  A2  A 3  
P 
 n
i 1 Ai 
P A1  P A1  A2  
P in11Ai 
 For the case of just two events, the multiplication
rule is simply the definition of conditional probability


P A1  A2   P A1 P A2 A1 
41
Example 1.10

 Three cards are drawn from an ordinary


52-card deck without replacement (drawn
cards are not placed back in the deck). We
wish to find the probability that none of
the three cards is a heart. Define the
event
Ai  the ith card is not a heart, i  1,2,3
  
P A1  A2  A3   P  A1 P A2 A1 P A3 A1  A2 
39 38 37
   C339
52 51 50
52
?
C3
42
Example 1.11
 A class consisting of 4 graduate and 12 undergraduate
students is randomly divided into 4 groups of 4. What is
the probability that each group includes a graduate
student?
A1 = {students 1 and 2 are in different group}
A2 = {students 1, 2, and 3 are in different group}
A3 = {students 1, 2, 3, and 4 are in different group}

 
P A3   PA1  A2  A3   P A1 P A2 A1 P A3 A1  A2 
12
P A1  
15 12
8

P A2 A1   14 8
4

P A3 A1  A2   13 4
12 8 4
 P A3     43
15 14 13
Example 1.12: The Monty Hall Problem

 Three closed doors.


 A prize is behind one of them.
 You point to one of the doors.
 A friend opens for you one of the
remaining two doors, after making sure
that the prize is not behind it.

44
Example: The Monty Hall Problem (2/2)
 What is the best strategy?
(a) Stick to your initial choice.
(b) Switch to the other unopened door.
(c) You first point to door 1. If door 2 is opened,
you do not switch. If door 3 is opened, you
switch.
 Sol: (a) 1/3 (win when prize in door 1)
(b) 2/3 (win when prize in door 2 or 3)
(c) case 1: If prize in door 1, your friend always
opens door 2. 1/3 +2/3x1/2=2/3,
(P in D 1)+(P in D 2)
case 2: If prize in door 1, your friend is equally
likely to open either door 2 or 3. 1/3x1/2+1/3=1/2
(P in D 1)+(P in D 2) 45
1.4 Total Probability Theorem and Bayes’ Rule

 Total Probability Theorem


 A “divide-and-conquer” approach
 Let A1, . . . , An be disjoint events that form a
partition of the sample space and assume
that P(Ai) > 0, for all i.
 Then, for any event B, we have
P(B)= P(A1 ∩ B) +  + P(An ∩ B)
= P(A1)P(B | A1) +  + P(An)P(B | An).

46
Total Probability Theorem (2/2)
Figure 1.13:

47
Example 1.13
Example
1.13.

48
Example 1.14: Roll a Die

 You roll a fair four-sided die.


 If the result is 1 or 2, you roll once
more but otherwise, you stop.
 What is the probability that the sum
total of your rolls is at least 4?

49
Example: Rolling a Die - Solution

 Let Ai be the event that the result of


first roll is i. P(Ai) = ¼.
 Let B be the event that the sum total is
at least 4.
 P(B | A1) = 1/2, P(B | A2) = 3/4, P(B
| A3) = 0, P(B | A4) = 1. (Why?)
 P(B) = ¼ · ½ + ¼ · 3/4 + ¼ · 0 + ¼ ·
1 = 9/16.

50
Example 1.15
 Example 1.15. Alice is taking a probability class and at the
end of each week she can be either up-to-date or she may
have fallen behind. If she is up-to-date in a given week,
the probability that she will be up-to-date (or behind) in
the next week is 0.8 (or 0.2, respectively). If she is behind
in a given week, the probability that she will be up-to-date
(or behind) in the next week is 0.4 (or 0.6, respectively).
Alice is (by default) up-to-date when she starts the class.
What is the probability that she is up-to-date after three
U i : up -weeks?
to - date P U   P U P U U   P B P U B   P U   0.8  P B   0.4
3 2 3 2 2 3 2 2 2
Bi : behind
   
P U 2   P U1 P U 2 U1  P B1 P U 2 B1  P U1   0.8  P B1   0.4
P B2   P U1 P B2 U1   P B1 P B2 B1   P U1   0.2  P B1   0.6
As we know that P U1   0.8, P B1   0.2
Recursion formulea
 P U 2   0.8  0.8  0.2  0.4  0.72
P U i 1   PU i   0.8  P Bi   0.4
P B2   0.8  0.2  0.2  0.6  0.28 P Bi 1   PU i   0.2  P Bi   0.6
 P U 3   0.72  0.8  0.28  0.4  0.688 P U1   0.8, PB1   0.251
Bayes’ Rule

 Let A1, A2, . . . , An be disjoint events


that form a partition of the sample
space, and assume that P(Ai) > 0, for
all i.
 Then, for any event B such that P(B) >
0, we have

52
Inference Using Bayes’ Rule

惡性腫瘤

良性腫瘤

Figure 1.14:

53
Inference by using Bayes’ rule
 Bayes’s rule is often used for inference.
 There are a number of “causes” that may
result in a certain “effect”.
 The events A1, A2, . . . , An are associated
with the causes ( 病因 ) and the event B
represents the effect. ( 徵狀、表象 )
 P(B | Ai) is the probabilistic model of the
cause-effect relation. 這是儀器的能力
 P(Ai| B) is referred to as the posterior
probability. P(Ai) is called the prior
probability. P(Ai) 是長期病例的統計
54
Example 1.16

 A = {an aircraft is present}.


 B = {the radar registers an aircraft presence}.
 Given P(A) = 0.05, P(B | A) = 0.99, P(B | Ac)
= 0.1,
P(aircraft present | radar registers)
= P(A|B)
= P(A)P(B | A)/P(B)
= P(A)P(B | A)/[P(A)P(B | A) + P(Ac)P(B | Ac)]
= 0.05 · 0.99/(0.05 · 0.99 + 0.95 · 0.1)
≈ 0.3426.
它顯示 even though radar detection 已很準
確, False alarm rate 仍高達 0.6574 55
Example 1.18: The False-Positive Puzzle
 Correct probability of testing a rare disease: 0.95; i.e.,
P(positive| tester has the disease) =0.95
P(negative| tester does not have the disease)=0.95
 P(A random person has the disease)=0.001
 P(tester has the disease |positive) = ?
Ans: P(tester has the disease |positive) = 0.0187

P  A P  B A
P  A B  P B Ac   1  P B c Ac   0.05
P B    
P  A P  B A


P  A  P  B A   P  Ac  P B A c 
0.001  0.95

Note that even though the test was assumed  0.0187
to be fairly accurate, a
0.001  0.95  0.999  0.05
person who has tested positive is still very unlikely (less than 0.02) to
have the disease. ( 要有 second opinion/test)
在美國頂尖的醫院, 80% 的人認為此答案是有得病的機率為 95%
56
1.5 Independence
 Definition: Two events A and B are said to be
independent if P(A ∩ B) = P(A)P(B).
 If in addition, P(B) > 0, independence is
equivalent to the condition P(A|B) = P(A). 這
可以說 event B carries no information about A.
 If A and B are independent, so are A and Bc.
 Are two disjoint events A and B independent?
The answer is no!
 Independence is often easy to grasp
intuitively. 兩個 events 由不相干的物理現象產生,
則為 independent 。
57
Example 1.19: Independence (1/3)

 Two successive rolls of a 4-sided die.


 16 outcomes, each has probability 1/16.
 Are the events Ai = {1st roll results in
i} and Bj = {2nd roll results in j}
independent?
 P(Ai ∩ Bj) = 1/16, P(Ai) = P(Bj) = 4/16.
 P(Ai ∩ Bj) = P(Ai)P(Bj) ) Independent.

58
Example: Independence (2/3)

 Are the events A = {1st roll is a 1} and


B = {sum of the two rolls is a 5}
independent?
 P(A ∩ B) = P({(1,4)}) = 1/16.
 P(A) = P({(1,1),(1,2),(1,3),(1,4)}) = 4/16.
 P(B) = P({(1,4),(2,3),(3,2),(4,1)}) = 4/16.
 P(A ∩ B) = P(A)P(B) ) Independent.

59
Example: Independence (3/3)

 Are the events A = {maximum of the


two rolls is 2} and B = {minimum of
the two rolls is 2} independent?
 P(A ∩ B) = P({(2,2)}) = 1/16.
 P(A) = P({(1,2),(2,1),(2,2)}) = 3/16.
 P(B) = P({(2,2),(2,3),(2,4),(3,2),(4,2)}) =
5/16.
 P(A)P(B) = 15/256 P(A ∩ B).
 A and B are not independent.

60
Conditional Independence

 Two events A and B are said to be


conditionally independent, given another
event C with P(C) > 0, if
P(A ∩ B | C) = P(A | C) P(B | C).
 If in addition, P(B | C) > 0, conditional
independence is equivalent to the
condition P(A | B ∩ C) = P(A | C).
 Independence does not imply conditional
independence, and vice versa.

61
Conditional Independence
 Given an event C , the events A and B are
called conditionally independent if
1
 
P A B C  P A C P B C    
 We also know that
P A  B  C 

P A B C  
P C 
multiplication rule
2

  
P C P B C P A B  C 
P C 

  
If P B C  0 , we have an alternative way
to express conditional independence

P A B C  P A C   3
62
Conditional Independence
 Notice that independence of two events A and B
with respect to the unconditionally probability law
does not imply conditional independence , and vice
versa

PA  B   P AP B      
P A B C  P A C P B C 
 If A and B are independent, the same holds for
c
(i) A and B
c c
(ii) A and B
 How can we verify it ? (See Problem 38)

63
Example 1.20

 Consider two independent fair coin


tosses, in which all four possible
outcomes are equally likely.
 H1 = {1st toss is a head}.
 H2 = {2nd toss is a head}.
 D = {the two tosses have different
results}.
 H1 and H2 are (unconditionally)
independent.

64
Example 1.20

 P(H1 | D) = ½.
 P(H2 | D) = ½.
 P(H1 ∩ H2 | D) = 0.
 P(H1 ∩ H2 | D)  P(H1 | D) P(H2 | D).
 H1 and H2 are not conditionally
independent.

65
Example 1.21
 There are two coins, a blue and a red one
 We choose one of the two at random, each being chosen with
probability 1/2, and proceed with two independent tosses.
 The coins are biased: with the blue coin, the probability of heads
in any given toss is 0.99, whereas for the red coin it is 0.01
 Let B be the event that the blue coin was selected. Let also H
i
be the event that the i-th toss resulted in heads

onditional case:   
P H 1  H 2 B  P H1 B P H 2 B   Given the choice of a coin, the
events Hand Hare independent
?
1 2
nconditional case: P H1  H 2   P H1 P H 2 
   
P H1   P B P H1 B  P B C P H1 B C   1
2
1
 0.99   0.01 
2
1
2
   
P H 2   P B P H 2 B  P B C P H 2 B C  1 1
  0.99   0.01 
2 2
1
2

    
P H1  H 2   PB P H1  H 2 B  P B C P H1  H 2 B C 
1 1 1 66
  0.99  0.99   0.01  0.01 
2 2 4
Definition of Independence of Several Events

 We say that the events A1, A2, …, An are


independent if

for every subset S of {1, 2, …, n}.


 For example, the independence of three
events A1 , A2 , A3 amounts to satisfying the
four conditions
P A1  A2   P A1 P A2 
P A1  A3   P A1 P A3 
P A2  A3   P A2 P A3 
P A1  A2  A3   P A1 P A2 P A3  67
Example 1.22: Pairwise Independence
does not Imply Independence
 Recall the Example 1.20 on pp. 37.
 H1 and H2 are independent.
 P(D|H1) = P(H1 ∩ D)/P(H1) = ¼ / ½ = ½ =
P(D) → H1 and D are independent.
 Similarly, H2 and D are independent.
 P(H1 ∩ H2 ∩ D) = 0  ½·½·½ =
P(H1)P(H2)P(D) → these three events are not
independent.
 H1 = {1st toss is a head}.
 H2 = {2nd toss is a head}.
 D = {the two tosses have different results}
68
Example 1.23
The equality P A1  A2  A3   P A1 P A2 P A3 
is not enough for independence.
 Consider two independent rolls of a fair six-
sided die, and the following events:
A = { 1st roll is 1, 2, or 3 },
B = { 1st roll is 3, 4, or 5 },
C = { the sum of the two rolls is 9 }.
1 1 1 4
P A  B  C       P AP B P C 
36 2 2 36
However,
1 1 1
P A  B      P AP B 
6 2 2
1 1 4
P A  C      P AP C 
36 2 36
1 1 4
P B  C      P B P C  69
12 2 36
Example 1.24 : Network Connectivity

Reliability: 討論包含多個 components 之 complex


system
的 operation 時,常假設 components 間為 independent

What is the probability that there is a path


connecting A and B in which all links are 70
Solution (1/2)
 P(series subsystem succeeds)
= p1p2 · · · pm.
 P(parallel subsystem succeeds)
= 1− (1 − p1)(1 − p2) · · · (1 − pm).
 P(C → B)
= 1 − (1 − P(C → E and E → B)) ·
(1 − P(C → F and F → B))
= 1− (1 − pCE pEB)(1 − pCF pFB)
= 1− (1 − 0.8 · 0.9)(1 − 0.95 · 0.85)
= 0.946.
71
Solution (2/2)

 P(A → C and C → B) = P(A → C)P(C →


B) = 0.9 · 0.946 = 0.851.
 P(A → D and D → B) = P(A → D)P(D →
B) = 0.75 · 0.95 = 0.712.
 P(A → B)
= 1 − (1 − P(A → C and C → B)) ¢
(1 − P(A → D and D → B))
= 1− (1 − 0.851)(1 − 0.712)
= 0.957.

72
Independent Trials and Bernoulli Trials
 If an experiment involves a sequence of
independent but identical stages, we
say that we have a sequence of
independent trials.
 In the special case where there are only
two possible results at each stage, we
say that we have a sequence of
independent Bernoulli trials.

73
Three Independent Tosses of a Coin

74
Binomial Coefficients, Probabilities,
and Formula
 Binomial coefficients:
 Number of distinct n-toss sequences that
contain k heads.
 i! = 1 ・ 2  (i − 1) ・ i.
 0! = 1.

 Binomial probabilities:
 p(k) = P(k heads come up in an n-toss
sequence).
0≤ k ≤ n
 Binomial formula:

75
Example 1.25: Grade of Service
 c modems, n customers, customer connection
probability p.
 What is the probability that there are more
customers needing a connection than there
are modems?
 Sol:

 If n = 100, p = 0.1, and c = 15, the desired


probability turns out to be 0.0399.
 Real Problem is more complicated, e.g. p is
time-dependent and location-dependent.
76
1.6 Counting
 Two applications of the discrete uniform probability
law
 When the sample space  has a finite number of
equally likely outcomes, the probability of any event A
is given by
number of elements of A
P  A 
number of elements of 
 When we want to calculate the probability of an event
A with a finite number of equally likely outcomes, each
of which has an already known probability p. Then the
probability of A is given by
P A  p  number of elements of A
 E.g., the calculation of k heads in n coin tosses
77
The Counting Principle
 Combinatorics
 Consider a process that consists of r stages.
Suppose that:
 There are n1 possible results at the first stage.
 For every possible result of the first stage,
there are n2 possible results at the second
stage.
 More generally, for any possible results of the
first i − 1 stages, there are ni possible results
at the i-th stage.
 Then, the total number of possible results of the
r-stage process is
n1n2 ・ ・ ・ nr.
78
 Example 1.27: The Number of Subsets of
an n-Element Set
 Consider an n-element set {s1, s2, . . . , sn}.
 How many subsets does it have (including itself
and the empty set)?
 Sol: 2n.
79
Permutation, Combination, and Partition

 Select k objects out of a collection of n


objects.
 If the order of selection matters, the
selection is called a permutation.
 Otherwise, it is called a combination.
 We will then discuss a more general
type of counting, involving a partition of
a collection of n objects into multiple
subsets.

80
k-Permutations

 The number of different ways that we


can pick k out of n distinct objects and
arrange them in a sequence.
 n(n − 1)  (n − k + 1) = n!/(n − k)!.
 If n = k:
 Simply called permutations.
 n(n − 1)(n − 2)  2 . 1 = n!.

81
Example 1.28: Number of “wxyz”

 Let us count the number of words that


consist of four distinct letters.
 Sol: 26!/(26-4)! = 26 . 25 . 24 . 23 =
358,800.

82
Combinations

 Counting the number of k-element


subsets of a given n-element set.
 Ans:

 Example: The number of combinations


of two out of the four letters A, B, C,
and D is found by letting n = 4 and k =
2.
 It is

83
 有些 formulas 是很難由代數來推導的
 例如:由 n
n k n ,令 p=1/2 可得
  
k 0  k 
p (1  p ) k
1

n
n
  
k 0  k 
 2 n

 下面 formula 可由兩個不同的 approaches 去


得到等式
n
n
 k  
k 1  k 
 n 2 n 1

84
Partitions

 We consider partitions of an n-element


set into r disjoint subsets, with the i-th
subset containing exactly ni elements.
 ni are nonnegative integers.
 n1 +  + nr = n.
 How many ways this can be done?
 Sol:

 This is multinomial coefficient and


denoted by
85
Example 1.32: Anagrams
 How many different words (letter
sequences) can be obtained by
rearranging the letters in the word
TATTOO?
 Sol: Write in the form T1AT2T3O1O2 .

 Anagram 回文構詞法
( 如將 now 變移位置而構成 won)
Homework: 2, 5, 6, 11, 15, 24, 31, 37,
39, 53, 59
86

You might also like