Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

ECE 863 Welcome to ECE 863 Analysis of Stochastic Systems Part I.

1: Introduction
Prof. Hayder Radha
Class Web Page: www.egr.msu.edu/classes/ece863
You need to type the whole thing in your web browser

From Unix: /web/classes/ece863

ECE 863: Part I.1

Page 2

Prof. Hayder Radha

ECE 863
This course teaches mathematical tools that are useful for a wide range of disciplines:
Communications and Networking
Information theory, coding, modulation, queuing theory, traffic modeling, etc.

ECE 863: Part I


Introduction to Probability Theory
Definition of random experiments Axioms of probability Mutual exclusivity Conditional probability Partition of the sample space Total probability Bays rule Independence

Signal, Speech and Image Processing


Statistical signal processing, filtering, signal modeling, etc.

Many other
Control; other areas in engineering and science

ECE 863: Part I.1

Page 3

Prof. Hayder Radha

ECE 863: Part I.1

Page 4

Prof. Hayder Radha

Definition of Random Experiment


Procedures/ steps (tossing a coin) Set of all possible outcomes

Outcomes; events; sample space


An event A is a set of outcomes: A = { s : such that s is an even number } outcome
S B

S
Sample Space

Measurements/ observations

Event

Sample Space S

An outcome s can NOT be decomposed into other outcomes


ECE 863: Part I.1
Page 5

Prof. Hayder Radha

ECE 863: Part I.1

Page 6

Prof. Hayder Radha

Examples of random experiments


Role a die once and record the result of the top-face:
S = { 1, 2, 3, 4, 5, 6} A = the outcome is even = {2, 4, 6} B = the outcome is larger than 3 = {4, 5, 6} C = the outcome is odd = {1, 3, 5}

Axioms of Probability
Probability of any event A is non-negative:
P[A] 0

The probability that the outcome belongs to the sample space is 1: P[S] = 1 The probability of the union of mutuallyexclusive events is the sum of their probabilities:
If A1
Prof. Hayder Radha ECE 863: Part I.1

Role a die once and see if the top-face is even


S = { even, odd } = { A, C }
ECE 863: Part I.1
Page 7

A = , P[A A ] = P[A ] + P[A ]


2 1 2 1 2
Page 8

Prof. Hayder Radha

Mutual Exclusivity
The probability of the union of mutuallyexclusive events is the sum of their probabilities:
If Ai

Mutual Exclusivity
However, in general:
P[A1

A ] = P[A ] + P[A ]
2 1 2

- P[A1

A ]
2

A = ,
j

ij

A3 A1
S

P A j = P[ A j ] j j

This formula works for both mutually exclusive and non-mutually-exclusive events

A3 A1

A2

A2

ECE 863: Part I.1

Page 9

Prof. Hayder Radha

ECE 863: Part I.1

Page 10

Prof. Hayder Radha

Example I.1
Role a die twice and record the number of dots on the top-face:
S= { (1,1) (2,1) (3,1) (4,1) (5,1) (6,1) } (1,2) (2,2) (3,2) (4,2) (5,2) (6,2) (1,3) (2,3) (3,3) (4,3) (5,3) (6,3)
Page 11

Example I.1
Define the following events:
A1 = First role gives an odd #
P[A1]=(18/36) P[A2]=(18/36)

(1,4) (2,4) (3,4) (4,4) (5,4) (6,4)

(1,5) (2,5) (3,5) (4,5) (5,5) (6,5)

(1,6) (2,6) (3,6) (4,6) (5,6) (6,6)


Prof. Hayder Radha

A2 = Second role gives an odd #


C = The sum is odd ;

Compute the probability of the event C (i.e. P[C] ) using the probabilities P[A1] & P[A2]

ECE 863: Part I.1

ECE 863: Part I.1

Page 12

Prof. Hayder Radha

Example I.1
Solution:
C = the union: C1 C2 : c C1 = first role is odd & second role is even = (A1 A2 ) c C2 = first role is even & second role is odd = (A2 A1 )
c C = (A1 A2 ) (A2 A1c )

ECE 863
Make sure that you read and understand:
Set operations Corollary 1-through-7 in the book

Since C1 and C2 are mutually exclusive: P[C] = P[C1 C2 ] = P[C1] + P[C2] P[C] = (9/36) + (9/36) = 1/2

ECE 863: Part I.1

Page 13

Prof. Hayder Radha

ECE 863: Part I.1

Page 14

Prof. Hayder Radha

Conditional Probabilities
Given that an event B has occurred, what is the probability of A Given that B has occurred, reduces the sample space: S B S
S B

Conditional Probabilities
We need to:
compute the intersection of A with B: normalize the probabilities by P[B]

P[A/B] = P[A B] / P[B] Think of P[A/S] = P[A S] / P[S]


A S B

Page 15

S B

Page 16

S B

ECE 863: Part I.1

Prof. Hayder Radha

ECE 863: Part I.1

Prof. Hayder Radha

So far, we have learned...


An outcome s can NOT be decomposed into other outcomes outcome s

Partition of the Sample Space


B1 , B2 ,Bn form a partition of S when:
S = B1 B2 Bn Bi Bj= , i j
B2 B1 B3

Event A

Sample Space S

P[AB] = P[A] + P[B] - P[A

B]

For M.E. events A

B= , P[AB] = P[A] + P[B]

B4

Conditional probability reduces the sample space: P[A/B] = P[A B] / P[B]


ECE 863: Part I.1
Page 17

Prof. Hayder Radha

ECE 863: Part I.1

Page 18

Prof. Hayder Radha

Total Probability
If B1 , B2 ,Bn form a partition of S, then for any event A: (A Bi) (A Bj) = , i j A = (A B1) (A B2) (A Bn )
B2 B1

Total Probability
Since A can be expressed as the union of mutually exclusive events: A = (A B1) (A B2) (A Bn ) P[A] = P[A B1] + P[A B2] + P[A Bn]
B2 B1

A
B3

B4

A
B3

B4

ECE 863: Part I.1

Page 19

Prof. Hayder Radha

ECE 863: Part I.1

Page 20

Prof. Hayder Radha

Total Probability
Therefore, if B1 , B2 , form a partition of S, then for any event A: P[A] = P[A B1] + P[A B2] +

Total Probability
Using the definition of conditional probability P[A/Bi] = P[A Bi] / P[Bi] P[A Bi] = P[A/Bi] P[Bi]
B2 B1

B2 B1

A
B3

B4

A
B3

B4

P[A] = P[A B1] + P[A B2] +

ECE 863: Part I.1

Page 21

Prof. Hayder Radha

ECE 863: Part I.1

Page 22

Prof. Hayder Radha

Total Probability
The Law of Total Probability:
If B1 , B2 , form a partition of S, then for any event A:

Bays Rule
If B1 , B2 ,Bn form a partition of S, then for any event A:

P[Bj/A] =

P[A/Bj ] P[Bj ]

P[A] = P[A/B1].P[B1] + P[A/B2].P[B2]


B2 B1

P[A/B ] P[B ]
i=1 i i

A
B3

B4

B2

P[A] = P[A B1] + P[A B2] +

B1

A
B3

B4

ECE 863: Part I.1

Page 23

Prof. Hayder Radha

ECE 863: Part I.1

Page 24

Prof. Hayder Radha

Derivation of Bays Rule


Recall that, if B1 , B2 ,Bn form a partition of S, then for any event A:

Derivation of Bays Rule


Also recall that the conditional probability P[Bj/A] can be expressed as follows: P[Bj/A] = P[A Bj] / P[A]
P[A] = P[A/B1].P[B1] + P[A/B2].P[B2]

P[A] = P[A/B1].P[B1] + P[A/B2].P[B2]


B2 B1

A
B3

B4

P[Bj/A] =

P[A/Bj ] P[Bj ]

B2 B1

P[A/B ] P[B ]
i=1 i i
Prof. Hayder Radha

A
B3

B4

P[Bj/A] =

P[A/Bj ] P[Bj ]

P[A/B ] P[B ]
i=1 i i
Prof. Hayder Radha

ECE 863: Part I.1

Page 25

ECE 863: Part I.1

Page 26

Derivation of Bays Rule


Reapplying the definition of conditional probability to the nominator: P[Bj/A] = P[A Bj] / P[A] P[Bj/A] = P[A/Bj] P[Bj] / P[A]
P[A] = P[A/B1].P[B1] + P[A/B2].P[B2]
B2 B1

Bays Rule
Using the law of total probability to express P[A] , we arrive at the expression for Bays Rule:
P[Bj/A] = P[A/Bj] P[Bj] / P[A] P[A] = P[A/B1].P[B1] + P[A/B2].P[B2]
B2 B1 B4

A
B3

B4

P[Bj/A] =

P[A/Bj ] P[Bj ]

P[A/B ] P[B ]
i=1 i i
Prof. Hayder Radha

A
B3

P[Bj/A] =

P[A/Bj ] P[Bj ]

P[A/B ] P[B ]
i=1 i i
Prof. Hayder Radha

ECE 863: Part I.1

Page 27

ECE 863: Part I.1

Page 28

Bays Rule
B1 , B2 ,Bn are known as the a priori events (i.e. we know about them before the experiment is performed) P[Bj/A] is the a posteriori probability (i.e., after performing the experiment, A occurred; then what is the probability of Bj)
B2 B1 B4

Bays Rule
Typically:
We perform an experiment and observe an event A Given that A has been observed, we are interested in finding out which are the most likely a priori event E.g., we compute P[B1/A], P[B2/A], P[B3/A], & P[B4/A]

A
B3

P[Bj/A] =

P[A/Bj ] P[Bj ]

B2 B1

P[A/B ] P[B ]
i=1 i i
Prof. Hayder Radha

A
B3

B4

P[Bj/A] =

P[A/Bj ] P[Bj ]

P[A/B ] P[B ]
i=1 i i
Prof. Hayder Radha

ECE 863: Part I.1

Page 29

ECE 863: Part I.1

Page 30

Example I.2
A transmitter sends either a 1 or a 0 over a communication system The receiver makes a decision based on the received signal
{T0 , T1} communication system {R0 , R1}

Example I.2
P[T0]=1-p ; P[T1]=p Probability of error e Compute P[Ti Rj] & P[Ti/Rj], i,j=0,1

Transmitter

Receiver

{T0 , T1} 1-p

Transmitter

communication system

Receiver

{R0 , R1}

e
ECE 863: Part I.1
Page 31

Prof. Hayder Radha

ECE 863: Part I.1

Page 32

Prof. Hayder Radha

Example I.2
Computing P[Ti Rj] = P[Rj/Ti].P[Ti]
P[T0 R0] = P[R0 /T0] P[T0] = (1-e) (1-p) ;
{R0 , R1}

Example I.2
Computing P[Ti Rj] = P[Rj/Ti].P[Ti]
P[T0 R0] = (1-e) (1-p) ; P[T1 R1] = (1-e) p ; P[T0 R1] = e (1-p) P[T1 R0] = e p
{R0 , R1}

{T0 , T1} 1-p

Transmitter

communication system

Receiver

{T0 , T1} 1-p

Transmitter

communication system

Receiver

e
ECE 863: Part I.1
Page 33

e
Prof. Hayder Radha ECE 863: Part I.1
Page 34

Prof. Hayder Radha

Example I.2
Computing P[Ti/Rj]
P[T0 / R0] = P[T0 R0] / P[R0]
since T0 and T1 are mutually exclusive (i.e. a partition),

P[R0] = P[R0/T0] P[T0] + P[R0/T1] P[T1]


Therefore,

Example I.2

= (1-e) (1-p) + e p

P[T0 / R0] = P[T0 R0] / P[R0] = (1-e) (1-p) / [(1-e) (1-p) + e p]

P[R0] = P[R0/T0] P[T0] + P[R0/T1] P[T1]


{T0 , T1} 1-p
Transmitter

communication system

Receiver

{R0 , R1}

{T0 , T1} 1-p

Transmitter

communication system

Receiver

{R0 , R1}

e
ECE 863: Part I.1
Page 35

Probability of error = e
Prof. Hayder Radha ECE 863: Part I.1
Page 36

Prof. Hayder Radha

Example I.2
P[T0/R0] = Given the receiver outputs (makes a decision on ) a "0", the probability of transmitting a "0"

Example I.2
p=0.1 p=0.5 p=0.9

1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.2 0.4 0.6 0.8 1

As an exercise, derive the expressions for the rest of the P[Ti/Rj]s: P[T0/R1] , P[T1/R0] , and P[T1/R1] And plot their values as functions of e for different values of p

Probability of an error e

{T0 , T1} 1-p


ECE 863: Part I.1

Transmitter

communication system Probability of error = e


Page 37

Receiver

{R0 , R1}

p
Prof. Hayder Radha ECE 863: Part I.1
Page 38

Prof. Hayder Radha

Independence
Definition of independence is based on preserving the value of the probability: A and B are independent

Example I.3
P[AB]=(1/6) = P[A] P[B]=(3/6) (2/6)= (1/6)
Therefore, A and B are independent

P[A/B] = P[A]

P[AC]=(1/6) P[A] P[C]=(3/6) (3/6)= (1/4)


Therefore, A and C are dependent
1 3 5

P[A/B] = P[AB]/P[B] = P[A] A & B are independent

P[AB] = P[A] P[B]


Page 39

2 4 6

S B

C
ECE 863: Part I.1 Prof. Hayder Radha ECE 863: Part I.1
Page 40

Prof. Hayder Radha

10

Example I.3
P[A/B]=(1/2) = P[A]

Mutual-Exclusivity & Independence

B did not change P[A] C changed P[A]


A
2 4

Therefore, A and B are independent

P[A/C]=(1/3) P[A]

AB= If A & B are M.E. P[AB] = 0


Remember M.E. If P[AB] = 0, does P[AB] = P[A] P[B] ? If P[A] 0 and P[B] 0, then P[A] P[B] 0 In this case: M.E.

Therefore, A and C are dependent


1 3 5

S
6

Dependence
Prof. Hayder Radha

C
ECE 863: Part I.1
Page 41

Prof. Hayder Radha

ECE 863: Part I.1

Page 42

ECE 863: Part I.1


At this point, you should know:
How to define a random experiment What are the axioms of probability The definition of Mutual Exclusivity (M.E.) The definition and impact of conditional probability How to form a Partition of the sample space The law of Total probability Bays rule and how to use it The notion of Independence and its relationship with Mutual Exclusivity
ECE 863: Part I.1
Page 43

Prof. Hayder Radha

11

You might also like