Professional Documents
Culture Documents
w5 TEL502E 2021
w5 TEL502E 2021
H0 : x[0] <
H1 : x[0] >
• Bayesian detector
17
Minimum Probability of Error
Assume the prior probabilities of H0 and H1 are known and represented by P (H0 ) and
P (H1 ), respectively. The probability of error, Pe , is then defined as
Pe = P (H1 )P (H0 |H1 ) + P (H0 )P (H1 |H0 ) = P (H1 )PM + P (H0 )PF A
Our goal is to design a detector that minimizes Pe . It is shown that the following detector is
optimal in this case
p(x|H1 ) P (H0 )
> =
p(x|H0 ) P (H1 )
In case P (H0 ) = P (H1 ), the detector is called the maximum likelihood detector.
32
Minimum Probability of Error - Derivation
We know that Z Z
p(x|H1 )dx = 1 p(x|H1 )dx,
R0 R1
such that
✓ Z ◆ Z
Pe = P (H1 ) 1 p(x|H1 )dx + P (H0 ) p(x|H0 )dx
R1 R1
Z
= P (H1 ) + [P (H0 )p(x|H0 ) P (H1 )p(x|H1 )] dx
R1
33
Minimum Probability of Error - Derivation
Z
Pe = P (H1 ) + [P (H0 )p(x|H0 ) P (H1 )p(x|H1 )] dx
R1
34
Minimum Probability of Error– Example DC in WGN
Consider the following signal detection problem
H0 : x[n] = w[n] n = 0, 1, . . . , N 1
where the signal is s[n] = A for A > 0 and w[n] is WGN with variance 2
. Now the min.
p(x|H1 ) P (H0 )
probability of error detector decides H1 if p(x|H0 ) > P (H1 ) = 1 (assuming P (H0 ) = P (H1 ) =
0.5), leading to h i
1 1
PN 1 2
N exp 2 2 n=0 (x[n] A)
(2⇡ 2 ) 2
h PN 1 2 i >1
1 1
N exp 2 2 n=0 x [n]
(2⇡ 2 ) 2
H0 : x[n] = w[n] n = 0, 1, . . . , N 1
where the signal is s[n] = A for A > 0 and w[n] is WGN with variance 2
. Now the NP
detector decides H1 if
h PN i
1 1 1 2
N exp 2 2 n=0 (x[n] A)
(2⇡ 2 ) 2
h PN 1 2 i >
1 1
N exp 2 2 n=0 x [n]
(2⇡ 2 ) 2
Starting from
p(x|H1 ) P (H0 )
> =
p(x|H0 ) P (H1 )
we can use Bayes’ rule:
p(x|Hi )p(Hi )
p(Hi |x) =
p(x)
we arrive at
p(H1 |x) > p(H0 |x).
this is called the MAP detector, which, if P (H1 ) = P (H0 ) reduces again to the ML detector.
37
Another form of the minimum probability of error detector
follows directly Bayesian inequality, thus we decide
if P( | > P( |
A generalisation of the minimum Pe criterion is one where costs are assigned to each type
of error:
Let Cij be the cost if we decide Hi while Hj is true. Minimizing the expected costs we get
1 X
X 1
R = E[C] = Cij P (Hi |Hj )P (Hj )
0=1 j=0
If C10 > C00 and C01 > C11 the detector that minimises the Bayes risk is to decide H1 when
38
Consider we have one observation x[0] (N=1)
To minimize
if 0
2
A
if 0
2 2
if 0
Consider we have multiple observations (N>1) , we can not plot
the multivariate PDFs anda observe the regions over which
each one yields the maximum instead we need to derive a test
statistic
The conditional PDF is
where
To maximize the likelihood we minimize the exponential term
Minimum Distance Receiver
We decide