Professional Documents
Culture Documents
Classical Detection Theory
Classical Detection Theory
• Simplest case:
the output is one of two choices. We refer to
them as hypotheses: H0 and H1 .
• More generally:
the output might be one of M hypotheses: H0,
H1, . . . , HM- 1
Source Mechanisms
Examples:
Transition Mechanisms:
Example:
• Bayes criterion
Bayes Criterion :
based on 2 assumptions:
- Prior information P0 and P1
- a cost is assigned to each possible course of actions :
C00, C10, C11, C01
Each time the experiment is conducted a certain cost will be incurred.
We should like to design our decision rule so that on the average the
cost will be as small as possible.
P0=Pr(H0 is true)
P1=Pr(H1 is true)
Note: there are two probabilities that we must average over:
the a priori probability and the probability that a particular course of action will
be taken.
Decision rule is equivalent to dividing the observation space (Signal space)
into 2 regions Z0 an Z1 :
Baye’s rule:
Detection rule: Minimize the Average Cost
z z0
Observing that:
We can write:
We get:
Fixed cost
“The integral represents the cost controlled by those points R that we assign to Z O ”
Find Z0 that minimizes the integral Baye’s Detection rule is obtained
In general: and
( the cost of a wrong decision is higher than the cost of a correct decision)
So the two terms inside the brackets are positive.
Therefore all values of R where the second term is larger than the first should be
included in Z0 because they contribute a negative amount to the integral. Similarly,
all values of R where the first term is larger than the second should be excluded
from Z0 because they would contribute a positive amount to the integral. Values of
R where the two terms are equal have no effect on the cost and may be assigned
arbitrarily.
Thus the decision regions are defined by the statement:
If:
Threshold
The quantity on the left is called the likelihood ratio and denoted by L(R):
The quantity on the right is the threshold of the test and is denoted by h:
Neyman-Pearson Tests:
- PF is the probability of a false alarm (i.e., we say the target is present when it is not)
-PM is the probability of a miss (we say the target is absent when it is present)
- PD is the probability of detection (i.e., we say the target is present when it is) P D=1-PM
Bayes detection:
(Ɩ λ)
PF p(L H 0)dL
λ
Nm
lλ
σ 2 ln η d
m 2
PD p(L H1)dL
λ
When d increases:
SNR increases
Better performance
When threshold
Increases:
PD decreases
PF decreases