Download as ppsx, pdf, or txt
Download as ppsx, pdf, or txt
You are on page 1of 23

Classical Detection Theory

• Simplest case:
the output is one of two choices. We refer to
them as hypotheses: H0 and H1 .

• More generally:
the output might be one of M hypotheses: H0,
H1, . . . , HM- 1
Source Mechanisms
Examples:
Transition Mechanisms:

Example:

assume R=-2 H0 or H1??


Ans. : H0 {p(-2│H0)=0.25 ; p(-2│H1)=0}
“Detection using maximum likelihood criterion”
R=0 H0 and H1 equally likely

Decision Rule: After observing the outcome in the observation


space we shall guess which hypothesis was true, and to accomplish
this we develop a decision rule that assigns each point to one of the
hypotheses.

Classical Decision Criteria:

• Bayes criterion

•Neyman Pearson criterion


binary hypothesis problem:

4 possible actions: Costs:

1. H0 true; choose H0. C00 true negative (TN)


2. H0 true; choose H1. C10 false positive (FP)
3. H1 true; choose H1. C11 true positive (TP)
4. H1 true; choose H0. C01 false negative (FN)

Bayes Criterion :
based on 2 assumptions:
- Prior information P0 and P1
- a cost is assigned to each possible course of actions :
C00, C10, C11, C01
Each time the experiment is conducted a certain cost will be incurred.
We should like to design our decision rule so that on the average the
cost will be as small as possible.

Expected value of the Cost or Risk :


P(A,B)=P(A)P(B/A)

P0=Pr(H0 is true)
P1=Pr(H1 is true)
Note: there are two probabilities that we must average over:
the a priori probability and the probability that a particular course of action will
be taken.
Decision rule is equivalent to dividing the observation space (Signal space)
into 2 regions Z0 an Z1 :

Baye’s rule:
Detection rule: Minimize the Average Cost

Determine the decision regions Z0 and Z1 in an optimal way

If received signal r  Z0 -> H0

If received signal r  Z1 -> H1


We can now write the expression for the risk in terms of the transition
probabilities and the decision regions :

  
z z0
Observing that:
We can write:

And given that :

We get:
Fixed cost

“The integral represents the cost controlled by those points R that we assign to Z O ”
Find Z0 that minimizes the integral  Baye’s Detection rule is obtained

In general: and
( the cost of a wrong decision is higher than the cost of a correct decision)
So the two terms inside the brackets are positive.

Therefore all values of R where the second term is larger than the first should be
included in Z0 because they contribute a negative amount to the integral. Similarly,
all values of R where the first term is larger than the second should be excluded
from Z0 because they would contribute a positive amount to the integral. Values of
R where the two terms are equal have no effect on the cost and may be assigned
arbitrarily.
Thus the decision regions are defined by the statement:
If:

assign R to Z1 and consequently say that H1 is true. Otherwise assign R


to Z0. and say Ho is true.

Alternately, we may write:

Threshold 
The quantity on the left is called the likelihood ratio and denoted by L(R):

Because it is the ratio of two functions of a random variable, it is a random


variable. We see that regardless of the dimensionality of R, L(R) is a one-
dimensional variable.

The quantity on the right is the threshold of the test and is denoted by h:

Thus Bayes criterion leads us to a likelihood ratio test (LRT):

If: C00=C11=0 , C10=C01 h=1 : Maximum Likelihood criterion


and P0=P1 (previous Example)
• All the data processing is involved in computing Λ(R) and
is not affected by a priori probabilities or cost assignments. This invariance
of the data processing is of considerable practical importance.

• Because the natural logarithm is a monotonic function, and both sides


are positive, an equivalent test is :

Likelihood ratio processor


This is an example of a sufficient statistic, which we denote by Ɩ(R).
It is just a function of the received data.
In other words, when making a decision, knowing the value of the sufficient statistic Ɩ is just as good as
knowing R.
•Neyman Pearson criterion:

Neyman-Pearson Tests:

In many physical situations it is difficult to assign realistic costs or a priori


probabilities.
A simple procedure to bypass this difficulty is to work with the conditional
probabilities PF and PD:

- PF is the probability of a false alarm (i.e., we say the target is present when it is not)
-PM is the probability of a miss (we say the target is absent when it is present)
- PD is the probability of detection (i.e., we say the target is present when it is) P D=1-PM

In general, we should like to make PF as small as possible and PD as large


as possible.
But : “For most problems of practical importance these are conflicting objectives”

An obvious criterion is to constrain one of the probabilities and maximize (or


minimize) the other.

A specific statement of this criterion is the following:


(Solve to get the threshold l)
Performance : Receiver Operating Characteristic (ROC)

ROC curve: plot of PD versus PF as the discrimination threshold is varied

For a Neyman-Pearson test the values of PF and PD completely


specify the test performance.

For Bayes detection, PF and PD can be calculated.

Likelihood ratio test : L= h

If threshold h increases, this is equivalent to decreasing Z1, the region where


We say H1 thus PD decreases (as well as PF ).
Example 1: (Gaussian variables with unequal means)

Bayes detection:

(Ɩ λ)


PF  p(L H 0)dL
λ
 Nm

lλ 
σ 2 ln η d

m 2


PD  p(L H1)dL
λ
When d increases:
SNR increases
Better performance

When threshold
Increases:
PD decreases
PF decreases

Receiver operating characteristic

You might also like