Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

2 QUIZZES 10%

OFFICE
HOURS:5-6:30 KRI
56 ASSIN'S

E1244 DETECTION & ESTIMATION THEORY

PROBLEM!
-

A

b S S
b
- >
TRANSMITER CHANNEL - REIEVER ->

b bit
=
- 20,3 sts in
* general.
S([ 2,23 -

*Bits are equally likely Pr(bz) pr(b


=
=
1) 0.5
=

Bit to symbol mapping:


bit symbol
0 -> 2
symbols the
-

->
are sent over
2
channel. 1
-

-> GOAL:Receiver wants


to reconstruct the transmitted
bit

# CASE1: CHANNE IS PERFECT:92S

NOW, s
it = 2

then 5 = 0 # No case as sp22p3


J 2
=
men 51 =

as channel is
perfect
# CASEC:CHANNE ADDS GAUSSIAN NOISE:3:SAN
wwN(0,22)
=> I can take any real value in such a case.

HOW TO IDEATH HIP


THE BIT IN SUCH CASE??
A

> If b0
= =5= = 2

SWN(2,2)
If b1
=
=
5 =
2
50N(2,2)
=
Notes
fo(u); b G(x);b 1 =

If =25then should
what be 5 =? =1
in such a case what
should be the decision rule?

if 510 = 5*
5>0 5
=> 1
=

C &(3) I - This
is called likelihood ratio
test
FCS) 5*

Likelihood ratio.

=> CAN WE AND A BETTER RULE


THAN THS??
HOW
=>
TO DEUDE A RULE IS BETTER??
↳ ERROR PROBABILIM givesa way

Error
=>

Probability PrCe):

Pr(e) pr(b 0).


= =
pr(9/5) pr(b=1). pr3e/b=).
+

a ⑳

becomes to
What
=> 9 fo() du
-Ina
two curves as

is
05[1 9(E) 0(E)] where
=

becomes very
+ -

large??
du.
-

g(x) =

Nearly equal, with 9(E)


=

right curve alway


above leftcare.
Another decision rule:

I 0.
=
-

11/16" "'S
3 10
-

# NOW WHCH IS
RULE BETTER?)
# WIN RULE IS BETTER THAN RULE2)
C Intuitively, we are saying bed in region of 73,0
where infact
the likelihood of bio
is more.

Hence rules not


is optimal

=>
Now, if input bits are not equally likely?

pr(b 0)= 0.25


=

Pr(b 1)=

0.75
=

previously
->
the threshold is zero.
-> Now the threshold should decrease as a bit is more

likely.
> not
** Now, error in bit zero is as important as error in
bit 1, then what
should be the threshold??

·
The threshold should decrease,

moving on,
=>
we refer s =y
DETECTION THEORY
# (HYPOTHESIS TESTING)
->

You are given a set of possibilities (states of nature)


1 correct
out ofwhich only is

GOAL: Decide selectone choice based on set of


measurements & some prior knowledge

The
->

measurements are random.

EXAMPLES:
·

Digital communication -

deciding on which hit sent


is

·Find whether
out a aircraft present
is or not
·
voice or face recognition.

# ESTIMATION THEORY
·

Determine the value of unknown parameters based on

of
set measurements (with
some prior knowledge.
·
Measurements are random.
·
There are no finite choices.

EXAMPLES:
·Estimate the target distance.
·

Estimate the free ofsignal.

EX:Estimation
of PC signal in Noise.

y a+ 10
=

(only one measurement is available)


a is unknown constant
ww N(0,52)

GOAL:compute estimate of abased on


= WH4??
y ->
05JAN,2023.
BINARY DETECTION [IPOTHESIS TESTING]

Let x-Sop} be the possible states of nature

State of
to null hypothesis
State 1/H / alternate hypothesis.
-

Let

y=[yyt] denote the random


measurements
a fee

it
let denote the space in which measurements a lie.

ie,ytT i RM.
typically
=

·
The distribution of
y depends on state of nature

general, these
In
-

f(91)
let denote the pdf)Pruf of a given that to is
True
both shouldbe
let f (Y/A) A, 11
to
different distinguish 11 11
-

blu both.
GOAL:We need to determine the decision rule 8.1-X, that
maps each element of to an of π
element

Dip3 In other

3021,
words,
such
divide
that
into
two regions

*
* 5(y) Holy
=
=>
(x, y, f(3(1), f((N), PUCH), PUCHI] -> STATISTICAL MODEL
GIVEN TO US.

To find:Decision rule il), >


Regions 282,

patio
EX:
Here Y1

>
fo(u); b
=

G(x);b 1 =

1Call-ve]
&=
+
=1R Call the
2

I
-

EX:
Binary erasure channel.

x 50,13
=
TX RX

7[0,1,23
=
0 - 0

- ,e
>
(Y/()
2
Pify:
PMF 1
& 1
π
=
=

1-p if
y e
=

f(9(()=[0i7
35

Ip ifye
[0,23
30-90ez
HOW TO DEFINE 20 & Z1 7 =

or
2 213.
=

properties of
2nq
* 2022,:
HOW TO FIND THE
OPTIMUM DEUSION ROLE:

ERRORS IN IUPOTHESIS TESTING:



[Declare to]
[
pr
->

it is true ->
Type I error (Pr of false alarm)
NOT POSSIBLE

TO MINIMISE
ATATIME pr
->

Declare is
It true
- Types error (Pr of missed
BOTH
detection)
pr
->

[Pecave i is true/H -
probability detection.
of

=I probability of missed detection.


APRIOR Prob
Threshin

toy.
Terr

↓Terro
it 10 -> always chose to
t is -> (0,1)
t is 1500- always choseH,
if -> (1,0)


(Threshold)
to

EXAMPLE
·

WHICH ERROR TO REDUCE? TYPES OR TYPET)


Note:Both cannot be reduced simultaneously.
There is
always a trade off if you reduce one over

the other

SOIN: We
try to minimize
can the total error, provided
thatapriorprobability are known, i.e.

Pr(e) PrCH) Pr (H. 1(5) pU(H1). Pr (1/1)


=
+

-> In some situations, you may want


to give more pref to a

particular type of error (weighted instead of a priori probability


sappropriate weights.
pres= P(H(t) cu
+

PHOR). [MOTIVATION
DISEASE
-

EXAMPLE]
2
Type
I Type
error.
error
Three different approaches for detection:

1) Bayesian Approach
considered R.V
Hypothesis is as
-

Assign preferences weights to differenterrors.


-

Aprion probabilities are known.

2) Min-Max Approach
-

Hypothesis is considered RV.

-Aprion probabilities are not


known.

Assign pref) weights to differenterrors.

~> f(Y(#) f(y () are


3). Neyman. Pearson Approach: conditional
-

Hypothesis itself is not considered Random.


-No prefs) weights

BAYESIAN APPROACH Aprion


I probabilities.
with prob Po
2
Ho
Let it be the random hypothesis. H
· =

As with prob Pl

·
i1 5(y)
Let =

be the declared hypothesis.

·
Let
cy be the cost/ penalty associated with the event
H Hj,
=

↑ Hi
=

5000, 201, 40, 413 in


general 1000410.

Expected cost:J C0 =

Pr(#Ho,H=H) C)
+

Pr(M Ho,H H)
= =

pr (H H, H H).
40 Pr(F= H, H Hs) + c ,,
= =
=

NOW HOW TO THE DETECTION PROBLEM


SOLVE T]

Goal is to determine decision rule (20,2) minimise the


that

expected cost
·
To Find, Decision Rule 1 Decision Region.

zVz i
[toif yc,
=

5(y) =

2017 0
=
10 TAN, 1023

Expected cost 5 100


=
=

Pr[I Ho, H 10] 4,


= = +

Pr [F Ho, H H]
=
=

+ 910 Pr[F 1, H H] +4//


Pr[H H, H H]
=

= = =

Need to minimise expected cost, with


this variable 24
above
:By conditioning, need to bring 2002, into the
expectedcost.

=J 100 =

Pro) Pr(TH/A=H) +
C01 PUCH). Dr(H=/1 1) =

+ 40 Pr(HD) pr. N=H/1=15). PrCH).


+
Br (F=1 /H 1). =

Need to bring regions 22 through conditional densities.

PUCH)⑤ c01Pr(A))
f (y(1)dg f(Y(()
i. J 100
=
+

dy

⑥(,f(Y(x)dy.
1 -

(2,7(9 (1) dy
+40
Pr(1.)f(Y/15)dy
+
c
Pr(N)f(Y/n) .
is
This always the
is
This always-ve.
-

CooPrCOTT Coo) e
m
INI?
we need to penalise J
flYI)RCCCNECEIN
=

wrong decision more


↑g(y)
than correct
decision.

g(y)dy
J (+=

is over
⑭ (( y
Need to 1

-
minimise 2

: ischosen in such a way, gly)


that minimum.
is

i.e, chose 2
as the region where gly) is only negative.

: Decide I if gly)<0
Decide to ifgly) 70.
·
Decide A=H, if gly/s

Pr(10) (<10 -C00) f(Y11) -


pr (60141) f(Y(/) <0.

AIBN)
ocoopCNOPOC
=>
Threshold).
f(y(10)
I,
L(y) Likelihood Ratio
=

&
FH
=

L(y)xz optimal test that minimises the


it
=

Baye's nsk.

C HKELHOOD RATO TEST.

Here, L(y) is a sufficient


amoun
statistics in the problem.
Cs Why?
# ifasb
Here Lly) issufficient to calculate

asb a b2 the ratio, even if we don'tknow

asb =>
loga> logb.
or h(q)>h(b)
->
... let e(y)=h(L(y)), where h() is a monotonically.
increasing function.
Iff it is

2 r(y)-n(v)
a

monotonically increasing L(y) =>

function. (Itpreserves equivalent Ho

the inequaliti
Ifh() 10gc.), (L(y) called
likelihoodon
=

then log is log


EXAMPLE:DC signal noise detection in noise.

y a
=

w(();y w(ho);a is
+ =

known
w ΔN(0,v)
Po Pr (10)
=

0.5
=

P pr(H) 0.5
= =

(0 (p 0
= =

40 (q) 1
=
=

2
-

12
f(y/1) = # &

eta
f(3/4) nz
=

Likelihood Ratio:

=(A)
-(ry a) -

I
e
&v 1.=

(n(L(y))
2(zy a)
=
-

1u(v) 0
=

=
a(ry as it
-

to
#
> a x
THRESHOLD
y K
=

I G

me!
un

This
isthe
opting

=> with this rule, whatwould be


the expected cost??
:J pr (10). PUCH.1A) PVCA).
prCHI)
= +

=0.590f(314) dy +0.5

I
flY/Nay

q(E).
=

EXAMPLE:BINARY CHANNEL
TRASURE
TX RX

,e
21p
20
0 - 0

1(3/15) =

ye
-
>
1 1
π

-(1) Pr(t) =
q 100 4 1
=
0
=

Pr(1) 1-9 40 (0) 1


=

= =

Here the problem is discrete.

4
0 if y-o
L(y) =

if y 2
v
aq
=
=


if
y 1.
=

LRT:
If yo
=> ((y)<w => M Ho.
=

if y 1
=
=
L(y)> 0 => iH =

If
ye =

=> 1x
qq
chose I Hi1 >
= =>
q <//2.

I 1if(<
=
=> q</z.
DIFFERENT
CASES UNDER SAME FRAME WORK:

·
Minimum Prob of error (MPE) & Minimum A Posterior(MAP).

100 C1 =0
=

1
40 (p)
=

J Pr(Total eror) PLA) PUCH. (10) P(H).


= =
+

PSH)

sylH) A
PUCH)

SIPrCHoly)
->
MAP DETECTOR.
using PrCNly) chose the hypothesis.
Baye's Rule > that results in largest
a-priorprob
MAXIMUM LIKAHOOD DETATOR:

Co 21) 0
= =

1
(0) 40
=
=

PUCH) PrCH) 0.5


= =

DETECTOR

<f(YHto
ML
o
& (314)

OPERATING CHARACTERISTICS:

·
performance of is
LRT defined by two error probabilities.

PF=Prob of False alarm =

pr(110)
PA prob
=
of
missed detection:prCHHH).

PD 1-pm PrCHA). prob of detection.


= =

performance is illustrated by the pop curve, called


Reciever output characteristics. (ROC) cme,
We ideally wants
to be in this
region i.e. P is
·deal.
MOST moved Dris less.
so
ly/7)cMw
1,0- 0
8 x *

ofi
Fow
case2. a

If U=0, PD 1, PF = 1.
=
·

If U-0, Pp 0, =

PF 0 =

·
ROC CURVE: 6 PF
PROPERTIES OF -
0

increasing function of P
#x
N
1) Pp an
is

2) always above
Pp.PF came lie
the diagonal.
V vv
3)P-PF curve is
always concave f
case 1 case2.
4) PD-PF
curve represents all
achievable points as threshold
is
varying. Detection probability of Case2>

3 (with each, Calculate PDF, they will


case 1.

always lie on the curve)

5) For continuous valued observations:


dep =2
i.e. slope of ROC curve at
any point (pp,p**):Threshold
⑭TAN,
2023 that
was used to

Justifying above properties: obtain that


point
L(y)> 8
As
- >

84, 2, shrinks as a result P8 also decreases as

From this
inequality they are integrated over
·

we can find regions


20821 ->
suppose we ignore the measurements and make decisions
based on a biased coin, pr(Heads=P
·

If 84, 2, region
·
shrinks.
For 0,02
5(y) [ifheadones
=

2x8, 7 2((z)
Pr((((0) p
pEcPF) PF
3 This gives diagonal of pop
=

the
·

muse.
=

4,2<4p Pb pr(H )(4) P


1

G
= =

iswithout
This data.

So, with
measurements given, we clearly.
will perform better.

:The PPF curve will always lie above

the diagonal.
Ex:DC signal in noise,
a is known
y m
=
(vs) y a+ b
=

a>0

w = N(0,02)

en((y)) (ry
= -

a)

Test:
a(y. a) In
yt a
=>
-

e)
Aim:To plot ROC

edy
&

Er
PD:

9(EV)
9(Z rn(8)
=
-

) a Ed

Pp a(((p) 1)
=
-

a signal to
noise ratio.

&

PF=1retary 150

#
a(t)
=
d= 3
d2=

d1
=

PF
a((0) a) d=
= ->
+
0

Plot POFon Matlab to verity.


If
->

d is large, the curves are well separated, then.


the detection would be much better, which is quite
from
evident the plotas well (as a, the care more

towards, less error region).


spli) * (x C
understand the ->

intuition why certain

things are in numerator


& denominator with 0
~

&202, regions.
=>
Rewriting the equ:
cost

J =
Co0PrSHol

Co, PUCH)
(0)7(Y)N]dy.
+[Pr(ii) f(3(+) +

Pr(()(( -

g(y)
J C0 PUCH + CO1PUCH) PUSH) (410-200)PF-PUSA) ((01-211) PB.
=
+

calculate 5
PDFgiven
With we can the

MIN-MAX IUPOTHESIS TESTING:


·
There are situations where Prit)&PrCA) are not known.
but costs are given. (600, (a, 410,111)
·
True Prob. PrCH)=p (Not known)
·
We will assume/guess some pulH):Pa

NOTE POINTS

=>

p
let = 0.5

If assumed
=>
pF -> does it depend on Pa,P, both or neither??

PuF05 & Par =


0.7
·

depends only on Pa
inherently, we are

optimizing
to Pa
the test PF
17(y/
=

y. 0
=10-00 rcioin
O is dependent
only on Pa
:P9F05gives better
i.e. threshold depends Pa.
performance as it on

close
is to
the Idepends only on Pa.
probability.
·Itdepends only on Pa.
·
We design the test using pa. The Propp for this
test

depends only upon Pa (and not P) let's denote it

by PF(Pa). & Pp(Pa).

·
The J'(pa, P 200(1-p) (01P
= +

+ (1 p) (210
- -

200) PF(Pa)
&depends on both pa, P) -

P(CO1 -2(1) PD(Pa)

( p) [(0 (P0 100)


= - +
-

Pr(Pq)]
p[(a
+

- ((0)
-

(1) PD(Pa)]
(1 p)
=
-

[(0 Pr(011) 40
+

Pr(I(10]
- Expcost/ Ho
=

p[G,Pr(((() 4)PrCH(())
+
+

-
R,(Pa) Exp.cst/H1
=

T(Pa,p) (1 p)R0(Pu) PRi(Pa).



- +
=

HOW CAN WE
MAKE TAS ROBUSTTO i?
HOW TO ATTAIN TAS RESULT
·
choice:Ro(1a) R,(Pas
= % MATHEMATICANY 13.

J(P9,P)=Ro(Pa)
=>
Independentof P)

We wish to Pa
select such that,

min [mal(Pa, p mu
-
rain -

Maxcritationhe

worstcase.
FACTS:

1) T(Pa,P) p
linear we
is

i)J(Pa,9] J(P,P)
>

ii) T(PP) &


JCP) is a concavet" of p
iv) J(Pa,0) RoCPaS
=

-(Pa,1) R,(Pa)
=
J(Pa,P)
~
i R,(Pa)

mini
~8 J(P,p)

·
J(P0,P) maximum
is either @ p 1, p 0.
=
=

·
min (max J(P0,PK)

You might also like