Title: Decision Theory
1Lecture 3
2Contents
- Introduction
- Transmission and reception probabilities in the
presence of noise - Bayess decision criterion
- Neyman Pearson criterion
3Introduction
- There are two types of decision
- Hard decision decoding, where it is immediately
decided whether a 1 or 0 was transmitted - Soft decision decoding where a scale of 8 to 16
levels (3 or 4 bits) is used to decide exactly
how close to 0 or 1 an actual received signal
was. This is often used in conjunction with a
soft input decoding algorithm. - Soft decision decoding outperforms hard decision
decoding by a significant amount. - For this lecture only we will assume a hard
decision is what is required.
4The following probabilities need to be
understood-
- P(0) The a priori probability of transmitting a
0 - P(1) The a priori probability of transmitting a
1 - p(v0) The conditional probability of v received
given a 0 was transmitted - p(v1) The conditional probability of v received
given a 1 was transmitted - P(0v) the a posteriori probability that 0 was
transmitted given v received - P(1v) the a posteriori probability that 1 was
transmitted given v received
5a priori means before transmission a posteriori
means after transmission
- and 2) are properties of the source and are often
known in advance e.g. random binary data has
P(0)P(1)0.5 - 3) and 4) are properties of the channel (not
always known) - 5) and 6) are what we would like to know!
6Example Binary symmetric channel
A binary symmetrical channel is one in which the
two binary possibilities (1 and 0) are sent
with equal probability. Assuming that p is the
probability of error in a single bit, then the
schematic of the transition probabilities is
shown overleaf.
7(No Transcript)
8 Some properties of this channel are-
P(0RX0TX)P(1RX0TX )1
P(0RX1TX)P(1RX1TX )1 Because only 1 or 0
can be received. P(0RX1TX)P(1RX0TX
)p Because p is the probability of error
P(0RX0TX)P(1RX1TX )1-p Because 1-p is
the probability of correct transmission
9Thus- P(0RX )P(0TX )P(0RX0TX )P(1TX
)P(0RX1TX ) P(1RX )P(1TX )P(1RX1TX
)P(0TX )P(1RX0TX ) or alternatively-
P(0RX )P(0TX ) (1-p)P(1TX )p P(1RX
)P(1TX ) (1-p)P(0TX )p
10Multiple Signal Transmission Example
If we have a source that produces 6 different
signals (A-F ) then we have a 6 by 6 transmission
matrix-
X
Matrix shows P(YRXX TX )
Y
11and assuming we have the following a priori
transmission probabilities-
12- 1) Calculate the probability of error if a
single D is transmitted. - 2) Calculate the probability of error if a
random string (obeying the properties on the
previous slide) is transmitted. - 3) Calculate the probability of receiving a C in
the above two cases.
13- The probability of receiving a D when a D was
transmitted is 0.6667 (row 4, column 4 of the
matrix), so probability of error is 1.0 -
0.66670.3333 - For a random data stream A-F, we take the
probability of occurrence for each symbol and
multiply it with the probability of error for the
same symbol.
143) i. For a D transmitted the probability of
getting a C is read straight from the matrix
(column 4, row 3) as 0.0. 3) ii. For a random
stream of data we multiply the probability of
occurrence for each symbols times the probability
of getting a C for that symbol (row 3 in the
matrix)
15Bayess Decision Criterion
- This rule minimizes the average cost of making a
decision whether it was a 1 or 0 transmitted
at the receiver. In a binary transmission system
there are only two costs associated with each
decision. - C0 the cost associated with incorrectly
deciding that a 1 transmitted was a 0 at the
receiver. - C1 the cost associated with incorrectly
deciding - that a 0 transmitted was a 1 at the
receiver. - In many cases C0C1, but not always. C0 , C1
can have any units at all (, or lives).
16Or in words both sides of the equation give us
the probability that vRX because of a 1
transmitted. Rearranging gives us Bayess rule
(Bayess theorem)-
17In words, the average cost in deciding vRX was a
0 is the cost of the mistake C0 multiplied by the
probability that vRX was caused by a 1. From
the general symmetry of the problem-
We take the decision that has the lower
conditional loss, for example we decide on a 1
for vRX if -
18Substituting the first two equations on the
previous slide into the last equation on the
previous slide yields that we should decide 1 if-
Now we use Bayess rule which states that-
and
19Substituting Bayes rule into first equation on
the previous slide yields that we should decide 1
if-
This is Bayess decision criterion.
Left hand side is likelihood ratio and the right
hand side is the threshold.
20Maximum Likelihood Decision Criterion
If the costs are equal and the probability of
transmission of 1 and 0 are equal to 0.5, then we
have-
and
As is the case in many channels, we should decide
on 1 if-
In words should decide on a 1 if the received
voltage vRX is more likely to have been caused by
a 1 than a 0. This is maximum likelihood decision
criterion.
21Comparison of receiver types
Receiver A priori probabilities known Decision costs known Assumptions
Bayes Yes Yes None
Maximum a posteriori (MAP) Yes No
Maximum Likelihood No No
22Binary Transmission Example
A binary transmission process is subject to
additive Gaussian noise with a mean of 1 V and
standard deviation of 2.5 V. A logical 1 is
transmitted as 4 V and a logical 0 is generated
as -4 V, before the noise is added. In this case
C0 C1100.00 and P(1)2P(0). a) Find the
average probability of error per bit transmitted
if a 0 V decision threshold is used. b) Find the
average cost per bit transmitted with a 0 V
threshold.
23c) Find the optimum decision threshold voltage
for the minimum average cost. d) Find the
average probability of error for the optimum
threshold from part c) e) Find the average cost
per bit transmitted with the optimum decision
threshold as calculated in part c) above.
24p(x)
pdf of transmitted signal
2/3
1/3
4 volts
Amplitude (volts)
-4 volts
p(x)
pdf of noise
1 volt
Amplitude (volts)
25p(x)
pdf of received signal and noise
5 volts
Amplitude (volts)
-3 volts
26Binary Transmission Example -Solution
a)
27b) Average cost per bit is
28c) P(1)2/3 and P(0)1/3. The optimum decision
threshold occurs when Bayess decision rule
becomes an equality-
The distributions p(vth1TX) and p(vth0TX) are
given by-
29c) (cont)
Thus-
30c) cont.
Taking natural logs (ln) of both sides of this
equation-
31c) cont.
32d) Now we have our new threshold, we can
substitute it into the earlier equation for total
error probability-
33d) cont. Which is less than part a). This isnt
always the case, Bayess decision rule can
actually increase the probability of error if the
costs are unequal, making more cheap wrong
decisions in order to make less expensive wrong
decisions. e) Average cost per bit is now-
Which is less than part b) as should always be
the case since Bayess decision rule minimizes
the average cost.
34Neyman Pearson Criterion
This does not require any knowledge of the source
statistics. This also works well in situations
where the cost of missing detection is very much
greater than the cost of false alarm, C0 gtgtC1.
If we have a threshold vth of detection PD is-
Unfortunately the above equation doesnt help if
you dont know in advance the probability of the
target being present, in a RADAR system for
example.
35 The probability of false alarm is-
36The threshold is set to give an acceptable PFA.
It is important that the noise pdf used is
accurate, but this should not be a problem as you
can measure noise at your receiver for long
periods, for example point your RADAR system at
an area of space where you know there are no
targets and measure the noise statistics.