Title: Channel capacity
1Cmax I(xy) ? that is
maximum information transfer
Binary Symmetric Channels
The noise in the system is random, then the
probabilities of errors in 0 and 1 is the
same. This is characterised by a single value p
of binary error probability.
0 0
p(0)
x (transmit)
y (receive)
p(1)1-p(0)
1 1
2Channel capacity of this channel
Mutual information increases as error rate
decreases
This is an backward equivocation (error
entropy),p is fixed, so the I(xy) is maximum when
H(y) is maximum. This occurs when p(0)p(1) at
receiver (output) H(y)1.
C1-H(p)
3Example . Find the capacity of a binary symmetric
channel with a binary error of 0.125.
(a) Variation of information transfer with output
probability
(b) Variation of Capacity with error probability
4- How to overcome the problem of information loss
in noisy channel? - Physical solution?
- (b) System solution. (Channel coding).
Source coding The task of source coding is to
represent the source information with the minimum
of symbols under the assumption that channel is
noisy-free. When a code is transmitted over a
channel in the presence of noise, errors will
occur. Channel coding The task of channel coding
is to represent the source information in a
manner that minimises the error probability in
decoding.
Redundancy --- put extra amount of information
to compensate information loss (temperature
control of a room in winter for different outdoor
temperature).
5Symbol error is the error based on some decision
rule If a received code word (some bits might
be in error) is classified as the wrong symbol
(different than the original symbol it meant).
Binomial distribution plays an important role in
channel coding A binomial distribution
experiment consists of n identical trials,
(think of coding a symbol by a binary digit
sequence i.e. code word , so n is length of the
code word). Each trial has two possible outcomes,
S or F, respectively, with a probability
p. Easily S can be defined as a transmission
error (1?0 or 0?1). The probability p is bit
error rate.
is used to calculate probability of r bit errors
in a codeword.
6 error protection-- improve tolerance of errors
error detection --- indicate occurrence of
errors. or error
correction
- Binary coding for error protection
Example Assume Binary Symmetrical Channel,
p0.01 ( error probability)
- Coding by repetition
Code A00000, B11111, use majority decision
rule. ? If more 0s than 1s ?A 2 errors
tolerated without producing symbol error. Use
binomial probability distribution to find symbol
error probability p(e)
7Information rate
M ? number of equiprobable code words. n ?
number of binary digits
P(e) if R
82) Coding by selection of code words
( using 5 digits, there are 32 possible code
words, But we dont have to use them all. )
- Two selections ( i.e. repetition)
- A00000, B11111
- This gives
- Thirty -two selections
-
9- 4 selections
A compromise between two extremes
- A lot of code words to give reasonable R.
- Code words are as different as possible to reduce
- p(e), e.g.
A 00000
B 00111
C 11001
D 11110
Each code word differs from all the other in at
least three digit positions.
Hamming distance is the number of digits
positions in which a pair of code words differ.
10Minimum Hamming distance (MHD) is the smallest
hamming distance for the set of code words.
MHD3.
One error can be tolerated.
11 32 Selections 32 Selections 32 Selections 32 Selections 2 Selections 2 Selections 4 Selections 4 Selections
A 00000 Q 10000 A 00000 A 00000
B 00001 R 10001 B 11111 B 00111
C 00010 S 10010 C 11001
D 00011 T 10011 D 11110
E 00100 U 10100
F 00101 V 10101
G 00110 W 10110
H 00111 X 10111
I 01000 Y 11000
J 01001 Z 11001
K 01010 11010
L 01011 . 11011
M 01100 , 11100
N 01101 11101
O 01110 11110
P 01111 ? 11111