Title: Information theory
1Information theory
- Multi-user information theory
- Part 3 Slepian Wolf Source Coding
- A.J. Han Vinck
- Essen, 2002
2content
- Source coding for dependent sources that are
separated
3Goal of the lecture
- Explain the idea of
- independent source coding for dependent source
4Problem explanation
X (X,Y) decoder Y
Independent encoding of X we use nH(X)
bits Independent encoding of Y we use nH(Y)
bits Total nH(X) H(Y) ? nH(X,Y)
nH(X)H(YX)
5(No Transcript)
6(No Transcript)
7(No Transcript)
8(No Transcript)
9realization
X
X
H(X)
De-compress
compress
XHT
YHT
NHT
Syndrome former
De code
N
Y
n-k ? nh(p)
Y X ? N
Total rate H(X) H(YX) H(X) n h(p)
10(No Transcript)
11(No Transcript)
12(No Transcript)
13Can we do with less?
Generate 2nH(X) typical X sequences decoder
needs only nH(X) bits to determine X Generate
2nH(Y) typical Y sequences how does Y
operate?
14intermezzo
for N 2nH(YX)? different colors we do
M 2nH(YX) random selections Then
Probability( all different colors in M drawings )
N(N-1)(N-2)???(N-M1)/NM 1 for M/N ? 0
N large
15Coding for Y
Y generates 2nH(Y) typical sequences every
sequence get one of 2nH(YX)? colors The
decoder knows everything about X, Y and the
coloring
16decoding
- decode X from nH(X) received bits
- Find the possible 2nH(YX) typical Y sequences
for the particular X - 3) Use the color to find Y from the nH(YX)?
bits
17Result
Sum rate nH(X) nH(YX)? ? nH(X,Y) for
? small and n large
18 Homework
formalize the proof
19alternative
- For linear systematic codes
- H H1 , H2 H1 , In-k
- k ? n n h(m/n) (Hamming bound)
- m of correctable errors
20General Transmission scheme
- assume A and B differ in ? m positions
- A ( a1 , a2 , a3 ) B ( b1 , b2 , b3 )
- k1 k2 n-k
k1 k2 n-k - Transmitter
- X ( a1, HA) and Y (b2, HB) ? 2n k bits
- Receiver
- S H A ? B ?( e1 , e2 ) ( a1 ? b1 , a2
? b2 ) ? a2 , b1 - a3 H (a1, a2) ? HA b3 H ( b1, b2 ) ? HB
21Efficiency
- Entropy H(A) H(BA) n nh(m/n)
- Transmitted 2n-k n (n-k) ? n nh(m/n)
- Optimal if we have optimal codes!