Title: PRESENTATION FROM THIS PART ONWARDS
1PRESENTATION FROM THIS PART ONWARDS
BY Dr. P. S.
SATYANARAYANA PROFESSOR H. O. D
DEPARTMENT OF EC B. M. S. COLLEGE OF
ENGINEERING BULL TEMPLE ROAD
BANGALORE 560 019 e-mail pssvittala_at_yahoo.co
m
2- SESSION I
- DISCRETE CHANNELS WITH MEMORY
- PROPERTIES OF ENTROPY
- EXTENSION OF A ZERO MEMORY SOURCE
- DISCRETE MEMORYLESS CHANNELS
3- DISCRETE CHANNELS WITH MEMORY
- MEMORYLESS CHANNELS
- NO INTER-SYMBOL INFLUENCE
-
- PRACTICAL CHANNELS -
- INTER-SYMBOL INFLUENCE DOES EXIST
-
- ERRORS OCCUR IN BURSTS
-
-
-
-
4EXAMPLES 1.TELEPHONE CHANNELS THAT ARE
AFFECTED BY SWITCHING TRANSIENTS AND
DROPOUTS 2. MICROWAVE RADIO LINKS THAT ARE
SUBJECTED TO FADING
5GAUSSIAN NOISE - INDEPENDENT ERRORS
--- CHARACTERISTIC OF MEMORYLESS
CHANNELS IMPULSE NOISE BURST ERRORS ---
CHARACTERISTIC OF CHANNELS WITH MEMORY COMPLEX
PHYSICAL PHENOMENA INVOLVED
DIFFICULT TO CHARACTERIZE THE CHANNEL GILBERT
MODEL --WIDELY USED TO DESCRIBE
SUCH CHANNELS
6HERE THE CHANNEL IS MODELED AS A DISCRETE
MEMORYLESS BSC, WHERE THE PROBABILITY OF
ERROR IS A TIME VARYING PARAMETER. THE
CHANGES IN PROBABILITY OF ERROR ARE MODELED BY
A MARKOFF PROCESS SHOWN IN THE FIG 1 BELOW.
7(No Transcript)
8THE ERROR GENERATING MECHANISM IN THE CHANNEL
OCCUPIES ONE OF THREE STATES. TRANSITION FROM
ONE STATE TO ANOTHER IS MODELED BY A DISCRETE,
STATIONARY MARKOFF PROCESS.
9ILLUSTRATION SUPPOSE THE CHANNEL IS IN STAE -2
HERE THE ERROR PROBABILITY IS 10 2 CHANNEL
STAYS IN THIS STATE DURING THE SUCCEEDING BIT
INTERVAL WITH A PROBABILITY OF 0.998 HOWEVER,
THE CHANNEL MAY GO TO STATE 1 WHERE BIT ERROR
PROBABILITY IS 0.5
10SINCE THE SYSTEM STAYS IN THIS STATE WITH
PROBABILITY OF 0.99, ERRORS TEND TO OCCUR IN
BURSTS (OR GROUPS). STATE 3 - REPRESENTS A LOW
BIT ERROR RATE ERRORS IN THIS STATE ARE
PRODUCED BY GAUSSIAN NOISE - ERRORS VERY RARELY
OCCUR IN BURSTS WHILE THE CHANNEL IS IN THIS
STATE. OTHER DETAILS OF THE MODEL ARE AS SHOWN
IN FIG 1.
112. LOGARITHMIC INEQUALITIES
12 FROM THE GRAPH ln x (x-1), equality iffy
x 1
13- 3. SOME PROPERTIES OF ENTROPY H(.)
- H FUNCTION IS ALWAYS A CONTINUOUS FUNCTION
- THIS PROPERTY IS OBVIOUS BECAUSE THE
PROBABILITIES ARE CONTINUOUS IN THE INTERVAL (0,
1) AND LOGARITHM OF A CONTINUOUS FUNCTION IS
CONTINUOUS BY ITSELF -
- 2. H FUNCTION IS SYMMETRICAL W. R. T ITS
ARGUMENTS - THIS PROPERTY SIMPLY MEANS THAT THE ENTROPY
- FUNCTION DOES NOT DEPEND ON THE ORDER IN WHICH
- THE PROBABILITIES ARE ARRANGED
14SPECIFICALLY FOR A SOURCE WITH THREE
SYMBOLS Ss1, s2, s3 Pp1, p2, p3 Hp1,
p2, p3 Hp1, p3, p2 Hp3, p1, p2 AND SO ON
3. H FUNCTION IS MAXIMUM WHEN ALL SOURCE SYMBOLS
ARE EQUALLY PROBABLE THIS
PROPERTY CAN BE PROVED USING LOG
INEQUALITIES
15CONSIDER H(S) log q, where q no. of source
symbols
16(No Transcript)
17PARTICULAR CASE- ZERO MEMORY BINARY SOURCES THE
MAXIMUM ENTROPY PROPOERTY FOR A BINARY SOURCE IS
DEPICTED IN FIG. 3
18(No Transcript)
19 EXTENSION OF A ZERO MEMORY SOURCE CONCEPT IS
USED IN CODING . SEQUENCES OF SYMBOLS USED AS
CODEWORDS CONSIDER A TRINARY SOURCE SS1, S2,
S3 ITS 3RD EXTENSION IS AS BELOW (q33327
symbols) S1S1S1, S1S1S2, S1S1S3 S1S2S1,
S1S2S2, S1S2S3 S1S3S1, S1S3S2, S1S3S3 S2S1S1,
S2S1S2, S2S1S3 S2S2S1, S2S2S2, S2S2S3 S2S3S1,
S2S3S2, S2S3S3 S3S1S1, S3S1S2, S3S1S3 S3S2S1,
S3S2S2, S3S2S3 S3S3S1, S3S3S2, S3S3S3
20nth EXTENSIOIN OF A q- SYMBOL DMS WILL HAVE qn
COMPOSITE SYMBOLS. EACH COMPOSITE SYMBOL IS A
SEQUENCE OF n-ORIGINAL SOURCE
SYMBOLS. THE n th EXTENSION OF A DMS , S, WILL BE
DENOTED BY Sn A TYPICAL COMPOSITE SYMBOL APPEARS
LIKE THIS si si1,si2,si3,sin EACH sij IS
ANY ONE OF THE Q-ORIGINAL SOURCE SYMBOLS. SINCE
SYMBOLS ARE INDEPENDENT P(si )P(si1,si2,si3,sin)
P(si1).P(si2).P(si3)P(sin)
21THUS ALL SYMBOL PROBABILITIES OF THE EXTENSION
ADD TO ONE NOW WE SHALL FIND THE ENTROPY OF
THE EXTENSION
22(No Transcript)
23Example A zero
memory source has a source alphabet S s1, s2,
s3, with P 1/2, 1/4, 1/4. Find the entropy
of the source. Find the entropy of the second
extension and verify H ( S 2 ) 2. H (S) We
have H(S) (1/2) log (2) 2 (1/4) log 4
1.5 bits/sym. The second extension and the
corresponding probabilities are tabulated as
below S 2 s1s1 , s1s2 ,
s1s3 , s2s1 , s2s2 , s2s3 , s3s1 , s3s2 , s3s3
P( S 2 ) 1/4 , 1/8 , 1/8 , 1/8 , 1/16 ,
1/16 , 1/8 , 1/16 , 1/16 Hence, H(S 2) (1/4)
log 4 4 (1/8) log 8 4 (1/16) log16 3.0
bits / sym. ?H (S 2)? H (S) 3 / 1.5 2
Indeed H ( S 2 ) 2. H (S)
24 4 DISCRETE MEMORYLESS CHANNELS
Fig 4.1 A Simple Communication System
25 Sum of all entries of JPM
Sum of all entries of JPM in the kth row
Sum of all entries of JPM in the jth column
And also
1
26FIVE ENTROPY FUNCTIONS
H(X, Y), H(X), H(Y), H (XY) AND H (YX)
p(x1, y2) log
H(X, Y) p(x1, y1) log
p(x1, yn) log
p (x2, y1) log
p(x2,y2) log
p(x2,yn) log
p (xm, y1) log
p(xm,y2) log
p(xm,yn) log
27H(X, Y)
log
H(X)
REPLACING p(xk) BY
H(X)
Similarly, H(Y)
28i.e., p (xk yj ) p (xk , yj ) / p ( yj )
29H(X Y) E H(X yj) j
30RELATION AMONG ENTROPY FUNCTIONS
Or H(Y X) H(X, Y) H(X)
That is H(X, Y) H(X) H(Y X )
Similarly, you can show H(X, Y) H(Y) H(X
Y)