Information Theory - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

Information Theory

Description:

Winning the lottery jackpot. Numbers from 1 to 49. Pick 6 numbers. Total possible combinations ... What about sharing the jackpot? needle length. X: event that ... – PowerPoint PPT presentation

Number of Views:53
Avg rating:3.0/5.0
Slides: 17
Provided by: tracd
Category:

less

Transcript and Presenter's Notes

Title: Information Theory


1
Information Theory
  • Trac D. Tran
  • ECE Department
  • The Johns Hopkins University
  • Baltimore, MD 21218

2
Outline
  • Probability
  • Definition
  • Properties
  • Examples
  • Random variable
  • Information theory
  • Self-information
  • Entropy
  • Entropy and probability estimation
  • Examples

Dr. Claude Elwood Shannon
3
Deterministic versus Random
  • Deterministic
  • Signals whose values can be specified explicitly
  • Example a sinusoid
  • Random
  • Digital signals in practice can be treated as a
    collection of random variables or a random
    process
  • The symbols which occur randomly carry
    information
  • Probability theory
  • The study of random outcomes/events
  • Use mathematics to capture behavior of random
    outcomes and events

4
Probability
  • Events and outcomes
  • Let X be an event with N possible mutually
    exclusive outcomes
  • Example
  • A coin toss is an event with 2 outcomes Head (H)
    or Tail (T)
  • A dice toss is an event with 6 outcomes
    1,2,3,4,5,6
  • Probability
  • The likelihood of observing a particular outcome
    above
  • Standard notation

5
Important Properties
  • Probability computation or estimation
  • Basic properties
  • Every probability measure lies inclusively
    between 0 and 1
  • Sum of probabilities of all outcomes is unity
  • For N equally likely outcomes
  • For two statistically independent event

6
Probability Examples
  • Fair coin flip
  • Tossing two honest coins what is the probability
    of observing two heads or two tails?
  • Poker game with a standard deck of 52 cards, what
    is the probability of getting a 5-card heart
    flush?

Four equally likely outcomes
Possible flush outcome
Total possible outcome
7
Probability Examples
  • Dropping needle game
  • Winning the lottery jackpot
  • Numbers from 1 to 49
  • Pick 6 numbers
  • Total possible combinations
  • Chance of winning 1 out of roughly 14 millions
  • What about sharing the jackpot?

X event that the needle touches one of the
regularly-spaced parallel lines
needle length
8
Random Variables
  • Random variable
  • A random variable is a mapping which assigns a
    real number to each possible outcome of a random
    experiment
  • A random variable X takes on a value from a given
    set. Thus it is simply an event whose outcomes
    have numerical values
  • Examples
  • X in coin toss, X1 for Head, X0 for Tail
  • The angular position of a rotating wheel
  • The output of a quantizer at time n
  • Digital signals can be viewed as a collection of
    random variables, or a random process

9
Information Theory
  • A measure of information
  • We have explored various signals however, we
    have not quantify the information that a signal
    carries
  • The amount of information in a signal might not
    equal to the amount of data it produces
  • The amount of information about an event is
    closely related to its probability of occurrence
  • Self-information
  • The information conveyed by an event A with
    probability of occurrence PA is

10
Information Degree of Uncertainty
  • Zero information
  • The sun rises in the east
  • If an integer n is greater than two, then
    has no solutions in non-zero
    integers a, b, and c
  • Little information
  • It will snow in Baltimore in January
  • JHU stays in the top 20 of US World News
    Reports Best Colleges within the next 5 years
  • A lot of information
  • A Hopkins mathematician proves P NP
  • The housing market will recover tomorrow!

11
Entropy
  • Entropy
  • Average amount of information of a source, more
    precisely, the average number of bits of
    information required to represent the symbols the
    source produces
  • For a source containing N independent symbols,
    its entropy is defined as
  • Unit of entropy bits/symbol
  • C. E. Shannon, A mathematical theory of
    communication, Bell Systems Technical Journal,
    1948

12
Entropy Example
  • Find and plot the entropy of the binary code in
    which the probability of occurrence for the
    symbol 1 is p and for the symbol 0 is 1-p

H
1
0
1
p
1/2
13
Two Extreme Cases
P(XH)P(XT)1/2 (maximum uncertainty) Minimum
(zero) redundancy, compression impossible
P(XH)1,P(XT)0 (minimum redundancy) Maximum
redundancy, compression trivial (1 bit is enough)
Redundancy is the opposite of uncertainty
14
Entropy Example
  • Find the entropy of a DNA sequence containing
    four equally-likely symbols A,C,T,G

15
Estimating Probability Entropy
  • Occurrence probabilities are usually not
    available
  • We need to estimate the probability by observing
    the data
  • Effective probability
  • Perform an experiment N times and count the
    number of times outcome Xi occurs
  • Need a large value of N to be accurate
  • Effective entropy
  • Can be computed from the estimated probabilities

16
Example
  • DNA sequence ACATAGCTCA
Write a Comment
User Comments (0)
About PowerShow.com