Title: ECE 801'02 Information Theory Course Project
1ECE 801.02 Information Theory Course Project
- Non-Shannon-Type Information Inequalities
- Presented by Zhoujia Mao and Bo Ji
1 Z. Zhang and R. W. Yeung, "On
characterization of entropy function via
information inequalities", IEEE Transactions on
Information Theory, vol. IT-44, no. 4, pp.
1440-1452, July 1998.
2Outline
- Introduction
- Problem Statement
- Main Results
- Theorem 3 in the paper
- Outer bound
- Inner bound
- Applications
- Discussion and Conclusion
3Introduction
- Basic information inequalities of Shannons
information measures - Non-negative implied by the non-negativity of
joint entropy - Non-decreasing implied by the non-negativity of
conditional joint entropy - Two-alternative implied by the non-negativity of
the conditional mutual information.
4Introduction
Let be the set of all functions defined on
taking values in
If , then there exist constraints
on an entropy function which are not implied by
the basic inequalities. Such a constraint, if in
the form of an inequality, is referred to a
non-Shannon-type inequality.
5Problem Statement
- For n2, 2 proved that
- For n3, 2 proved that
- but
- For , 1 has following conjecture
- 1 proved Conjecture 1 by Theorem 3, which means
that cannot fully characterize the entropy
function. -
2 Z. Zhang and R. W. Yeung, "A non-Shannon type
conditional inequality of information
quantities," IEEE Trans. Inform. Theory, vol. 43,
pp. 19821985, Nov. 1997.
6Main Results
7- The multivariate mutual information is defined by
McGill as - which will be used frequently in the proof of
Theorem 3.
8Main Results
Proof Define a function F by letting
9Outer Bound
???
And Theorem 3 says that
10Inner-bound
- Need for inner-bound since may not be
true, we need both outer and inner bound to get
closer to the exact region - We will at first introduce a new coordinate system
11 - By induction from
- we can have
- So, define
- accordingly
12- The following Lemma is used for simplify
following calculation and is not proved in ,
so we give the proof - Lemma 1 of
Z. Zhang and R. W. Yeung, "On
characterization of entropy function via
information inequalities", IEEE Transactions on
Information Theory, vol. IT-44, no. 4, pp.
1440-1452, July 1998
13- Proof
- 1) Extend
to form - .
Thus, when - ,
- 2) By induction, suppose when
-
14- 3) When , the right side
summation of can be
grouped as pairs -
, take as
, and take as , so the above
summation is which is
an item of case - 4) Since case has items, and
every two items make a pair, after reduction,
exactly items remain as case
15- Now we start to find certain inequality to define
the inner bound - Define
, after some calculation using the former
Lemma, we have - here, is the same as in Theorem 3
for finding the outer-bound
16- Changing the general function into entropy
functions, we have - so, this item can be negative
- Define
- since it takes a subset of the above item,
then it is reasonable to guess
17- Theorem 6 in
- The details of this proof can be found in , we
state the main idea here first list some
classical cases of constructions for entropy
functions, and then find a sequence of
constructible functions and nonnegative
coefficients such that
for any in
Z. Zhang and R. W. Yeung, "On
characterization of entropy function via
information inequalities", IEEE Transactions on
Information Theory, vol. IT-44, no. 4, pp.
1440-1452, July 1998
18Application
- Scenario
- Multi-source multi-sink network coding
- Sources are independent
- Channels are error-free (Reason since data is
coded together, similar to noise come in, thus
channel noise should be assumed not to exist in
order to simplify analysis)
19- Notation
- denotes the set of sources, denotes the
set of channels and denotes the set of
receivers, denotes maximum rate at link - is the rate of , and are
auxiliary r.v. - A receiver requires data from a set of
source to decode the coded data - Let ,
, and
20- Define constrained regions
- By independent sources, let
- By error-free channel, define
- By rate constraint, define
21- The region studied before is used to combine
with these specific constrained regions to give a
good capacity region for this scenario - This region is
- where ,
Z. Zhang and R. W. Yeung, "On
characterization of entropy function via
information inequalities", IEEE Transactions on
Information Theory, vol. IT-44, no. 4, pp.
1440-1452, July 1998
22Contribution
- Makes a step for fully characterizing the region
and have some applications in simple network
coding cases - The main techniques they use to find a region are
finding inequalities for entropy functions and
then generalizing them to general functions and
the main idea is to find easier-obtained outer
and inner bound to get close to the exact region,
and these techniques and ideas are quite useful
in finding a region
23Shortage
- The outer and inner bound and are still not
tight and fully understood, for example, whether
cannot be proved. Therefore, the case
is more difficult - Region is used in simple scenarios with
constraint like independent and error-free. Thus,
it is still of interest to find more applications
24Future Research
- Theorem 3 and Theorem 6 of both first
construct a region from certain inequality and
equality based inequality, and these inequalities
comes without strong intuition and are not
unique. That means there may be are inequalities
which can characterize a tighter bound.
Therefore, a more valuable research direction is
to find the smallest inequality satisfied by
entropy functions. Here, the smallest means any
function satisfying other entropy inequalities
must satisfy that inequality. Then we can
construct the tightest bound
Z. Zhang and R. W. Yeung, "On
characterization of entropy function via
information inequalities", IEEE Transactions on
Information Theory, vol. IT-44, no. 4, pp.
1440-1452, July 1998
25- As for application, we see from , the region
complexity of more complex scenarios lies in the
constrained capacity regions which depend on the
specialty of different cases. Therefore,
tightening theoretical region will result
improvement in all application cases. That also
means if simple network coding case can use this
region, complex cases like dependent sources and
inter-session coding can also use these regions
as long as special constraints are well defined
Z. Zhang and R. W. Yeung, "On
characterization of entropy function via
information inequalities", IEEE Transactions on
Information Theory, vol. IT-44, no. 4, pp.
1440-1452, July 1998
26Thanks! ?