Title: Quantum Information Theory
1Quantum Information Theory
- Graduate Course - Spring 2005
- Lecture 5 2/21/05
- Marco Lanzagorta Jeff Tollaksen
- George Mason University
2Von Neumann Entropy unifies aspects of QI
- Transmission of classical info over quantum
channels (i.e. what is the max info in bits that
can be obtained using best measmts) - measure of success Prob of error - 0
- Tradeoff between gaining info about a quantum
state and disturbance of the state - Quantifying entanglement E(?AB?)S(?A)
- Transmission of quantum info over quantum
channels (compressibility of an ensemble of pure
quantum states, i.e. min of qubits per letter
of message needed to encode information) - measure of success Fidelity - 1
3Basic properties of the von Neumann entropy
Classically, information is a function of prob
distribs quantum information is a function of
density matrices
- von Neumann entropy S(?)?-log ??? tr ? log ?
4Basic properties of the von Neumann entropy
- Suppose we have an ensemble of pure states ?
?iqi?k???k - H(q) S(?)
- Only equal if the ??k are orthogonal, then
quantum source reduces to a classical source
all signal states can be distinguished
5Transmission of quantum info over quantum
channels-quantum analog of noiseless coding
theorem
How many qubits are needed to store the output of
the source, so the output can be reliably
recovered?
In this case each letter of the message is drawn
from an ensemble of pure states ?k? (e.g.
polarization states of a photon that are not
necessarily orthogonal) with probability pi Thus
each letter is described by a density matrix
The entire message is given by
How redundant is this message?
Smaller HS
?n
nR qubits
?n
n uses
compress
decompress
6The typical subspace
Devise quantum code to compress w/o loss of
fidelity develop notion of typical subspace
rather than typical sequence
7Quantum analog of noiseless coding theorem
- Use diagonal form
- A given sequence has
- The typical subspace has size
- Therefore, we can Schumacher compress n?nS(?)
8- Von Neumann entropy is of qubits of quantum
info carried per letter of the message - Use large blocks of N independent inputs, w/ N
large - (N large so we can use properties of typical
subspace) - Use a coding scheme with NS(?) qubits that are
- Faithful for states in the typical subspace
- No good otherwise
- Can always compress unless ? ½I (cant compress
random qubits) - Schumachers protocol
- Project onto typical subspace
- If successful projection, then encode
- If not successful, then do nothing (according to
the law of large s, this prob?0 as n??) - It can then be shown that the average fidelity? 1
9Quantum data compression 1 letter example
- Suppose letters are single qubits taken from an
ensemble - Density matrix of each letter
- Eigenstates of the density matrix are qubits in n
axis - Which overlap w/ initial state
10Quantum data compression 1 letter example
- On the other hand
- Thus, if we dont know whether up z or up x was
sent, the best guess we can make is 0 which
has max fidelity - For arbitrary input ?? this is 0.8535
- Thus by diagonalizing the density matrix we can
decompose HS of one qubit into a likely 1-d
subspace and an unlikely 1-d subspace
11Quantum data compression 3 letter example
- Suppose Alice needs to send 3 letters but can
only afford 2 qubits - She could send 2 of the 3 qubits and ask Bob to
guess 0 for the third with F.8535 overall.
Is there a better procedure? - Yes, in the 1 letter example we saw that by
diagonalizing the density matrix we can decompose
HS of one qubit into a likely 1-d subspace and an
unlikely 1-d subspace - Similarly we can de-compose HS of 3 qubits into
likely and unlikely subspaces and encode only the
most likely
12Quantum data compression 3 letter example
- If the signal state is
- (where each of the qubits is either ?z? or
?x?) - Then the de-composition of HS of 3 qubits into
likely and unlikely subspaces is given by - Likely space spanned by
13Quantum data compression 3 letter example
- If we make a fuzzy measurement that projects the
signal onto the likely subspace then, probability
of likely - And probability of projecting onto un-likely
sub-space - E.g., Alice could apply a U that rotates the 4
high prob basis states to ??0? and low-prob
to ??1? - Then Alice performs a measurement on the 3rd
qubit. If the outcome is 0? then Alices input
state has been projected onto the likely subspace
14Quantum data compression 3 letter example
- She then sends the remaining 2 unmeasured qubits
to Bob - When Bob receives this compressed 2-qubit state,
he decompresses it by appending a 0? and
applying U-1 - If Alices measurement of the third qubit gives
1? then the best she can do is send to Bob the
state that he will decompress as the most likely
state 0?0?0? - Thus, if Alice encodes 3-qubit signal state ??
and sends 2 qubits to Bob who decodes them, then
Bob obtains state
15Quantum data compression 3 letter example
- The fidelity of this procedure
- This is better than the procedure of sending 2 of
the 3 qubits with perfect fidelity (total
fidelity was 0.8535) - With longer messages, the fidelity improves, the
eigenvalues of the diagonalized density matrix
are - Von Neumann entropy of 1-qubit ensemble is S(?)
H(cos2p/8).6009, thus we can shorten message by
.6009
16Quantum data compression
- If Alice just sent classical information
orthogonal quantum states then Bob could follow
the previous procedure to re-construct Alices
initial state and achieve high-fidelity
compression to H(X) bits per letter - But if states are drawn from non-orthogonal pure
states, then this compression is not optimal
classical information about preparation of state
is redundant because non-orthogonal states cannot
be perfectly distinguished - Hence Schumacher coding can achieve optimal
compression to S(?) qubits per letter but at a
price - Bob received message from Alice, but he doesnt
know what it is - Bob cant make a measurement that will determine
Alices message correctly because the state would
be disturbed
17Quantum data compression
decompression
compression
18Outline of Schumachers data compression
19Schumacher compression
20Fidelity
- Begin with and try to preserve it
- Wind up with
- Did we success?
- Bad criterion
- Good criterion subject to a strong
observational test - This gives the fidelity
21(No Transcript)
22(No Transcript)
23The idea of the proof is similar to Shannons
proof.
Two known proofs
One is a complicated kludge, done from
first principles.
The other proof is an elegant easy proof
that relies on other deep theorems.
24Some remarks about Schumacher compression
The states do not have to be orthogonal, in fact
it works better if they are not!
Classical messages require H S bits
Can also transmit entangled states with F?1 by
using S qubits.
25Entropy and Thermodynamics
- 2 approaches to quantum statistical physics
- Evolution of closed system and perform coarse
graining to obtain thermodynamic variables - Evolution of open system quantum system in
contact with an environment track only system,
dont monitor environ. - If system A and environment E are initially
uncorrelated - And then the entropy is additive
- If we let the open system evolve via UAE, then
26Entropy and Thermodynamics
- Since unitary evolution preserves S
- Applying sub-additivity ?
- If A and E are un-correlated then these are equal
- Hence entropy of world cannot decrease
- Info initially encoded in system (i.e. ability to
distinguish one state from another) is LOST and
ends up encoded in entanglement between Sys-env
(in principle recoverable but not in practice)
27Entropy and Thermodynamics
- Sys-env interactions leads to well-defined global
properties while local properties become unc - Simple derivation of decoherence (w/o ref to H)
- Specify of states of sys (gas g, HS dim ng)
- Specify of states of env (container c HS dim
nc) - Ensemble average of von Neumann entropy
approaches maximum as nc increases
28Entropy and Thermodynamics
- Even though we dont know actual pure state of
the whole almost all states have same local
properties in terms of objective uncertainties
described by finite local entropy - This is generic for any bi-partite system and
gives rise to thermodynamic behavior - Highly entangled states are typical and so are
uncertain local properties
29Thermodynamics of computation
- All computers produce waste heathow much waste
heat is necessary? - Landauer 1961 the only operations that are
thermodynamically irrev are the logically irrev - Bennet 1982 only inherently irrev operation is
erasure of info - Landauers principle thermo cost of erasure
- ? Qkt ln 2 per bit
- if answer has 40 bits, then min energy needed is
40 kT
30Thermodynamics of computation
- If we could discrimate non-orthogonal states then
we could violate 2nd law of thermodynamics - Could make a closed loop in which non-orthog
states are made orthog by perfect
distinguishability then used to do work and
finally return to same state - No heat is dissipated but work is done
31Encoding classical information into quantum states
Alice prepares ?kQ probability pk
Bob measures decoding observable B and tries to
infer k
Mutual information H(KB)H(K)-H(KB) How much
classical info can a Quantum system carry?
32Encoding classical information into quantum states
- Potential problems
- Alice could use non-orthogonal states that cannot
be perfectly distinguished - Noise
- Bob can choose different decoding observables
33Encoding classical information into quantum states
- Alice prepares state ?kQ with probability pk
?QSpk ?kQ - What is most efficient way to distinguish them?
Define - S(Spk ?kQ ) - S pkS(?kQ )??Q
- Entropy of avg signal average signal entropy
- ?Q measures distinguishability of the signal
states ?kQ - ?Q0 (by convexity of S)
- ?Q0 iff ?kQ all are the same
- ?QH(p) w/ equality iff the signals are
orthogonal - If signals are pure states, then ?QS(?Q)
34Encoding classical information into quantum states
- No measurement can provide more than ?Q bits of
info about the preparation of Q - S(Spk ?kQ ) - S pkS(?kQ ) ? ?Q
- Entropy of avg signal average signal entropy
- Note if ?kQ ?k???k then ?Q SQ
- This quantity depends not just on ? but on how it
is realized as a ensemble of mixed states - It is also similar to mutual info which tells us
how much on average the shannon entropy of Y is
reduced once we learn X, similarily ?Q tells us
how much on avg the Von Neumann entropy of an
ensemble is reduced when prep is known - By suitable choice of code and decoding
observable, the alphabet can be used to send up
to ?Q bits of information per letter with
arbitrarily low prob of error
35Encoding classical information into quantum states
- Alice prepares state ?kRQ with probability pk
then - ?Q ?RQ property of partial tr
- Distinguishability is not increased by discarding
subsystem - Distinguishability cannot be increased by
dynamics - Mutual info between input K and measurement A
- Holevo
- H(AK) ?A
36Accessible information what is the max info in
bits that can be obtained using best measurements
- The close analyogy between Holevo info and
classical mutual info suggests that ?Q is related
to amount of classical info that can be stored in
a quantum source - What is classical info content that can be
extracted?
- Let Imax H(AK) (maximize over all decoding
observables) - I ?A , for example
- I H(AK)0.278 bits (can make I ?A iff ?kQ
commute
37Accessible information example
- Suppose papbpc1/3, then ?A 1
- Best measurement gives I0.585 bits, so use 2
copies - 9 possible states aa?, ab?, etc.
- Use only aa?, bb?, cc?, entropy is ?Q1Q2
SQ1Q21.369 bits - This is 0.685 bits/system
- Improved H(AK) by using blocks of systems, only
some signal states, and a good decoding obserable
Material in this talk used w/ permission from M.
Nielsen
38Error correction overview
- Digitization of errors project back onto state
w/ no errors - Measure errors without measuring data
- Errors are local but the quantum information is
stored in a non-local way - Info is stored in correlations amongst several
qubits - Assumption errors affecting qubit are
un-correlated - If they are correlated, then this coding will not
improve the reliability - Non-local information is robust to local
disturbances