Title: What Entropy Means to You
1 What Entropy Means to You
- Professor Jamie Sleigh
- Waikato Clinical School
- University of Auckland
- Hamilton, New Zealand
2Stole it from us they did. Stole the precious!
3Dangerous Metaphors
4Complexity
Freedom
Randomness
Thermodynamic Entropy
Information Entropy Uncertainty
Disorder
Irregularity
-Information
5 The Tale of 2 Entropies (and 3 Men)
Opening the Black Box
6Entropy Talk Outline
- What is Entropy?
- Thermodynamic Entropy macro and micro
- Shannon Entropy/Uncertainty
- How do we calculate Spectral Entropy?
- Examples
- What does Entropy mean in the Brain?
- Does it tell you something about the Mechanisms
of Consciousness?
7Beware impending maths!
8(1) The Heat MacroscopeClausius 1865 (hntroph)
- Transformation
- 1st law - Quantity is constant but exchangeable
- 2nd law - Quality is degraded becomes
unavailable for work -
dSdE/T
9(2) The Heat MicroscopeBoltzmann /Maxwell (1876)
- W is number of arrangements to achieve the same
state - (eg teenagers bedroom)
S k ? loge W
10Microstates and Macrostates
- MICROSTATES
- (1) ?
- (2) ?
- (3) ?
- (1) ? ? ?
- MACROSTATE
- Voltage of 2 volts
- S k ? Log(3)
- Voltage of 0 volts
- S k ? Log(1)
11Thermodynamic Entropy (Mixed-up-ness) the
spontaneous tendency of objects to become
randomly scattered Dispersal of Energy
Start
End
- Requires
- Continuous interchange
- of energy (temperature)
- (2) Ability to access
- microstates
12(3)Information, not HeatShannon (1948)
- No one really knows what entropy is, so
in a debate you will always have the advantage - J von Neumann to Claude Shannon
H ?pi?log2(pi)
13How to Calculate Shannon Entropy/Uncertainty (H)
H0.81
H ?pi?log2(pi)
Pi
SUM
H0.94
Pi
14Shannon linking Mixed-up-ness with Probability
- High probability Certainty
- 1/Randomness
- Information is the reduction in Uncertainty
- Betting in a 10 horse race vs
- Betting in a 2 horse race
15The Formula EMC2
- Thermodynamic Boltzmann Entropy
- Entropy(S) kB?? pi ? log2(pi)
- Shannon Entropy/Information/Logical Entropies
- Uncertainty(H) ? pi ? log2(pi)
16Information Entropy is now divorced from any
driving energy!
17Entropy and Anaesthesia?Axiom Consciousness has
something to do with information processing in
the cortex
18What is a Power Spectrum?
Voltage
Frequency 2
Frequency 1
Frequency 1
Amplitude
Amplitude
19Unconsciousness loss of EEG high frequencies
AWAKE
ASLEEP
Frequency (Hz)
Time (sec)
20Shannon Entropy A number that measures
spread-out-ness
21The EEG Entropy Zoo
- 1) Time Series Entropies
- Shannon (H) -? p log p (the spread of the
EEG) - 2) Spectral Entropies
- Spectral Entropy (SEN) H of spectral powers
(p) - Renyi Entropies (GSE) ? /(1- ?) log? p ?
- Kullback-Liebler/Relative Entropies (K-L)
SEN of specified EEG epoch compared to the awake.
- Others...also CUP...FisherHOSWavelet...
- 3) Phase-space/Embedding Entropies
- Kolmogorov-Sinai entropy is a measure of rate of
information loss from the system. Practical
estimators are - Approximate Entropy (ApEn)
- Entropy of the Singular-Value Decomposition
(SVDen)
22(No Transcript)
23Two Main EEG Entropies
- Quantifies the amount of freedom (not disorder!)
in the system - Spectral entropy
- freedom in frequency space
- I.A. Rezek et al., IEEE Trans. Biomed. Eng.
(1998) - few relevant frequencies gt low entropy
- wide spectrum of frequencies gt high entropy
- Approximate Kolmogorov-Sinai entropy
- freedom in Taken's embedding space
- Pincus et al., J. Clin. Monit. (1991)
- Bruhn et al., Anesthesiology (2000)
241 50 Frequency
White Noise - each frequency is the same
amplitude H 1/log(50) ? ?1/50 ? log(1/50)
50 ? .... above 1
A Spread-out Spectrum
251 50 Frequency
Sine Wave - a single frequency present H
1/log(50) ? sum(0 ? log(0)) 0
A Bunched -up Spectrum
26EEG Power Spectrum - Alert Patient
Spectral Entropy 0.9
27EEG Power Spectrum - Anaesthetised
Awake
Spectral Entropy 0.4
28Short Correlation Time - Awake
Freedom
Correlation Time lt 10msec
29Long Correlation Time - Asleep
Prison
Correlation Time 125msec
30Approximate Entropy Prediction
- A practical way to estimate the Kolmogorov
entropy. - Measures ability to predict the next point in the
EEG - Is a Phase-Space method
- Slow EEG waves are more predictable...
- Similar end-result to Spectral Entropy
31Spectral Entropy vs Log Correlation
TimeHSpectral Entropy Ln(Tcorrel)
32A Half-Time Break/ Summary 1) Thermodynamic
entropy - explains dispersal of energy 2)
Information flow has similar formula 3) General
anaesthesia ? narrowing of the EEG frequency
spectrum 4) The Spectral entropy ( Shannon
entropy of frequencies) measures the narrowing of
the EEG spectrum
But wait, theres more A reconciliation of H
and S? Shannon Boltzmann?
Back to Boltzmann
33Ceaselessly Colliding Entropy energy
diffusion Molecules Neurons
34Thermodynamics CorticodynamicsThermal
agitation vs Neural agitation
- Temperature T result of transfer of kinetic
energy of molecules by collisions - S ?E / ?T
- Cooling reduces the available energy states and
slows dispersion
- Excitibility ? result of transfer of voltage
changes by synapses - S ?E / ??
- Anaesthesia reduces the synaptic efficiency
- Coma Freezing
35Spectral Entropy Synaptic Agitation
Ornstein Uhlenbeck process
- Thermodynamics Gas
- Temperature
- Velocity of molecules 1/Drift.
- Spectral Entropy loge(Drift)
- gt Spectral Entropy of KE measures Temperature
- Synaptic Excitibility
- Drift term Intensity of inputs into dendritic
tree - Spectral Entropy loge(Drift)
- gt Spectral Entropy of the EEG measures synaptic
activity
36Spectral Entropy Technical Considerations
- Are you measuring the EEG?
- (or EEG EMG, eye movements, blinks etc)
- Are you interested in Unconsciousness or
Depth-of-anaesthesia? - Spectral Entropy is amplitude independent!
- (burst suppression entropy of measurement
noise) - Other agents/consciousness states?
- (REM sleep, opioids, N2O, ketamine...)
37Spectral Entropy and Brain Thiopentone
concentrations
38Sleep and Spectral Entropy
Sleep stage - vol 3
Sleep Stage -vol 6
BIS
Vol 6
1
0.9
0.8
Spectral Entropy
SEN
0.7
0.6
0.5
0.4
0
500
1000
1500
Time (epochs)
39Ketamine
40Remifentanil and EEG
Spectral Entropy
Surgery Awake
Surgery Awake
41Conclusions Entropy is
- Transformation
- Ratio of temp heat
- Unavailablity for work
- Log number of available microstates
- Mixed-up-ness
- Dispersal of energy
- Randomness
- Spread-out-ness
- Information
- Disorder
- Uncertainty
- Surprisal
- Freedom
42Conclusions Entropy and Anaesthesia
- Awake brain High Entropy gt Freedom Boiling
Brain - there are many available microstates
- energy spreads out easily spatial coherence or
decoherence? - accurate fast cortical information processing
- Comatose brain Low Entropy gt Prison
- Frozen Brain
- few microstates
- slow inaccurate information processing
43References
- www.phys.waikato.ac.nz/cortex
- http//pespmc1.vub.ac.be/ENTRTHER.html
- http//www.2ndlaw.com/entropy.html
- Inouye T et al. Electroenceph Clin Neurophysiol
199179204-. - Rezek IA IEEE Trans Biomed Eng 1998451186
- Styer DF. Insight into entropy. Am J Phys
2000681090
44(No Transcript)