What Entropy Means to You - PowerPoint PPT Presentation

1 / 44
About This Presentation
Title:

What Entropy Means to You

Description:

Entropy and Anaesthesia? ... Anaesthesia reduces the synaptic efficiency. Coma = 'Freezing' ... Are you interested in Unconsciousness or Depth-of-anaesthesia? ... – PowerPoint PPT presentation

Number of Views:125
Avg rating:3.0/5.0
Slides: 45
Provided by: SLE72
Category:

less

Transcript and Presenter's Notes

Title: What Entropy Means to You


1
What Entropy Means to You
  • Professor Jamie Sleigh
  • Waikato Clinical School
  • University of Auckland
  • Hamilton, New Zealand

2
Stole it from us they did. Stole the precious!
3
Dangerous Metaphors
4
Complexity
Freedom
Randomness
Thermodynamic Entropy
Information Entropy Uncertainty
Disorder
Irregularity
-Information
5
The Tale of 2 Entropies (and 3 Men)
Opening the Black Box
6
Entropy Talk Outline
  • What is Entropy?
  • Thermodynamic Entropy macro and micro
  • Shannon Entropy/Uncertainty
  • How do we calculate Spectral Entropy?
  • Examples
  • What does Entropy mean in the Brain?
  • Does it tell you something about the Mechanisms
    of Consciousness?

7
Beware impending maths!
8
(1) The Heat MacroscopeClausius 1865 (hntroph)
- Transformation
  • 1st law - Quantity is constant but exchangeable
  • 2nd law - Quality is degraded becomes
    unavailable for work

dSdE/T
9
(2) The Heat MicroscopeBoltzmann /Maxwell (1876)
  • W is number of arrangements to achieve the same
    state
  • (eg teenagers bedroom)

S k ? loge W
10
Microstates and Macrostates
  • MICROSTATES
  • (1) ?
  • (2) ?
  • (3) ?
  • (1) ? ? ?
  • MACROSTATE
  • Voltage of 2 volts
  • S k ? Log(3)
  • Voltage of 0 volts
  • S k ? Log(1)

11
Thermodynamic Entropy (Mixed-up-ness) the
spontaneous tendency of objects to become
randomly scattered Dispersal of Energy
Start
End
  • Requires
  • Continuous interchange
  • of energy (temperature)
  • (2) Ability to access
  • microstates

12
(3)Information, not HeatShannon (1948)
  • No one really knows what entropy is, so
    in a debate you will always have the advantage
  • J von Neumann to Claude Shannon

H ?pi?log2(pi)
13
How to Calculate Shannon Entropy/Uncertainty (H)
H0.81
H ?pi?log2(pi)
Pi
SUM
H0.94
Pi
14
Shannon linking Mixed-up-ness with Probability
  • High probability Certainty
  • 1/Randomness
  • Information is the reduction in Uncertainty
  • Betting in a 10 horse race vs
  • Betting in a 2 horse race

15
The Formula EMC2
  • Thermodynamic Boltzmann Entropy
  • Entropy(S) kB?? pi ? log2(pi)
  • Shannon Entropy/Information/Logical Entropies
  • Uncertainty(H) ? pi ? log2(pi)

16
Information Entropy is now divorced from any
driving energy!
17
Entropy and Anaesthesia?Axiom Consciousness has
something to do with information processing in
the cortex
18
What is a Power Spectrum?
Voltage
Frequency 2
Frequency 1
Frequency 1
Amplitude
Amplitude
19
Unconsciousness loss of EEG high frequencies
AWAKE
ASLEEP
Frequency (Hz)
Time (sec)
20
Shannon Entropy A number that measures
spread-out-ness
21
The EEG Entropy Zoo
  • 1) Time Series Entropies
  • Shannon (H) -? p log p (the spread of the
    EEG)
  • 2) Spectral Entropies
  • Spectral Entropy (SEN) H of spectral powers
    (p)
  • Renyi Entropies (GSE) ? /(1- ?) log? p ?
  • Kullback-Liebler/Relative Entropies (K-L)
    SEN of specified EEG epoch compared to the awake.
  • Others...also CUP...FisherHOSWavelet...
  • 3) Phase-space/Embedding Entropies
  • Kolmogorov-Sinai entropy is a measure of rate of
    information loss from the system. Practical
    estimators are
  • Approximate Entropy (ApEn)
  • Entropy of the Singular-Value Decomposition
    (SVDen)

22
(No Transcript)
23
Two Main EEG Entropies
  • Quantifies the amount of freedom (not disorder!)
    in the system
  • Spectral entropy
  • freedom in frequency space
  • I.A. Rezek et al., IEEE Trans. Biomed. Eng.
    (1998)
  • few relevant frequencies gt low entropy
  • wide spectrum of frequencies gt high entropy
  • Approximate Kolmogorov-Sinai entropy
  • freedom in Taken's embedding space
  • Pincus et al., J. Clin. Monit. (1991)
  • Bruhn et al., Anesthesiology (2000)

24
1 50 Frequency
White Noise - each frequency is the same
amplitude H 1/log(50) ? ?1/50 ? log(1/50)
50 ? .... above 1
A Spread-out Spectrum
25
1 50 Frequency
Sine Wave - a single frequency present H
1/log(50) ? sum(0 ? log(0)) 0
A Bunched -up Spectrum
26
EEG Power Spectrum - Alert Patient
Spectral Entropy 0.9
27
EEG Power Spectrum - Anaesthetised
Awake
Spectral Entropy 0.4
28
Short Correlation Time - Awake
Freedom
Correlation Time lt 10msec
29
Long Correlation Time - Asleep
Prison
Correlation Time 125msec
30
Approximate Entropy Prediction
  • A practical way to estimate the Kolmogorov
    entropy.
  • Measures ability to predict the next point in the
    EEG
  • Is a Phase-Space method
  • Slow EEG waves are more predictable...
  • Similar end-result to Spectral Entropy

31
Spectral Entropy vs Log Correlation
TimeHSpectral Entropy Ln(Tcorrel)
32
A Half-Time Break/ Summary 1) Thermodynamic
entropy - explains dispersal of energy 2)
Information flow has similar formula 3) General
anaesthesia ? narrowing of the EEG frequency
spectrum 4) The Spectral entropy ( Shannon
entropy of frequencies) measures the narrowing of
the EEG spectrum
But wait, theres more A reconciliation of H
and S? Shannon Boltzmann?
Back to Boltzmann
33
Ceaselessly Colliding Entropy energy
diffusion Molecules Neurons
34
Thermodynamics CorticodynamicsThermal
agitation vs Neural agitation
  • Temperature T result of transfer of kinetic
    energy of molecules by collisions
  • S ?E / ?T
  • Cooling reduces the available energy states and
    slows dispersion
  • Excitibility ? result of transfer of voltage
    changes by synapses
  • S ?E / ??
  • Anaesthesia reduces the synaptic efficiency
  • Coma Freezing

35
Spectral Entropy Synaptic Agitation
Ornstein Uhlenbeck process
  • Thermodynamics Gas
  • Temperature
  • Velocity of molecules 1/Drift.
  • Spectral Entropy loge(Drift)
  • gt Spectral Entropy of KE measures Temperature
  • Synaptic Excitibility
  • Drift term Intensity of inputs into dendritic
    tree
  • Spectral Entropy loge(Drift)
  • gt Spectral Entropy of the EEG measures synaptic
    activity

36
Spectral Entropy Technical Considerations
  • Are you measuring the EEG?
  • (or EEG EMG, eye movements, blinks etc)
  • Are you interested in Unconsciousness or
    Depth-of-anaesthesia?
  • Spectral Entropy is amplitude independent!
  • (burst suppression entropy of measurement
    noise)
  • Other agents/consciousness states?
  • (REM sleep, opioids, N2O, ketamine...)

37
Spectral Entropy and Brain Thiopentone
concentrations
38
Sleep and Spectral Entropy
Sleep stage - vol 3
Sleep Stage -vol 6
BIS
Vol 6
1
0.9
0.8
Spectral Entropy
SEN
0.7
0.6
0.5
0.4
0
500
1000
1500
Time (epochs)
39
Ketamine
40
Remifentanil and EEG
Spectral Entropy
Surgery Awake
Surgery Awake
41
Conclusions Entropy is
  • Transformation
  • Ratio of temp heat
  • Unavailablity for work
  • Log number of available microstates
  • Mixed-up-ness
  • Dispersal of energy
  • Randomness
  • Spread-out-ness
  • Information
  • Disorder
  • Uncertainty
  • Surprisal
  • Freedom

42
Conclusions Entropy and Anaesthesia
  • Awake brain High Entropy gt Freedom Boiling
    Brain
  • there are many available microstates
  • energy spreads out easily spatial coherence or
    decoherence?
  • accurate fast cortical information processing
  • Comatose brain Low Entropy gt Prison
  • Frozen Brain
  • few microstates
  • slow inaccurate information processing

43
References
  • www.phys.waikato.ac.nz/cortex
  • http//pespmc1.vub.ac.be/ENTRTHER.html
  • http//www.2ndlaw.com/entropy.html
  • Inouye T et al. Electroenceph Clin Neurophysiol
    199179204-.
  • Rezek IA IEEE Trans Biomed Eng 1998451186
  • Styer DF. Insight into entropy. Am J Phys
    2000681090

44
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com