Professional Documents
Culture Documents
A Literature Survey On Entropy Measures
A Literature Survey On Entropy Measures
Measures
Assignment IV Report
University of Manitoba
Faculty of
Engineering
Department of Mechanical and Manufacturing Engineering
Contents
CHAPTER 1 DYNAMICAL SYSTEMS, SHORT-NOISY DATA SAMPLES AND ENTROPY
MEASUREMENTS......................................................................................................... 2
1.1 Dynamical Systems........................................................................................... 2
1.2 Shannons Entropy............................................................................................ 2
1.3 Concept of Entropy in Dynamical Systems........................................................3
1.4 Entropy in Time Measured Signals.....................................................................4
1.5 Short and Noisy Time Series..............................................................................5
CHAPTER 2- GRAHAMS ENTROPIES............................................................................8
2.1 Coarse Time series Quantization and Vector Identifiers....................................8
2.2 Quantized Dynamical Entropy...........................................................................9
2.3 Quantized Approximation of Sample Entropy..................................................10
CHAPTER 3 - ENTROPY FEATURES ANALYSIS AND COMPARISON...............................13
3.1 Dependence on Data Length...........................................................................13
3.2 Robustness to Observational Noise.................................................................14
3.3 Computational Efficiency.................................................................................16
CHAPTER 4 CONCLUSIONS..................................................................................... 17
References................................................................................................................ 18
Page | 1
H= pi log p i
(11)
i=1
Suppose that there is an attractor 2 in phase space and that the trajectory
x ( t )
is in box
id
of:
K=lim lim lim
0 0 d
1
p ( i 1 , i 2 , .. ,i d ) ln p (i 1 , i 2 , .. ,i d )
d i ,
i , .. ,i
1
(12)
Reyni entropies
0 0 d
1 1
ln
pq ( i 1 ,i 2 ,.. , i d )
d q1 i ,
i , .. ,i
(13)
quantities
Kq
K2
is more practical
due to its ease of calculation from a measured time series. Generically, the
d
whole trajectory can be reconstructed from
measurements where
d F
X , we consider then:
lim 1
N
C d )= N 2 [number of pairs ( n , m ) with
Then
K 2,d
(|
i=1
1
2 2
C d ( )
(14)
as follows:
(15)
and,
Page | 5
K 2 lim lim K 2, d
(16)
0 d
obtained from a
(17)
where
Xi ,
d
X j
X j r /( N m+ 1)
(18)
Xi ,
Xj
(19)
Page | 6
Next define
Nm+1
( r )=( Nm+1)
i=1
ln Cmi ( r)
(110)
and define
(111)
ApEn (m , r , N )
is approximately equal
(18)
r , and equation
incrementing.
As
ApEn
ln(0)
in equation
ApEn
ApEn
measure to
be heavily dependent on the record length where for short time series, more
similarity than is present is observed. Despite this inefficiency, ApEn is
widely applied in cardiovascular studies.
Page | 7
ApEn
u ( 1 ) , u ( 2 ) , ,u ( N ) {u ( j ) : 1 j N }
(112)
(113)
next define
Nm1
(114)
of d| xm +1 (i )x m +1 ( j )| , where j=1: N mi j
1
)
A mi ( r )=
Nm1
(115)
B mi ( r ) and
A mi ( r )
B ( r )=
A m ( r )=
to define
1
N m
Nm
1
Nm
N m
Bmi ( r )
(116)
i=1
A mi ( r )
(117)
i=1
0 results.
Bm (r )
Am (r )
m+ 1 points.
Page | 8
SampEn ( m , r )= lim ln
N
SampEn (m ,r )
as
Am(r )
Bm ( r )
(118)
Am ( r )
SampEn ( m , r , N ) =ln m
B (r )
where
m ,
r , and
(119)
SampEn (m ,r )
1-
2-
SampEn (m ,r ) .
Page | 9
Page | 10
X q =
X min ( X )
r>0
function and it rounds off the value inside the function (to the nearest whole
number) towards the negative infinity.
Then these quantized data samples are grouped (called vector groups) using an
embedding dimension m (m N 1 ) . This embedding dimension determines the
length of the vector. Lets call these vector groups as
The vector identifiers (
V j where 1 j N m+1.
j= V j ( i ) h
i=1
i1
H= pi log p i
i=1
Q ( j)
N m+1
( j)=
p
Therefore, Quantized Dynamical Entropy,
H (m ,r )
p( j)
p ( j) log 2
H ( m, r )=
j
The unit of QDE is bits due to the base 02 of the logarithm value.
Since calculation of QDE depends on all finite parameters compared to the difficult
and data sensitive limit calculations in some entropy calculations such as KS
entropy, QDE paves the way to much easier calculations of entropy measures.
Page | 12
Page | 13
SampEn ( m , r , N ) =ln
A m+1 (r )
B m (r )
]
m
B ( r )=(N m)
Where
N m
1
Bmi (r )
i=1
m +1
Nm
( r ) =( N m)1 A mi +1( r)
i=1
of vectors X mi within r of X mj
B ( r )=
N m1
m
i
Page | 14
m +1
i
Then, SampEn vector matches are approximated with number of vector identifiers
in order to come up with the QASE.
Q( m)1
Q( m)
Nm1
^ m (2 r ) =(N m)1
B
m
Q( m+ 1)1
Q( m+1)
Nm1
^
A m+ 1 ( 2 r )=(N m)1
m +1
The self matches of vector identifiers are negated with deducting one from number
of occurrences of vector identifiers. Then QASE can be derived as,
^
Am +1 (2 r)
QASE ( m ,2 r )=ln
^ m (2r )
B
Page | 15
CHAPTER 3
COMPARISON
ENTROPY
FEATURES
ANALYSIS
AND
The two Grahams entropies are analyzed with Approximate entropy and Sample
entropy. The effect of data length (convergence of entropy values with number of
data samples), robustness to observational noise and finally the computational
efficiency were analyzed.
For the analysis, four MatLab programs were implemented for ApEn, SampEn, QDE
and QASE.
Page | 16
x i+1=r x i ( 1xi )
from 3.5 to 4. Four Gaussian noise levels were used; 0.00 (noiseless), 0.02, 0.06,
Page | 17
Entropy Value
0.20 STD. Noiseless model was used as the benchmark to analyze each entropy
measures with noisy data. The following figures illustrate the behavior of each
entropy under noiseless and three noise levels.
Entropy Value
Entropy Value
r
Page | 18
Entropy Value
r
Figure 05d. QASE with Noise
From Figures 05.a-d it can be seen that all entropy values gets higher as the noise
levels go up. SampEn and ApEn have very close behaviors and it is impossible to
say any significant difference among them by just observing the graphs. May be an
error value calculation between noisy plots and the noiseless plots would give a
significant difference between two plots. It is possible to deduce from the graphs
that both Sample and Approximate entropies are almost robust to small levels of
noise. In real situations the practical noise levels would range in-between 0 0.06
STD and therefore 0.20 level is doubt to be observed in real situations. But that high
level of noise was used to check the robustness of the entropies at higher noise
levels.
In contrary, QDE and QASE are not as robust as the aforementioned entropies. But
for lower noise levels (0.02, 0.06 STD) they both have not deviated much from the
original noiseless shape of the graph. But it is possible to say that QASE is more
robust than QDE and QDE has lost its robustness to noise at 0.2 STD level, while
QASE managed to keep the basic shape of the plot.
CHAPTER 4 CONCLUSIONS
Both sample entropy and quantized approximation of sample entropy shows better
convergence with data samples over approximate and quantized dynamical
entropies. Considering SampEn and QASE, it was observed that SampEn performs
slightly better than the latter. Therefore it can be concluded that in terms of
convergence performances, sample entropy performs the best and QASE follows
that.
Approximate entropy and sample entropy both show robust characteristics over
observational noise. At the maximum noise level among the tests (0.2 STD of
Gaussian noise), SampEn performs slightly better than ApEn. QDE and QASE show
Page | 20
References
[1] C. E. Shannon, A mathematical theory of communication, SIGMOBILE Mob.
Comput. Commun. Rev. 5, 1, (2001).
Page | 21
Page | 22