Professional Documents
Culture Documents
Analysis of Shannon-Fisher Information Plane in Time Series Based On Information Entropy
Analysis of Shannon-Fisher Information Plane in Time Series Based On Information Entropy
Analysis of Shannon-Fisher Information Plane in Time Series Based On Information Entropy
entropy
Yuanyuan Wang, and Pengjian Shang
The paper proposes a Shannon-Fisher information plane of disorder in dynamical systems. It is a standard measure
based on the information entropy to analyze the financial for the order state of sequences and has been applied pre-
time series of U.S., Australian, and Asian stock mar- viously to DNA sequences. The Shannon entropy quantifies
kets. As we know, the probability distribution P mainly the probability density function of the distribution of values.
depends on the number of possible states of the system M. In addition, it can also be used to represent the degree of
We find that with the increase of the embedding dimen- uncertainty involved in predicting the output of a probabilis-
sion m and the parameter M, the normalized Shannon tic event.12,13 For discrete distributions, the probability will
entropy increases, and the Fisher information measure be a maximum value if one predicts the result exactly before
(FIM) decreases, indicating that as m and M increase, it happens; as a consequence, the Shannon entropy will be a
the financial time series is more complex, thus produc- minimum. If one is absolutely capable of predicting the result
ing large amounts of normalized Shannon entropy and its of an incident, the Shannon entropy will be zero. As for con-
corresponding FIM is smaller. In addition, applying this
tinuous distributions, that is not the case. For instance, over
method into analyzing the financial stock markets, we also
the real line, Shannon entropy can reach any arbitrary value,
conclude that the ten stock markets are organized into
three groups according to their own characteristics. The positive or negative. In practice, some studies have analyzed
three Asian stock markets TWII, SSEC, and CTSP300 the physical and chemical properties of atomic and molecular
are grouped together. The second group consists of the systems from an information theoretical point of view.14–25
American stock markets: DJI, NDX, NYA, and S&P500. Nowadays, the Fisher Information Measure (FIM),26
The rest Australian stock markets AFLI, AORD, and which was introduced by Fisher in 1925 in the context of
AXJO are grouped together. We observe that the FIM statistical estimation, is also playing an increasing role in sci-
values of the four American stock markets DJI, NDX, entific analysis. It is a powerful tool to detect complex and
NYA, and S&P500 are bigger than those the other stock non-stationary signals. The Frieden extreme physical infor-
markets. These suggest that these four American stock mation principle27 utilizes FIM to derive important laws of
markets are less active than the Asian stock markets and chemistry and physics,28 such as the equations of the nonrel-
the Australian stock markets. ativistic quantum mechanics29 or relevant results in density
functional theory.30,31 Since FIM allows one to accurately
describe the behaviors of dynamic systems,32 its application to
I. INTRODUCTION the characterization of complex signals issued from these sys-
Time series of measurements are the basic elements for tems appears quite natural. Martin et al. have employed this
investigating natural phenomena. From time series, we can approach to characterize the dynamics of electroencephalo-
extract much information on dynamical systems. Recently, gram (EEG) signals.33 They have shown the informative
information-based measures of randomness (or “regularity”) content of FIM in investigating significant changes in the
of time series have attracted more and more attention in vari- behavior of nonlinear dynamical systems34 disclosing. There-
ous branches of science.1–11 As is well-known, the Shannon fore, FIM is an important quantity involved in many aspects
entropy is the important magnitude to quantify the degree of the theoretical and observational description of natural
phenomena. Moreover, one of the interesting results of their
a)
Author to whom correspondence should be addressed:
study is that FIM can detect some non-stationary behav-
17121633@bjtu.edu.cn iors in situations where the Shannon entropy shows limited
b)
pjshang@bjtu.edu.cn dynamics.
1054-1500/2018/28(10)/103107/9/$30.00 28, 103107-1 Published by AIP Publishing.
103107-2 Y. Wang and P. Shang Chaos 28, 103107 (2018)
Motivated by the work, some researchers define a a key role in information theory. From Eq. (2), Martin et al.
Shannon-Fisher information plane as a measure of complex- introduce the integrand, which is the scalar product of two
ity. In the previous studies, Romera and Dehesa35 introduced vectors, is independent of the reference frame.34
the concept of Shannon-Fisher information plane as a specific Let us pay attention to the one-dimensional case. Con-
correlation measure. In addition, the Shannon-Fisher plane sider a random variable x whose probability density function
has been applied successfully into all kinds of fields, such as is represented as f (x). Its associated FIM27,37 is given by
the analysis of signals,32 the study of electron correlation35 +∞
within the He-isoelectronic series, and so on. The Shannon- 1 df (x) 2
I= dx
Fisher information plane allows one to quantify the global −∞ f (x) dx
versus local characteristic of the time series generated by +∞
dψ(x) 2
the dynamical process. This means that it incorporates a =4 dx, (3)
−∞ dx
global measure of uncertain (Shannon) and a local measure
of accuracy (Fisher). It is known to us that the Shannon- +∞
where x ∈ R and −∞ f (x)dx = 1. By Eq. (3), we can see that
Fisher information plane is commonly based on permutation the division by f (x) is not convenient if f (x) → 0 at certain
entropy. However, in this paper, we propose a Shannon-Fisher x-values. In order to avoid this happening, some researchers
information plane based on the information entropy to ana- use real probability amplitudes f (x) = ψ 2 (x),27,37 which is a
lyze financial time series, which can give us more detailed, simpler form (no divisors), and present that FIM simply mea-
accurate, and clearer information on the classification of finan- sures the gradient content in ψ(x). Thus, this measure is called
cial stock markets. The information entropy is on the basis a “local” one. The Shannon entropy12 as a measure of “global
of novel complete quantification of the information, where character” is defined as
the information is accepted as an interpretation of the spatial +∞
structures (which are different from the temporal structures H =− f (x) log f (x)dx. (4)
applied in the multiscale entropy). In the Shannon-Fisher −∞
information plane, since the FIM quantifies the amount of
For convenience, the alternative concept of entropy power12
organization or order in a system, while the Shannon entropy
can be used:
measures its degree of uncertainty or disorder, it is obvious
1 2H
that the larger the FIM is, the smaller the Shannon entropy is. N= e ; (5)
As a result, applying the Shannon-Fisher information plane 2π e
into financial time series is very effective, and we also gain this satisfies the so-called “isoperimetric inequality” IN ≥
insight of the inner dynamics of a system. For those complex D,35,38–40 and D represents the dimension of the space of the
systems, there might be still the possibility of increasing noise variable x.
and order in the sense of more and more intricate patterns in The probability density function f (x) can be estimated
the presence of increasing noise in some ways. according to the kernel density estimator technique,41,42 which
The rest part of this paper is arranged as follows. In approximates the density function as Eq. (6), and these forms
Sec. II, we present the details of the Shannon-Fisher infor- are shown in Refs. 43 and 44.
mation plane based on information entropy. Section III shows
M
the results with two types of artificial time series: Autore- ˆfM (x) = 1 x − xi
K , (6)
gressive Fractionally Integrated Moving Average (ARFIMA) Mb i=1 b
models36 and Chebyshev map model. Application to financial
time series is illustrated in Sec. IV. Finally, we summarize and where b is the bandwidth, and the most successful method
make a conclusion. among all the current bandwidth selection methods, both
empirically and theoretically, is the solve-the-equation plug-
II. METHODOLOGY in method.45 M is the number of data. In addition, K(u) is the
kernel function; it should satisfy Eq. (7)
Consider the relevant Fisher- and Shannon-associated +∞
quantities.34 Let f = ψ 2 be a probability density in K(u) ≥ 0 and K(u)du = 1. (7)
Rd (d ≥ 1). Fisher’s information associated with f (or to the −∞
probability amplitude ψ) is expressed as the (possible infinite) In the estimation procedure, kernel used is the Gaussian of
non-negative number I zero mean and unit variance. As a result, Eq. (6) can also be
shown as
|∇f |2
I(f ) = dx , (1)
Rd f 1 M
−
(x−xi )2
fˆM (x) = √ e 2b2 . (8)
or in the light of the amplitudes M 2π b2 i=1
I(ψ) = dx(∇ψ · ∇ψ), (2) In our study, we let the probability distribution P = {pi : i =
Rd 1, 2, . . . , M }. In this case, the Shannon entropy is given by
where ∇ is the differential operator. This formula defines a
M
convex, isotropic functional I, which was proposed by Fisher H =− pi · ln pi . (9)
(1925) in the context of statistical estimation,26 and it plays i=1
103107-3 Y. Wang and P. Shang Chaos 28, 103107 (2018)
Moreover, the “normalized Shannon entropy” is calculated as Here, m represents the embedding dimension, and the embed-
M
ding dimension m is strictly determined by using Takens
H 1
S= = − pi · ln pi . (10) theorem.54
Hmax Hmax i=1 Step 2: Distance matrix construction: Define the dis-
tance matrix D = {dij } among vectors X (i) and X (j) for all
It is obvious that Hmax = ln(M ) is obtained by a uniform
1 ≤ i, j ≤ N − m + 1. The distance dij is shown by
probability distribution Pe = {pi = 1/M , ∀ i = 1, 2, . . . , M }.
In practice, discretizing the equivalent continuous expres- dij = max{|u(i + k) − u(j + k)|, 0 ≤ k ≤ m − 1}. (13)
sion in Eqs. (1) and (3), we obtain discrete forms of Fisher Step 3: Probability density estimation: The distribution
information that do not have the same value. Here, Eqs. (1) characteristics of all dij for 1 ≤ i, j ≤ N − m + 1 should be
and (3) can be used as a FIM-starting point in the discrete complete quantification of the information underlying the
case. The proposal of Frieden27 and of Ferri et al.46 take, at distance matrix D. In this study, the histogram approach is
starting point, Eq. (1). However, the concomitant discretiza- applied to estimating the empirical probability mass function
tions can be considered just approximations to what results of D. In the histogram approach, the interval of histogram
when using Eq. (3) as the starting point. We emphasize that graph is [max(dij ) − min(dij )]/M , where M represents the
a discrete normalized FIM, convenient for our purposes, is number of bin of histogram graph. This means that the range
defined as of the first histogram is {min(dij ), min(dij ) + [max(dij ) −
M −1 min(dij )]/M }, and the last histogram is in the range of
I = I0 [(pi+1 )1/2 − (pi )1/2 ]2 . (11) {max(dij ) − [max(dij ) − min(dij )]/M , max(dij )}. Then, we let
i=1 pi (i = 1, 2, . . . , M ) be the probability of each bin when
It has been extensively discussed that this discretization is the histogram has M bins. Therefore, the probability pi is
well behaved in a discrete environment,47 and this method shown as
for discrete distribution has been employed to do some the number of dij of the ith histogram
research.48,49 Here, the normalization constant I0 is given by pi = . (14)
(N − m + 1)2
⎧
⎪
⎨1 if pi∗ = 1 for i∗ = 1 or i∗ = M and Step 4: Calculation: For obtaining the probability dis-
I0 = pi = 0 ∀ i
= i∗ , (12) tribution P, we should calculate the normalized Shannon
⎪
⎩ entropy and FIM via Eqs. (10) and (11), respectively.
1/2 otherwise.
The use of Shannon-Fisher information plane was initially III. NUMERICAL RESULTS FOR ARTIFICIAL TIME
proposed by Vignat and Bercher,32 and it is applied into many SERIES
works.50–52 In this plane, we can find that if our system lies in To evaluate the effectiveness of the Shannon-Fisher infor-
a very ordered state, the normalized Shannon entropy S ∼ 0 mation plane method, two types of artificial time series gen-
and the FIM I ∼ 1. On the contrary, when the system is erated by ARFIMA models36 and Chebyshev map model are
shown by a very disordered state, we obtain that S ∼ 1 and tested, respectively.
I ∼ 0. The Shannon entropy (and better the Shannon length
e(S[f ]) , which is always non negative and has dimension of A. ARFIMA models
length)53 is a more appropriate measure of the global spread-
ing because it does not depend on any specific point of the The power-law auto-correlations in stochastic variables
interval definition of the distribution, and it gives a more sim- can be expressed by an ARFIMA process:55
ilar weight to all points of that interval. On the contrary, the ∞
local sensitivity of FIM for discrete probability distribution xt = an (d)xt−n + t , (15)
functions is reflected in the fact that the specific “i-ordering” n=1
of the discrete values pi must be seriously taken into account where t ∼ N(0, 1), d ∈ (0, 0.5) is a memory parameter corre-
in evaluating the sum in Eq. (11). The summands can be lated with Hurst exponent as hxx = 0.5 + d,56 and the weight
regarded as a kind of “distance” between two contiguous an (d) is given by
probabilities. For a different ordering of the pertinent sum-
an (d) = d(n − d)/[(1 − d)(n + 1)], (16)
mands would be lead to a different FIM-value, hereby its
local nature. Thus, the Shannon-Fisher information plane is where is the Gamma function. The two-component
an effective tool to contrast global and local characteristic of ARFIMA process36 is defined as follows:
the probability distribution in research.
xt = [WXt + (1 − W )Yt ] + t , (17)
A. Algorithm yt = [(1 − W )Xt + WYt ] + νt , (18)
In order to put the Shannon-Fisher information plane ∞
based on information entropy into use, the method we have Xt = an (d1 )Xt−n , (19)
proposed above can be divided into four steps. n=1
Step 1: State-space reconstruction: Consider a original ∞
series {u(i), 1 ≤ i ≤ N}, form (N − m + 1) vectors X (i) by Yt = an (d2 )Yt−n , (20)
X (i) = {u(i), u(i + 1), . . . , u(i + m − 1)}, 1 ≤ i ≤ N − m + 1. n=1
103107-4 Y. Wang and P. Shang Chaos 28, 103107 (2018)
FIG. 2. Shannon-Fisher information plane with ARFIMA time series on various embedding dimensions m as a function of (M = 25 , 26 , 27 , 28 , 29 , 210 ).
103107-5 Y. Wang and P. Shang Chaos 28, 103107 (2018)
FIG. 4. Shannon-Fisher information plane with Chebyshev map on various embedding dimensions m as a function of (M = 25 , 26 , 27 , 28 , 29 , 210 ).
103107-6 Y. Wang and P. Shang Chaos 28, 103107 (2018)
FIG. 7. The classical Shannon-Fisher information plane of financial time series with m = 5 and m = 6, respectively.
method. Considering the properties obtained by Figs. 5 and 6, the classical Shannon-Fisher information plane, it can pro-
we set m = 12 and M = 26 (64). For a specific embedding vide us more detailed, accurate, and clearer information on
dimension m and M , we can calculate the normalized Shan- the classification of financial stock markets to some extent.
non entropy and its FIM for every stock market. By Fig. 8, we
find that the financial stock markets are partitioned into three V. CONCLUSION
groups. This finding is also found in Figs. 5 and 6. It is obvi-
ous that four American stock markets are grouped together. In this paper, we propose a Shannon-Fisher information
The second group includes three Australian stock markets, plane based on the information entropy. In our simulation
AFLI, AORD, and AXJO; their FIM values are nearly iden- experiments, for a given M , which represents the number of
tical. The other Asian stock markets are grouped together. In possible states of the system, we find that with the increase of
addition, we observe that American stock markets have the the embedding dimension m, the normalized Shannon entropy
biggest FIM, and Asian stock markets have the smallest FIM. increases, and the FIM decreases. The choice of m is deter-
It indicates that the American stock markets is less active than mined by using the false nearest-neighbor method. We know
the other stock markets, including three Australian stock mar- that the probability distribution P mainly depends on the
kets and three Asian stock markets. However, applying the parameter M . When analyzing the results of the Shannon-
classical Shannon-Fisher information plane into financial time Fisher information plane as a function of M , it is obvious that
series, whatever m = 5 or m = 6, it is hard to classify the as the parameter M increases, the curve shows the trend of
financial stock markets. decline. In addition, when M is not big enough, we observe
In summary, the proposed method is very useful and that the FIM changes a lot. It may suggest that the embed-
effective to analyze the financial time series. Compared with ding dimension m plays a leading role in determining the FIM
when M is not so big. As m is from 11 to 15, the curve gen-
erally moves to lower right; however, the results obtained
by m from 6 to 10 may not be as regular. In the real series
experiments, the similar properties are obtained, that is, with
the parameters m and M increasing, the normalized Shannon
entropy increases, and the FIM tends to be smaller. Perhaps
this may be explained by the fact that as the parameters m and
M increase, the financial time series is more complex, thus
producing large amounts of normalized Shannon entropy, and
its corresponding FIM is smaller. As we know, the Shannon
entropy is a tool to measure a degree of disorder in dynamical
systems; on the contrary, the FIM can quantify the degree of
order in dynamical systems. These properties are totally con-
formed to the characteristics of Shannon-Fisher information
plane. Moreover, applying the proposed method into finan-
cial stock markets, we conclude that the ten stock markets are
accurately divided into three groups. The FIM values of four
American stock markets DJI, NDX, NYA, and S&P500 are
FIG. 8. Shannon-Fisher information plane based on information entropy with bigger than those of the other stock markets. These show that
financial time series for m = 12 and M = 64. these four American stock markets may be less active than
103107-8 Y. Wang and P. Shang Chaos 28, 103107 (2018)
the Asian stock markets TWII, SSEC, and CTSP300 and the 21
N. L. Guevara, R. P. Sagar, and R. O. Esquivel, “Shannon-information
Australian stock markets AFLI, AORD, and AXJO. entropy sum as a correlation measure in atomic systems,” Phys. Rev. A
67, 5149–5153 (2003).
In summary, compared with the classical Shannon-Fisher 22
N. L. Guevara, R. P. Sagar, and R. O. Esquivel, “Information uncertainty-
information plane based on the permutation entropy, our pro- type inequalities in atomic systems,” J. Chem. Phys. 119, 7030–7036
posed method can give us more exact information on the (2003).
23
classification of financial stock markets. We hope that this N. L. Guevara, R. P. Sagar, and R. O. Esquivel, “Local correlation measures
in atomic systems,” J. Chem. Phys. 122, 084101 (2005).
method could be of great possibility and potentiality to be 24
C. C. Moustakidis and S. E. Massen, “The dependence of information
used to study other fields. entropy of uniform Fermi systems on correlations and thermal effects,”
Phys. Rev. B 71, 045102 (2005).
25
K. D. Sen, “Characteristic features of Shannon information entropy of
ACKNOWLEDGMENTS confined atoms,” J. Chem. Phys. 123, 074110 (2005).
26
R. A. Fisher, “Theory of statistical estimation,” Proc. Camb. Phil. Soc. 22,
The financial support from the funds of the Fundamen- 700–725 (1925).
27
tal Research Funds for the Central Universities (2018YJS178, B. R. Frieden, Science from Fisher Information (Cambridge University
Press, 2004).
2018JBZ104), the China National Science (61771035), and 28
B. R. Frieden and B. H. Soffer, “Lagrangians of physics and the
the Beijing National Science (4162047) is gratefully acknowl- game of Fisher-information transfer,” Phys. Rev. E 52, 2274–2286
edged. (1995).
29
M. Reginatto, “Erratum: Derivation of the equations of nonrelativistic
1 quantum mechanics using the principle of minimum Fisher information
J. F. Bercher and C. Vignat, “On minimum Fisher information distributions
with restricted support and fixed variance,” Inform. Sci. 179, 3832–3842 [Phys. Rev. A 58, 1775 (1998)],” Phys. Rev. A 60, 1730 (1999).
30
(2009). R. Nalewajski, “Information principles in the theory of electronic struc-
2 ture,” Chem. Phys. Lett. 372, 28–34 (2003).
A. L. Berger, V. J. Della-Pietra, and S. A. Della Pietra, “A maximum
31
entropy approach to natural language processing,” Comput. Linguist. 22, Á. Nagy, “Fisher information in density functional theory,” J. Chem. Phys.
39–71 (1996). 119, 9401–9405 (2003).
3 32
S. A. Della-Pietra, V. J. Della-Pietra, and J. Lafferty, “Inducing features of C. Vignat and J. F. Bercher, “Analysis of signals in the Fisher-Shannon
random fields,” IEEE Trans. Pattern Anal. Mach. Int. 19, 380–393 (1997). information plane,” Phys. Lett. A 312, 27–33 (2003).
4 33
B. R. Frieden, Physics from Fisher Information (Cambridge University M. T. Martin, F. Pennini, and A. Plastino, “Fisher’s information and the
Press, 1998), Vol. 33, pp. 327–343. analysis of complex signals,” Phys. Lett. A 256, 173–180 (1999).
5 34
Y. Y. Wang and P. J. Shang, “Analysis of financial stock markets through M. T. Martin, J. Perez, and A. Plastino, “Fisher information and nonlinear
multidimensional scaling based on information measures,” Nonlinear Dyn. dynamics,” Physica A 291, 523–532 (2001).
35
89, 1827–1844 (2017). E. Romera and J. S. Dehesa, “The Fisher-Shannon information
6 plane, an electron correlation tool,” J. Chem. Phys. 120, 8906–8912
H. Xiong and P. J. Shang, “Weighted multifractal cross-correlation analysis
based on Shannon entropy,” Commun. Nonlinear Sci. Numer. Simulat. 30, (2004).
36
268–283 (2016). B. Podobnik, D. Horvatic, A. L. Ng, H. E. Stanley, and P. C. Ivanov,
7 “Modeling long-range cross-correlations in two-component ARFIMA and
J. S. A. E. Fouda and W. Koepf, “Detecting regular dynamics from
time series using permutations slopes,” Commun. Nonlinear Sci. Numer. FIARCH processes,” Physica A 387, 3954–3959 (2008).
37
Simulat. 27, 216–227 (2015). R. A. Fisher, “On the mathematical foundations of theoretical statistics,”
8 Philos. Trans. R. Soc. Lond. 222, 309–368 (1922).
A. M. Lopes and J. A. T. Machado, “Analysis of temperature time-series:
38
Embedding dynamics into the MDS method,” Commun. Nonlinear Sci. A. Dembo, T. A. Cover, and J. A. Thomas, “Information theoretic inequal-
Numer. Simulat. 19, 851–871 (2014). ities,” IEEE Trans. Inform. Theory 37, 1501–1518 (1991).
9 39
J. N. Xia and P. J. Shang, “Multiscale entropy analysis of financial time J. C. Angulo, J. Antolin, and K. D. Sen, “Fisher-Shannon plane
series,” Fluct. Noise Lett. 11, 1250033 (2012). and statistical complexity of atoms,” Phys. Lett. A 372, 670–674
10 (2008).
Y. Yin and P. J. Shang, “Comparison of multiscale methods in the
40
stock markets for detrended cross-correlation analysis and cross-sample R. O. Esquivel, J. C. Angulo, J. Antolin, J. S. Dehesa, S. Lopez-Rosa,
entropy,” Fluct. Noise Lett. 13, 1450023 (2014). and N. Flores-Gallegos, “Analysis of complexity measures and informa-
11 tion planes of selected molecules in position and momentum spaces,” Phys.
Q. Tian, P. J. Shang, and G. C. Feng, “Financial time series analysis based
on information categorization method,” Physica A 416, 183–191 (2014). Chem. Chem. Phys. 12, 7108–7116 (2010).
12 41
C. E. Shannon, “A mathematical theory of communication,” Bell Syst. L. Devroye, A Course on Density Estimation (Birkhauser, 1987).
42
Tech. J. 27, 379–423 (1948). A. Janicki and A. Weron, Simulation and Chaotic Behavior of Stable
13 Stochastic Processes (Marcel Dekker, 1994).
R. C. Hilborn, Chaos and Nonlinear Dynamics (Oxford University Press,
43
2000). L. Telesca, M. Lovallo, H. L. Hsu, and C. C. Chen, “Analysis of dynamics
14 in magnetotelluric data by using the Fisher–Shannon method,” Physica A
S. B. Sears, R. G. Parr, and U. Dinur, “On the quantum-mechanical kinetic
energy as a measure of the information in a distribution,” Israel J. Chem. 390, 1350–1355 (2011).
44
19, 165–173 (2013). L. Telesca, M. Lovallo, A. Chamoli, V. P. Dimri, and K. Srivastava,
15 “Fisher–Shannon analysis of seismograms of tsunamigenic and non-
S. R. Gadre, “Information entropy and Thomas-Fermi theory,” Phys. Rev.
A 30, 620–621 (1984). tsunamigenic earthquakes,” Physica A 392, 3424–3429 (2013).
16 45
S. R. Gadre, S. B. Sears, S. J. Chakravorty, and R. D. Bendale, “Some V. C. Raykar and R. Duraiswami, “Fast optimal bandwidth selec-
novel characteristics of atomic information entropies,” Phys. Rev. A 32, tion for kernel density estimation,” in Proceedings of the SIAM Inter-
2602–2606 (1985). national Conference on Data Mining (Bethesda, Maryland, 2006),
17 pp. 524–528.
R. J. YáÌĹez, W. V. Assche, and J. S. Dehesa, “Position and momen-
46
tum information entropies of the D-dimensional harmonic oscillator and G. L. Ferri, F. Pennini, and A. Plastino, “LMC-complexity and various
hydrogen atom,” Phys. Rev. A 32, 3065–3079 (1994). chaotic regimes,” Phys. Lett. A 373, 2210–2214 (2009).
18 47
M. Hô, R. P. Sagar, V. H. Smith, Jr., and R. O. Esquivel, “Atomic infor- P. Sánchez-Moreno, R. J. YáÌĹez, and J. S. Dehesa, “Discrete densities and
mation entropies beyond the Hartree-Fock limit,” J. Phys. B 27, 5149 Fisher information,” in Proceedings of the 14th International Conference
(1994). on Difference Equations and Applications (Bahçeşehir University Press,
19 2009), pp. 291–298.
M. Hô, R. P. Sagar, J. M. Pérez-Jordá, V. H. Smith, Jr., and R. O. Esquivel,
48
“A numerical study of molecular information entropies,” Chem. Phys. Lett. B. A. Gonçalves, L. Carpi, and O. A. Rosso, “Time series characterization
219, 15–20 (1994). via horizontal visibility graph and Information Theory,” Physica A 464,
20 93–102 (2016).
Á. Nagy and R. G. Parr, “Information entropy as a measure of the quality
49
of an approximate electronic wave function,” Int. J. Quantum Chem. 58, O. A. Rosso, F. Olivares, and A. Plastino, “Noise versus chaos in a causal
323–327 (1996). Fisher-Shannon plane,” Physics 7, 070006 (2015).
103107-9 Y. Wang and P. Shang Chaos 28, 103107 (2018)
50 54
M. G. Ravetti, L. C. Carpi, B. A. Gonçalves, A. C. Frery, and O. A. Rosso, F. Black and M. Scholes, “The pricing of options and corporate liabilities,”
“Distinguishing noise from chaos: Objective versus subjective criteria J. Polit. Econ. 81, 637–654 (1973).
55
using horizontal visibility graph,” PLoS One 9, e108004 (2014). J. R. M. Hosking, “Fractional differencing,” Biometrika 68, 165–176
51
O. A. Rosso, F. Olivares, L. Zunino, L. D. Micco, A. Aquino, A. Plastino, (1981).
56
and H. Larrondo, “Characterization of chaotic maps using the permuta- B. Podobnik and H. E. Stanley, “Detrended cross-correlation analysis:
tion Bandt-Pompe probability distribution,” Eur. Phys. J. B 86, 116–129 A new method for analyzing two nonstationary time series,” Phys. Rev.
(2013). Lett. 100, 084102 (2008).
52 57
F. Olivares, A. Plastino, and O. A. Rosso, “Contrasting chaos with noise via See http://finance.yahoo.com for daily closing prices of financial stock
local versus global information quantifiers,” Phys. Lett. A 376, 1577–1583 markets.
58
(2012). S. H. Bian and P. J. Shang, “Refined two-index entropy and multiscale
53
C. E. Shannon and W. Weaver, The Mathematical Theory of Communica- analysis for complex system,” Commun. Nonlinear Sci. Numer. Simul. 39,
tion (University of Illinois Press, Urbana, 1947). 233–247 (2016).