This repository generates a plot which displays the dependency of marginal
redundancy
Let
where
Consider a scalar time series
Like mutual information, this quantity represents the average amount of
information that is shared, or redundant, among the time series of interest.
Marginal redundancy
Kolmogorov-Sinai entropy, also known as measure-theoretic entropy, metric
entropy, Kolmogorov entropy, or simply KS entropy, represents the information
production rate of a dynamical system. For a given partition, define
In the absence of noise, non-chaotic systems exhibit zero KS entropy while
chaotic systems display positive values. For signals containing any amount of
noise, KS entropy diverges to infinity as
The python function plot_marginal_redundancies
takes in a scalar time series
array with the specified parameters (max_dim
, max_lag
, bins
) and outputs
a plot with time lag on the x-axis and marginal redundancy on the y-axis. It
plots a separate curve for each embedding dimension starting from max_dim
over the time lags max_lag
.
An example is shown below for the Lorenz system using 1,000,000 data points.
The parameters and time series generation function are included in
example.py
. The computation time is approximately 30 seconds on an Intel®
Core™ i7-1255U at base clock speed.
In the marginal redundancy plot, the curves should converge to a straight line relation for sufficiently large embedding dimension:
The negative of the slope of this asymptote line approximates the KS entropy of the system.
-
Installation: Download
redundancy_analysis.py
in your project directory and import the file as a Python module usingimport redundancy_analysis
. -
Required Packages:
numpy
,matplotlib
[1] A. M. Fraser, “Using Mutual Information to Estimate Metric Entropy,” Springer series in synergetics, pp. 82–91, Jan. 1986, doi: https://doi.org/10.1007/978-3-642-71001-8_11.
[2] A. M. Fraser, “Information and entropy in strange attractors,” IEEE Transactions on Information Theory, vol. 35, no. 2, pp. 245–262, Mar. 1989, doi: https://doi.org/10.1109/18.32121.
[3] M. Palus, "Kolmogorov Entropy From Time Series Using Information-Theoretic Functionals," Neural Network World, vol. 7, 1997.
[4] Y. Sinai, “Kolmogorov-Sinai entropy,” Scholarpedia, vol. 4, no. 3, p. 2034, 2009, doi: https://doi.org/10.4249/scholarpedia.2034.
[5] G. P. Williams, “Chaos Theory Tamed,” CRC Press eBooks, Sep. 1997, doi: https://doi.org/10.1201/9781482295412.