An information-theoretic approach to numerically determine the Markov order of discrete stochastic processes defined over a finite state space is introduced. To measure statistical dependencies between different time points of symbolic time series, two information-theoretic measures are proposed. The first measure is time-lagged mutual information between the random variables Xn and Xn+k, representing the values of the process at time points n and n + k, respectively. The measure will be termed autoinformation, in analogy to the autocorrelation function for metric time series, but using Shannon entropy rather than linear correlation. This measure is complemented by the conditional mutual information between Xn and Xn+k, removing the influen...
We present an information-theoretical analysis of temporal dependencies in EEG microstate sequences ...
This paper presents symbolic time series analysis (STSA) of multi-dimensional measurement data for p...
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomne...
An information-theoretic approach to numerically determine the Markov order of discrete stochastic p...
International audienceInformation theoretic measures (entropies, entropy rates, mutual information) ...
In this paper we present the concept of description of random processes in complex systems with disc...
: A technique for identification and quantification of chaotic dynamics in experimental time series ...
Synchronization, a basic nonlinear phenomenon, is widely observed in diverse complex systems studied...
We study how the Shannon entropy of sequences produced by an information source converges to the sou...
Entropy measures are widely applied to quantify the complexity of dynamical systems in diverse field...
Information storage, reflecting the capability of a dynamical system to keep predictable information...
We consider the Markov and non-Markov processes in complex systems by the dynamical information Shan...
Symbolic time series analysis D-Markov machines a b s t r a c t A recent publication has reported a ...
Measures of entropy have been widely used to characterize complexity, particularly in physiological ...
Kinetic behaviour of dynamical information Shannon entropy is discussed for complex systems: physica...
We present an information-theoretical analysis of temporal dependencies in EEG microstate sequences ...
This paper presents symbolic time series analysis (STSA) of multi-dimensional measurement data for p...
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomne...
An information-theoretic approach to numerically determine the Markov order of discrete stochastic p...
International audienceInformation theoretic measures (entropies, entropy rates, mutual information) ...
In this paper we present the concept of description of random processes in complex systems with disc...
: A technique for identification and quantification of chaotic dynamics in experimental time series ...
Synchronization, a basic nonlinear phenomenon, is widely observed in diverse complex systems studied...
We study how the Shannon entropy of sequences produced by an information source converges to the sou...
Entropy measures are widely applied to quantify the complexity of dynamical systems in diverse field...
Information storage, reflecting the capability of a dynamical system to keep predictable information...
We consider the Markov and non-Markov processes in complex systems by the dynamical information Shan...
Symbolic time series analysis D-Markov machines a b s t r a c t A recent publication has reported a ...
Measures of entropy have been widely used to characterize complexity, particularly in physiological ...
Kinetic behaviour of dynamical information Shannon entropy is discussed for complex systems: physica...
We present an information-theoretical analysis of temporal dependencies in EEG microstate sequences ...
This paper presents symbolic time series analysis (STSA) of multi-dimensional measurement data for p...
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomne...