In summary, in the present Special Issue, manuscripts focused on any of the above-mentioned “Information Theoretic Measures as Mutual Information, Permutation Entropy Approaches, Sample Entropy, Wavelet Entropy and its Evaluations”, as well as its interdisciplinary applications, are more than welcome. In this special issue, a series of articles under the common denominator of Theoretical Information Measures and their applications is presented. In particular, a brief description of the content of each of the papers included is given below.Instituto de Física La Plat
: A technique for identification and quantification of chaotic dynamics in experimental time series ...
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynam...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
We give a survey of the basic statistical ideas underlying the definition of entropy in information...
Given a probability space, we analyze the uncertainty, that is, the amount of information of a finit...
The concept of information theory originated when an attempt was made to create a theoretical model ...
Given a probability space, we analyze the uncertainty, that is, the amount of information of a finit...
In the last two decades, the understanding of complex dynamical systems underwent important conceptu...
Information Theory is studied from the following view points: (1) the theory of entropy as amount of...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
ca. 200 words; this text will present the book in all promotional forms (e.g. flyers). Please descri...
Modeling and inference are central to most areas of science and especially to evolving and complex s...
We give a survey of the basic statistical ideas underlying the definition of entropy in information ...
“Information Theory and Language” is a collection of 12 articles that appeared recently in Entropy a...
We review of the interface between (theoretical) physics and information for non-experts. The origin...
: A technique for identification and quantification of chaotic dynamics in experimental time series ...
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynam...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
We give a survey of the basic statistical ideas underlying the definition of entropy in information...
Given a probability space, we analyze the uncertainty, that is, the amount of information of a finit...
The concept of information theory originated when an attempt was made to create a theoretical model ...
Given a probability space, we analyze the uncertainty, that is, the amount of information of a finit...
In the last two decades, the understanding of complex dynamical systems underwent important conceptu...
Information Theory is studied from the following view points: (1) the theory of entropy as amount of...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
ca. 200 words; this text will present the book in all promotional forms (e.g. flyers). Please descri...
Modeling and inference are central to most areas of science and especially to evolving and complex s...
We give a survey of the basic statistical ideas underlying the definition of entropy in information ...
“Information Theory and Language” is a collection of 12 articles that appeared recently in Entropy a...
We review of the interface between (theoretical) physics and information for non-experts. The origin...
: A technique for identification and quantification of chaotic dynamics in experimental time series ...
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynam...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...