Introduction: It is a remarkable fact that we can assign a numerical measure to certain quantities, which are closely related to our semantic conception of the word information. Depending on the model under consideration, these quantities are variously called the uncertainty, the average uncertainty or entropy, the mutual information, and the average mutual information or trans-information. ..
A generalized information theory is proposed as a natural extension of Shannon's information theor...
Information is a mathematical concept introduced in 1949 by C. E. Shannon in the mathematical theory...
We introduce an information-theoretical approach for analyzing information transfer between time ser...
We discuss a recently proposed quantity, called transfer entropy, which uses time series data to mea...
We give a survey of the basic statistical ideas underlying the definition of entropy in information ...
Modeling and inference are central to most areas of science and especially to evolving and complex s...
Information Theory is studied from the following view points: (1) the theory of entropy as amount of...
We review of the interface between (theoretical) physics and information for non-experts. The origin...
Information theory provides an interdisciplinary method to understand important phenomena in many re...
Shannon's famous paper [1] paved the way to a theory called information theory. In essence, the...
We apply a certain unifying physical description of the results of Information Theory. Assuming that...
This book presents the concepts needed to deal with self-organizing complex systems from a unifying ...
We study complexity and information and introduce the idea that while complexity is relative to a g...
The definition of information predictability stochastic process and its parameters is given in the a...
In summary, in the present Special Issue, manuscripts focused on any of the above-mentioned “Informa...
A generalized information theory is proposed as a natural extension of Shannon's information theor...
Information is a mathematical concept introduced in 1949 by C. E. Shannon in the mathematical theory...
We introduce an information-theoretical approach for analyzing information transfer between time ser...
We discuss a recently proposed quantity, called transfer entropy, which uses time series data to mea...
We give a survey of the basic statistical ideas underlying the definition of entropy in information ...
Modeling and inference are central to most areas of science and especially to evolving and complex s...
Information Theory is studied from the following view points: (1) the theory of entropy as amount of...
We review of the interface between (theoretical) physics and information for non-experts. The origin...
Information theory provides an interdisciplinary method to understand important phenomena in many re...
Shannon's famous paper [1] paved the way to a theory called information theory. In essence, the...
We apply a certain unifying physical description of the results of Information Theory. Assuming that...
This book presents the concepts needed to deal with self-organizing complex systems from a unifying ...
We study complexity and information and introduce the idea that while complexity is relative to a g...
The definition of information predictability stochastic process and its parameters is given in the a...
In summary, in the present Special Issue, manuscripts focused on any of the above-mentioned “Informa...
A generalized information theory is proposed as a natural extension of Shannon's information theor...
Information is a mathematical concept introduced in 1949 by C. E. Shannon in the mathematical theory...
We introduce an information-theoretical approach for analyzing information transfer between time ser...