Currently we are witnessing the revaluation of huge data recourses that should be analyzed carefully to draw the right decisions about the world problems. Such big data are statistically risky since we know that the data are combination of (useful) signals and (useless) noise, which considered as unorganized facts that need to be filtered and processed. Using the signals only and discarding the noise means that the data restructured and reorganized to be useful and it is called information. So for any data set, we need only the information. In context of information theory, the entropy is used as a statistical measure to quantify the maximum amount of information in a random event
In summary, in the present Special Issue, manuscripts focused on any of the above-mentioned “Informa...
In the last two decades, the understanding of complex dynamical systems underwent important conceptu...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
In this study we illustrate a Maximum Entropy (ME) methodology for modeling incomplete information a...
Maximum entropy estimation is a relatively new estimation technique in econometrics. We carry out se...
This thesis is a formal presentation of entropy and related principles as they relate to probability...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
The origin of entropy dates back to 19th century. In 1948, the entropy concept as a measure of uncer...
In many practical situations, we have only partial information about the probabilities. In some case...
A coherent statistical methodology is necessary for analyzing and understanding complex economic sys...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
International audienceThis chapter focuses on the notions of entropy and of maximum entropy distribu...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The maximum entropy method has been widely used in many fields, such as statistical mechanics,econom...
Methodologies related to information theory have been increasingly used in studies in economics and ...
In summary, in the present Special Issue, manuscripts focused on any of the above-mentioned “Informa...
In the last two decades, the understanding of complex dynamical systems underwent important conceptu...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
In this study we illustrate a Maximum Entropy (ME) methodology for modeling incomplete information a...
Maximum entropy estimation is a relatively new estimation technique in econometrics. We carry out se...
This thesis is a formal presentation of entropy and related principles as they relate to probability...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
The origin of entropy dates back to 19th century. In 1948, the entropy concept as a measure of uncer...
In many practical situations, we have only partial information about the probabilities. In some case...
A coherent statistical methodology is necessary for analyzing and understanding complex economic sys...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
International audienceThis chapter focuses on the notions of entropy and of maximum entropy distribu...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The maximum entropy method has been widely used in many fields, such as statistical mechanics,econom...
Methodologies related to information theory have been increasingly used in studies in economics and ...
In summary, in the present Special Issue, manuscripts focused on any of the above-mentioned “Informa...
In the last two decades, the understanding of complex dynamical systems underwent important conceptu...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...