We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of thesegeneralized quantities. We then consider these generalized quantities in an easily interpreted spe-cial case. We show that the resulting quantities, share many of the properties of entropy andrelative entropy, such as the data processing inequality and the second law of thermodynamics.We formulate an important statistical learning problem – probability estimation – in terms of ageneralized relative entropy. The solution of this problem reflects general risk preferences via theutility function; moreover, the solution is optimal in a sense of robust absolute ...
We introduce entropy coherent and entropy convex measures of risk and prove a collection of axiomati...
Perfectly rational decision-makers maximize expected utility, but crucially ignore the resource cost...
The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KL-entropy) plays a...
Information measures arise in many disciplines, including forecasting (where scoring rules are used ...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
The refinement axiom for entropy has been provocative in providing foundations of information theory...
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal i...
We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. W...
The maximum entropy principle can be used to assign utility values when only partial information is ...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
A new definition of generalized information measures is introduced so as to investigate the finite-p...
Perfectly rational decision-makers maximize expected utility, but crucially ignore the resource cost...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
We introduce entropy coherent and entropy convex measures of risk and prove a collection of axiomati...
Perfectly rational decision-makers maximize expected utility, but crucially ignore the resource cost...
The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KL-entropy) plays a...
Information measures arise in many disciplines, including forecasting (where scoring rules are used ...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
The refinement axiom for entropy has been provocative in providing foundations of information theory...
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal i...
We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. W...
The maximum entropy principle can be used to assign utility values when only partial information is ...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
A new definition of generalized information measures is introduced so as to investigate the finite-p...
Perfectly rational decision-makers maximize expected utility, but crucially ignore the resource cost...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
We introduce entropy coherent and entropy convex measures of risk and prove a collection of axiomati...
Perfectly rational decision-makers maximize expected utility, but crucially ignore the resource cost...
The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KL-entropy) plays a...