Estimating information-theoretic quantities such as entropy and mutual information is central to many problems in statistics and machine learning, but challenging in high dimensions. This paper presents estimators of entropy via inference (EEVI), which deliver upper and lower bounds on many information quantities for arbitrary variables in a probabilistic generative model. These estimators use importance sampling with proposal distribution families that include amortized variational inference and sequential Monte Carlo, which can be tailored to the target model and used to squeeze true information values with high accuracy. We present several theoretical properties of EEVI and demonstrate scalability and efficacy on two problems from the me...
We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimens...
This paper extends the formulation of the Shannon entropy under probabilistic uncertainties which ar...
We explore a supervised machine learning approach to estimate the entanglement entropy of multi-qubi...
In previous work, we studied four well known systems of qualitative probabilistic inference, and pre...
In summary, in the present Special Issue, manuscripts focused on any of the above-mentioned “Informa...
AbstractThis paper is a sequel to an earlier result of the authors that in making inferences from ce...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
The entropy rate quantifies the amount of uncertainty or disorder produced by any dynamical system. ...
Evidence theory (TE), based on imprecise probabilities, is often more appropriate than the classica...
Given a probability space, we analyze the uncertainty, that is, the amount of information of a finit...
This short note is a critical discussion of the quantification of aleatoric and epistemic uncertaint...
We advocate the use of a notion of entropy that reflects the relative abundances of the symbols in a...
Neural networks have dramatically increased our capacity to learn from large, high-dimensional datas...
In information theory, one major goal is to find useful functions that summarize the amount of infor...
The combination of mathematical models and uncertainty measures can be applied in the area of data m...
We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimens...
This paper extends the formulation of the Shannon entropy under probabilistic uncertainties which ar...
We explore a supervised machine learning approach to estimate the entanglement entropy of multi-qubi...
In previous work, we studied four well known systems of qualitative probabilistic inference, and pre...
In summary, in the present Special Issue, manuscripts focused on any of the above-mentioned “Informa...
AbstractThis paper is a sequel to an earlier result of the authors that in making inferences from ce...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
The entropy rate quantifies the amount of uncertainty or disorder produced by any dynamical system. ...
Evidence theory (TE), based on imprecise probabilities, is often more appropriate than the classica...
Given a probability space, we analyze the uncertainty, that is, the amount of information of a finit...
This short note is a critical discussion of the quantification of aleatoric and epistemic uncertaint...
We advocate the use of a notion of entropy that reflects the relative abundances of the symbols in a...
Neural networks have dramatically increased our capacity to learn from large, high-dimensional datas...
In information theory, one major goal is to find useful functions that summarize the amount of infor...
The combination of mathematical models and uncertainty measures can be applied in the area of data m...
We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimens...
This paper extends the formulation of the Shannon entropy under probabilistic uncertainties which ar...
We explore a supervised machine learning approach to estimate the entanglement entropy of multi-qubi...