Min-entropy is a statistical measure of the amount of randomness that a particular distribution contains. In this paper we investigate the notion of computational min-entropy which is the computational analog of statistical min-entropy. We consider three possible definitions for this notion, and show equivalence and separation results for these definitions in various computational models. We also study whether or not certain properties of statistical min-entropy have a computational analog. In particular, we consider the following questions: 1. Let X be a distribution with high computational min-entropy. Does one get a pseudorandom distribution when applying a “randomness extractor ” on X? 2. Let X and Y be (possibly dependent) random varia...
This dissertation explores the multifaceted interplay between efficient computation and probability ...
ABSTRACT In order to find out the limiting speed of solving a specific problem using computer, this...
Maximum entropy models are increasingly being used to describe the collective activity of neural pop...
A “randomness extractor” is an algorithm that given a sample from a distribution with sufficiently h...
We investigate how information leakage reduces computational entropy of a random variable X. Recall ...
There are two popular ways to measure computational entropy in cryptography: (HILL) pseudoentropy an...
Random numbers are essential for cryptography. In most real-world systems, these values come from a...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
International audienceIn many areas of computer science, it is of primary importance to assess the r...
This book is the first one that provides a solid bridge between algorithmic information theory and s...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
Let X_1,..., X_n be a sequence of n classical random variables and consider a sample Xs_1,..., Xs_r ...
The combination of mathematical models and uncertainty measures can be applied in the area of data m...
A common practice in the estimation of the complexity of objects, in particular of graphs, is to rel...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
This dissertation explores the multifaceted interplay between efficient computation and probability ...
ABSTRACT In order to find out the limiting speed of solving a specific problem using computer, this...
Maximum entropy models are increasingly being used to describe the collective activity of neural pop...
A “randomness extractor” is an algorithm that given a sample from a distribution with sufficiently h...
We investigate how information leakage reduces computational entropy of a random variable X. Recall ...
There are two popular ways to measure computational entropy in cryptography: (HILL) pseudoentropy an...
Random numbers are essential for cryptography. In most real-world systems, these values come from a...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
International audienceIn many areas of computer science, it is of primary importance to assess the r...
This book is the first one that provides a solid bridge between algorithmic information theory and s...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
Let X_1,..., X_n be a sequence of n classical random variables and consider a sample Xs_1,..., Xs_r ...
The combination of mathematical models and uncertainty measures can be applied in the area of data m...
A common practice in the estimation of the complexity of objects, in particular of graphs, is to rel...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
This dissertation explores the multifaceted interplay between efficient computation and probability ...
ABSTRACT In order to find out the limiting speed of solving a specific problem using computer, this...
Maximum entropy models are increasingly being used to describe the collective activity of neural pop...