Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be e...
We investigate the performance of error-correcting codes, where the code word comprises products of ...
Efficient approximation lies at the heart of large-scale machine learning problems. In this paper, w...
In this study we illustrate a Maximum Entropy (ME) methodology for modeling incomplete information a...
Optimisation problems typically involve finding the ground state (i.e. the minimum energy configurat...
We propose a framework for learning hidden-variable models by optimizing entropies, in which entropy...
Entropy is a central concept in physics and has deep connections with Information theory, which is o...
International audienceMaximum entropy models provide the least constrained probability distributions...
International audienceThe channel capacity of a deterministic system with confidential data is an up...
Many problems in natural language processing can be viewed as linguistic classification problems, in...
ConIII (pronounced CON-ee) is an open-source Python project providing a simple interface to solving ...
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sam...
Abstract. The channel capacity of a deterministic system with confidential data is an upper bound on...
The combination of mathematical models and uncertainty measures can be applied in the area of data m...
In the field of optimization using probabilistic models of the search space, this thesis identifies ...
We present a new statistical learning paradigm for Boltzmann machines based on a new inference pri...
We investigate the performance of error-correcting codes, where the code word comprises products of ...
Efficient approximation lies at the heart of large-scale machine learning problems. In this paper, w...
In this study we illustrate a Maximum Entropy (ME) methodology for modeling incomplete information a...
Optimisation problems typically involve finding the ground state (i.e. the minimum energy configurat...
We propose a framework for learning hidden-variable models by optimizing entropies, in which entropy...
Entropy is a central concept in physics and has deep connections with Information theory, which is o...
International audienceMaximum entropy models provide the least constrained probability distributions...
International audienceThe channel capacity of a deterministic system with confidential data is an up...
Many problems in natural language processing can be viewed as linguistic classification problems, in...
ConIII (pronounced CON-ee) is an open-source Python project providing a simple interface to solving ...
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sam...
Abstract. The channel capacity of a deterministic system with confidential data is an upper bound on...
The combination of mathematical models and uncertainty measures can be applied in the area of data m...
In the field of optimization using probabilistic models of the search space, this thesis identifies ...
We present a new statistical learning paradigm for Boltzmann machines based on a new inference pri...
We investigate the performance of error-correcting codes, where the code word comprises products of ...
Efficient approximation lies at the heart of large-scale machine learning problems. In this paper, w...
In this study we illustrate a Maximum Entropy (ME) methodology for modeling incomplete information a...