Entropy is a central concept in physics and has deep connections with Information theory, which is one of the foundations of modern machine learning. Specifically, energy-based models are unsupervised machine learning models that adopt a simple yet general formulation based on the principle of maximum entropy. Three Chapters in my thesis are related to energy-based models, and one Chapter uses a Gaussian Coding rate function, which is also related to entropy.The Boltzmann machine is an energy-based model with strong connections to spin systems in Physics. Boltzmann machines were conceived with bipolar real-valued spin states (up and down) and later generalized to complex valued spin states with unit length. Building on the previous work on ...
Bayesian networks (BNs) are representative causal models and are expressed as directedacyclic graphs...
We propose a framework for learning hidden-variable models by optimizing entropies, in which entropy...
[EN] We introduce a new family of energy-based probabilistic graphical models for efficient unsuperv...
Entropy is a central concept in physics and has deep connections with Information theory, which is o...
We present a new statistical learning paradigm for Boltzmann machines based on a new inference princ...
We compare and contrast the statistical physics and quantum physics inspired approaches for unsuperv...
The brain's cognitive power does not arise on exacting digital precision in high-performance computi...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
Exact Boltzmann learning can be done in certain restricted networks by the technique of decimation. ...
Ensemble methods of machine learning combine neural networks or other machine learning models in ord...
International audienceRestricted Boltzmann machines (RBMs) are energy-based neural networks which ar...
In this paper, entropy term is used in the learning phase of a neural network. As learning progresse...
Optimisation problems typically involve finding the ground state (i.e. the minimum energy configurat...
Throughout this Ph.D. thesis, we will study the sampling properties of Restricted Boltzmann Machines...
We explore the training and usage of the Restricted Boltzmann Machine for unsupervised feature extra...
Bayesian networks (BNs) are representative causal models and are expressed as directedacyclic graphs...
We propose a framework for learning hidden-variable models by optimizing entropies, in which entropy...
[EN] We introduce a new family of energy-based probabilistic graphical models for efficient unsuperv...
Entropy is a central concept in physics and has deep connections with Information theory, which is o...
We present a new statistical learning paradigm for Boltzmann machines based on a new inference princ...
We compare and contrast the statistical physics and quantum physics inspired approaches for unsuperv...
The brain's cognitive power does not arise on exacting digital precision in high-performance computi...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
Exact Boltzmann learning can be done in certain restricted networks by the technique of decimation. ...
Ensemble methods of machine learning combine neural networks or other machine learning models in ord...
International audienceRestricted Boltzmann machines (RBMs) are energy-based neural networks which ar...
In this paper, entropy term is used in the learning phase of a neural network. As learning progresse...
Optimisation problems typically involve finding the ground state (i.e. the minimum energy configurat...
Throughout this Ph.D. thesis, we will study the sampling properties of Restricted Boltzmann Machines...
We explore the training and usage of the Restricted Boltzmann Machine for unsupervised feature extra...
Bayesian networks (BNs) are representative causal models and are expressed as directedacyclic graphs...
We propose a framework for learning hidden-variable models by optimizing entropies, in which entropy...
[EN] We introduce a new family of energy-based probabilistic graphical models for efficient unsuperv...