Normalizing flows are a popular approach for constructing probabilistic and generative models. However, maximum likelihood training of flows is challenging due to the need to calculate computationally expensive determinants of Jacobians. This paper takes steps towards addressing this challenge by introducing an approach for determinant-free training of flows inspired by two-sample testing. Central to our framework is the energy objective, a multidimensional extension of proper scoring rules that admits efficient estimators based on random projections and that outperforms a range of alternative two-sample objectives that can be derived in our framework. Crucially, the energy objective and its alternatives do not require calculating determina...
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
Normalizing flows are constructed from a base distribution with a known density and a diffeomorphism...
Normalizing Flows (NFs) are emerging as a powerful class of generative models, as they not only allo...
This paper studies the cooperative learning of two generative flow models, in which the two models a...
We present a machine-learning model based on normalizing flows that is trained to sample from the is...
Energy-Based Models (EBMs) capture dependencies between variables by associating a scalar energy to ...
Energy-based models are popular in machine learning due to the elegance of their formulation and the...
A normalizing flow is an invertible mapping between an arbitrary probability distribution and a stan...
Normalising flows are tractable probabilistic models that leverage the power of deep learning to des...
Normalizing flows are a class of deep generative models that provide a promising route to sample lat...
Normalizing flows provide an elegant approach to generative modeling that allows for efficient sampl...
Deep Learning is becoming a standard tool across science and industry to optimally solve a variety o...
Energy-Based Models (EBMs) are a class of generative models like Variational Autoencoders, Normalizi...
Normalizing Flows (NFs) are able to model complicated distributions p(y) with strong inter-dimension...
Numerous applications of machine learning involve representing probability distributions over high-d...
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
Normalizing flows are constructed from a base distribution with a known density and a diffeomorphism...
Normalizing Flows (NFs) are emerging as a powerful class of generative models, as they not only allo...
This paper studies the cooperative learning of two generative flow models, in which the two models a...
We present a machine-learning model based on normalizing flows that is trained to sample from the is...
Energy-Based Models (EBMs) capture dependencies between variables by associating a scalar energy to ...
Energy-based models are popular in machine learning due to the elegance of their formulation and the...
A normalizing flow is an invertible mapping between an arbitrary probability distribution and a stan...
Normalising flows are tractable probabilistic models that leverage the power of deep learning to des...
Normalizing flows are a class of deep generative models that provide a promising route to sample lat...
Normalizing flows provide an elegant approach to generative modeling that allows for efficient sampl...
Deep Learning is becoming a standard tool across science and industry to optimally solve a variety o...
Energy-Based Models (EBMs) are a class of generative models like Variational Autoencoders, Normalizi...
Normalizing Flows (NFs) are able to model complicated distributions p(y) with strong inter-dimension...
Numerous applications of machine learning involve representing probability distributions over high-d...
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
Normalizing flows are constructed from a base distribution with a known density and a diffeomorphism...
Normalizing Flows (NFs) are emerging as a powerful class of generative models, as they not only allo...