Generative Flow Networks (GFlowNets) are recently proposed models for learning stochastic policies that generate compositional objects by sequences of actions with the probability proportional to a given reward function. The central problem of GFlowNets is to improve their exploration and generalization. In this work, we propose a novel path regularization method based on optimal transport theory that places prior constraints on the underlying structure of the GFlowNets. The prior is designed to help the GFlowNets better discover the latent structure of the target distribution or enhance its ability to explore the environment in the context of active learning. The path regularization controls the flow in GFlowNets to generate more diverse a...
There often is a dilemma between ease of optimization and robust out-of-distribution (OoD) generaliz...
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative ...
Catastrophic forgetting (CF) happens whenever a neural network overwrites past knowledge while being...
Generative flow networks (GFlowNets) are a method for learning a stochastic policy for generating co...
Generative Flow Networks (GFlowNets) have been introduced as a method to sample a diverse set of can...
Generative Flow Networks (GFlowNets) have demonstrated significant performance improvements for gene...
We present energy-based generative flow networks (EB-GFN), a novel probabilistic modeling algorithm ...
The Generative Flow Network is a probabilistic framework where an agent learns a stochastic policy f...
Bayesian Flow Networks (BFNs) has been recently proposed as one of the most promising direction to u...
Within a broad class of generative adversarial networks, we show that discriminator optimization pro...
Sampling conditional distributions is a fundamental task for Bayesian inference and density estimati...
Existing generalization bounds fail to explain crucial factors that drive generalization of modern n...
Normalizing flows are a popular approach for constructing probabilistic and generative models. Howev...
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyc...
This paper studies the cooperative learning of two generative flow models, in which the two models a...
There often is a dilemma between ease of optimization and robust out-of-distribution (OoD) generaliz...
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative ...
Catastrophic forgetting (CF) happens whenever a neural network overwrites past knowledge while being...
Generative flow networks (GFlowNets) are a method for learning a stochastic policy for generating co...
Generative Flow Networks (GFlowNets) have been introduced as a method to sample a diverse set of can...
Generative Flow Networks (GFlowNets) have demonstrated significant performance improvements for gene...
We present energy-based generative flow networks (EB-GFN), a novel probabilistic modeling algorithm ...
The Generative Flow Network is a probabilistic framework where an agent learns a stochastic policy f...
Bayesian Flow Networks (BFNs) has been recently proposed as one of the most promising direction to u...
Within a broad class of generative adversarial networks, we show that discriminator optimization pro...
Sampling conditional distributions is a fundamental task for Bayesian inference and density estimati...
Existing generalization bounds fail to explain crucial factors that drive generalization of modern n...
Normalizing flows are a popular approach for constructing probabilistic and generative models. Howev...
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyc...
This paper studies the cooperative learning of two generative flow models, in which the two models a...
There often is a dilemma between ease of optimization and robust out-of-distribution (OoD) generaliz...
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative ...
Catastrophic forgetting (CF) happens whenever a neural network overwrites past knowledge while being...