Skip to main content

Deep Generative Models

2024


On Divergence Measures for Training GFlowNets

Novel approach to training Generative Flow Networks (GFlowNets) by minimizing divergence measures such as Renyi-$\alpha$, Tsallis-$\alpha$, and Kullback-Leibler (KL) divergences. Stochastic gradient estimators using variance reduction techniques leads to faster and stabler training.

Analyzing GFlowNets: Stability, Expressiveness, and Assessment

How balance violations impact the learned distribution, motivating an weighted balance loss to improve training. For graph distributions, there are scenarios where balance is unattainable, and richer embeddings of children’s states is needed enhance expressiveness. To measure of distributional correctness in GFN we introduce a provable correct novel assessment metric.