Skip to main content

Conference

2025


When do GFlowNets Learn the Right Distribution?

Analysis of the limitations and stability of GFlowNets under balance violations, showing how these affect accuracy. We introduce a novel metric for assessing correctness, improving evaluation beyond existing protocols.

ICLR 2025 (Spotlight, ~top 5% 🎉)

2024


On Divergence Measures for Training GFlowNets

Novel approach to training Generative Flow Networks (GFlowNets) by minimizing divergence measures such as Renyi-$\alpha$, Tsallis-$\alpha$, and Kullback-Leibler (KL) divergences. Stochastic gradient estimators using variance reduction techniques leads to faster and stabler training.

NeurIPS 2024 (Poster)

2019


Time is of the Essence: a Joint Hierarchical RNN and Point Process Model for Time and Item Predictions

A joint model combining a Hierarchical RNN for session-based recommendations and a Point Process model for predicting return times. This approach improves both recommendation accuracy and return-time predictions over strong baselines.

WSDM 2019 (Poster)

2017


Content-Based Social Recommendation with Poisson Matrix Factorization

A latent variable probabilistic model for recommender systems that combines social trust, item content, and user preferences into a unified Poisson matrix factorization framework. This model jointly factorizes the user–item interaction matrix and item–content matrix, accounting for social relationships and content information to enhance recommendation accuracy. ECML 2017

2014