Skip to main content

Conference

2024


On Divergence Measures for Training GFlowNets

Novel approach to training Generative Flow Networks (GFlowNets) by minimizing divergence measures such as Renyi-$\alpha$, Tsallis-$\alpha$, and Kullback-Leibler (KL) divergences. Stochastic gradient estimators using variance reduction techniques leads to faster and stabler training.

2019


Time is of the Essence: a Joint Hierarchical RNN and Point Process Model for Time and Item Predictions

A joint model combining a Hierarchical RNN for session-based recommendations and a Point Process model for predicting return times. This approach improves both recommendation accuracy and return-time predictions over strong baselines.

2017


Content-Based Social Recommendation with Poisson Matrix Factorization

A latent variable probabilistic model for recommender systems that combines social trust, item content, and user preferences into a unified Poisson matrix factorization framework. This model jointly factorizes the user–item interaction matrix and item–content matrix, accounting for social relationships and content information to enhance recommendation accuracy.

2014