ESSEC METALAB

RESEARCH

MONOTONIC ALPHA-DIVERGENCE MINIMISATION FOR VARIATIONAL INFERENCE

[ARTICLE] This paper introduces a novel family of iterative algorithms for α-divergence minimization in Variational Inference, systematically decreasing the divergence between variational and posterior distributions.

by Kamélia Daudel (ESSEC Business School), Randal DoucFrançois Roueff

In this paper, we introduce a novel family of iterative algorithms which carry out α-divergence minimisation in a Variational Inference context. They do so by ensuring a systematic decrease at each step in the α-divergence between the variational and the posterior distributions. In its most general form, the variational distribution is a mixture model and our framework allows us to simultaneously optimise the weights and components parameters of this mixture model. Our approach permits us to build on various methods previously proposed for α-divergence minimisation such as Gradient or Power Descent schemes and we also shed a new light on an integrated Expectation Maximization algorithm. Lastly, we provide empirical evidence that our methodology yields improved results on several multimodal target distributions and on a real data example.

[Please read the research paper here]

Research list
arrow-right
Résumé de la politique de confidentialité

Ce site utilise des cookies afin que nous puissions vous fournir la meilleure expérience utilisateur possible. Les informations sur les cookies sont stockées dans votre navigateur et remplissent des fonctions telles que vous reconnaître lorsque vous revenez sur notre site Web et aider notre équipe à comprendre les sections du site que vous trouvez les plus intéressantes et utiles.