ESSEC METALAB

RESEARCH

ALPHA-DIVERGENCE VARIATIONAL INFERENCE MEETS IMPORTANCE WEIGHTED AUTO-ENCODERS: METHODOLOGY AND ASYMPTOTICS

[ARTICLE] This paper studies the VR-IWAE bound, a generalization of the IWAE bound for minimizing alpha-divergence with unbiased gradient estimators, sharing properties with the VR bound.

by Kamélia Daudel (ESSEC Business School), Joe BentonYuyang ShiArnaud Doucet

Several algorithms involving the Variational Renyi (VR) bound have been proposed to minimize an alpha-divergence between a target posterior distribution and a variational distribution. Despite promising empirical results, those algorithms resort to biased stochastic radient descent procedures and thus lack theoretical guarantees. In this paper, we formalize and study the VR-IWAE bound, a generalization of the importance weighted auto-encoder (IWAE) bound. We show that the VR-IWAE bound enjoys several desirable properties and notably leads to the same stochastic gradient descent procedure as the VR bound in the reparameterized case, but this time by relying on unbiased gradient estimators. We then provide two complementary theoretical analyses of the VR-IWAE bound and thus of the standard IWAE bound. Those analyses shed light on the benefits or lack thereof of these bounds. Lastly, we illustrate our theoretical claims over toy and real-data examples.

[Please read the research paper here]

Research list
arrow-right
Résumé de la politique de confidentialité

Ce site utilise des cookies afin que nous puissions vous fournir la meilleure expérience utilisateur possible. Les informations sur les cookies sont stockées dans votre navigateur et remplissent des fonctions telles que vous reconnaître lorsque vous revenez sur notre site Web et aider notre équipe à comprendre les sections du site que vous trouvez les plus intéressantes et utiles.