Research Blog

Arash Vahdat

Find me on:

Recent Posts

A Continuous Relaxation For Training Discrete Variational Autoencoders

Posted by Arash Vahdat on Jul 6, 2018 10:50:36 AM

Advances in deep learning have pushed generative learning into new and complex domains such as molecule design, music, voice, image and program generation. These advances have been made using models with continuous latent variables in spite of the computational efficiencies and greater interpretability offered by discrete latent variables. Despite the advantages of discrete latent variables, continuous latent variable models have proven to be  much easier to train. Unfortunately, problems such as clustering, semi-supervised learning, and variational memory addressing all require discrete variables. Thus, efficient training of machine learning models with discrete variables remains an important challenge in advanced machine learning.

Read More

Tags: QuPA, VAE, DVAE, Generative Learning

Subscribe for Updates

Recent Posts