Yoel Zeldes 4/1/2019

Mixture of Variational Autoencoders - a Fusion Between MoE and VAE

Read Original

This technical article proposes a novel architecture that fuses Mixture of Experts (MoE) with Variational Autoencoders (VAE). It explores how to achieve label-free conditional generation (e.g., generating specific MNIST digits) by using a manager network to route inputs to specialized VAE 'experts', all trained in an entirely unsupervised manner.

Mixture of Variational Autoencoders - a Fusion Between MoE and VAE

Comments

No comments yet

Be the first to share your thoughts!