View on GitHub

Research Review Notes

Summaries of academic research papers

Wasserstein Autoencoders


Idea

Wasserstein Autoencoders (WAE) are proposed as an alternative to Variational Autoencoders (VAE) as a method of getting the encoded data distribution to match the prior, which can be used for generative modeling. The main idea is to use the Wasserstein distance metric to penalize the distance between the encoded distribution and the prior, as opposed to the KL-divergence term used in VAEs.

Background

Method

Observations