Gaussian Mixture Variational Autoencoder - GitHub
https://github.com/jariasf/GMVAE02.10.2020 · Implementation of Gaussian Mixture Variational Autoencoder (GMVAE) for Unsupervised Clustering in PyTorch and Tensorflow. The probabilistic model is based on the model proposed by Rui Shu, which is a modification of the M2 unsupervised model proposed by Kingma et al. for semi-supervised learning. Unlike other implementations that use …
[1611.02648] Deep Unsupervised Clustering with Gaussian ...
https://arxiv.org/abs/1611.0264808.11.2016 · We study a variant of the variational autoencoder model (VAE) with a Gaussian mixture as a prior distribution, with the goal of performing unsupervised clustering through deep generative models. We observe that the known problem of over-regularisation that has been shown to arise in regular VAEs also manifests itself in our model and leads to cluster …
GitHub - Nat-D/GMVAE: Deep Unsupervised Clustering with ...
https://github.com/Nat-D/GMVAE04.03.2020 · Abstract. We study a variant of the variational autoencoder model with a Gaussian mixture as a prior distribution, with the goal of performing unsupervised clustering through deep generative models. We observe that the standard variational approach in these models is unsuited for unsupervised clustering, and mitigate this problem by leveraging ...
Learning Latent Superstructures in Variational ...
https://arxiv.org/abs/1803.0520614.03.2018 · We investigate a variant of variational autoencoders where there is a superstructure of discrete latent variables on top of the latent features. In general, our superstructure is a tree structure of multiple super latent variables and it is automatically learned from data. When there is only one latent variable in the superstructure, our model reduces to one that assumes the …