[1412.6581] Variational Recurrent Auto-Encoders
arxiv.org › abs › 1412Dec 20, 2014 · In this paper we propose a model that combines the strengths of RNNs and SGVB: the Variational Recurrent Auto-Encoder (VRAE). Such a model can be used for efficient, large scale unsupervised learning on time series data, mapping the time series data to a latent vector representation. The model is generative, such that data can be generated from samples of the latent space. An important ...
[1412.6581] Variational Recurrent Auto-Encoders
https://arxiv.org/abs/1412.658120.12.2014 · In this paper we propose a model that combines the strengths of RNNs and SGVB: the Variational Recurrent Auto-Encoder (VRAE). Such a model can be used for efficient, large scale unsupervised learning on time series data, mapping the time series data to a latent vector representation. The model is generative, such that data can be generated from samples of the …
[1506.02216] A Recurrent Latent Variable Model for ...
https://arxiv.org/abs/1506.0221607.06.2015 · A Recurrent Latent Variable Model for Sequential Data. In this paper, we explore the inclusion of latent random variables into the dynamic hidden state of a recurrent neural network (RNN) by combining elements of the variational autoencoder. We argue that through the use of high-level latent random variables, the variational RNN (VRNN)1 can ...