-
Deep Variational Clustering Framework for Self-labeling of Large-scale Medica...
The proposed framework is composed of two networks (see Fig 1). The encoder network q with parameters of φ computes qφ(z|x) : xi → zi. The encoder maps an input image xi ∈ X to... -
Density Estimation Using Real NVP
This dataset has no description
-
Alternating Back-Propagation for Generator Networks
This dataset has no description
-
Wasserstein GAN
This dataset has no description
-
Flexible Prior Distributions for Deep Generative Models
The dataset induced prior distribution is learned using a secondary GAN named PGAN. This prior is then used to further train the original GAN. -
Stacked Wasserstein Autoencoder
The proposed model is built on the theoretical analysis presented in [30,14]. Similar to the ARAE [14], our model provides flexibility in learning an autoencoder from the input... -
Learning minimal representations of stochastic processes with variational aut...
Stochastic processes have found numerous applications in science, as they are broadly used to model a variety of natural phenomena. The dataset consists of trajectories of... -
PRVNet: A Novel Partially-Regularized Variational Autoencoders for Massive MI...
The dataset used in this paper for CSI feedback compression in MIMO-OFDM systems. -
A Bayesian Non-parametric Approach to Generative Models
Generative models have emerged as a promising technique for producing high-quality im-ages that are indistinguishable from real images. -
Disentangled Representation Learning
Disentangled representation learning endeavours to train a model proficient in disentangling the underlying factors of observed data. -
Importance Sampling with Variational Autoencoders
The dataset used in the paper is a high-dimensional non-parametric importance sampling problem. -
Lifelong Teacher-Student Network Learning
A unique cognitive capability of humans consists in their ability to acquire new knowledge and skills from a sequence of experiences. Meanwhile, artificial intelligence systems... -
SGVAE: Sequential Graph Variational Autoencoder
Generative models of graphs are well-known, but many existing models are limited in scalability and expressivity. We present a novel sequential graphical variational autoencoder... -
CULT: Continual Unsupervised Learning with Typicality-Based Environment Detec...
FashionMNIST and MNIST datasets are used for continual unsupervised learning with variational autoencoders and generative replay. -
Adversarial Feature Learning
Adversarial Feature Learning -
Dynamical Variational Autoencoders: A Comprehensive Review
A comprehensive review of dynamical variational autoencoders -
Auto-encoding variational Bayes
Auto-encoding variational Bayes