-
Joint Observations
The dataset used in the paper is a set of joint observations that are used to train the generative model. The dataset is used to evaluate the performance of the proposed method... -
Learning (Very) Simple Generative Models Is Hard
The dataset is used to study the computational complexity of learning pushforwards of Gaussians under one-hidden-layer ReLU networks. -
Synthesizing Human Faces using Latent Space Factorization and Local Weights (...
A 3D face generative model with local weights to increase the model's variations and expressiveness. -
Prediction-Focused Mixtures
The dataset used in the paper is a concatenation of M independent mixtures, each with Km components. -
Hierarchical Exponential-family Energy-based (HEE) model on CIFAR10
The HEE model uses CIFAR10 to demonstrate its ability to generate high-quality images. -
Hierarchical Exponential-family Energy-based (HEE) model
The HEE model uses 2D synthetic datasets and FashionMNIST to validate its capabilities. -
Variational Discriminator Bottleneck
The dataset used in the paper is not explicitly described, but it is mentioned that the authors used a 34 degrees-of-freedom humanoid character and a phase-functioned... -
Generative Models for 3D Objects
Generative models for 3D objects -
GAN and VAE from an Optimal Transport Point of View
The dataset used in the paper is a generative model, specifically a Wasserstein GAN and a Wasserstein VAE. -
Synthetic two-dimensional data and MNIST digits
The dataset used in the experiments with the synthetic two-dimensional data and the MNIST digits. -
Density Estimation Using Real NVP
This dataset has no description
-
Alternating Back-Propagation for Generator Networks
This dataset has no description
-
Wasserstein GAN
This dataset has no description
-
Flexible Prior Distributions for Deep Generative Models
The dataset induced prior distribution is learned using a secondary GAN named PGAN. This prior is then used to further train the original GAN. -
Max-Margin Deep Generative Models
Deep generative models (DGMs) are effective on learning multilayered represen- tations of complex data and performing inference of input data by exploring the generative ability. -
Score-based generative model for function spaces
The dataset used in the paper is a score-based generative model for function spaces. -
GenCO: Generating Diverse Designs with Combinatorial Constraints
GenCO: Generating Diverse Designs with Combinatorial Constraints -
DDIM or PLMS
Text-to-image models, such as DDIM or PLMS. -
OpenCLIP and LAION-5B
Language-guided image diffusion models, such as OpenCLIP and LAION-5B.