-
Hamiltonian Neural Networks for Solving Differential Equations
The Hamiltonian neural network architecture is used to solve DE systems. The Hamiltonian NN is an evolution of previously used unsupervised NNs for finding solutions to DEs that... -
Learning (Very) Simple Generative Models Is Hard
The dataset is used to study the computational complexity of learning pushforwards of Gaussians under one-hidden-layer ReLU networks. -
Interactive Simulations of Backdoors in Neural Networks
This work addresses the problem of planting and defending cryptographic-based backdoors in artificial intelligence (AI) models. The motivation comes from our lack of... -
Analyzing individual neurons in pre-trained models
The dataset is used for analyzing individual neurons in pre-trained models. -
A Logical Calculus of the Ideas Immanent in Nervous Activity
The dataset used in the paper is related to neural networks and artificial intelligence. -
Energy-based out-of-distribution detection
Energy-based out-of-distribution detection. -
Enhancing the reliability of out-of-distribution image detection in neural ne...
Enhancing the reliability of out-of-distribution image detection in neural networks. -
A baseline for detecting misclassified and out-of-distribution examples in ne...
A baseline for detecting misclassified and out-of-distribution examples in neural networks. -
Google Edge TPU
The dataset consists of 24 state-of-the-art Google edge neural network models, including CNNs, LSTMs, Transducers, and RCNNs. -
Zenkai - Framework for Exploring Beyond Backpropagation
Zenkai is an open-source framework designed to give researchers more control and flexibility over building and training deep learning machines. -
Multifunctional Agent
The dataset used in the paper is a set of embodied recurrent neural networks that perform object categorization and pole-balancing tasks. -
Leapfrogging for parallelism in deep neural networks
The dataset used in the paper is a neural network with L layers numbered 1,..., L, in which each of the hidden layers has N neurons. -
NITI: INTEGER TRAINING
The dataset used in this paper is MNIST, CIFAR10, and ImageNet. -
Hierarchical Exponential-family Energy-based (HEE) model on CIFAR10
The HEE model uses CIFAR10 to demonstrate its ability to generate high-quality images. -
Hierarchical Exponential-family Energy-based (HEE) model
The HEE model uses 2D synthetic datasets and FashionMNIST to validate its capabilities. -
Non-asymptotic approximations of neural networks by Gaussian processes
The dataset is not explicitly described in the paper, but it is mentioned that the authors study the extent to which wide neural networks may be approximated by Gaussian processes. -
Training Over-Parameterized Deep Neural Networks
The dataset used in this paper is a collection of training data for over-parameterized deep neural networks. -
Neural Certificates for Safe Control Policies
This paper develops an approach to learn a policy of a dynamical system that is guaranteed to be both provably safe and goal-reaching.