-
Learning Trajectories are Generalization Indicators
This paper explores the connection between learning trajectories of Deep Neural Networks (DNNs) and their generalization capabilities when optimized using stochastic gradient... -
Adam: A method for stochastic optimization
This dataset is used to test the robustness of watermarking methods against adaptive attacks. -
Two-Layer Neural Networks
The dataset is used to analyze the convergence of stochastic gradient descent for two-layer neural networks. -
Limiting Dynamics of SGD
The dataset used in this paper is a collection of pre-trained convolutional neural networks trained on ImageNet.