-
URL pre-training dataset
A dataset of 20 million unlabeled URLs for pre-training -
Point-BERT: Pre-training 3D Point Cloud Transformers with Masked Point Modeling
Point-BERT is a new paradigm for learning point cloud Transformers. It pre-trains standard point cloud Transformers with a Masked Point Modeling (MPM) task. -
Bengali Handwritten Digit Dataset
A dataset of 70000 handwritten samples of Bengali numerals for recognition using artificial neural network based architecture pre-trained by a stacked denoising autoencoder. -
BERT: Pre-training of deep bidirectional transformers for language understanding
This paper proposes BERT, a pre-trained deep bidirectional transformer for language understanding.