2 datasets found

Formats: JSON Tags: Transformer Model

Filter Results
  • ARAGPT2

    ARAGPT2 is a stacked transformer-decoder model trained using the causal language modeling objective. The model is trained on 77GB of Arabic text.
  • TPPoet

    A dataset of Persian classical poems used for training a decoder-only transformer model to generate unconditioned rhyming couplets.
You can also access this registry using the API (see API Docs).