Xlnet: Generalized Autoregressive Pretraining for Language Understanding

The Xlnet is a generalized autoregressive pretraining model for language understanding.

Data and Resources

Cite this as

Zhilin Yang, Zihang Dai, Yiming Yang, Jaime Carbonell, Ruslan Salakhutdinov, Quoc V Le (2025). Dataset: Xlnet: Generalized Autoregressive Pretraining for Language Understanding. https://doi.org/10.57702/mt69o3yg

DOI retrieved: January 3, 2025

Additional Info

Field Value
Created January 3, 2025
Last update January 3, 2025
Defined In https://doi.org/10.48550/arXiv.2105.03994
Citation
  • https://doi.org/10.48550/arXiv.2006.15020
Author Zhilin Yang
More Authors
Zihang Dai
Yiming Yang
Jaime Carbonell
Ruslan Salakhutdinov
Quoc V Le
Homepage https://arxiv.org/abs/1906.08237