You're currently viewing an old version of this dataset. To see the current version, click here.

Unified language model pre-training for natural language understanding and generation

A unified language model pre-training for natural language understanding and generation.

Data and Resources

Cite this as

Li Dong, Nan Yang, Wenhui Wang, Furu Wei, Xi-aodong Liu, Yu Wang, Jianfeng Gao, Ming Zhou, Hsiao-Wuen Hon (2024). Dataset: Unified language model pre-training for natural language understanding and generation. https://doi.org/10.57702/4v41nnxp

DOI retrieved: December 2, 2024

Additional Info

Field Value
Created December 2, 2024
Last update December 2, 2024
Author Li Dong
More Authors
Nan Yang
Wenhui Wang
Furu Wei
Xi-aodong Liu
Yu Wang
Jianfeng Gao
Ming Zhou
Hsiao-Wuen Hon
Homepage https://arxiv.org/abs/1906.10500