You're currently viewing an old version of this dataset. To see the current version, click here.

Diffusion-LM Improves Controllable Text Generation

Controlling the behavior of language models (LMs) without re-training is a major open problem in natural language generation. We develop a new non-autoregressive language model based on continuous diffusions that we call Diffusion-LM.

Data and Resources

This dataset has no data

Cite this as

Xiang Lisa Li, John Thickstun, Ishaan Gulrajani, Percy Liang, Tatsunori B. Hashimoto (2024). Dataset: Diffusion-LM Improves Controllable Text Generation. https://doi.org/10.57702/y6eyps02

Private DOI This DOI is not yet resolvable.
It is available for use in manuscripts, and will be published when the Dataset is made public.

Additional Info

Field Value
Created December 3, 2024
Last update December 3, 2024
Defined In https://doi.org/10.48550/arXiv.2205.14217
Author Xiang Lisa Li
More Authors
John Thickstun
Ishaan Gulrajani
Percy Liang
Tatsunori B. Hashimoto
Homepage https://arxiv.org/abs/2204.06125