Diffusion-LM Improves Controllable Text Generation

Controlling the behavior of language models (LMs) without re-training is a major open problem in natural language generation. We develop a new non-autoregressive language model based on continuous diffusions that we call Diffusion-LM.

Data and Resources

Cite this as

Xiang Lisa Li, John Thickstun, Ishaan Gulrajani, Percy Liang, Tatsunori B. Hashimoto (2024). Dataset: Diffusion-LM Improves Controllable Text Generation. https://doi.org/10.57702/y6eyps02

DOI retrieved: December 3, 2024

Additional Info

Field Value
Created December 3, 2024
Last update December 3, 2024
Defined In https://doi.org/10.48550/arXiv.2205.14217
Author Xiang Lisa Li
More Authors
John Thickstun
Ishaan Gulrajani
Percy Liang
Tatsunori B. Hashimoto
Homepage https://arxiv.org/abs/2204.06125