You're currently viewing an old version of this dataset. To see the current version, click here.

Towards Efficient Dialogue Pre-training with Transferable and Interpretable Latent Structure

This paper proposes a novel dialogue model with a latent structure that is easily transferable from the general domain to downstream tasks in a lightweight and transparent way.

Data and Resources

Cite this as

Xueliang Zhao, Lemao Liu, Tingchen Fu, Shuming Shi, Dongyan Zhao, Rui Yan (2024). Dataset: Towards Efficient Dialogue Pre-training with Transferable and Interpretable Latent Structure. https://doi.org/10.57702/9rpvoyo9

DOI retrieved: December 17, 2024

Additional Info

Field Value
Created December 17, 2024
Last update December 17, 2024
Defined In https://doi.org/10.48550/arXiv.2210.12461
Author Xueliang Zhao
More Authors
Lemao Liu
Tingchen Fu
Shuming Shi
Dongyan Zhao
Rui Yan
Homepage https://arxiv.org/abs/2106.09567