ZeroQuant-FP: A Leap Forward in LLMs Post-Training W4A8 Quantization Using Floating-Point Formats

The dataset used in the paper is not explicitly described, but it is mentioned that it is a large language model dataset.

Data and Resources

Cite this as

Xiaoxia Wu, Zhewei Yao, Yuxiong He (2025). Dataset: ZeroQuant-FP: A Leap Forward in LLMs Post-Training W4A8 Quantization Using Floating-Point Formats. https://doi.org/10.57702/sjdl7oz5

DOI retrieved: January 2, 2025

Additional Info

Field Value
Created January 2, 2025
Last update January 2, 2025
Defined In https://doi.org/10.48550/arXiv.2307.09782
Author Xiaoxia Wu
More Authors
Zhewei Yao
Yuxiong He
Homepage https://github.com/microsoft/DeepSpeed