You're currently viewing an old version of this dataset. To see the current version, click here.

DRFLM: Distributionally Robust Federated Learning with Inter-client Noise via Local Mixup

The authors propose a general framework to solve the two challenges simultaneously: inter-client data heterogeneity and intra-client data noise. The framework uses distributionally robust optimization to mitigate the negative effects caused by data heterogeneity and incorporates mixup techniques into the local training process to mitigate the effects of intra-client data noise.

Data and Resources

This dataset has no data

Cite this as

Bingzhe Wu, Zhipeng Liang, Yuxuan Han, Yatao Bian, Peilin Zhao, Junzhou Huang (2024). Dataset: DRFLM: Distributionally Robust Federated Learning with Inter-client Noise via Local Mixup. https://doi.org/10.57702/yz6rbzpo

Private DOI This DOI is not yet resolvable.
It is available for use in manuscripts, and will be published when the Dataset is made public.

Additional Info

Field Value
Created December 3, 2024
Last update December 3, 2024
Defined In https://doi.org/10.48550/arXiv.2204.07742
Author Bingzhe Wu
More Authors
Zhipeng Liang
Yuxuan Han
Yatao Bian
Peilin Zhao
Junzhou Huang