DRFLM: Distributionally Robust Federated Learning with Inter-client Noise via Local Mixup

The authors propose a general framework to solve the two challenges simultaneously: inter-client data heterogeneity and intra-client data noise. The framework uses distributionally robust optimization to mitigate the negative effects caused by data heterogeneity and incorporates mixup techniques into the local training process to mitigate the effects of intra-client data noise.

Data and Resources

Cite this as

Bingzhe Wu, Zhipeng Liang, Yuxuan Han, Yatao Bian, Peilin Zhao, Junzhou Huang (2024). Dataset: DRFLM: Distributionally Robust Federated Learning with Inter-client Noise via Local Mixup. https://doi.org/10.57702/yz6rbzpo

DOI retrieved: December 3, 2024

Additional Info

Field Value
Created December 3, 2024
Last update December 3, 2024
Defined In https://doi.org/10.48550/arXiv.2204.07742
Author Bingzhe Wu
More Authors
Zhipeng Liang
Yuxuan Han
Yatao Bian
Peilin Zhao
Junzhou Huang