FedCD: Improving Performance in non-IID Federated Learning

Federated learning has been widely applied to enable decentralized devices, which each have their own local data, to learn a shared model. However, learning from real-world data can be challenging, as it is rarely identically and independently distributed (IID) across edge devices (a key assumption for current high-performing and low-bandwidth algorithms). We present a novel approach, FedCD, which clones and deletes models to dynamically group devices with similar data.

Data and Resources

This dataset has no data

Cite this as

Kavya Kopparapu, Eric Lin, Jessica Zhao (2024). Dataset: FedCD: Improving Performance in non-IID Federated Learning. https://doi.org/10.57702/wqbepbn5

Private DOI This DOI is not yet resolvable.
It is available for use in manuscripts, and will be published when the Dataset is made public.

Additional Info

Field Value
Created December 17, 2024
Last update December 17, 2024
Defined In https://doi.org/10.48550/arXiv.2006.09637
Author Kavya Kopparapu
More Authors
Eric Lin
Jessica Zhao
Homepage https://doi.org/10.1145/nnnnnnn.nnnnnnn