Federated learning is a powerful distributed learning scheme that allows numerous edge devices to collaboratively train a model without sharing their data. However, training is resource-intensive for edge devices, and limited network bandwidth is often the main bottleneck. Prior work often overcomes the constraints by condensing the models or messages into compact formats, e.g., by gradient compression or distillation. In contrast, we propose ProgFed, the first progressive training framework for efficient and effective federated learning. It inherently reduces computation and two-way communication costs while maintaining the strong performance of the final models. We theoretically prove that ProgFed converges at the same asymptotic rate as ...
Federated learning has shown its advances over the last few years but is facing many challenges, suc...
Progressive compression allows images to start loading as low-resolution versions, becoming clearer ...
Due to privacy and regulatory reasons, sharing data between institutions can be difficult. Because o...
Federated learning is a powerful distributed learning scheme that allows numerous edge devices to co...
Abstract Federated learning is an effective solution for edge training, but the limited bandwidth an...
Training a large-scale model over a massive data set is an extremely computation and storage intensi...
In the modern paradigm of federated learning, a large number of users are involved in a global learn...
This is the author accepted manuscript. The final version is available from IEEE via the DOI in this...
Methods for training models on graphs distributed across multiple clients have recently grown in pop...
Federated learning (FL) is able to manage edge devices to cooperatively train a model while maintain...
Federated learning enables cooperative training among massively distributed clients by sharing their...
We introduce FedDCT, a novel distributed learning paradigm that enables the usage of large, high-per...
Federated Learning (FL) allows training machine learning models in privacy-constrained scenarios by ...
Federated Learning (FL) has attracted increasing attention in recent years. A leading training algor...
The distributed training of deep learning models faces two issues: efficiency and privacy. First of ...
Federated learning has shown its advances over the last few years but is facing many challenges, suc...
Progressive compression allows images to start loading as low-resolution versions, becoming clearer ...
Due to privacy and regulatory reasons, sharing data between institutions can be difficult. Because o...
Federated learning is a powerful distributed learning scheme that allows numerous edge devices to co...
Abstract Federated learning is an effective solution for edge training, but the limited bandwidth an...
Training a large-scale model over a massive data set is an extremely computation and storage intensi...
In the modern paradigm of federated learning, a large number of users are involved in a global learn...
This is the author accepted manuscript. The final version is available from IEEE via the DOI in this...
Methods for training models on graphs distributed across multiple clients have recently grown in pop...
Federated learning (FL) is able to manage edge devices to cooperatively train a model while maintain...
Federated learning enables cooperative training among massively distributed clients by sharing their...
We introduce FedDCT, a novel distributed learning paradigm that enables the usage of large, high-per...
Federated Learning (FL) allows training machine learning models in privacy-constrained scenarios by ...
Federated Learning (FL) has attracted increasing attention in recent years. A leading training algor...
The distributed training of deep learning models faces two issues: efficiency and privacy. First of ...
Federated learning has shown its advances over the last few years but is facing many challenges, suc...
Progressive compression allows images to start loading as low-resolution versions, becoming clearer ...
Due to privacy and regulatory reasons, sharing data between institutions can be difficult. Because o...