Federated optimization (FedOpt), which targets at collaboratively training a learning model across a large number of distributed clients, is vital for federated learning. The primary concerns in FedOpt can be attributed to the model divergence and communication efficiency, which significantly affect the performance. In this paper, we propose a new method, i.e., LoSAC, to learn from heterogeneous distributed data more efficiently. Its key algorithmic insight is to locally update the estimate for the global full gradient after {each} regular local model update. Thus, LoSAC can keep clients' information refreshed in a more compact way. In particular, we have studied the convergence result for LoSAC. Besides, the bonus of LoSAC is the ability t...
In recent centralized nonconvex distributed learning and federated learning, local methods are one o...
Distributed Mean Estimation (DME) is a central building block in federated learning, where clients s...
There is a growing interest in the distributed optimization framework that goes under the name of Fe...
Federated Learning is a machine learning paradigm where we aim to train machine learning models in a...
As an emerging technology, federated learning (FL) involves training machine learning models over di...
Federated learning (FL) aims to minimize the communication complexity of training a model over heter...
Federated Averaging (FEDAVG) has emerged as the algorithm of choice for federated learning due to it...
Federated learning enables a large amount of edge computing devices to learn a model without data sh...
This is the author accepted manuscript. The final version is available from IEEE via the DOI in this...
Federated learning (FL) is an emerging machine learning paradigm involving multiple clients, e.g., m...
In Federated Learning (FL), a number of clients or devices collaborate to train a model without shar...
Federated learning is an emerging distributed machine learning framework which jointly trains a glob...
We propose a solution to address the lack of high-probability guarantees in Federated Learning (FL) ...
The uneven distribution of local data across different edge devices (clients) results in slow model ...
In the setting of federated optimization, where a global model is aggregated periodically, step asyn...
In recent centralized nonconvex distributed learning and federated learning, local methods are one o...
Distributed Mean Estimation (DME) is a central building block in federated learning, where clients s...
There is a growing interest in the distributed optimization framework that goes under the name of Fe...
Federated Learning is a machine learning paradigm where we aim to train machine learning models in a...
As an emerging technology, federated learning (FL) involves training machine learning models over di...
Federated learning (FL) aims to minimize the communication complexity of training a model over heter...
Federated Averaging (FEDAVG) has emerged as the algorithm of choice for federated learning due to it...
Federated learning enables a large amount of edge computing devices to learn a model without data sh...
This is the author accepted manuscript. The final version is available from IEEE via the DOI in this...
Federated learning (FL) is an emerging machine learning paradigm involving multiple clients, e.g., m...
In Federated Learning (FL), a number of clients or devices collaborate to train a model without shar...
Federated learning is an emerging distributed machine learning framework which jointly trains a glob...
We propose a solution to address the lack of high-probability guarantees in Federated Learning (FL) ...
The uneven distribution of local data across different edge devices (clients) results in slow model ...
In the setting of federated optimization, where a global model is aggregated periodically, step asyn...
In recent centralized nonconvex distributed learning and federated learning, local methods are one o...
Distributed Mean Estimation (DME) is a central building block in federated learning, where clients s...
There is a growing interest in the distributed optimization framework that goes under the name of Fe...