This paper proposes a locally differentially private federated learning algorithm for strongly convex but possibly nonsmooth problems that protects the gradients of each worker against an honest but curious server. The proposed algorithm adds artificial noise to the shared information to ensure privacy and dynamically allocates the time-varying noise variance to minimize an upper bound of the optimization error subject to a predefined privacy budget constraint. This allows for an arbitrarily large but finite number of iterations to achieve both privacy protection and utility up to a neighborhood of the optimal solution, removing the need for tuning the number of iterations. Numerical results show the superiority of the proposed algorithm ov...
Federated learning (FL) is an emerging technique that trains machine learning models across multiple...
We consider the problem of reinforcing federated learning with formal privacy guarantees. We propose...
We study federated learning (FL)--especially cross-silo FL--with non-convex loss functions and data ...
This paper studies the problem of federated learning (FL) in the absence of a trustworthy server/cli...
Federated Learning (FL) is a paradigm for large-scale distributed learning which faces two key chall...
As a popular distributed learning framework, federated learning (FL) enables clients to conduct coop...
To preserve participants' privacy, Federated Learning (FL) has been proposed to let participants col...
Advanced adversarial attacks such as membership inference and model memorization can make federated ...
We consider private federated learning (FL), where a server aggregates differentially private gradie...
Repeated parameter sharing in federated learning causes significant information leakage about privat...
Federated learning (FL) allows to train a massive amount of data privately due to its decentralized ...
Federated learning (FL) has emerged as a privacy solution for collaborative distributed learning whe...
Federated learning (FL) that enables edge devices to collaboratively learn a shared model while keep...
The task of preserving privacy while ensuring efficient communication is a fundamental challenge in ...
International audienceSince its inception, Federated Learning (FL) has successfully dealt with vario...
Federated learning (FL) is an emerging technique that trains machine learning models across multiple...
We consider the problem of reinforcing federated learning with formal privacy guarantees. We propose...
We study federated learning (FL)--especially cross-silo FL--with non-convex loss functions and data ...
This paper studies the problem of federated learning (FL) in the absence of a trustworthy server/cli...
Federated Learning (FL) is a paradigm for large-scale distributed learning which faces two key chall...
As a popular distributed learning framework, federated learning (FL) enables clients to conduct coop...
To preserve participants' privacy, Federated Learning (FL) has been proposed to let participants col...
Advanced adversarial attacks such as membership inference and model memorization can make federated ...
We consider private federated learning (FL), where a server aggregates differentially private gradie...
Repeated parameter sharing in federated learning causes significant information leakage about privat...
Federated learning (FL) allows to train a massive amount of data privately due to its decentralized ...
Federated learning (FL) has emerged as a privacy solution for collaborative distributed learning whe...
Federated learning (FL) that enables edge devices to collaboratively learn a shared model while keep...
The task of preserving privacy while ensuring efficient communication is a fundamental challenge in ...
International audienceSince its inception, Federated Learning (FL) has successfully dealt with vario...
Federated learning (FL) is an emerging technique that trains machine learning models across multiple...
We consider the problem of reinforcing federated learning with formal privacy guarantees. We propose...
We study federated learning (FL)--especially cross-silo FL--with non-convex loss functions and data ...