FedSVD: Asynchronous Federated Learning With Stale Weight Vector Decomposition
Federated learning (FL) emerges as a collaborative learning framework that addresses the critical needs for privacy preservation and communication efficiency. In synchronous FL, each client waits for the global model, which is aggregated from the trained models of all participating clients. To allev...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11015799/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Federated learning (FL) emerges as a collaborative learning framework that addresses the critical needs for privacy preservation and communication efficiency. In synchronous FL, each client waits for the global model, which is aggregated from the trained models of all participating clients. To alleviate the downtime associated with waiting for all clients to complete training, asynchronous FL enables independent aggregation of client models. In asynchronous FL, the global model is continuously updated during client training, leading to the inevitable issue of stale updates when clients return their models to the server. These outdated updates hinder the convergence of the global model during aggregation. To address this staleness problem, we propose FedSVD, a method that leverages vector decomposition of stale weights. FedSVD evaluates each client’s trained weight in terms of their staleness relative to the current global model and decomposes the weights into two vectors: one pointing in the direction of the previous global model update, and another orthogonal to it. The global model is then updated using only the orthogonal vector, as the parallel vector is considered already accounted for in the current global model. Experimental results show that FedSVD outperforms existing baseline methods on benchmark datasets under various client conditions and data distributions. |
|---|---|
| ISSN: | 2169-3536 |