Communication-Balancing Threshold for Event-Triggered Federated Learning

Federated Learning (FL) enables training models across distributed devices while preserving data privacy by avoiding raw data sharing. However, it suffers from significant communication overhead. Event-Triggered FL (ETFL) addresses this issue by allowing devices to transmit updates only when substan...

Full description

Saved in:
Bibliographic Details
Main Authors: Juhyeong Yoon, Jun-Pyo Hong, Jaeyoung Song
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11096591/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Federated Learning (FL) enables training models across distributed devices while preserving data privacy by avoiding raw data sharing. However, it suffers from significant communication overhead. Event-Triggered FL (ETFL) addresses this issue by allowing devices to transmit updates only when substantial changes occur in the model. Nevertheless, this approach may result in imbalanced communication, where some devices communicate more frequently than others, leading to uneven model performance and slower overall convergence. To address this, we propose a new threshold-based method that dynamically adjusts each device’s communication frequency. Our method ensures balanced communication across devices and reduces the time required for each training iteration, ultimately accelerating convergence time. Furthermore, we analyze how a device’s communication affects the difference between its local model and the global model. Through extensive experiments, we demonstrate that the proposed method significantly reduces communication imbalance and achieves faster convergence compared to existing approaches. This result highlights the importance of balancing communication in federated learning to improve overall performance and ensure fairness across devices.
ISSN:2169-3536