Faster Convergence With Less Communication: Broadcast-Based Subgraph Sampling for Decentralized Learning Over Wireless Networks

Decentralized stochastic gradient descent (D-SGD) is a widely adopted optimization algorithm for decentralized training of machine learning models across networked agents. A crucial part of D-SGD is the consensus-based model averaging, which heavily relies on information exchange and fusion among th...

Full description

Saved in:
Bibliographic Details
Main Authors: Daniel Perez Herrera, Zheng Chen, Erik G. Larsson
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Open Journal of the Communications Society
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10879080/
Tags: Add Tag
No Tags, Be the first to tag this record!