KL-FedDis: A federated learning approach with distribution information sharing using Kullback-Leibler divergence for non-IID data
Data Heterogeneity or Non-IID (non-independent and identically distributed) data identification is one of the prominent challenges in Federated Learning (FL). In Non-IID data, clients have their own local data, which may not be independently and identically distributed. This arises because clients i...
Saved in:
| Main Authors: | Md. Rahad, Ruhan Shabab, Mohd. Sultan Ahammad, Md. Mahfuz Reza, Amit Karmaker, Md. Abir Hossain |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Elsevier
2025-03-01
|
| Series: | Neuroscience Informatics |
| Subjects: | |
| Online Access: | http://www.sciencedirect.com/science/article/pii/S277252862400027X |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Affine Calculus for Constrained Minima of the Kullback–Leibler Divergence
by: Giovanni Pistone
Published: (2025-03-01) -
Deciphering the Social Vulnerability of Landslides Using the Coefficient of Variation-Kullback-Leibler-TOPSIS at an Administrative Village Scale
by: Yueyue Wang, et al.
Published: (2025-02-01) -
Towards few-shot learning with triplet metric learning and Kullback-Leibler optimization
by: Yukun Liu, et al.
Published: (2025-06-01) -
Kullback–Leibler Divergence‐Based Fault Detection Scheme for 100% Inverter Interfaced Autonomous Microgrids
by: Ali Mallahi, et al.
Published: (2025-06-01) -
Assessing the Impact of Physical Activity on Dementia Progression Using Clustering and the MRI-Based Kullback–Leibler Divergence
by: Agnieszka Wosiak, et al.
Published: (2025-01-01)